WorldWideScience

Sample records for carlo image simulation-differential

  1. Medical Imaging Image Quality Assessment with Monte Carlo Methods

    Science.gov (United States)

    Michail, C. M.; Karpetas, G. E.; Fountos, G. P.; Kalyvas, N. I.; Martini, Niki; Koukou, Vaia; Valais, I. G.; Kandarakis, I. S.

    2015-09-01

    The aim of the present study was to assess image quality of PET scanners through a thin layer chromatography (TLC) plane source. The source was simulated using a previously validated Monte Carlo model. The model was developed by using the GATE MC package and reconstructed images obtained with the STIR software for tomographic image reconstruction, with cluster computing. The PET scanner simulated in this study was the GE DiscoveryST. A plane source consisted of a TLC plate, was simulated by a layer of silica gel on aluminum (Al) foil substrates, immersed in 18F-FDG bath solution (1MBq). Image quality was assessed in terms of the Modulation Transfer Function (MTF). MTF curves were estimated from transverse reconstructed images of the plane source. Images were reconstructed by the maximum likelihood estimation (MLE)-OSMAPOSL algorithm. OSMAPOSL reconstruction was assessed by using various subsets (3 to 21) and iterations (1 to 20), as well as by using various beta (hyper) parameter values. MTF values were found to increase up to the 12th iteration whereas remain almost constant thereafter. MTF improves by using lower beta values. The simulated PET evaluation method based on the TLC plane source can be also useful in research for the further development of PET and SPECT scanners though GATE simulations.

  2. Monte Carlo studies for medical imaging detector optimization

    Science.gov (United States)

    Fois, G. R.; Cisbani, E.; Garibaldi, F.

    2016-02-01

    This work reports on the Monte Carlo optimization studies of detection systems for Molecular Breast Imaging with radionuclides and Bremsstrahlung Imaging in nuclear medicine. Molecular Breast Imaging requires competing performances of the detectors: high efficiency and high spatial resolutions; in this direction, it has been proposed an innovative device which combines images from two different, and somehow complementary, detectors at the opposite sides of the breast. The dual detector design allows for spot compression and improves significantly the performance of the overall system if all components are well tuned, layout and processing carefully optimized; in this direction the Monte Carlo simulation represents a valuable tools. In recent years, Bremsstrahlung Imaging potentiality in internal radiotherapy (with beta-radiopharmaceuticals) has been clearly emerged; Bremsstrahlung Imaging is currently performed with existing detector generally used for single photon radioisotopes. We are evaluating the possibility to adapt an existing compact gamma camera and optimize by Monte Carlo its performance for Bremsstrahlung imaging with photons emitted by the beta- from 90 Y.

  3. Bayesian inference and Markov chain Monte Carlo in imaging

    Science.gov (United States)

    Higdon, David M.; Bowsher, James E.

    1999-05-01

    Over the past 20 years, many problems in Bayesian inference that were previously intractable can now be fairly routinely dealt with using a computationally intensive technique for exploring the posterior distribution called Markov chain Monte Carlo (MCMC). Primarily because of insufficient computing capabilities, most MCMC applications have been limited to rather standard statistical models. However, with the computing power of modern workstations, a fully Bayesian approach with MCMC, is now possible for many imaging applications. Such an approach can be quite useful because it leads not only to `point' estimates of an underlying image or emission source, but it also gives a means for quantifying uncertainties regarding the image. This paper gives an overview of Bayesian image analysis and focuses on applications relevant to medical imaging. Particular focus is on prior image models and outlining MCMC methods for these models.

  4. Microscopic imaging through turbid media Monte Carlo modeling and applications

    CERN Document Server

    Gu, Min; Deng, Xiaoyuan

    2015-01-01

    This book provides a systematic introduction to the principles of microscopic imaging through tissue-like turbid media in terms of Monte-Carlo simulation. It describes various gating mechanisms based on the physical differences between the unscattered and scattered photons and method for microscopic image reconstruction, using the concept of the effective point spread function. Imaging an object embedded in a turbid medium is a challenging problem in physics as well as in biophotonics. A turbid medium surrounding an object under inspection causes multiple scattering, which degrades the contrast, resolution and signal-to-noise ratio. Biological tissues are typically turbid media. Microscopic imaging through a tissue-like turbid medium can provide higher resolution than transillumination imaging in which no objective is used. This book serves as a valuable reference for engineers and scientists working on microscopy of tissue turbid media.

  5. Monte Carlo simulations in small animal PET imaging

    Energy Technology Data Exchange (ETDEWEB)

    Branco, Susana [Universidade de Lisboa, Faculdade de Ciencias, Instituto de Biofisica e Engenharia Biomedica, Lisbon (Portugal)], E-mail: susana.silva@fc.ul.pt; Jan, Sebastien [Service Hospitalier Frederic Joliot, CEA/DSV/DRM, Orsay (France); Almeida, Pedro [Universidade de Lisboa, Faculdade de Ciencias, Instituto de Biofisica e Engenharia Biomedica, Lisbon (Portugal)

    2007-10-01

    This work is based on the use of an implemented Positron Emission Tomography (PET) simulation system dedicated for small animal PET imaging. Geant4 Application for Tomographic Emission (GATE), a Monte Carlo simulation platform based on the Geant4 libraries, is well suited for modeling the microPET FOCUS system and to implement realistic phantoms, such as the MOBY phantom, and data maps from real examinations. The use of a microPET FOCUS simulation model with GATE has been validated for spatial resolution, counting rates performances, imaging contrast recovery and quantitative analysis. Results from realistic studies of the mouse body using {sup -}F and [{sup 18}F]FDG imaging protocols are presented. These simulations include the injection of realistic doses into the animal and realistic time framing. The results have shown that it is possible to simulate small animal PET acquisitions under realistic conditions, and are expected to be useful to improve the quantitative analysis in PET mouse body studies.

  6. Monte Carlo simulations in small animal PET imaging

    International Nuclear Information System (INIS)

    This work is based on the use of an implemented Positron Emission Tomography (PET) simulation system dedicated for small animal PET imaging. Geant4 Application for Tomographic Emission (GATE), a Monte Carlo simulation platform based on the Geant4 libraries, is well suited for modeling the microPET FOCUS system and to implement realistic phantoms, such as the MOBY phantom, and data maps from real examinations. The use of a microPET FOCUS simulation model with GATE has been validated for spatial resolution, counting rates performances, imaging contrast recovery and quantitative analysis. Results from realistic studies of the mouse body using -F and [18F]FDG imaging protocols are presented. These simulations include the injection of realistic doses into the animal and realistic time framing. The results have shown that it is possible to simulate small animal PET acquisitions under realistic conditions, and are expected to be useful to improve the quantitative analysis in PET mouse body studies

  7. Monte Carlo simulations of landmine detection using neutron backscattering imaging

    Energy Technology Data Exchange (ETDEWEB)

    Datema, Cor P. E-mail: c.datema@iri.tudelft.nl; Bom, Victor R.; Eijk, Carel W.E. van

    2003-11-01

    Neutron backscattering is a technique that has successfully been applied to the detection of non-metallic landmines. Most of the effort in this field has concentrated on single detectors that are scanned across the soil. Here, two new approaches are presented in which a two-dimensional image of the hydrogen distribution in the soil is made. The first method uses an array of position-sensitive {sup 3}He-tubes that is placed in close proximity of the soil. The second method is based on coded aperture imaging. Here, thermal neutrons from the soil are projected onto a detector which is typically placed one to several meters above the soil. Both methods use a pulsed D/D neutron source. The Monte Carlo simulation package GEANT 4 was used to investigate the performance of both imaging systems.

  8. Image based Monte Carlo Modeling for Computational Phantom

    Science.gov (United States)

    Cheng, Mengyun; Wang, Wen; Zhao, Kai; Fan, Yanchang; Long, Pengcheng; Wu, Yican

    2014-06-01

    The evaluation on the effects of ionizing radiation and the risk of radiation exposure on human body has been becoming one of the most important issues for radiation protection and radiotherapy fields, which is helpful to avoid unnecessary radiation and decrease harm to human body. In order to accurately evaluate the dose on human body, it is necessary to construct more realistic computational phantom. However, manual description and verfication of the models for Monte carlo(MC)simulation are very tedious, error-prone and time-consuming. In addiation, it is difficult to locate and fix the geometry error, and difficult to describe material information and assign it to cells. MCAM (CAD/Image-based Automatic Modeling Program for Neutronics and Radiation Transport Simulation) was developed as an interface program to achieve both CAD- and image-based automatic modeling by FDS Team (Advanced Nuclear Energy Research Team, http://www.fds.org.cn). The advanced version (Version 6) of MCAM can achieve automatic conversion from CT/segmented sectioned images to computational phantoms such as MCNP models. Imaged-based automatic modeling program(MCAM6.0) has been tested by several medical images and sectioned images. And it has been applied in the construction of Rad-HUMAN. Following manual segmentation and 3D reconstruction, a whole-body computational phantom of Chinese adult female called Rad-HUMAN was created by using MCAM6.0 from sectioned images of a Chinese visible human dataset. Rad-HUMAN contains 46 organs/tissues, which faithfully represented the average anatomical characteristics of the Chinese female. The dose conversion coefficients(Dt/Ka) from kerma free-in-air to absorbed dose of Rad-HUMAN were calculated. Rad-HUMAN can be applied to predict and evaluate dose distributions in the Treatment Plan System (TPS), as well as radiation exposure for human body in radiation protection.

  9. Robust Image Denoising using a Virtual Flash Image for Monte Carlo Ray Tracing

    DEFF Research Database (Denmark)

    Moon, Bochang; Jun, Jong Yun; Lee, JongHyeob;

    2013-01-01

    We propose an efficient and robust image-space denoising method for noisy images generated by Monte Carlo ray tracing methods. Our method is based on two new concepts: virtual flash images and homogeneous pixels. Inspired by recent developments in flash photography, virtual flash images emulate...... parameters. To highlight the benefits of our method, we apply our method to two Monte Carlo ray tracing methods, photon mapping and path tracing, with various input scenes. We demonstrate that using virtual flash images and homogeneous pixels with a standard denoising method outperforms state...... values. While denoising each pixel, we consider only homogeneous pixels—pixels that are statistically equivalent to each other. This makes it possible to define a stochastic error bound of our method, and this bound goes to zero as the number of ray samples goes to infinity, irrespective of denoising...

  10. GPU based Monte Carlo for PET image reconstruction: parameter optimization

    International Nuclear Information System (INIS)

    This paper presents the optimization of a fully Monte Carlo (MC) based iterative image reconstruction of Positron Emission Tomography (PET) measurements. With our MC re- construction method all the physical effects in a PET system are taken into account thus superior image quality is achieved in exchange for increased computational effort. The method is feasible because we utilize the enormous processing power of Graphical Processing Units (GPUs) to solve the inherently parallel problem of photon transport. The MC approach regards the simulated positron decays as samples in mathematical sums required in the iterative reconstruction algorithm, so to complement the fast architecture, our work of optimization focuses on the number of simulated positron decays required to obtain sufficient image quality. We have achieved significant results in determining the optimal number of samples for arbitrary measurement data, this allows to achieve the best image quality with the least possible computational effort. Based on this research recommendations can be given for effective partitioning of computational effort into the iterations in limited time reconstructions. (author)

  11. Monte Carlo simulation of the image formation process in portal imaging

    International Nuclear Information System (INIS)

    We have written Monte Carlo programs to simulate the formation of radiological images. Our code is used to propagate a simulated x-ray fluence through each component of an existing video-based portal imaging system. This simulated fluence consists of a 512x512 pixel image containing both contrast-detail patterns as well as checker patterns to assess spatial resolution of the simulated portal imager. All of the components of the portal imaging system were modeled as a cascade of eight linear stages. Using this code, one can assess the visual impact of changing components in the imaging chain by changing the appropriate probability density function. Virtual experiments were performed to assess the visual impact of replacing the lens and TV camera by an amorphous silicon array, and the effect of scattered radiation on portal images

  12. Monte Carlo modeling of ultrasound probes for image guided radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Bazalova-Carter, Magdalena, E-mail: bazalova@uvic.ca [Department of Radiation Oncology, Stanford University, Stanford, California 94305 and Department of Physics and Astronomy, University of Victoria, Victoria, British Columbia V8W 2Y2 (Canada); Schlosser, Jeffrey [SoniTrack Systems, Inc., Palo Alto, California 94304 (United States); Chen, Josephine [Department of Radiation Oncology, UCSF, San Francisco, California 94143 (United States); Hristov, Dimitre [Department of Radiation Oncology, Stanford University, Stanford, California 94305 (United States)

    2015-10-15

    Purpose: To build Monte Carlo (MC) models of two ultrasound (US) probes and to quantify the effect of beam attenuation due to the US probes for radiation therapy delivered under real-time US image guidance. Methods: MC models of two Philips US probes, an X6-1 matrix-array transducer and a C5-2 curved-array transducer, were built based on their megavoltage (MV) CT images acquired in a Tomotherapy machine with a 3.5 MV beam in the EGSnrc, BEAMnrc, and DOSXYZnrc codes. Mass densities in the probes were assigned based on an electron density calibration phantom consisting of cylinders with mass densities between 0.2 and 8.0 g/cm{sup 3}. Beam attenuation due to the US probes in horizontal (for both probes) and vertical (for the X6-1 probe) orientation was measured in a solid water phantom for 6 and 15 MV (15 × 15) cm{sup 2} beams with a 2D ionization chamber array and radiographic films at 5 cm depth. The MC models of the US probes were validated by comparison of the measured dose distributions and dose distributions predicted by MC. Attenuation of depth dose in the (15 × 15) cm{sup 2} beams and small circular beams due to the presence of the probes was assessed by means of MC simulations. Results: The 3.5 MV CT number to mass density calibration curve was found to be linear with R{sup 2} > 0.99. The maximum mass densities in the X6-1 and C5-2 probes were found to be 4.8 and 5.2 g/cm{sup 3}, respectively. Dose profile differences between MC simulations and measurements of less than 3% for US probes in horizontal orientation were found, with the exception of the penumbra region. The largest 6% dose difference was observed in dose profiles of the X6-1 probe placed in vertical orientation, which was attributed to inadequate modeling of the probe cable. Gamma analysis of the simulated and measured doses showed that over 96% of measurement points passed the 3%/3 mm criteria for both probes placed in horizontal orientation and for the X6-1 probe in vertical orientation. The

  13. TH-E-18A-01: Developments in Monte Carlo Methods for Medical Imaging

    Energy Technology Data Exchange (ETDEWEB)

    Badal, A [U.S. Food and Drug Administration (CDRH/OSEL), Silver Spring, MD (United States); Zbijewski, W [Johns Hopkins University, Baltimore, MD (United States); Bolch, W [University of Florida, Gainesville, FL (United States); Sechopoulos, I [Emory University, Atlanta, GA (United States)

    2014-06-15

    Monte Carlo simulation methods are widely used in medical physics research and are starting to be implemented in clinical applications such as radiation therapy planning systems. Monte Carlo simulations offer the capability to accurately estimate quantities of interest that are challenging to measure experimentally while taking into account the realistic anatomy of an individual patient. Traditionally, practical application of Monte Carlo simulation codes in diagnostic imaging was limited by the need for large computational resources or long execution times. However, recent advancements in high-performance computing hardware, combined with a new generation of Monte Carlo simulation algorithms and novel postprocessing methods, are allowing for the computation of relevant imaging parameters of interest such as patient organ doses and scatter-to-primaryratios in radiographic projections in just a few seconds using affordable computational resources. Programmable Graphics Processing Units (GPUs), for example, provide a convenient, affordable platform for parallelized Monte Carlo executions that yield simulation times on the order of 10{sup 7} xray/ s. Even with GPU acceleration, however, Monte Carlo simulation times can be prohibitive for routine clinical practice. To reduce simulation times further, variance reduction techniques can be used to alter the probabilistic models underlying the x-ray tracking process, resulting in lower variance in the results without biasing the estimates. Other complementary strategies for further reductions in computation time are denoising of the Monte Carlo estimates and estimating (scoring) the quantity of interest at a sparse set of sampling locations (e.g. at a small number of detector pixels in a scatter simulation) followed by interpolation. Beyond reduction of the computational resources required for performing Monte Carlo simulations in medical imaging, the use of accurate representations of patient anatomy is crucial to the

  14. Monte Carlo Simulations of Arterial Imaging with Optical Coherence Tomography

    Energy Technology Data Exchange (ETDEWEB)

    Amendt, P.; Estabrook, K.; Everett, M.; London, R.A.; Maitland, D.; Zimmerman, G.; Colston, B.; da Silva, L.; Sathyam, U.

    2000-02-01

    The laser-tissue interaction code LATIS [London et al., Appl. Optics 36, 9068 ( 1998)] is used to analyze photon scattering histories representative of optical coherence tomography (OCT) experiment performed at Lawrence Livermore National Laboratory. Monte Carlo photonics with Henyey-Greenstein anisotropic scattering is implemented and used to simulate signal discrimination of intravascular structure. An analytic model is developed and used to obtain a scaling law relation for optimization of the OCT signal and to validate Monte Carlo photonics. The appropriateness of the Henyey-Greenstein phase function is studied by direct comparison with more detailed Mie scattering theory using an ensemble of spherical dielectric scatterers. Modest differences are found between the two prescriptions for describing photon angular scattering in tissue. In particular, the Mie scattering phase functions provide less overall reflectance signal but more signal contrast compared to the Henyey-Greenstein formulation.

  15. TH-E-18A-01: Developments in Monte Carlo Methods for Medical Imaging

    International Nuclear Information System (INIS)

    Monte Carlo simulation methods are widely used in medical physics research and are starting to be implemented in clinical applications such as radiation therapy planning systems. Monte Carlo simulations offer the capability to accurately estimate quantities of interest that are challenging to measure experimentally while taking into account the realistic anatomy of an individual patient. Traditionally, practical application of Monte Carlo simulation codes in diagnostic imaging was limited by the need for large computational resources or long execution times. However, recent advancements in high-performance computing hardware, combined with a new generation of Monte Carlo simulation algorithms and novel postprocessing methods, are allowing for the computation of relevant imaging parameters of interest such as patient organ doses and scatter-to-primaryratios in radiographic projections in just a few seconds using affordable computational resources. Programmable Graphics Processing Units (GPUs), for example, provide a convenient, affordable platform for parallelized Monte Carlo executions that yield simulation times on the order of 107 xray/ s. Even with GPU acceleration, however, Monte Carlo simulation times can be prohibitive for routine clinical practice. To reduce simulation times further, variance reduction techniques can be used to alter the probabilistic models underlying the x-ray tracking process, resulting in lower variance in the results without biasing the estimates. Other complementary strategies for further reductions in computation time are denoising of the Monte Carlo estimates and estimating (scoring) the quantity of interest at a sparse set of sampling locations (e.g. at a small number of detector pixels in a scatter simulation) followed by interpolation. Beyond reduction of the computational resources required for performing Monte Carlo simulations in medical imaging, the use of accurate representations of patient anatomy is crucial to the virtual

  16. A scatter correction method for T1-201 images: A Monte Carlo investigation

    Energy Technology Data Exchange (ETDEWEB)

    Hademenos, G.J.; King, M.A. (Univ. of Massachusetts Medical Center, Worcester, MA (United States). Dept. of Nuclear Medicine); Ljungberg, M. (Lund Univ., (Sweden). Dept. of Radiation Physics); Zubal, G.; Harrell, C.R. (Yale Univ. School of Medicine, New Haven, CT (United States). Dept. of Diagnostic Radiology)

    1993-08-01

    Results from the application of a modified dual photopeak window (DPW) scatter correction method to Monte Carlo simulated T1-201 emission images are presented. In the Monte Carlo investigation, individual simulations were performed for six radiation emissions of T1-201. For each emission, point sources of T1-201 were imaged at various locations i na water-filled elliptical tub phantom using three energy windows: two 12% windows abutted at 72 keV and a third 10 keV window placed to the right of the photopeak window (95.001 keV - 105.000 keV). The third window was used to estimate the spilldown contribution from the T1-201 gamma rays in each of the two photopeak windows. Using the corrected counts in these two windows, the DPW method was applied to each point source image to estimate the scatter distribution. For point source images in both homogeneous and non-homogeneous attenuating media, the application of this modified version of DPW resulted in an approximately six-fold reduction in the scatter fraction and an excellent agreement of the shape of the tails between the estimated scatter distribution and the Monte Carlo-simulated truth. This method was also applied to two views of an extended cardiac distribution within an anthropomorphic phantom, again, resulting in at least a six-fold improvement between the scatter estimate and the Monte Carlo-simulated true scatter.

  17. Monte Carlo simulation of gamma ray tomography for image reconstruction

    International Nuclear Information System (INIS)

    The Monte Carlo simulations of known density and shape object was validate with Gamma Ray Tomography in static experiments. An aluminum half-moon piece placed inside a steel pipe was the MC simulation test object that was also measured by means of gamma ray transmission. Wall effect of the steel pipe due to irradiation geometry in a single pair source-detector tomography was evaluated by comparison with theoretical data. MCNPX code requires a defined geometry to each photon trajectory which practically prevents this usage for tomography reconstruction simulation. The solution was found by writing a program in Delphi language to create input files automation code. Simulations of tomography data by automated MNCPX code were carried out and validated by experimental data. Working in this sequence the produced data needed a databank to be stored. Experimental setup used a Cesium-137 isotopic radioactive source (7.4 × 109 Bq), and NaI(Tl) scintillation detector of (51 × 51) × 10−3 m crystal size coupled to a multichannel analyzer. A stainless steel tubes of 0,154 m internal diameter, 0.014 m thickness wall. The results show that the MCNPX simulation code adapted to automated input file is useful for generating a matrix data M(θ,t), of a computerized gamma ray tomography for any known density and regular shape object. Experimental validation used RMSE from gamma ray paths and from attenuation coefficient data. (author)

  18. Monte Carlo simulation of gamma ray tomography for image reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Guedes, Karlos A.N.; Moura, Alex; Dantas, Carlos; Melo, Silvio; Lima, Emerson, E-mail: karlosguedes@hotmail.com [Universidade Federal de Pernambuco (UFPE), Recife, PE (Brazil); Meric, Ilker [University of Bergen (Norway)

    2015-07-01

    The Monte Carlo simulations of known density and shape object was validate with Gamma Ray Tomography in static experiments. An aluminum half-moon piece placed inside a steel pipe was the MC simulation test object that was also measured by means of gamma ray transmission. Wall effect of the steel pipe due to irradiation geometry in a single pair source-detector tomography was evaluated by comparison with theoretical data. MCNPX code requires a defined geometry to each photon trajectory which practically prevents this usage for tomography reconstruction simulation. The solution was found by writing a program in Delphi language to create input files automation code. Simulations of tomography data by automated MNCPX code were carried out and validated by experimental data. Working in this sequence the produced data needed a databank to be stored. Experimental setup used a Cesium-137 isotopic radioactive source (7.4 × 109 Bq), and NaI(Tl) scintillation detector of (51 × 51) × 10−3 m crystal size coupled to a multichannel analyzer. A stainless steel tubes of 0,154 m internal diameter, 0.014 m thickness wall. The results show that the MCNPX simulation code adapted to automated input file is useful for generating a matrix data M(θ,t), of a computerized gamma ray tomography for any known density and regular shape object. Experimental validation used RMSE from gamma ray paths and from attenuation coefficient data. (author)

  19. Monte Carlo evaluation of the Filtered Back Projection method for image reconstruction in proton computed tomography

    International Nuclear Information System (INIS)

    In this paper the use of the Filtered Back Projection (FBP) Algorithm, in order to reconstruct tomographic images using the high energy (200-250 MeV) proton beams, is investigated. The algorithm has been studied in detail with a Monte Carlo approach and image quality has been analysed and compared with the total absorbed dose. A proton Computed Tomography (pCT) apparatus, developed by our group, has been fully simulated to exploit the power of the Geant4 Monte Carlo toolkit. From the simulation of the apparatus, a set of tomographic images of a test phantom has been reconstructed using the FBP at different absorbed dose values. The images have been evaluated in terms of homogeneity, noise, contrast, spatial and density resolution.

  20. Monte Carlo evaluation of the Filtered Back Projection method for image reconstruction in proton computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Cirrone, G.A.P., E-mail: cirrone@lns.infn.it [Laboratori Nazionali del Sud - National Instiute for Nuclear Physics INFN (INFN-LNS), Via S.Sofia 64, 95100 Catania (Italy); Bucciolini, M. [Department of ' Fisiopatologia Clinica' , University of Florence, V.le Morgagni 85, I-50134 Florence (Italy); Bruzzi, M. [Energetic Department, University of Florence, Via S. Marta 3, I-50139 Florence (Italy); Candiano, G. [Laboratorio di Tecnologie Oncologiche HSR, Giglio Contrada, Pietrapollastra-Pisciotto, 90015 Cefalu, Palermo (Italy); Civinini, C. [National Institute for Nuclear Physics INFN, Section of Florence, Via G. Sansone 1, Sesto Fiorentino, I-50019 Florence (Italy); Cuttone, G. [Laboratori Nazionali del Sud - National Instiute for Nuclear Physics INFN (INFN-LNS), Via S.Sofia 64, 95100 Catania (Italy); Guarino, P. [Nuclear Engineering Department, University of Palermo, Via... Palermo (Italy); Laboratori Nazionali del Sud - National Instiute for Nuclear Physics INFN (INFN-LNS), Via S.Sofia 64, 95100 Catania (Italy); Lo Presti, D. [Physics Department, University of Catania, Via S. Sofia 64, I-95123, Catania (Italy); Mazzaglia, S.E. [Laboratori Nazionali del Sud - National Instiute for Nuclear Physics INFN (INFN-LNS), Via S.Sofia 64, 95100 Catania (Italy); Pallotta, S. [Department of ' Fisiopatologia Clinica' , University of Florence, V.le Morgagni 85, I-50134 Florence (Italy); Randazzo, N. [National Institute for Nuclear Physics INFN, Section of Catania, Via S.Sofia 64, 95123 Catania (Italy); Sipala, V. [National Institute for Nuclear Physics INFN, Section of Catania, Via S.Sofia 64, 95123 Catania (Italy); Physics Department, University of Catania, Via S. Sofia 64, I-95123, Catania (Italy); Stancampiano, C. [National Institute for Nuclear Physics INFN, Section of Catania, Via S.Sofia 64, 95123 Catania (Italy); and others

    2011-12-01

    In this paper the use of the Filtered Back Projection (FBP) Algorithm, in order to reconstruct tomographic images using the high energy (200-250 MeV) proton beams, is investigated. The algorithm has been studied in detail with a Monte Carlo approach and image quality has been analysed and compared with the total absorbed dose. A proton Computed Tomography (pCT) apparatus, developed by our group, has been fully simulated to exploit the power of the Geant4 Monte Carlo toolkit. From the simulation of the apparatus, a set of tomographic images of a test phantom has been reconstructed using the FBP at different absorbed dose values. The images have been evaluated in terms of homogeneity, noise, contrast, spatial and density resolution.

  1. Patient-specific CT dose determination from CT images using Monte Carlo simulations

    Science.gov (United States)

    Liang, Qing

    Radiation dose from computed tomography (CT) has become a public concern with the increasing application of CT as a diagnostic modality, which has generated a demand for patient-specific CT dose determinations. This thesis work aims to provide a clinically applicable Monte-Carlo-based CT dose calculation tool based on patient CT images. The source spectrum was simulated based on half-value layer measurements. Analytical calculations along with the measured flux distribution were used to estimate the bowtie-filter geometry. Relative source output at different points in a cylindrical phantom was measured and compared with Monte Carlo simulations to verify the determined spectrum and bowtie-filter geometry. Sensitivity tests were designed with four spectra with the same kVp and different half-value layers, and showed that the relative output at different locations in a phantom is sensitive to different beam qualities. An mAs-to-dose conversion factor was determined with in-air measurements using an Exradin A1SL ionization chamber. Longitudinal dose profiles were measured with thermoluminescent dosimeters (TLDs) and compared with the Monte-Carlo-simulated dose profiles to verify the mAs-to-dose conversion factor. Using only the CT images to perform Monte Carlo simulations would cause dose underestimation due to the lack of a scatter region. This scenario was demonstrated with a cylindrical phantom study. Four different image extrapolation methods from the existing CT images and the Scout images were proposed. The results show that performing image extrapolation beyond the scan region improves the dose calculation accuracy under both step-shoot scan mode and helical scan mode. Two clinical studies were designed and comparisons were performed between the current CT dose metrics and the Monte-Carlo-based organ dose determination techniques proposed in this work. The results showed that the current CT dosimetry failed to show dose differences between patients with the same

  2. Tomographic image of prompt gamma ray from boron neutron capture therapy: A Monte Carlo simulation study

    International Nuclear Information System (INIS)

    The resulting neutron captures in 10B are used for radiation therapy. The occurrence point of the characteristic 478 keV prompt gamma rays agrees with the neutron capture point. If these prompt gamma rays are detected by external instruments such as a gamma camera or single photon emission computed tomography (SPECT), the therapy region can be monitored during the treatment using images. A feasibility study and analysis of a reconstructed image using many projections (128) were conducted. The optimization of the detection system and a detailed neutron generator simulation were beyond the scope of this study. The possibility of extracting a 3D BNCT-SPECT image was confirmed using the Monte Carlo simulation and OSEM algorithm. The quality of the prompt gamma ray SPECT image obtained from BNCT was evaluated quantitatively using three different boron uptake regions and was shown to depend on the location and size relations. The prospects for obtaining an actual BNCT-SPECT image were also estimated from the quality of the simulated image and the simulation conditions. When multi tumor regions should be treated using the BNCT method, a reasonable model to determine how many useful images can be obtained from SPECT can be provided to the BNCT facilities based on the preceding imaging research. However, because the scope of this research was limited to checking the feasibility of 3D BNCT-SPECT image reconstruction using multiple projections, along with an evaluation of the image, some simulation conditions were taken from previous studies. In the future, a simulation will be conducted that includes optimized conditions for an actual BNCT facility, along with an imaging process for motion correction in BNCT. Although an excessively long simulation time was required to obtain enough events for image reconstruction, the feasibility of acquiring a 3D BNCT-SPECT image using multiple projections was confirmed using a Monte Carlo simulation, and a quantitative image analysis was

  3. Ideal-observer computation in medical imaging with use of Markov-chain Monte Carlo techniques

    Science.gov (United States)

    Kupinski, Matthew A.; Hoppin, John W.; Clarkson, Eric; Barrett, Harrison H.

    2003-03-01

    The ideal observer sets an upper limit on the performance of an observer on a detection or classification task. The performance of the ideal observer can be used to optimize hardware components of imaging systems and also to determine another observer's relative performance in comparison with the best possible observer. The ideal observer employs complete knowledge of the statistics of the imaging system, including the noise and object variability. Thus computing the ideal observer for images (large-dimensional vectors) is burdensome without severely restricting the randomness in the imaging system, e.g., assuming a flat object. We present a method for computing the ideal-observer test statistic and performance by using Markov-chain Monte Carlo techniques when we have a well-characterized imaging system, knowledge of the noise statistics, and a stochastic object model. We demonstrate the method by comparing three different parallel-hole collimator imaging systems in simulation.

  4. Polarization imaging of multiply-scattered radiation based on integral-vector Monte Carlo method

    International Nuclear Information System (INIS)

    A new integral-vector Monte Carlo method (IVMCM) is developed to analyze the transfer of polarized radiation in 3D multiple scattering particle-laden media. The method is based on a 'successive order of scattering series' expression of the integral formulation of the vector radiative transfer equation (VRTE) for application of efficient statistical tools to improve convergence of Monte Carlo calculations of integrals. After validation against reference results in plane-parallel layer backscattering configurations, the model is applied to a cubic container filled with uniformly distributed monodispersed particles and irradiated by a monochromatic narrow collimated beam. 2D lateral images of effective Mueller matrix elements are calculated in the case of spherical and fractal aggregate particles. Detailed analysis of multiple scattering regimes, which are very similar for unpolarized radiation transfer, allows identifying the sensitivity of polarization imaging to size and morphology.

  5. Feasibility Study of Neutron Dose for Real Time Image Guided Proton Therapy: A Monte Carlo Study

    OpenAIRE

    Kim, Jin Sung; Shin, Jung Suk; Kim, Daehyun; Shin, EunHyuk; Chung, Kwangzoo; Cho, Sungkoo; Ahn, Sung Hwan; Ju, Sanggyu; Chung, Yoonsun; Jung, Sang Hoon; Han, Youngyih

    2015-01-01

    Two full rotating gantry with different nozzles (Multipurpose nozzle with MLC, Scanning Dedicated nozzle) with conventional cyclotron system is installed and under commissioning for various proton treatment options at Samsung Medical Center in Korea. The purpose of this study is to investigate neutron dose equivalent per therapeutic dose, H/D, to x-ray imaging equipment under various treatment conditions with monte carlo simulation. At first, we investigated H/D with the various modifications...

  6. Monte Carlo simulation of secondary electron images for real sample structures in scanning electron microscopy.

    Science.gov (United States)

    Zhang, P; Wang, H Y; Li, Y G; Mao, S F; Ding, Z J

    2012-01-01

    Monte Carlo simulation methods for the study of electron beam interaction with solids have been mostly concerned with specimens of simple geometry. In this article, we propose a simulation algorithm for treating arbitrary complex structures in a real sample. The method is based on a finite element triangular mesh modeling of sample geometry and a space subdivision for accelerating simulation. Simulation of secondary electron image in scanning electron microscopy has been performed for gold particles on a carbon substrate. Comparison of the simulation result with an experiment image confirms that this method is effective to model complex morphology of a real sample.

  7. Novel imaging and quality assurance techniques for ion beam therapy a Monte Carlo study

    CERN Document Server

    Rinaldi, I; Jäkel, O; Mairani, A; Parodi, K

    2010-01-01

    Ion beams exhibit a finite and well defined range in matter together with an “inverted” depth-dose profile, the so-called Bragg peak. These favourable physical properties may enable superior tumour-dose conformality for high precision radiation therapy. On the other hand, they introduce the issue of sensitivity to range uncertainties in ion beam therapy. Although these uncertainties are typically taken into account when planning the treatment, correct delivery of the intended ion beam range has to be assured to prevent undesired underdosage of the tumour or overdosage of critical structures outside the target volume. Therefore, it is necessary to define dedicated Quality Assurance procedures to enable in-vivo range verification before or during therapeutic irradiation. For these purposes, Monte Carlo transport codes are very useful tools to support the development of novel imaging modalities for ion beam therapy. In the present work, we present calculations performed with the FLUKA Monte Carlo code and pr...

  8. Pinhole X-ray Fluorescence Imaging of Gadolinium Nanoparticles: A Preliminary Monte Carlo Study

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Seong Moon; Sung, Won Mo; Ye, Sung Joon [Seoul National University, Seoul (Korea, Republic of)

    2014-10-15

    X-ray fluorescence imaging is a modality for the element-specific imaging of a subject through analysis of characteristic x-rays produced by exploiting the interaction of high atomic number elements and incoming x-rays. Previous studies have utilized a polychromatic x-ray source to investigate the production of in vivo x-ray fluorescence images for the assessment of concentrations and locations of gold nanoparticles. However, previous efforts have so far been unable to detect low concentrations, such as 0.001% gold by weight, which is an expected concentration accumulated in tumors. We examined the feasibility of a monochromatic synchrotron x-rays implementation of pinhole x-ray fluorescence imaging by Monte Carlo simulations using MCNP5. In the current study, gadolinium (Gd) nanoparticles, which have been widely used as a contrast agent in magnetic resonance imaging and also as a dose enhancer in radiation therapy, were chosen for tumor targeting. Since a monochromatic x-ray source is used, the increased x-ray fluorescence signals allow the detection of low concentrations of Gd. Two different monochromatic x-ray beam energies, 50.5 keV near the Kedge energy (i.e., 50.207 keV) of Gd and 55 keV, were compared by their respective imaging results. Using Monte Carlo simulations the feasibility of imaging low concentrations of Gd nanoparticles (e.g., 0.001 wt%) with x-ray fluorescence using monochromatic synchrotron x-rays of two different energies was shown. In the case of imaging a single Gd column inserted in the center of a water phantom, the fluorescence signals from 0.05 wt% and 0.1 wt% Gd columns irradiated with a 50.5 keV photon beam were higher than those irradiated with 55 keV. Below 0.05 wt% region no significant differences were found.

  9. Pinhole X-ray Fluorescence Imaging of Gadolinium Nanoparticles: A Preliminary Monte Carlo Study

    International Nuclear Information System (INIS)

    X-ray fluorescence imaging is a modality for the element-specific imaging of a subject through analysis of characteristic x-rays produced by exploiting the interaction of high atomic number elements and incoming x-rays. Previous studies have utilized a polychromatic x-ray source to investigate the production of in vivo x-ray fluorescence images for the assessment of concentrations and locations of gold nanoparticles. However, previous efforts have so far been unable to detect low concentrations, such as 0.001% gold by weight, which is an expected concentration accumulated in tumors. We examined the feasibility of a monochromatic synchrotron x-rays implementation of pinhole x-ray fluorescence imaging by Monte Carlo simulations using MCNP5. In the current study, gadolinium (Gd) nanoparticles, which have been widely used as a contrast agent in magnetic resonance imaging and also as a dose enhancer in radiation therapy, were chosen for tumor targeting. Since a monochromatic x-ray source is used, the increased x-ray fluorescence signals allow the detection of low concentrations of Gd. Two different monochromatic x-ray beam energies, 50.5 keV near the Kedge energy (i.e., 50.207 keV) of Gd and 55 keV, were compared by their respective imaging results. Using Monte Carlo simulations the feasibility of imaging low concentrations of Gd nanoparticles (e.g., 0.001 wt%) with x-ray fluorescence using monochromatic synchrotron x-rays of two different energies was shown. In the case of imaging a single Gd column inserted in the center of a water phantom, the fluorescence signals from 0.05 wt% and 0.1 wt% Gd columns irradiated with a 50.5 keV photon beam were higher than those irradiated with 55 keV. Below 0.05 wt% region no significant differences were found

  10. Characterization of array scintillation detector for follicle thyroid 2D imaging acquisition using Monte Carlo simulation

    International Nuclear Information System (INIS)

    The image acquisition methods applied to nuclear medicine and radiobiology are a valuable research study for determination of thyroid anatomy to seek disorders associated to follicular cells. The Monte Carlo (MC) simulation has also been used in problems related to radiation detection in order to map medical images since the improvement of data processing compatible with personnel computers (PC). This work presents an innovative study to find out the adequate scintillation inorganic detector array that could be coupled to a specific light photo sensor, a charge coupled device (CCD) through a fiber optic plate in order to map the follicles of thyroid gland. The goal is to choose the type of detector that fits the application suggested here with spatial resolution of 10 μm and good detector efficiency. The methodology results are useful to map a follicle image using gamma radiation emission. A source - detector simulation is performed by using a MCNP4B (Monte Carlo for Neutron Photon transport) general code considering different source energies, detector materials and geometries including pixel sizes and reflector types. The results demonstrate that by using MCNP4B code is possible to searching for useful parameters related to the systems used in nuclear medicine, specifically in radiobiology applied to endocrine physiology studies to acquiring thyroid follicles images. (author)

  11. Coded aperture coherent scatter imaging for breast cancer detection: a Monte Carlo evaluation

    Science.gov (United States)

    Lakshmanan, Manu N.; Morris, Robert E.; Greenberg, Joel A.; Samei, Ehsan; Kapadia, Anuj J.

    2016-03-01

    It is known that conventional x-ray imaging provides a maximum contrast between cancerous and healthy fibroglandular breast tissues of 3% based on their linear x-ray attenuation coefficients at 17.5 keV, whereas coherent scatter signal provides a maximum contrast of 19% based on their differential coherent scatter cross sections. Therefore in order to exploit this potential contrast, we seek to evaluate the performance of a coded- aperture coherent scatter imaging system for breast cancer detection and investigate its accuracy using Monte Carlo simulations. In the simulations we modeled our experimental system, which consists of a raster-scanned pencil beam of x-rays, a bismuth-tin coded aperture mask comprised of a repeating slit pattern with 2-mm periodicity, and a linear-array of 128 detector pixels with 6.5-keV energy resolution. The breast tissue that was scanned comprised a 3-cm sample taken from a patient-based XCAT breast phantom containing a tomosynthesis- based realistic simulated lesion. The differential coherent scatter cross section was reconstructed at each pixel in the image using an iterative reconstruction algorithm. Each pixel in the reconstructed image was then classified as being either air or the type of breast tissue with which its normalized reconstructed differential coherent scatter cross section had the highest correlation coefficient. Comparison of the final tissue classification results with the ground truth image showed that the coded aperture imaging technique has a cancerous pixel detection sensitivity (correct identification of cancerous pixels), specificity (correctly ruling out healthy pixels as not being cancer) and accuracy of 92.4%, 91.9% and 92.0%, respectively. Our Monte Carlo evaluation of our experimental coded aperture coherent scatter imaging system shows that it is able to exploit the greater contrast available from coherently scattered x-rays to increase the accuracy of detecting cancerous regions within the breast.

  12. Monte Carlo Modeling of Cascade Gamma Rays in 86Y PET imaging: Preliminary results

    OpenAIRE

    Zhu, Xuping; El Fakhri, Georges

    2009-01-01

    86Y is a PET agent that could be used as an ideal surrogate to allow personalized dosimetry in 90Y radionuclide therapy. However, 86Y also emits cascade gamma rays. We have developed a Monte Carlo program based on SimSET to model cascade gamma rays in PET imaging. The new simulation was validated with the GATE simulation package. Agreements within 15% were found in spatial resolution, apparent scatter fraction (ratio of coincidences outside peak regions in line source sinograms), singles and ...

  13. Sequential Monte Carlo Methods for Joint Detection and Tracking of Multiaspect Targets in Infrared Radar Images

    Directory of Open Access Journals (Sweden)

    Bruno MarceloGS

    2008-01-01

    Full Text Available We present in this paper a sequential Monte Carlo methodology for joint detection and tracking of a multiaspect target in image sequences. Unlike the traditional contact/association approach found in the literature, the proposed methodology enables integrated, multiframe target detection and tracking incorporating the statistical models for target aspect, target motion, and background clutter. Two implementations of the proposed algorithm are discussed using, respectively, a resample-move (RS particle filter and an auxiliary particle filter (APF. Our simulation results suggest that the APF configuration outperforms slightly the RS filter in scenarios of stealthy targets.

  14. Sequential Monte Carlo Methods for Joint Detection and Tracking of Multiaspect Targets in Infrared Radar Images

    Directory of Open Access Journals (Sweden)

    Anton G. Pavlov

    2008-02-01

    Full Text Available We present in this paper a sequential Monte Carlo methodology for joint detection and tracking of a multiaspect target in image sequences. Unlike the traditional contact/association approach found in the literature, the proposed methodology enables integrated, multiframe target detection and tracking incorporating the statistical models for target aspect, target motion, and background clutter. Two implementations of the proposed algorithm are discussed using, respectively, a resample-move (RS particle filter and an auxiliary particle filter (APF. Our simulation results suggest that the APF configuration outperforms slightly the RS filter in scenarios of stealthy targets.

  15. William, a voxel model of child anatomy from tomographic images for Monte Carlo dosimetry calculations

    International Nuclear Information System (INIS)

    Full text: Medical imaging provides two-dimensional pictures of the human internal anatomy from which may be constructed a three-dimensional model of organs and tissues suitable for calculation of dose from radiation. Diagnostic CT provides the greatest exposure to radiation per examination and the frequency of CT examination is high. Esti mates of dose from diagnostic radiography are still determined from data derived from geometric models (rather than anatomical models), models scaled from adult bodies (rather than bodies of children) and CT scanner hardware that is no longer used. The aim of anatomical modelling is to produce a mathematical representation of internal anatomy that has organs of realistic size, shape and positioning. The organs and tissues are represented by a great many cuboidal volumes (voxels). The conversion of medical images to voxels is called segmentation and on completion every pixel in an image is assigned to a tissue or organ. Segmentation is time consuming. An image processing pack age is used to identify organ boundaries in each image. Thirty to forty tomographic voxel models of anatomy have been reported in the literature. Each model is of an individual, or a composite from several individuals. Images of children are particularly scarce. So there remains a need for more paediatric anatomical models. I am working on segmenting ''William'' who is 368 PET-CT images from head to toe of a seven year old boy. William will be used for Monte Carlo dose calculations of dose from CT examination using a simulated modern CT scanner.

  16. Creation of a Reference Image with Monte Carlo Simulations for Online EPID Verification of Daily Patient Setup

    Energy Technology Data Exchange (ETDEWEB)

    Descalle, M-A; Chuang, C; Pouliot, J

    2002-01-30

    Patient positioning accuracy remains an issue for external beam radiotherapy. Currently, kilovoltage verification images are used as reference by clinicians to compare the actual patient treatment position with the planned position. These images are qualitatively different from treatment-time megavoltage portal images. This study will investigate the feasibility of using PEREGRINE, a 3D Monte Carlo calculation engine, to create reference images for portal image comparisons. Portal images were acquired using an amorphous-silicon flat-panel EPID for (1) the head and pelvic sections of an anthropomorphic phantom with 7-8 mm displacements applied, and (2) a prostate patient on five treatment days. Planning CT scans were used to generate simulated reference images with PEREGRINE. A correlation algorithm quantified the setup deviations between simulated and portal images. Monte Carlo simulated images exhibit similar qualities to portal images, the phantom slabs appear clearly. Initial positioning differences and applied displacements were detected and quantified. We find that images simulated with Monte Carlo methods can be used as reference images to detect and quantify set-up errors during treatment.

  17. Optimal design of Anger camera for bremsstrahlung imaging: Monte Carlo evaluation.

    Directory of Open Access Journals (Sweden)

    Stephan eWalrand

    2014-06-01

    Full Text Available A conventional Anger camera is not adapted to bremsstrahlung imaging and, as a result, even using a reduced energy acquisition window, geometric x-rays represent less than 15% of the recorded events. This increases noise, limits the contrast, and reduces the quantification accuracy.Monte Carlo simulations of energy spectra showed that a camera based on a 30mm-thick BGO crystal and equipped with a high energy pinhole collimator is well adapted to bremsstrahlung imaging. The total scatter contamination is reduced by a factor ten versus a conventional NaI camera equipped with a high energy parallel hole collimator enabling acquisition using an extended energy window ranging from 50 to 350 keV. By using the recorded event energy in the reconstruction method, shorter acquisition time and reduced orbit range will be usable allowing the design of a simplified mobile gantry. This is more convenient for use in a busy catheterization room. After injecting a safe activity, a fast SPECT could be performed without moving the catheter tip in order to assess the liver dosimetry and estimate the additional safe activity that could still be injected.Further long running time Monte Carlo simulations of realistic acquisitions will allow assessing the quantification capability of such system. Simultaneously, a dedicated bremsstrahlung prototype camera reusing PMT-BGO blocks coming from a retired PET system is currently under design for further evaluation.

  18. Monte Carlo simulation of CD-SEM images for linewidth and critical dimension metrology.

    Science.gov (United States)

    Li, Y G; Zhang, P; Ding, Z J

    2013-01-01

    In semiconductor industry, strict critical dimension control by using a critical dimension scanning electron microscope (CD-SEM) is an extremely urgent task in near-term years. A Monte Carlo simulation model for study of CD-SEM image has been established, which is based on using Mott's cross section for electron elastic scattering and the full Penn dielectric function formalism for electron inelastic scattering and the associated secondary electron (SE) production. In this work, a systematic calculation of CD-SEM line-scan profiles and 2D images of trapezoidal Si lines has been performed by taking into account different experimental factors including electron beam condition (primary energy, probe size), line geometry (width, height, foot/corner rounding, sidewall angle, and roughness), material properties, and SE signal detection. The influences of these factors to the critical dimension metrology are investigated, leading to build a future comprehensive model-based library. PMID:22887037

  19. Simulation of Astronomical Images from Optical Survey Telescopes using a Comprehensive Photon Monte Carlo Approach

    CERN Document Server

    Peterson, J R; Kahn, S M; Rasmussen, A P; Peng, E; Ahmad, Z; Bankert, J; Chang, C; Claver, C; Gilmore, D K; Grace, E; Hannel, M; Hodge, M; Lorenz, S; Lupu, A; Meert, A; Nagarajan, S; Todd, N; Winans, A; Young, M

    2015-01-01

    We present a comprehensive methodology for the simulation of astronomical images from optical survey telescopes. We use a photon Monte Carlo approach to construct images by sampling photons from models of astronomical source populations, and then simulating those photons through the system as they interact with the atmosphere, telescope, and camera. We demonstrate that all physical effects for optical light that determine the shapes, locations, and brightnesses of individual stars and galaxies can be accurately represented in this formalism. By using large scale grid computing, modern processors, and an efficient implementation that can produce 400,000 photons/second, we demonstrate that even very large optical surveys can be now be simulated. We demonstrate that we are able to: 1) construct kilometer scale phase screens necessary for wide-field telescopes, 2) reproduce atmospheric point-spread-function moments using a fast novel hybrid geometric/Fourier technique for non-diffraction limited telescopes, 3) ac...

  20. A novel image reconstruction methodology based on inverse Monte Carlo analysis for positron emission tomography

    Science.gov (United States)

    Kudrolli, Haris A.

    2001-04-01

    A three dimensional (3D) reconstruction procedure for Positron Emission Tomography (PET) based on inverse Monte Carlo analysis is presented. PET is a medical imaging modality which employs a positron emitting radio-tracer to give functional images of an organ's metabolic activity. This makes PET an invaluable tool in the detection of cancer and for in-vivo biochemical measurements. There are a number of analytical and iterative algorithms for image reconstruction of PET data. Analytical algorithms are computationally fast, but the assumptions intrinsic in the line integral model limit their accuracy. Iterative algorithms can apply accurate models for reconstruction and give improvements in image quality, but at an increased computational cost. These algorithms require the explicit calculation of the system response matrix, which may not be easy to calculate. This matrix gives the probability that a photon emitted from a certain source element will be detected in a particular detector line of response. The ``Three Dimensional Stochastic Sampling'' (SS3D) procedure implements iterative algorithms in a manner that does not require the explicit calculation of the system response matrix. It uses Monte Carlo techniques to simulate the process of photon emission from a source distribution and interaction with the detector. This technique has the advantage of being able to model complex detector systems and also take into account the physics of gamma ray interaction within the source and detector systems, which leads to an accurate image estimate. A series of simulation studies was conducted to validate the method using the Maximum Likelihood - Expectation Maximization (ML-EM) algorithm. The accuracy of the reconstructed images was improved by using an algorithm that required a priori knowledge of the source distribution. Means to reduce the computational time for reconstruction were explored by using parallel processors and algorithms that had faster convergence rates

  1. Monte Carlo feasibility study for image guided surgery: from direct beta minus detection to Cerenkov luminescence imaging

    Science.gov (United States)

    Gigliotti, C. R.; Altabella, L.; Boschi, F.; Spinelli, A. E.

    2016-07-01

    The goal of this work is to compare the performances of different beta minus detection strategies for image guided surgery or ex vivo tissue analysis. In particular we investigated Cerenkov luminescence imaging (CLI) with and without the use of a radiator, direct and indirect beta detection and bremsstrahlung imaging using beta emitters commonly employed in Nuclear Medicine. Monte Carlo simulations were implemented using the GAMOS plug-in for GEANT4 considering a slab of muscle and a radioactive source (32P or 90Y) placed at 0.5 mm depth. We estimated the gain that can be obtained in terms of produced photons using different materials placed on the slab used as Cerenkov radiators, we then focused on the number of exiting photons and their spatial distribution for the different strategies. The use of radiator to enhance Cerenkov signal reduces the spatial resolution because of the increased optical spread. We found that direct beta detection and CLI are best approaches in term of resolution while the use of a thin scintillator increases the signal but the spatial resolution is degraded. Bremsstrahlung presents lower signal and it does not represent the best choice for image guided surgery. CLI represents a more flexible approach for image guided surgery using or ex vivo tissue analysis using beta-emitter imaging.

  2. Monte Carlo simulation in proton computed tomography: a study of image reconstruction technique

    Energy Technology Data Exchange (ETDEWEB)

    Inocente, Guilherme Franco; Stenico, Gabriela V.; Hormaza, Joel Mesa [Universidade Estadual Paulista Julio de Mesquita Filho (UNESP), Botucatu, SP (Brazil). Inst. de Biociencias. Dept. de Fisica e Biofisica

    2012-07-01

    Full text: The radiation method is one of the most used for cancer treatment. In this context arises therapy with proton beams in front of conventional radiotherapy. It is known that with proton therapy there are more advantages to the patient treated when compared with more conventional methods. The dose distributed along the path, especially in healthy tissues - neighbor the tumor, is smaller and the accuracy of treatment is much better. To carry out the treatment, the patient undergoes a plan through images for visualization and location of the target volume. The main method for obtaining these images is computed tomography X-ray (XCT). For treatment with proton beam this imaging technique can to generate some uncertainties. The purpose of this project is to study the feasibility of reconstructing images generated from the irradiation with proton beams, thereby reducing some inaccuracies, as it will be the same type of radiation as treatment planning, and also to drastically reduce some errors location, since the planning can be done at the same place and just before where the patient is treated. This study aims to obtain a relationship between the intrinsic property of the interaction of photons and protons with matter. For this we use computational simulation based on Monte Carlo method with the code SRIM 2008 and MCNPX v.2.5.0, to reconstruct images using the technique used in conventional computed tomography. (author)

  3. Monte Carlo Simulation of X-rays Multiple Refractive Scattering from Fine Structure Objects imaged with the DEI Technique

    CERN Document Server

    Khromova, A N; Arfelli, F; Menk, R H; Besch, H J; Plothow-Besch, H; 10.1109/NSSMIC.2004.1466758

    2010-01-01

    In this work we present a novel 3D Monte Carlo photon transport program for simulation of multiple refractive scattering based on the refractive properties of X-rays in highly scattering media, like lung tissue. Multiple scattering reduces not only the quality of the image, but contains also information on the internal structure of the object. This information can be exploited utilizing image modalities such as Diffraction Enhanced Imaging (DEI). To study the effect of multiple scattering a Monte Carlo program was developed that simulates multiple refractive scattering of X-ray photons on monodisperse PMMA (poly-methyl-methacrylate) microspheres representing alveoli in lung tissue. Eventually, the results of the Monte Carlo program were compared to the measurements taken at the SYRMEP beamline at Elettra (Trieste, Italy) on special phantoms showing a good agreement between both data.

  4. Momentum transfer Monte Carlo model for the simulation of laser speckle contrast imaging (Conference Presentation)

    Science.gov (United States)

    Regan, Caitlin; Hayakawa, Carole K.; Choi, Bernard

    2016-03-01

    Laser speckle imaging (LSI) enables measurement of relative blood flow in microvasculature and perfusion in tissues. To determine the impact of tissue optical properties and perfusion dynamics on speckle contrast, we developed a computational simulation of laser speckle contrast imaging. We used a discrete absorption-weighted Monte Carlo simulation to model the transport of light in tissue. We simulated optical excitation of a uniform flat light source and tracked the momentum transfer of photons as they propagated through a simulated tissue geometry. With knowledge of the probability distribution of momentum transfer occurring in various layers of the tissue, we calculated the expected laser speckle contrast arising with coherent excitation using both reflectance and transmission geometries. We simulated light transport in a single homogeneous tissue while independently varying either absorption (.001-100mm^-1), reduced scattering (.1-10mm^-1), or anisotropy (0.05-0.99) over a range of values relevant to blood and commonly imaged tissues. We observed that contrast decreased by 49% with an increase in optical scattering, and observed a 130% increase with absorption (exposure time = 1ms). We also explored how speckle contrast was affected by the depth (0-1mm) and flow speed (0-10mm/s) of a dynamic vascular inclusion. This model of speckle contrast is important to increase our understanding of how parameters such as perfusion dynamics, vessel depth, and tissue optical properties affect laser speckle imaging.

  5. Monte Carlo validation of optimal material discrimination using spectral x-ray imaging

    CERN Document Server

    Nik, Syen J; Watts, Richard; Dale, Tony; Currie, Bryn; Meyer, Juergen

    2014-01-01

    The validation of a previous work on the optimization of material discrimination in spectral x-ray imaging is reported. Using Monte Carlo simulations based on the BEAMnrc package, material decomposition was performed on the projection images of phantoms containing up to three materials. The simulated projection data was first decomposed into material basis images by minimizing the z-score between expected and simulated counts. Statistical analysis was performed for the pixels within the region-of-interest consisting of contrast material(s) in the BEAMnrc simulations. With the consideration of scattered radiation and a realistic scanning geometry, the theoretical optima of energy bin borders provided by the algorithm were shown to have an accuracy of $\\pm$2 keV for the decomposition of 2 and 3 materials. Finally, the signal-to-noise ratio predicted by the theoretical model was also validated. The counts per pixel needed for achieving a specific imaging aim can therefore be estimated using the validated model.

  6. Monte Carlo simulation of breast tumor imaging properties with compact, discrete gamma cameras

    International Nuclear Information System (INIS)

    The authors describe Monte Carlo simulation results for breast tumor imaging using a compact, discrete gamma camera. The simulations were designed to analyze and optimize camera design, particularly collimator configuration and detector pixel size. Simulated planar images of 5--15 mm diameter tumors in a phantom patient (including a breast, torso, and heart) were generated for imaging distances of 5--55 mm, pixel sizes of 2 x 2--4 x 4 mm2, and hexagonal and square hole collimators with sensitivities from 4,000 to 16,000 counts/mCi/sec. Other factors considered included T/B (tumor-to-background tissue uptake ratio) and detector energy resolution. Image properties were quantified by computing the observed tumor fwhm (full-width at half-maximum) and S/N (sum of detected tumor events divided by the statistical noise). Results suggest that hexagonal and square hole collimators perform comparably, that higher sensitivity collimators provide higher tumor S/N with little increase in the observed tumor fwhm, that smaller pixels only slightly improve tumor fwhm and S/N, and that improved detector energy resolution has little impact on either the observed tumor fwhm or the observed tumor S/N

  7. Feasibility Study of Neutron Dose for Real Time Image Guided Proton Therapy: A Monte Carlo Study

    CERN Document Server

    Kim, Jin Sung; Kim, Daehyun; Shin, EunHyuk; Chung, Kwangzoo; Cho, Sungkoo; Ahn, Sung Hwan; Ju, Sanggyu; Chung, Yoonsun; Jung, Sang Hoon; Han, Youngyih

    2015-01-01

    Two full rotating gantry with different nozzles (Multipurpose nozzle with MLC, Scanning Dedicated nozzle) with conventional cyclotron system is installed and under commissioning for various proton treatment options at Samsung Medical Center in Korea. The purpose of this study is to investigate neutron dose equivalent per therapeutic dose, H/D, to x-ray imaging equipment under various treatment conditions with monte carlo simulation. At first, we investigated H/D with the various modifications of the beam line devices (Scattering, Scanning, Multi-leaf collimator, Aperture, Compensator) at isocenter, 20, 40, 60 cm distance from isocenter and compared with other research groups. Next, we investigated the neutron dose at x-ray equipments used for real time imaging with various treatment conditions. Our investigation showed the 0.07 ~ 0.19 mSv/Gy at x-ray imaging equipments according to various treatment options and intestingly 50% neutron dose reduction effect of flat panel detector was observed due to multi- lea...

  8. Fast Monte Carlo Simulation for Patient-specific CT/CBCT Imaging Dose Calculation

    CERN Document Server

    Jia, Xun; Gu, Xuejun; Jiang, Steve B

    2011-01-01

    Recently, X-ray imaging dose from computed tomography (CT) or cone beam CT (CBCT) scans has become a serious concern. Patient-specific imaging dose calculation has been proposed for the purpose of dose management. While Monte Carlo (MC) dose calculation can be quite accurate for this purpose, it suffers from low computational efficiency. In response to this problem, we have successfully developed a MC dose calculation package, gCTD, on GPU architecture under the NVIDIA CUDA platform for fast and accurate estimation of the x-ray imaging dose received by a patient during a CT or CBCT scan. Techniques have been developed particularly for the GPU architecture to achieve high computational efficiency. Dose calculations using CBCT scanning geometry in a homogeneous water phantom and a heterogeneous Zubal head phantom have shown good agreement between gCTD and EGSnrc, indicating the accuracy of our code. In terms of improved efficiency, it is found that gCTD attains a speed-up of ~400 times in the homogeneous water ...

  9. Atmospheric correction of Earth-observation remote sensing images by Monte Carlo method

    Indian Academy of Sciences (India)

    Hanane Hadjit; Abdelaziz Oukebdane; Ahmad Hafid Belbachir

    2013-10-01

    In earth observation, the atmospheric particles contaminate severely, through absorption and scattering, the reflected electromagnetic signal from the earth surface. It will be greatly beneficial for land surface characterization if we can remove these atmospheric effects from imagery and retrieve surface reflectance that characterizes the surface properties with the purpose of atmospheric correction. Giving the geometric parameters of the studied image and assessing the parameters describing the state of the atmosphere, it is possible to evaluate the atmospheric reflectance, and upward and downward transmittances which take part in the garbling data obtained from the image. To that end, an atmospheric correction algorithm for high spectral resolution data over land surfaces has been developed. It is designed to obtain the main atmospheric parameters needed in the image correction and the interpretation of optical observations. It also estimates the optical characteristics of the Earth-observation imagery (LANDSAT and SPOT). The physics underlying the problem of solar radiation propagations that takes into account multiple scattering and sphericity of the atmosphere has been treated using Monte Carlo techniques.

  10. Optimal processing for gel electrophoresis images: Applying Monte Carlo Tree Search in GelApp.

    Science.gov (United States)

    Nguyen, Phi-Vu; Ghezal, Ali; Hsueh, Ya-Chih; Boudier, Thomas; Gan, Samuel Ken-En; Lee, Hwee Kuan

    2016-08-01

    In biomedical research, gel band size estimation in electrophoresis analysis is a routine process. To facilitate and automate this process, numerous software have been released, notably the GelApp mobile app. However, the band detection accuracy is limited due to a band detection algorithm that cannot adapt to the variations in input images. To address this, we used the Monte Carlo Tree Search with Upper Confidence Bound (MCTS-UCB) method to efficiently search for optimal image processing pipelines for the band detection task, thereby improving the segmentation algorithm. Incorporating this into GelApp, we report a significant enhancement of gel band detection accuracy by 55.9 ± 2.0% for protein polyacrylamide gels, and 35.9 ± 2.5% for DNA SYBR green agarose gels. This implementation is a proof-of-concept in demonstrating MCTS-UCB as a strategy to optimize general image segmentation. The improved version of GelApp-GelApp 2.0-is freely available on both Google Play Store (for Android platform), and Apple App Store (for iOS platform).

  11. Evaluation of scattered radiation from radiographic intensifying screen on dental image contrast using Monte Carlo code

    International Nuclear Information System (INIS)

    The most dental imaging is performed by means a imaging system consisting of a film/screen combination. Fluorescent intensifying screens for X-ray films are used in order to reduce the radiation dose. They produce visible light which increases the efficiency of the film. In addition, the primary radiation can be scattered elastically (Rayleigh scattering) and inelastically (Compton scattering) which will degrade the image resolution. Scattered radiation produced in Gd2O2S:Tb intensifying screens was simulated by using a Monte Carlo radiation transport code - the EGS4. The magnitude of scattered radiation striking the film is typically quantified using the scatter to primary radiation and the scatter fraction. The angular distribution of the intensity of the scattered radiation (sum of both the scattering effects) was simulated, showing that the ratio of secondary-to-primary radiation incident on the X-ray film is about 5.67% and 3.28 % and the scatter function is about 5.27% and 3.18% for the front and back screen, respectively, over the range from 0 to π rad. (author)

  12. Improving time domain fluorescence lifetime imaging with an adaptive Monte Carlo data inflation (AMDI) algorithm

    Science.gov (United States)

    Leray, Aymeric; Trinel, Dave; Spriet, Corentin; Usson, Yves; Heliot, Laurent

    2011-07-01

    Fluorescence Lifetime Imaging Microscopy (FLIM) is a powerful technique which gives access to the local environment of fluorophores in living cells. However, to correctly estimate all lifetime parameters, time domain FLIM imaging requires a high number of photons and consequently a long laser exposure time which is not compatible with the observation of dynamic molecular events and which induces cellular stress phenomena. For reducing this exposure time, we have developed an original approach to statistically inflate the number of collected photon. This approach called Adaptive Monte Carlo Data Inflation (AMDI) combines the well-known bootstrap technique with an adaptive Parzen kernel. We have evaluated its potential on experimental FLIM data in vivo. We have demonstrated that our robust method allows estimating precisely fluorescence lifetime with exposure time reduced up to 50 times for mono-exponential (corresponding to a minimum of 20 photons/pixel) and 10 times for bi-exponential decays (corresponding to a minimum of 5000 photons/pixel) in comparison with the standard fitting method. Furthermore, thanks to AMDI, we demonstrate that it becomes possible to estimate accurately all fitting parameters in FRET experiments without constraining any parameter. An additional benefit of our technique is that it improves the spatial resolution of the FLIM images by reducing the commonly used spatial binning factor.

  13. Monte Carlo study of a 3D Compton imaging device with GEANT4

    CERN Document Server

    Lenti, M; 10.1016/j.nima.2011.06.060

    2011-01-01

    In this paper we investigate, with a detailed Monte-Carlo simulation based on Geant4, the novel approach [Nucl. Instrum. Methods A588 (2008) 457] to 3D imaging with photon scattering. A monochromatic and well collimated gamma beam is used to illuminate the object to be imaged and the photons Compton scattered are detected by means of a surrounding germanium strip detector. The impact position and the energy of the photons are measured with high precision and the scattering position along the beam axis is calculated. We study as an application of this technique the case of brain imaging but the results can be applied as well to situations where a lighter object, with localized variations of density, is embedded in a denser container. We report here the attainable sensitivity in the detection of density variations as a function of the beam energy, the depth inside the object and size and density of the inclusions. Using a 600 keV gamma beam, for an inclusion with a density increase of 30% with respect to the so...

  14. Monte Carlo study of a 3D Compton imaging device with GEANT4

    Energy Technology Data Exchange (ETDEWEB)

    Lenti, M., E-mail: lenti@fi.infn.it [Sezione dell' INFN di Firenze, via G. Sansone 1, I-50019 Sesto F. (Italy); Veltri, M., E-mail: michele.veltri@uniurb.it [Sezione dell' INFN di Firenze, via G. Sansone 1, I-50019 Sesto F. (Italy); Dipartimento di Matematica, Fisica e Informatica, Universita di Urbino, via S. Chiara 27, I-61029 Urbino (Italy)

    2011-10-21

    In this paper we investigate, with a detailed Monte Carlo simulation based on Geant4, the novel approach of Lenti (2008) to 3D imaging with photon scattering. A monochromatic and well collimated gamma beam is used to illuminate the object to be imaged and the photons Compton scattered are detected by means of a surrounding germanium strip detector. The impact position and the energy of the photons are measured with high precision and the scattering position along the beam axis is calculated. We study as an application of this technique the case of brain imaging but the results can be applied as well to situations where a lighter object, with localized variations of density, is embedded in a denser container. We report here the attainable sensitivity in the detection of density variations as a function of the beam energy, the depth inside the object and size and density of the inclusions. Using a 600 keV gamma beam, for an inclusion with a density increase of 30% with respect to the surrounding tissue and thickness along the beam of 5 mm, we obtain at midbrain position a resolution of about 2 mm and a contrast of 12%. In addition the simulation indicates that for the same gamma beam energy a complete brain scan would result in an effective dose of about 1 mSv.

  15. Drug quantification in turbid media by fluorescence imaging combined with light-absorption correction using white Monte Carlo simulations

    DEFF Research Database (Denmark)

    Xie, Haiyan; Liu, Haichun; Svenmarker, Pontus;

    2011-01-01

    in vivo by the fluorescence imaging technique. In this paper we present a novel approach to compensate for the light absorption in homogeneous turbid media both for the excitation and emission light, utilizing time-resolved fluorescence white Monte Carlo simulations combined with the Beer-Lambert law...

  16. Monte Carlo simulation of novel breast imaging modalities based on coherent x-ray scattering

    Science.gov (United States)

    Ghammraoui, Bahaa; Badal, Andreu

    2014-07-01

    We present upgraded versions of MC-GPU and penEasy_Imaging, two open-source Monte Carlo codes for the simulation of radiographic projections and CT, that have been extended and validated to account for the effect of molecular interference in the coherent x-ray scatter. The codes were first validation by comparison between simulated and measured energy dispersive x-ray diffraction (EDXRD) spectra. A second validation was by evaluation of the rejection factor of a focused anti-scatter grid. To exemplify the capabilities of the new codes, the modified MC-GPU code was used to examine the possibility of characterizing breast tissue composition and microcalcifications in a volume of interest inside a whole breast phantom using EDXRD and to simulate a coherent scatter computed tomography (CSCT) system based on first generation CT acquisition geometry. It was confirmed that EDXRD and CSCT have the potential to characterize tissue composition inside a whole breast. The GPU-accelerated code was able to simulate, in just a few hours, a complete CSCT acquisition composed of 9758 independent pencil-beam projections. In summary, it has been shown that the presented software can be used for fast and accurate simulation of novel breast imaging modalities relying on scattering measurements and therefore can assist in the characterization and optimization of promising modalities currently under development.

  17. Predicting image blur in proton radiography: comparisons between measurements and Monte Carlo simulations

    Energy Technology Data Exchange (ETDEWEB)

    von Wittenau, A; Aufderheide, M B; Henderson, G L

    2010-05-07

    Given the cost and lead-times involved in high-energy proton radiography, it is prudent to model proposed radiographic experiments to see if the images predicted would return useful information. We recently modified our raytracing transmission radiography modeling code HADES to perform simplified Monte Carlo simulations of the transport of protons in a proton radiography beamline. Beamline objects include the initial diffuser, vacuum magnetic fields, windows, angle-selecting collimators, and objects described as distorted 2D (planar or cylindrical) meshes or as distorted 3D hexahedral meshes. We present an overview of the algorithms used for the modeling and code timings for simulations through typical 2D and 3D meshes. We next calculate expected changes in image blur as scattering materials are placed upstream and downstream of a resolution test object (a 3 mm thick sheet of tantalum, into which 0.4 mm wide slits have been cut), and as the current supplied to the focusing magnets is varied. We compare and contrast the resulting simulations with the results of measurements obtained at the 800 MeV Los Alamos LANSCE Line-C proton radiography facility.

  18. Predicting image blur in proton radiography: Comparisons between measurements and Monte Carlo simulations

    Energy Technology Data Exchange (ETDEWEB)

    Schach von Wittenau, Alexis E., E-mail: schachvonwittenau1@llnl.gov [Lawrence Livermore National Laboratory, Livermore, CA 94551 (United States); Aufderheide, Maurice; Henderson, Gary [Lawrence Livermore National Laboratory, Livermore, CA 94551 (United States)

    2011-10-01

    Given the cost and lead-times involved in high-energy proton radiography, it is prudent to model proposed radiographic experiments to see if the images predicted would return useful information. We recently modified our raytracing transmission radiography modeling code HADES to perform simplified Monte Carlo simulations of the transport of protons in a proton radiography beamline. Beamline objects include the initial diffuser, vacuum magnetic fields, windows, angle-selecting collimators, and objects described as distorted 2D (planar or cylindrical) meshes or as distorted 3D hexahedral meshes. We describe the algorithms used for simulations through typical 2D and 3D meshes. We calculate expected changes in image blur as scattering materials are placed upstream and downstream of a resolution test object (a 3 mm thick sheet of tantalum, into which 0.4 mm wide slits have been cut), and as the current supplied to the focusing magnets is varied. We compare and contrast the resulting simulations with the results of measurements obtained at the 800 MeV Los Alamos LANSCE Line-C proton radiography facility.

  19. Map Building and Monte Carlo Localization Using Global Appearance of Omnidirectional Images

    Directory of Open Access Journals (Sweden)

    Oscar Reinoso

    2010-12-01

    Full Text Available In this paper we deal with the problem of map building and localization of a mobile robot in an environment using the information provided by an omnidirectional vision sensor that is mounted on the robot. Our main objective consists of studying the feasibility of the techniques based in the global appearance of a set of omnidirectional images captured by this vision sensor to solve this problem. First, we study how to describe globally the visual information so that it represents correctly locations and the geometrical relationships between these locations. Then, we integrate this information using an approach based on a spring-mass-damper model, to create a topological map of the environment. Once the map is built, we propose the use of a Monte Carlo localization approach to estimate the most probable pose of the vision system and its trajectory within the map. We perform a comparison in terms of computational cost and error in localization. The experimental results we present have been obtained with real indoor omnidirectional images.

  20. The use of computed tomography images in Monte Carlo treatment planning

    Science.gov (United States)

    Bazalova, Magdalena

    Monte Carlo (MC) dose calculations cannot accurately assess the dose delivered to the patient during radiotherapy unless the patient anatomy is well known. This thesis focuses on the conversion of patient computed tomography (CT) images into MC geometry files. Metal streaking artifacts and their effect on MC dose calculations are first studied. A correction algorithm is applied to artifact-corrupted images and dose errors due to density and tissue mis-assignment are quantified in a phantom and a patient study. The correction algorithm and MC dose calculations for various treatment beams are also investigated using phantoms with real hip prostheses. As a result of this study, we suggest that a metal artifact correction algorithm should be a part of any MC treatment planning. By means of MC simulations, scatter is proven to be a major cause of metal artifacts. The use of dual-energy CT (DECT) for a novel tissue segmentation scheme is thoroughly investigated. First, MC simulations are used to determine the optimal beam filtration for an accurate DECT material extraction. DECT is then tested on a CT scanner with a phantom and a good agreement in the extraction of two material properties, the relative electron density rhoe and the effective atomic number Z is found. Compared to the conventional tissue segmentation based on rhoe-differences, the novel tissue segmentation scheme uses differences in both rhoe and Z. The phantom study demonstrates that the novel method based on rhoe and Z information works well and makes MC dose calculations more accurate. This thesis demonstrates that DECT suppresses streaking artifacts from brachytherapy seeds. Brachytherapy MC dose calculations using single-energy CT images with artifacts and DECT images with suppressed artifacts are performed and the effect of artifact reduction is investigated. The patient and canine DECT studies also show that image noise and object motion are very important factors in DECT. A solution for reduction

  1. Generation of scintigraphic images in a virtual dosimetry trial based on Monte Carlo modelling

    International Nuclear Information System (INIS)

    Full text of publication follows. Aim: the purpose of dosimetry calculations in therapeutic nuclear medicine is to maximize tumour absorbed dose while minimizing normal tissue toxicities. However a wide heterogeneity of dosimetric approaches is observed: there is no standardized dosimetric protocol to date. The DosiTest project (www.dositest.com) intends to identify critical steps in the dosimetry chain by implementing clinical dosimetry in different Nuclear Medicine departments, on scintigraphic images generated by Monte Carlo simulation from a same virtual patient. This study aims at presenting the different steps contributing to image generation, following the imaging protocol of a given participating centre, Milan's European Institute of Oncology (IEO). Materiel and methods: the chosen clinical application is that of 111In-pentetreotide (OctreoscanTM). Pharmacokinetic data from the literature are used to derive a compartmental model. The kinetic rates between 6 compartments (liver, spleen, kidneys, blood, urine, remainder body) were obtained from WinSaam [3]: the activity in each compartment is known at any time point. The TestDose [1] software (computing architecture of DosiTest) implements the NURBS-based phantom NCAT-WB [2] to generate anatomical data for the virtual patient. IEO gamma-camera was modelled with GATE [4] v6.2. Scintigraphic images were simulated for each compartment and the resulting projections were weighted by the respective pharmacokinetics for each compartment. The final step consisted in aggregating each compartment to generate the resulting image. Results: following IEO's imaging protocol, planar and tomographic image simulations were generated at various time points. Computation times (on a 480 virtual cores computing cluster) for 'step and shoot' whole body simulations (5 steps/time point) and acceptable statistics were: 10 days for extra-vascular fluid, 28 h for blood, 12 h for liver, 7 h for kidneys, and 1-2 h for

  2. SIMULATION OF ASTRONOMICAL IMAGES FROM OPTICAL SURVEY TELESCOPES USING A COMPREHENSIVE PHOTON MONTE CARLO APPROACH

    Energy Technology Data Exchange (ETDEWEB)

    Peterson, J. R.; Peng, E.; Ahmad, Z.; Bankert, J.; Grace, E.; Hannel, M.; Hodge, M.; Lorenz, S.; Lupu, A.; Meert, A.; Nagarajan, S.; Todd, N.; Winans, A.; Young, M. [Department of Physics and Astronomy, Purdue University, West Lafayette, IN 47907 (United States); Jernigan, J. G. [Space Sciences Laboratory, University of California, Berkeley, CA 94720 (United States); Kahn, S. M.; Rasmussen, A. P.; Chang, C.; Gilmore, D. K. [Kavli Institute for Particle Astrophysics and Cosmology, Stanford University, Stanford, CA 94305 (United States); Claver, C., E-mail: peters11@purdue.edu [National Optical Astronomy Observatory, Tucson, AZ 85719 (United States)

    2015-05-15

    We present a comprehensive methodology for the simulation of astronomical images from optical survey telescopes. We use a photon Monte Carlo approach to construct images by sampling photons from models of astronomical source populations, and then simulating those photons through the system as they interact with the atmosphere, telescope, and camera. We demonstrate that all physical effects for optical light that determine the shapes, locations, and brightnesses of individual stars and galaxies can be accurately represented in this formalism. By using large scale grid computing, modern processors, and an efficient implementation that can produce 400,000 photons s{sup −1}, we demonstrate that even very large optical surveys can be now be simulated. We demonstrate that we are able to (1) construct kilometer scale phase screens necessary for wide-field telescopes, (2) reproduce atmospheric point-spread function moments using a fast novel hybrid geometric/Fourier technique for non-diffraction limited telescopes, (3) accurately reproduce the expected spot diagrams for complex aspheric optical designs, and (4) recover system effective area predicted from analytic photometry integrals. This new code, the Photon Simulator (PhoSim), is publicly available. We have implemented the Large Synoptic Survey Telescope design, and it can be extended to other telescopes. We expect that because of the comprehensive physics implemented in PhoSim, it will be used by the community to plan future observations, interpret detailed existing observations, and quantify systematics related to various astronomical measurements. Future development and validation by comparisons with real data will continue to improve the fidelity and usability of the code.

  3. Monte Carlo Modeling of Cascade Gamma Rays in 86Y PET imaging: Preliminary results

    Science.gov (United States)

    Zhu, Xuping; El Fakhri, Georges

    2011-01-01

    86Y is a PET agent that could be used as an ideal surrogate to allow personalized dosimetry in 90Y radionuclide therapy. However, 86Y also emits cascade gamma rays. We have developed a Monte Carlo program based on SimSET to model cascade gamma rays in PET imaging. The new simulation was validated with the GATE simulation package. Agreements within 15% were found in spatial resolution, apparent scatter fraction (ratio of coincidences outside peak regions in line source sinograms), singles and coincidences statistics and detected photons energy distribution within the PET energy window. A 20% discrepancy was observed in the absolute scatter fraction, likely caused by differences in the tracking of higher-energy cascade gamma photons. On average the new simulation is 6 times faster than GATE, and the computing time can be further improved by using variance reduction techniques currently available in SimSET. Comparison with phantom acquisitions showed agreements in spatial resolutions and the general shape of projection profiles; however, the standard scatter correction method on the scanner is not directly applicable for 86Y PET as it leads to incorrect scatter fractions. The new simulation was used to characterize 86Y PET. Compared with conventional 18F PET, in which major contamination at low count rates comes from scattered events, cascade gamma-involved events are more important in 86Y PET. The two types of contaminations have completely different distribution patterns, which should be considered for the corrections of their effects. Our approach will be further improved in the future in the modeling of random coincidences and tracking of high energy photons, and simulation results will be used for the development of correction methods in 86Y PET. PMID:19521011

  4. Monte Carlo modeling of cascade gamma rays in {sup 86}Y PET imaging: preliminary results

    Energy Technology Data Exchange (ETDEWEB)

    Zhu Xuping; El Fakhri, Georges [Radiology Department, Massachusetts General Hospital and Harvard Medical School, 55 Fruit Street, Boston, Massachusetts, MA (United States)], E-mail: xzhu4@Partners.org, E-mail: elfakhri@pet.mgh.harvard.edu

    2009-07-07

    {sup 86}Y is a PET agent that could be used as an ideal surrogate to allow personalized dosimetry in {sup 90}Y radionuclide therapy. However, {sup 86}Y also emits cascade gamma rays. We have developed a Monte Carlo program based on SimSET (Simulation System for Emission Tomography) to model cascade gamma rays in PET imaging. The new simulation was validated with the GATE simulation package. Agreements within 15% were found in spatial resolution, apparent scatter fraction (ratio of coincidences outside peak regions in line source sinograms), single and coincidence statistics and detected photons energy distribution within the PET energy window. A discrepancy of 20% was observed in the absolute scatter fraction, likely caused by differences in the tracking of higher energy cascade gamma photons. On average, the new simulation is 6 times faster than GATE, and the computing time can be further improved by using variance reduction techniques currently available in SimSET. Comparison with phantom acquisitions showed agreements in spatial resolutions and the general shape of projection profiles; however, the standard scatter correction method on the scanner is not directly applicable to {sup 86}Y PET as it leads to incorrect scatter fractions. The new simulation was used to characterize {sup 86}Y PET. Compared with conventional {sup 18}F PET, in which major contamination at low count rates comes from scattered events, cascade gamma-involved events are more important in {sup 86}Y PET. The two types of contaminations have completely different distribution patterns, which should be considered for the corrections of their effects. Our approach will be further improved in the future in the modeling of random coincidences and tracking of high-energy photons, and simulation results will be used for the development of correction methods in {sup 86}Y PET.

  5. Study of the point spread function (PSF) for 123I SPECT imaging using Monte Carlo simulation

    Science.gov (United States)

    Cot, A.; Sempau, J.; Pareto, D.; Bullich, S.; Pavía, J.; Calviño, F.; Ros, D.

    2004-07-01

    The iterative reconstruction algorithms employed in brain single-photon emission computed tomography (SPECT) allow some quantitative parameters of the image to be improved. These algorithms require accurate modelling of the so-called point spread function (PSF). Nowadays, most in vivo neurotransmitter SPECT studies employ pharmaceuticals radiolabelled with 123I. In addition to an intense line at 159 keV, the decay scheme of this radioisotope includes some higher energy gammas which may have a non-negligible contribution to the PSF. The aim of this work is to study this contribution for two low-energy high-resolution collimator configurations, namely, the parallel and the fan beam. The transport of radiation through the material system is simulated with the Monte Carlo code PENELOPE. We have developed a main program that deals with the intricacies associated with tracking photon trajectories through the geometry of the collimator and detection systems. The simulated PSFs are partly validated with a set of experimental measurements that use the 511 keV annihilation photons emitted by a 18F source. Sensitivity and spatial resolution have been studied, showing that a significant fraction of the detection events in the energy window centred at 159 keV (up to approximately 49% for the parallel collimator) are originated by higher energy gamma rays, which contribute to the spatial profile of the PSF mostly outside the 'geometrical' region dominated by the low-energy photons. Therefore, these high-energy counts are to be considered as noise, a fact that should be taken into account when modelling PSFs for reconstruction algorithms. We also show that the fan beam collimator gives higher signal-to-noise ratios than the parallel collimator for all the source positions analysed.

  6. Accurate study of FosPeg® distribution in a mouse model using fluorescence imaging technique and fluorescence white monte carlo simulations

    DEFF Research Database (Denmark)

    Xie, Haiyan; Liu, Haichun; Svenmarker, Pontus;

    2010-01-01

    Fluorescence imaging is used for quantitative in vivo assessment of drug concentration. Light attenuation in tissue is compensated for through Monte-Carlo simulations. The intrinsic fluorescence intensity, directly proportional to the drug concentration, could be obtained....

  7. Development of virtual CT DICOM images of patients with tumors: application for TPS and Monte Carlo dose evaluation

    International Nuclear Information System (INIS)

    A novel procedure for the generation of a realistic virtual Computed Tomography (CT) image of a patient, using the advanced Boundary RE Presentation (BREP)-based model MASH, has been implemented. This method can be used in radiotherapy assessment. It is shown that it is possible to introduce an artificial cancer, which can be modeled using mesh surfaces. The use of virtual CT images based on BREP models presents several advantages with respect to CT images of actual patients, such as automation, control and flexibility. As an example, two artificial cases, namely a brain and a prostate cancer, were created through the generation of images and tumor/organ contours. As a secondary objective, the described methodology has been used to generate input files for treatment planning system (TPS) and Monte Carlo code dose evaluation. In this paper, we consider treatment plans generated assuming a dose delivery via an active proton beam scanning performed with the INFN-IBA TPS kernel. Additionally, Monte Carlo simulations of the two treatment plans were carried out with GATE/GEANT4. The work demonstrates the feasibility of the approach based on the BREP modeling to produce virtual CT images. In conclusion, this study highlights the benefits in using digital phantom model capable of representing different anatomical structures and varying tumors across different patients. These models could be useful for assessing radiotherapy treatment planning systems (TPS) and computer simulations for the evaluation of the adsorbed dose. (author)

  8. Patient-specific scatter correction in clinical cone beam computed tomography imaging made possible by the combination of Monte Carlo simulations and a ray tracing algorithm

    DEFF Research Database (Denmark)

    Slot Thing, Rune; Bernchou, Uffe; Mainegra-Hing, Ernesto;

    2013-01-01

    Abstract Purpose. Cone beam computed tomography (CBCT) image quality is limited by scattered photons. Monte Carlo (MC) simulations provide the ability of predicting the patient-specific scatter contamination in clinical CBCT imaging. Lengthy simulations prevent MC-based scatter correction from...

  9. Collimator and energy window optimization for ⁹⁰Y bremsstrahlung SPECT imaging: A SIMIND Monte Carlo study.

    Science.gov (United States)

    Roshan, Hoda Rezaei; Mahmoudian, Babak; Gharepapagh, Esmaeil; Azarm, Ahmadreza; Islamian, Jalil Pirayesh

    2016-02-01

    Treatment efficacy of radioembolization using Yttrium-90 ((90)Y) microspheres is assessed by the (90)Y bremsstrahlung single photon emission computed tomography (SPECT) imaging following radioembolization. The radioisotopic image has the potential of providing reliable activity map of (90)Y microspheres distribution. One of the main reasons of the poor image quality in (90)Y bremsstrahlung SPECT imaging is the continuous and broad energy spectrum of the related bremsstrahlung photons. Furthermore, collimator geometry plays an impressive role in the spatial resolution, sensitivity and image contrast. Due to the relatively poor quality of the (90)Y bremsstrahlung SPECT images, we intend to optimize the medium-energy (ME) parallel-hole collimator and energy window. The Siemens e.cam gamma camera equipped with a ME collimator and a voxelized phantom was simulated by the SImulating Medical Imaging Nuclear Detectors (SIMIND) program. We used the SIMIND Monte Carlo program to generate the (90)Y bremsstrahlung SPECT projection of the digital Jaszczak phantom. The phantom consist of the six hot spheres ranging from 9.5 to 31.8mm in diameter, which are used to evaluate the image contrast. In order to assess the effect of the energy window on the image contrast, three energy windows ranging from 60 to 160 KeV, 160 to 400 KeV, and 60 to 400 KeV were set on a (90)Y bremsstrahlung spectrum. As well, the effect of the hole diameter of a ME collimator on the image contrast and bremsstrahlung spectrum were investigated. For the fixed collimator and septa thickness values (3.28 cm and 1.14 mm, respectively), a hole diameter range (2.35-3.3mm) was chosen based on the appropriate balance between the spatial resolution and sensitivity. The optimal energy window for (90)Y bremsstrahlung SPECT imaging was extended energy window from 60 to 400 KeV. Besides, The optimal value of the hole diameter of ME collimator was obtained 3.3mm. Geometry of the ME parallel-hole collimator and energy

  10. Theoretical and Monte Carlo optimization of a stacked three-layer flat-panel x-ray imager for applications in multi-spectral diagnostic medical imaging

    Science.gov (United States)

    Lopez Maurino, Sebastian; Badano, Aldo; Cunningham, Ian A.; Karim, Karim S.

    2016-03-01

    We propose a new design of a stacked three-layer flat-panel x-ray detector for dual-energy (DE) imaging. Each layer consists of its own scintillator of individual thickness and an underlying thin-film-transistor-based flat-panel. Three images are obtained simultaneously in the detector during the same x-ray exposure, thereby eliminating any motion artifacts. The detector operation is two-fold: a conventional radiography image can be obtained by combining all three layers' images, while a DE subtraction image can be obtained from the front and back layers' images, where the middle layer acts as a mid-filter that helps achieve spectral separation. We proceed to optimize the detector parameters for two sample imaging tasks that could particularly benefit from this new detector by obtaining the best possible signal to noise ratio per root entrance exposure using well-established theoretical models adapted to fit our new design. These results are compared to a conventional DE temporal subtraction detector and a single-shot DE subtraction detector with a copper mid-filter, both of which underwent the same theoretical optimization. The findings are then validated using advanced Monte Carlo simulations for all optimized detector setups. Given the performance expected from initial results and the recent decrease in price for digital x-ray detectors, the simplicity of the three-layer stacked imager approach appears promising to usher in a new generation of multi-spectral digital x-ray diagnostics.

  11. Validation of the GATE Monte Carlo simulation platform for modelling a CsI(Tl) scintillation camera dedicated to small animal imaging

    CERN Document Server

    Lazaro, D; Loudos, G; Strul, D; Santin, G; Giokaris, N; Donnarieix, D; Maigne, L; Spanoudaki, V; Styliaris, S; Staelens, S; Breton, V

    2004-01-01

    Monte Carlo simulations are increasingly used in scintigraphic imaging to model imaging systems and to develop and assess tomographic reconstruction algorithms and correction methods for improved image quantitation. GATE (GEANT 4 Application for Tomographic Emission) is a new Monte Carlo simulation platform based on GEANT4 dedicated to nuclear imaging applications. This paper describes the GATE simulation of a prototype of scintillation camera dedicated to small animal imaging and consisting of a CsI(Tl) crystal array coupled to a position sensitive photomultiplier tube. The relevance of GATE to model the camera prototype was assessed by comparing simulated 99mTc point spread functions, energy spectra, sensitivities, scatter fractions and image of a capillary phantom with the corresponding experimental measurements. Results showed an excellent agreement between simulated and experimental data: experimental spatial resolutions were predicted with an error less than 100 mu m. The difference between experimental...

  12. A Monte Carlo study of the effect of coded-aperture material and thickness on neutron imaging

    International Nuclear Information System (INIS)

    In this paper, a coded-aperture design for a scintillator-based neutron imaging system has been simulated using a series of Monte Carlo simulations. Using Monte Carlo simulations, work to optimise a system making use of the EJ-426 neutron scintillator detector has been conducted. This type of scintillator has a low sensitivity to gamma rays and is therefore particularly useful for neutron detection in a mixed radiation environment. Simulations have been conducted using varying coded-aperture materials and different coded-aperture thicknesses. From this, neutron images have been produced, compared qualitatively and quantitatively for each case to find the best material for the MURA (modified uniformly redundant array) pattern. The neutron images generated also allow observations on how differing thicknesses of coded-aperture impact the system. A system in which a neutron sensitive scintillator has been used in conjunction with a MURA coded aperture to detect and locate a neutron emitting point source centralised in the system has been simulated. A comparison between the results of the different coded-aperture thicknesses is discussed, via the calculation of system error between the reconstructed source location and the actual location. As the system is small scale with a relatively large step along the axis (0.5 cm), it is justifiable to say that the smaller error values provide satisfactory results, which correlate with only a few centimetres difference in the reconstructed source location to actual source location. A general trend of increasing error can be deduced both as the thickness of the coded-aperture material decreases and the capture cross section of the different materials reduces. (authors)

  13. Monte Carlo simulation studies on scintillation detectors and image reconstruction of brain-phantom tumors in TOFPET

    Directory of Open Access Journals (Sweden)

    Mondal Nagendra

    2009-01-01

    Full Text Available This study presents Monte Carlo Simulation (MCS results of detection efficiencies, spatial resolutions and resolving powers of a time-of-flight (TOF PET detector systems. Cerium activated Lutetium Oxyorthosilicate (Lu 2 SiO 5 : Ce in short LSO, Barium Fluoride (BaF 2 and BriLanCe 380 (Cerium doped Lanthanum tri-Bromide, in short LaBr 3 scintillation crystals are studied in view of their good time and energy resolutions and shorter decay times. The results of MCS based on GEANT show that spatial resolution, detection efficiency and resolving power of LSO are better than those of BaF 2 and LaBr 3 , although it possesses inferior time and energy resolutions. Instead of the conventional position reconstruction method, newly established image reconstruction (talked about in the previous work method is applied to produce high-tech images. Validation is a momentous step to ensure that this imaging method fulfills all purposes of motivation discussed by reconstructing images of two tumors in a brain phantom.

  14. Evaluation of the respiratory motion effect in small animal PET images with GATE Monte Carlo simulations

    OpenAIRE

    Branco, Susana; Almeida, Pedro; Jan, Sébastien

    2011-01-01

    The rapid growth in genetics and molecular biology combined with the development of techniques for genetically engineering small animals has led to increased interest in in vivo small animal imaging. Small animal imaging has been applied frequently to the imaging of small animals (mice and rats), which are ubiquitous in modeling human diseases and testing treatments. The use of PET in small animals allows the use of subjects as their own control, reducing the interanimal variability....

  15. Ideal-observer computation in medical imaging with use of Markov-chain Monte Carlo techniques

    OpenAIRE

    Kupinski, Matthew A.; Hoppin, John W.; Clarkson, Eric; Barrett, Harrison H.

    2003-01-01

    The ideal observer sets an upper limit on the performance of an observer on a detection or classification task. The performance of the ideal observer can be used to optimize hardware components of imaging systems and also to determine another observer’s relative performance in comparison with the best possible observer. The ideal observer employs complete knowledge of the statistics of the imaging system, including the noise and object variability. Thus computing the ideal observer for images...

  16. Sensitivity study for CT image use in Monte Carlo treatment planning

    Science.gov (United States)

    Verhaegen, Frank; Devic, Slobodan

    2005-03-01

    An important step in Monte Carlo treatment planning (MCTP), which is commonly performed uncritically, is segmentation of the patient CT data into a voxel phantom for dose calculation. In addition to assigning mass densities to voxels, as is done in conventional TP, this entails assigning media. Mis-assignment of media can potentially lead to significant dose errors in MCTP. In this work, a test phantom with exact-known composition was used to study CT segmentation errors and to quantify subsequent MCTP inaccuracies. For our test cases, we observed dose errors in some regions of up to 10% for 6 and 15 MV photons, more than 30% for an 18 MeV electron beam and more than 40% for 250 kVp photons. It is concluded that a careful CT calibration with a suitable phantom is essential. Generic calibrations and the use of commercial CT phantoms have to be critically assessed.

  17. Cerenkov luminescence imaging of human breast cancer: a Monte Carlo simulations study

    International Nuclear Information System (INIS)

    Cerenkov luminescence imaging (CLI) is a novel molecular imaging technique based on the detection of Cerenkov light produced by beta particles traveling through biological tissues. In this paper we simulated using 18F and 90Y the possibility of detecting Cerenkov luminescence in human breast tissues, in order to evaluate the potential of the CLI technique in a clinical setting. A human breast digital phantom was obtained from an 18F-FDG CT-PET scan. The spectral features of the breast surface emission were obtained as well as the simulated images obtainable by a cooled CCD detector. The simulated images revealed a signal to noise ratio equal to 6 for a 300 s of acquisition time. We concluded that a dedicated human Cerenkov imaging detector can be designed in order to offer a valid low cost alternative to diagnostic techniques in nuclear medicine, in particular allowing the detection of beta-minus emitters used in radiotherapy

  18. A practical cone-beam CT scatter correction method with optimized Monte Carlo simulations for image-guided radiation therapy

    Science.gov (United States)

    Xu, Yuan; Bai, Ti; Yan, Hao; Ouyang, Luo; Pompos, Arnold; Wang, Jing; Zhou, Linghong; Jiang, Steve B.; Jia, Xun

    2015-05-01

    Cone-beam CT (CBCT) has become the standard image guidance tool for patient setup in image-guided radiation therapy. However, due to its large illumination field, scattered photons severely degrade its image quality. While kernel-based scatter correction methods have been used routinely in the clinic, it is still desirable to develop Monte Carlo (MC) simulation-based methods due to their accuracy. However, the high computational burden of the MC method has prevented routine clinical application. This paper reports our recent development of a practical method of MC-based scatter estimation and removal for CBCT. In contrast with conventional MC approaches that estimate scatter signals using a scatter-contaminated CBCT image, our method used a planning CT image for MC simulation, which has the advantages of accurate image intensity and absence of image truncation. In our method, the planning CT was first rigidly registered with the CBCT. Scatter signals were then estimated via MC simulation. After scatter signals were removed from the raw CBCT projections, a corrected CBCT image was reconstructed. The entire workflow was implemented on a GPU platform for high computational efficiency. Strategies such as projection denoising, CT image downsampling, and interpolation along the angular direction were employed to further enhance the calculation speed. We studied the impact of key parameters in the workflow on the resulting accuracy and efficiency, based on which the optimal parameter values were determined. Our method was evaluated in numerical simulation, phantom, and real patient cases. In the simulation cases, our method reduced mean HU errors from 44 to 3 HU and from 78 to 9 HU in the full-fan and the half-fan cases, respectively. In both the phantom and the patient cases, image artifacts caused by scatter, such as ring artifacts around the bowtie area, were reduced. With all the techniques employed, we achieved computation time of less than 30 s including the

  19. Monte Carlo simulation of X-ray imaging and spectroscopy experiments using quadric geometry and variance reduction techniques

    Science.gov (United States)

    Golosio, Bruno; Schoonjans, Tom; Brunetti, Antonio; Oliva, Piernicola; Masala, Giovanni Luca

    2014-03-01

    The simulation of X-ray imaging experiments is often performed using deterministic codes, which can be relatively fast and easy to use. However, such codes are generally not suitable for the simulation of even slightly more complex experimental conditions, involving, for instance, first-order or higher-order scattering, X-ray fluorescence emissions, or more complex geometries, particularly for experiments that combine spatial resolution with spectral information. In such cases, simulations are often performed using codes based on the Monte Carlo method. In a simple Monte Carlo approach, the interaction position of an X-ray photon and the state of the photon after an interaction are obtained simply according to the theoretical probability distributions. This approach may be quite inefficient because the final channels of interest may include only a limited region of space or photons produced by a rare interaction, e.g., fluorescent emission from elements with very low concentrations. In the field of X-ray fluorescence spectroscopy, this problem has been solved by combining the Monte Carlo method with variance reduction techniques, which can reduce the computation time by several orders of magnitude. In this work, we present a C++ code for the general simulation of X-ray imaging and spectroscopy experiments, based on the application of the Monte Carlo method in combination with variance reduction techniques, with a description of sample geometry based on quadric surfaces. We describe the benefits of the object-oriented approach in terms of code maintenance, the flexibility of the program for the simulation of different experimental conditions and the possibility of easily adding new modules. Sample applications in the fields of X-ray imaging and X-ray spectroscopy are discussed. Catalogue identifier: AERO_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AERO_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland

  20. Monte Carlo simulations of the dose from imaging with GE eXplore 120 micro-CT using GATE

    International Nuclear Information System (INIS)

    Purpose: Small animals are increasingly used as translational models in preclinical imaging studies involving microCT, during which the subjects can be exposed to large amounts of radiation. While the radiation levels are generally sublethal, studies have shown that low-level radiation can change physiological parameters in mice. In order to rule out any influence of radiation on the outcome of such experiments, or resulting deterministic effects in the subjects, the levels of radiation involved need to be addressed. The aim of this study was to investigate the radiation dose delivered by the GE eXplore 120 microCT non-invasively using Monte Carlo simulations in GATE and to compare results to previously obtained experimental values. Methods: Tungsten X-ray spectra were simulated at 70, 80, and 97 kVp using an analytical tool and their half-value layers were simulated for spectra validation against experimentally measured values of the physical X-ray tube. A Monte Carlo model of the microCT system was set up and four protocols that are regularly applied to live animal scanning were implemented. The computed tomography dose index (CTDI) inside a PMMA phantom was derived and multiple field of view acquisitions were simulated using the PMMA phantom, a representative mouse and rat. Results: Simulated half-value layers agreed with experimentally obtained results within a 7% error window. The CTDI ranged from 20 to 56 mGy and closely matched experimental values. Derived organ doses in mice reached 459 mGy in bones and up to 200 mGy in soft tissue organs using the highest energy protocol. Dose levels in rats were lower due to the increased mass of the animal compared to mice. The uncertainty of all dose simulations was below 14%. Conclusions: Monte Carlo simulations proved a valuable tool to investigate the 3D dose distribution in animals from microCT. Small animals, especially mice (due to their small volume), receive large amounts of radiation from the GE eXplore 120

  1. Monte Carlo simulations of the dose from imaging with GE eXplore 120 micro-CT using GATE

    Energy Technology Data Exchange (ETDEWEB)

    Bretin, Florian; Bahri, Mohamed Ali; Luxen, André; Phillips, Christophe; Plenevaux, Alain; Seret, Alain, E-mail: aseret@ulg.ac.be [Cyclotron Research Centre, University of Liège, Sart Tilman B30, Liège 4000 (Belgium)

    2015-10-15

    Purpose: Small animals are increasingly used as translational models in preclinical imaging studies involving microCT, during which the subjects can be exposed to large amounts of radiation. While the radiation levels are generally sublethal, studies have shown that low-level radiation can change physiological parameters in mice. In order to rule out any influence of radiation on the outcome of such experiments, or resulting deterministic effects in the subjects, the levels of radiation involved need to be addressed. The aim of this study was to investigate the radiation dose delivered by the GE eXplore 120 microCT non-invasively using Monte Carlo simulations in GATE and to compare results to previously obtained experimental values. Methods: Tungsten X-ray spectra were simulated at 70, 80, and 97 kVp using an analytical tool and their half-value layers were simulated for spectra validation against experimentally measured values of the physical X-ray tube. A Monte Carlo model of the microCT system was set up and four protocols that are regularly applied to live animal scanning were implemented. The computed tomography dose index (CTDI) inside a PMMA phantom was derived and multiple field of view acquisitions were simulated using the PMMA phantom, a representative mouse and rat. Results: Simulated half-value layers agreed with experimentally obtained results within a 7% error window. The CTDI ranged from 20 to 56 mGy and closely matched experimental values. Derived organ doses in mice reached 459 mGy in bones and up to 200 mGy in soft tissue organs using the highest energy protocol. Dose levels in rats were lower due to the increased mass of the animal compared to mice. The uncertainty of all dose simulations was below 14%. Conclusions: Monte Carlo simulations proved a valuable tool to investigate the 3D dose distribution in animals from microCT. Small animals, especially mice (due to their small volume), receive large amounts of radiation from the GE eXplore 120

  2. Dynamic 99mTc-MAG3 renography: images for quality control obtained by combining pharmacokinetic modelling, an anthropomorphic computer phantom and Monte Carlo simulated scintillation camera imaging

    Science.gov (United States)

    Brolin, Gustav; Sjögreen Gleisner, Katarina; Ljungberg, Michael

    2013-05-01

    In dynamic renal scintigraphy, the main interest is the radiopharmaceutical redistribution as a function of time. Quality control (QC) of renal procedures often relies on phantom experiments to compare image-based results with the measurement setup. A phantom with a realistic anatomy and time-varying activity distribution is therefore desirable. This work describes a pharmacokinetic (PK) compartment model for 99mTc-MAG3, used for defining a dynamic whole-body activity distribution within a digital phantom (XCAT) for accurate Monte Carlo (MC)-based images for QC. Each phantom structure is assigned a time-activity curve provided by the PK model, employing parameter values consistent with MAG3 pharmacokinetics. This approach ensures that the total amount of tracer in the phantom is preserved between time points, and it allows for modifications of the pharmacokinetics in a controlled fashion. By adjusting parameter values in the PK model, different clinically realistic scenarios can be mimicked, regarding, e.g., the relative renal uptake and renal transit time. Using the MC code SIMIND, a complete set of renography images including effects of photon attenuation, scattering, limited spatial resolution and noise, are simulated. The obtained image data can be used to evaluate quantitative techniques and computer software in clinical renography.

  3. Design considerations for a C-shaped PET system, dedicated to small animal brain imaging, using GATE Monte Carlo simulations

    Science.gov (United States)

    Efthimiou, N.; Papadimitroulas, P.; Kostou, T.; Loudos, G.

    2015-09-01

    Commercial clinical and preclinical PET scanners rely on the full cylindrical geometry for whole body scans as well as for dedicated organs. In this study we propose the construction of a low cost dual-head C-shaped PET system dedicated for small animal brain imaging. Monte Carlo simulation studies were performed using GATE toolkit to evaluate the optimum design in terms of sensitivity, distortions in the FOV and spatial resolution. The PET model is based on SiPMs and BGO pixelated arrays. Four different configurations with C- angle 0°, 15°, 30° and 45° within the modules, were considered. Geometrical phantoms were used for the evaluation process. STIR software, extended by an efficient multi-threaded ray tracing technique, was used for the image reconstruction. The algorithm automatically adjusts the size of the FOV according to the shape of the detector's geometry. The results showed improvement in sensitivity of ∼15% in case of 45° C-angle compared to the 0° case. The spatial resolution was found 2 mm for 45° C-angle.

  4. Assessment of array scintillation detector for follicle thyroid 2-D image acquisition using Monte Carlo simulation

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Carlos Borges da; Santanna, Claudio Reis de [Instituto de Engenharia Nuclear (IEN/CNEN-RJ), Rio de Janeiro, RJ (Brazil)]. E-mails: borges@ien.gov.br; santanna@ien.gov.br; Braz, Delson [Universidade Federal do Rio de Janeiro (UFRJ), RJ (Brazil). Coordenacao dos Programas de Pos-graduacao de Engenharia (COPPE). Lab. de Instrumentacao Nuclear]. E-mail: delson@lin.ufrj.br; Carvalho, Denise Pires de [Universidade Federal do Rio de Janeiro (UFRJ), RJ (Brazil). Inst. de Biofisica Carlos Chagas Filho. Lab. de Fisiologia Endocrina]. E-mail: dencarv@ufrj.br

    2007-07-01

    This work presents an innovative study to find out the adequate scintillation inorganic detector array to be used coupled to a specific light photo sensor, a charge coupled device (CCD), through a fiber optic plate. The goal is to choose the type of detector that fits a 2-dimensional imaging acquisition of a cell thyroid tissue application with high resolution and detection efficiency in order to map a follicle image using gamma radiation emission. A point or volumetric source - detector simulation by using a MCNP4B general code, considering different source energies, detector materials and geometry including pixel sizes and reflector types was performed. In this study, simulations were performed for 7 x 7 and 127 x 127 arrays using CsI(Tl) and BGO scintillation crystals with pixel size ranging from 1 x 1 cm{sup 2} to 10 x 10 {mu}m{sup 2} and radiation thickness ranging from 1 mm to 10 mm. The effect of all these parameters was investigated to find the best source-detector system that result in an image with the best contrast details. The results showed that it is possible to design a specific imaging system that allows searching for in-vitro studies, specifically in radiobiology applied to endocrine physiology. (author)

  5. 3D imaging using combined neutron-photon fan-beam tomography: A Monte Carlo study.

    Science.gov (United States)

    Hartman, J; Yazdanpanah, A Pour; Barzilov, A; Regentova, E

    2016-05-01

    The application of combined neutron-photon tomography for 3D imaging is examined using MCNP5 simulations for objects of simple shapes and different materials. Two-dimensional transmission projections were simulated for fan-beam scans using 2.5MeV deuterium-deuterium and 14MeV deuterium-tritium neutron sources, and high-energy X-ray sources, such as 1MeV, 6MeV and 9MeV. Photons enable assessment of electron density and related mass density, neutrons aid in estimating the product of density and material-specific microscopic cross section- the ratio between the two provides the composition, while CT allows shape evaluation. Using a developed imaging technique, objects and their material compositions have been visualized. PMID:26953978

  6. SU-E-J-205: Monte Carlo Modeling of Ultrasound Probes for Real-Time Ultrasound Image-Guided Radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Hristov, D; Schlosser, J; Bazalova, M [Stanford Universtiy, Stanford, CA (United States); Chen, J [UCSF Comprehensive Cancer Center, Lafayette, CA (United States)

    2014-06-01

    Purpose: To quantify the effect of ultrasound (US) probe beam attenuation for radiation therapy delivered under real-time US image guidance by means of Monte Carlo (MC) simulations. Methods: MC models of two Philips US probes, an X6-1 matrix-array transducer and a C5-2 curved-array transducer, were built based on their CT images in the EGSnrc BEAMnrc and DOSXYZnrc codes. Due to the metal parts, the probes were scanned in a Tomotherapy machine with a 3.5 MV beam. Mass densities in the probes were assigned based on an electron density calibration phantom consisting of cylinders with mass densities between 0.2–8.0 g/cm{sup 3}. Beam attenuation due to the probes was measured in a solid water phantom for a 6 MV and 15 MV 15x15 cm{sup 2} beam delivered on a Varian Trilogy linear accelerator. The dose was measured with the PTW-729 ionization chamber array at two depths and compared to MC simulations. The extreme case beam attenuation expected in robotic US image guided radiotherapy for probes in upright position was quantified by means of MC simulations. Results: The 3.5 MV CT number to mass density calibration curve was found to be linear with R{sup 2} > 0.99. The maximum mass densities were 4.6 and 4.2 g/cm{sup 3} in the C5-2 and X6-1 probe, respectively. Gamma analysis of the simulated and measured doses revealed that over 98% of measurement points passed the 3%/3mm criteria for both probes and measurement depths. The extreme attenuation for probes in upright position was found to be 25% and 31% for the C5-2 and X6-1 probe, respectively, for both 6 and 15 MV beams at 10 cm depth. Conclusion: MC models of two US probes used for real-time image guidance during radiotherapy have been built. As a Result, radiotherapy treatment planning with the imaging probes in place can now be performed. J Schlosser is an employee of SoniTrack Systems, Inc. D Hristov has financial interest in SoniTrack Systems, Inc.

  7. TH-A-18C-09: Ultra-Fast Monte Carlo Simulation for Cone Beam CT Imaging of Brain Trauma

    Energy Technology Data Exchange (ETDEWEB)

    Sisniega, A; Zbijewski, W; Stayman, J [Department of Biomedical Engineering, Johns Hopkins University (United States); Yorkston, J [Carestream Health (United States); Aygun, N [Department of Radiology, Johns Hopkins University (United States); Koliatsos, V [Department of Neurology, Johns Hopkins University (United States); Siewerdsen, J [Department of Biomedical Engineering, Johns Hopkins University (United States); Department of Radiology, Johns Hopkins University (United States)

    2014-06-15

    Purpose: Application of cone-beam CT (CBCT) to low-contrast soft tissue imaging, such as in detection of traumatic brain injury, is challenged by high levels of scatter. A fast, accurate scatter correction method based on Monte Carlo (MC) estimation is developed for application in high-quality CBCT imaging of acute brain injury. Methods: The correction involves MC scatter estimation executed on an NVIDIA GTX 780 GPU (MC-GPU), with baseline simulation speed of ~1e7 photons/sec. MC-GPU is accelerated by a novel, GPU-optimized implementation of variance reduction (VR) techniques (forced detection and photon splitting). The number of simulated tracks and projections is reduced for additional speed-up. Residual noise is removed and the missing scatter projections are estimated via kernel smoothing (KS) in projection plane and across gantry angles. The method is assessed using CBCT images of a head phantom presenting a realistic simulation of fresh intracranial hemorrhage (100 kVp, 180 mAs, 720 projections, source-detector distance 700 mm, source-axis distance 480 mm). Results: For a fixed run-time of ~1 sec/projection, GPU-optimized VR reduces the noise in MC-GPU scatter estimates by a factor of 4. For scatter correction, MC-GPU with VR is executed with 4-fold angular downsampling and 1e5 photons/projection, yielding 3.5 minute run-time per scan, and de-noised with optimized KS. Corrected CBCT images demonstrate uniformity improvement of 18 HU and contrast improvement of 26 HU compared to no correction, and a 52% increase in contrast-tonoise ratio in simulated hemorrhage compared to “oracle” constant fraction correction. Conclusion: Acceleration of MC-GPU achieved through GPU-optimized variance reduction and kernel smoothing yields an efficient (<5 min/scan) and accurate scatter correction that does not rely on additional hardware or simplifying assumptions about the scatter distribution. The method is undergoing implementation in a novel CBCT dedicated to brain

  8. Concurrent Reflectance Confocal Microscopy and Laser Doppler Flowmetry to Improve Skin Cancer Imaging: A Monte Carlo Model and Experimental Validation.

    Science.gov (United States)

    Mowla, Alireza; Taimre, Thomas; Lim, Yah Leng; Bertling, Karl; Wilson, Stephen J; Prow, Tarl W; Soyer, H Peter; Rakić, Aleksandar D

    2016-01-01

    Optical interrogation of suspicious skin lesions is standard care in the management of skin cancer worldwide. Morphological and functional markers of malignancy are often combined to improve expert human diagnostic power. We propose the evaluation of the combination of two independent optical biomarkers of skin tumours concurrently. The morphological modality of reflectance confocal microscopy (RCM) is combined with the functional modality of laser Doppler flowmetry, which is capable of quantifying tissue perfusion. To realize the idea, we propose laser feedback interferometry as an implementation of RCM, which is able to detect the Doppler signal in addition to the confocal reflectance signal. Based on the proposed technique, we study numerical models of skin tissue incorporating two optical biomarkers of malignancy: (i) abnormal red blood cell velocities and concentrations and (ii) anomalous optical properties manifested through tissue confocal reflectance, using Monte Carlo simulation. We also conduct a laboratory experiment on a microfluidic channel containing a dynamic turbid medium, to validate the efficacy of the technique. We quantify the performance of the technique by examining a signal to background ratio (SBR) in both the numerical and experimental models, and it is shown that both simulated and experimental SBRs improve consistently using this technique. This work indicates the feasibility of an optical instrument, which may have a role in enhanced imaging of skin malignancies. PMID:27598157

  9. Concurrent Reflectance Confocal Microscopy and Laser Doppler Flowmetry to Improve Skin Cancer Imaging: A Monte Carlo Model and Experimental Validation

    Science.gov (United States)

    Mowla, Alireza; Taimre, Thomas; Lim, Yah Leng; Bertling, Karl; Wilson, Stephen J.; Prow, Tarl W.; Soyer, H. Peter; Rakić, Aleksandar D.

    2016-01-01

    Optical interrogation of suspicious skin lesions is standard care in the management of skin cancer worldwide. Morphological and functional markers of malignancy are often combined to improve expert human diagnostic power. We propose the evaluation of the combination of two independent optical biomarkers of skin tumours concurrently. The morphological modality of reflectance confocal microscopy (RCM) is combined with the functional modality of laser Doppler flowmetry, which is capable of quantifying tissue perfusion. To realize the idea, we propose laser feedback interferometry as an implementation of RCM, which is able to detect the Doppler signal in addition to the confocal reflectance signal. Based on the proposed technique, we study numerical models of skin tissue incorporating two optical biomarkers of malignancy: (i) abnormal red blood cell velocities and concentrations and (ii) anomalous optical properties manifested through tissue confocal reflectance, using Monte Carlo simulation. We also conduct a laboratory experiment on a microfluidic channel containing a dynamic turbid medium, to validate the efficacy of the technique. We quantify the performance of the technique by examining a signal to background ratio (SBR) in both the numerical and experimental models, and it is shown that both simulated and experimental SBRs improve consistently using this technique. This work indicates the feasibility of an optical instrument, which may have a role in enhanced imaging of skin malignancies. PMID:27598157

  10. Monte-Carlo simulation of pinhole collimator of a small field of view gamma camera for small animal imaging

    Institute of Scientific and Technical Information of China (English)

    ZHU Jie; MA Wenyan; ZHU Yufeng; MA Hongguang; WU Yuelei; HU Huasi; ZHANG Boping; HUO Yonggang; LIU Silu; JIAN Bin; WANG Zhaomin

    2009-01-01

    Needs in scintimammography applications,especially for small animal cardiac imaging,lead to develop a small field of view,high spatial resolution gamma camera with a pinhole collimator.However the ideal pinhole collimator must keep a compromise between spatial resolution and sensitivity.In order to design a pinhole collimator with an optimized sensitivity and spatial resolution,the spatial resolution and the geometric sensitivity response as a function of the source to collimator distance has been obtained by means of Monte-Carlo simulation for a small field of view gamma camera with a pinhole collimator of various-hole diameters.The results show that the camera with pinhole of 1 mm,1.5 mm and 2 mm diameter has respectively spatial resolution of 1.5 mm,2.25 mm and 3 mm and geometric sensitivity of 0.016%,0.022% and 0.036%,while the source to collimator distance is 3 cm.We chose the pinhole collimator with hole diameter size of 1.2 mm for our the gamma camera designed based on the wade-off between sensitivity and resolution.

  11. Markov Chain Monte Carlo Random Effects Modeling in Magnetic Resonance Image Processing Using the BRugs Interface to WinBUGS

    Directory of Open Access Journals (Sweden)

    David G. Gadian

    2011-10-01

    Full Text Available A common feature of many magnetic resonance image (MRI data processing methods is the voxel-by-voxel (a voxel is a volume element manner in which the processing is performed. In general, however, MRI data are expected to exhibit some level of spatial correlation, rendering an independent-voxels treatment inefficient in its use of the data. Bayesian random effect models are expected to be more efficient owing to their information-borrowing behaviour. To illustrate the Bayesian random effects approach, this paper outlines a Markov chain Monte Carlo (MCMC analysis of a perfusion MRI dataset, implemented in R using the BRugs package. BRugs provides an interface to WinBUGS and its GeoBUGS add-on. WinBUGS is a widely used programme for performing MCMC analyses, with a focus on Bayesian random effect models. A simultaneous modeling of both voxels (restricted to a region of interest and multiple subjects is demonstrated. Despite the low signal-to-noise ratio in the magnetic resonance signal intensity data, useful model signal intensity profiles are obtained. The merits of random effects modeling are discussed in comparison with the alternative approaches based on region-of-interest averaging and repeated independent voxels analysis. This paper focuses on perfusion MRI for the purpose of illustration, the main proposition being that random effects modeling is expected to be beneficial in many other MRI applications in which the signal-to-noise ratio is a limiting factor.

  12. Concurrent Reflectance Confocal Microscopy and Laser Doppler Flowmetry to Improve Skin Cancer Imaging: A Monte Carlo Model and Experimental Validation

    Directory of Open Access Journals (Sweden)

    Alireza Mowla

    2016-09-01

    Full Text Available Optical interrogation of suspicious skin lesions is standard care in the management of skin cancer worldwide. Morphological and functional markers of malignancy are often combined to improve expert human diagnostic power. We propose the evaluation of the combination of two independent optical biomarkers of skin tumours concurrently. The morphological modality of reflectance confocal microscopy (RCM is combined with the functional modality of laser Doppler flowmetry, which is capable of quantifying tissue perfusion. To realize the idea, we propose laser feedback interferometry as an implementation of RCM, which is able to detect the Doppler signal in addition to the confocal reflectance signal. Based on the proposed technique, we study numerical models of skin tissue incorporating two optical biomarkers of malignancy: (i abnormal red blood cell velocities and concentrations and (ii anomalous optical properties manifested through tissue confocal reflectance, using Monte Carlo simulation. We also conduct a laboratory experiment on a microfluidic channel containing a dynamic turbid medium, to validate the efficacy of the technique. We quantify the performance of the technique by examining a signal to background ratio (SBR in both the numerical and experimental models, and it is shown that both simulated and experimental SBRs improve consistently using this technique. This work indicates the feasibility of an optical instrument, which may have a role in enhanced imaging of skin malignancies.

  13. Concurrent Reflectance Confocal Microscopy and Laser Doppler Flowmetry to Improve Skin Cancer Imaging: A Monte Carlo Model and Experimental Validation.

    Science.gov (United States)

    Mowla, Alireza; Taimre, Thomas; Lim, Yah Leng; Bertling, Karl; Wilson, Stephen J; Prow, Tarl W; Soyer, H Peter; Rakić, Aleksandar D

    2016-09-01

    Optical interrogation of suspicious skin lesions is standard care in the management of skin cancer worldwide. Morphological and functional markers of malignancy are often combined to improve expert human diagnostic power. We propose the evaluation of the combination of two independent optical biomarkers of skin tumours concurrently. The morphological modality of reflectance confocal microscopy (RCM) is combined with the functional modality of laser Doppler flowmetry, which is capable of quantifying tissue perfusion. To realize the idea, we propose laser feedback interferometry as an implementation of RCM, which is able to detect the Doppler signal in addition to the confocal reflectance signal. Based on the proposed technique, we study numerical models of skin tissue incorporating two optical biomarkers of malignancy: (i) abnormal red blood cell velocities and concentrations and (ii) anomalous optical properties manifested through tissue confocal reflectance, using Monte Carlo simulation. We also conduct a laboratory experiment on a microfluidic channel containing a dynamic turbid medium, to validate the efficacy of the technique. We quantify the performance of the technique by examining a signal to background ratio (SBR) in both the numerical and experimental models, and it is shown that both simulated and experimental SBRs improve consistently using this technique. This work indicates the feasibility of an optical instrument, which may have a role in enhanced imaging of skin malignancies.

  14. Experimental Component Characterization, Monte-Carlo-Based Image Generation and Source Reconstruction for the Neutron Imaging System of the National Ignition Facility

    Energy Technology Data Exchange (ETDEWEB)

    Barrera, C A; Moran, M J

    2007-08-21

    The Neutron Imaging System (NIS) is one of seven ignition target diagnostics under development for the National Ignition Facility. The NIS is required to record hot-spot (13-15 MeV) and downscattered (6-10 MeV) images with a resolution of 10 microns and a signal-to-noise ratio (SNR) of 10 at the 20% contour. The NIS is a valuable diagnostic since the downscattered neutrons reveal the spatial distribution of the cold fuel during an ignition attempt, providing important information in the case of a failed implosion. The present study explores the parameter space of several line-of-sight (LOS) configurations that could serve as the basis for the final design. Six commercially available organic scintillators were experimentally characterized for their light emission decay profile and neutron sensitivity. The samples showed a long lived decay component that makes direct recording of a downscattered image impossible. The two best candidates for the NIS detector material are: EJ232 (BC422) plastic fibers or capillaries filled with EJ399B. A Monte Carlo-based end-to-end model of the NIS was developed to study the imaging capabilities of several LOS configurations and verify that the recovered sources meet the design requirements. The model includes accurate neutron source distributions, aperture geometries (square pinhole, triangular wedge, mini-penumbral, annular and penumbral), their point spread functions, and a pixelated scintillator detector. The modeling results show that a useful downscattered image can be obtained by recording the primary peak and the downscattered images, and then subtracting a decayed version of the former from the latter. The difference images need to be deconvolved in order to obtain accurate source distributions. The images are processed using a frequency-space modified-regularization algorithm and low-pass filtering. The resolution and SNR of these sources are quantified by using two surrogate sources. The simulations show that all LOS

  15. SU-C-201-06: Utility of Quantitative 3D SPECT/CT Imaging in Patient Specific Internal Dosimetry of 153-Samarium with GATE Monte Carlo Package

    Energy Technology Data Exchange (ETDEWEB)

    Fallahpoor, M; Abbasi, M [Tehran University of Medical Sciences, Vali-Asr Hospital, Tehran, Tehran (Iran, Islamic Republic of); Sen, A [University of Houston, Houston, TX (United States); Parach, A [Shahid Sadoughi University of Medical Sciences, Yazd, Yazd (Iran, Islamic Republic of); Kalantari, F [UT Southwestern Medical Center, Dallas, TX (United States)

    2015-06-15

    Purpose: Patient-specific 3-dimensional (3D) internal dosimetry in targeted radionuclide therapy is essential for efficient treatment. Two major steps to achieve reliable results are: 1) generating quantitative 3D images of radionuclide distribution and attenuation coefficients and 2) using a reliable method for dose calculation based on activity and attenuation map. In this research, internal dosimetry for 153-Samarium (153-Sm) was done by SPECT-CT images coupled GATE Monte Carlo package for internal dosimetry. Methods: A 50 years old woman with bone metastases from breast cancer was prescribed 153-Sm treatment (Gamma: 103keV and beta: 0.81MeV). A SPECT/CT scan was performed with the Siemens Simbia-T scanner. SPECT and CT images were registered using default registration software. SPECT quantification was achieved by compensating for all image degrading factors including body attenuation, Compton scattering and collimator-detector response (CDR). Triple energy window method was used to estimate and eliminate the scattered photons. Iterative ordered-subsets expectation maximization (OSEM) with correction for attenuation and distance-dependent CDR was used for image reconstruction. Bilinear energy mapping is used to convert Hounsfield units in CT image to attenuation map. Organ borders were defined by the itk-SNAP toolkit segmentation on CT image. GATE was then used for internal dose calculation. The Specific Absorbed Fractions (SAFs) and S-values were reported as MIRD schema. Results: The results showed that the largest SAFs and S-values are in osseous organs as expected. S-value for lung is the highest after spine that can be important in 153-Sm therapy. Conclusion: We presented the utility of SPECT-CT images and Monte Carlo for patient-specific dosimetry as a reliable and accurate method. It has several advantages over template-based methods or simplified dose estimation methods. With advent of high speed computers, Monte Carlo can be used for treatment planning

  16. Monte Carlo investigations of the effect of beam divergence on thick, segmented crystalline scintillators for radiotherapy imaging

    Science.gov (United States)

    Wang, Yi; El-Mohri, Youcef; Antonuk, Larry E.; Zhao, Qihua

    2010-07-01

    The use of thick, segmented scintillators in electronic portal imagers offers the potential for significant improvement in x-ray detection efficiency compared to conventional phosphor screens. Such improvement substantially increases the detective quantum efficiency (DQE), leading to the possibility of achieving soft-tissue visualization at clinically practical (i.e. low) doses using megavoltage (MV) cone-beam computed tomography. While these DQE increases are greatest at zero spatial frequency, they are diminished at higher frequencies as a result of degradation of spatial resolution due to lateral spreading of secondary radiation within the scintillator—an effect that is more pronounced for thicker scintillators. The extent of this spreading is even more accentuated for radiation impinging the scintillator at oblique angles of incidence due to beam divergence. In this paper, Monte Carlo simulations of radiation transport, performed to investigate and quantify the effects of beam divergence on the imaging performance of MV imagers based on two promising scintillators (BGO and CsI:Tl), are reported. In these studies, 10-40 mm thick scintillators, incorporating low-density polymer, or high-density tungsten septal walls, were examined for incident angles corresponding to that encountered at locations up to ~15 cm from the central beam axis (for an imager located 130 cm from a radiotherapy x-ray source). The simulations demonstrate progressively more severe spatial resolution degradation (quantified in terms of the effect on the modulation transfer function) as a function of increasing angle of incidence (as well as of the scintillator thickness). Since the noise power behavior was found to be largely independent of the incident angle, the dependence of the DQE on the incident angle is therefore primarily determined by the spatial resolution. The observed DQE degradation suggests that 10 mm thick scintillators are not strongly affected by beam divergence for

  17. Validation of the GATE Monte Carlo simulation platform for modelling a CsI(Tl) scintillation camera dedicated to small-animal imaging

    Science.gov (United States)

    Lazaro, D.; Buvat, I.; Loudos, G.; Strul, D.; Santin, G.; Giokaris, N.; Donnarieix, D.; Maigne, L.; Spanoudaki, V.; Styliaris, S.; Staelens, S.; Breton, V.

    2004-01-01

    Monte Carlo simulations are increasingly used in scintigraphic imaging to model imaging systems and to develop and assess tomographic reconstruction algorithms and correction methods for improved image quantitation. GATE (GEANT4 application for tomographic emission) is a new Monte Carlo simulation platform based on GEANT4 dedicated to nuclear imaging applications. This paper describes the GATE simulation of a prototype of scintillation camera dedicated to small-animal imaging and consisting of a CsI(Tl) crystal array coupled to a position-sensitive photomultiplier tube. The relevance of GATE to model the camera prototype was assessed by comparing simulated 99mTc point spread functions, energy spectra, sensitivities, scatter fractions and image of a capillary phantom with the corresponding experimental measurements. Results showed an excellent agreement between simulated and experimental data: experimental spatial resolutions were predicted with an error less than 100 µm. The difference between experimental and simulated system sensitivities for different source-to-collimator distances was within 2%. Simulated and experimental scatter fractions in a [98-182 keV] energy window differed by less than 2% for sources located in water. Simulated and experimental energy spectra agreed very well between 40 and 180 keV. These results demonstrate the ability and flexibility of GATE for simulating original detector designs. The main weakness of GATE concerns the long computation time it requires: this issue is currently under investigation by the GEANT4 and the GATE collaborations.

  18. Monte Carlo-based compensation for patient scatter, detector scatter, and crosstalk contamination in In-111 SPECT imaging

    Energy Technology Data Exchange (ETDEWEB)

    Moore, Stephen C. [Department of Radiology, Brigham and Women' s Hospital and Harvard Medical School, 75 Francis Street, Boston, MA 02115 (United States)]. E-mail: scmoore@bwh.harvard.edu; Ouyang, Jinsong [Department of Radiology, Brigham and Women' s Hospital and Harvard Medical School, 75 Francis Street, Boston, MA 02115 (United States); Park, Mi-Ae [Department of Radiology, Brigham and Women' s Hospital and Harvard Medical School, 75 Francis Street, Boston, MA 02115 (United States); El Fakhri, Georges [Department of Radiology, Brigham and Women' s Hospital and Harvard Medical School, 75 Francis Street, Boston, MA 02115 (United States)

    2006-12-20

    We have incorporated Monte Carlo (MC)-based estimates of patient scatter, detector scatter, and crosstalk into an iterative reconstruction algorithm, and compared its performance to that of a general spectral (GS) approach. We extended the MC-based reconstruction algorithm of de Jong et al. by (1) using the 'Delta scattering' method to determine photon interaction points (2) simulating scatter maps for many energy bins simultaneously, and (3) decoupling the simulation of the object and detector by using pre-stored point spread functions (PSF) that included all collimator and detector effects. A numerical phantom was derived from a segmented CT scan of a torso phantom. The relative values of In-111 activity concentration simulated in soft tissue, liver, spine, left lung, right lung, and five spherical tumors (1.3-2.0 cm diam.) were 1.0, 1.5, 1.5, 0.3, 0.5, and 10.0, respectively. GS scatter projections were incorporated additively in an OSEM reconstruction (6 subsetsx10 projectionsx2 photopeak windows). After three iterations, GS scatter projections were replaced by MC-estimated scatter projections for two additional iterations. MC-based compensation was quantitatively compared to GS-based compensation after five iterations. The bias of organ activity estimates ranged from -13% to -6.5% (GS), and from -1.4% to +5.0% (MC); tumor bias ranged from -20.0% to +10.0% for GS (mean{+-}std.dev.=-4.3{+-}11.9%), and from -2.2 to +18.8% for MC (+4.1{+-}8.6%). Image noise in all organs was less with MC than with GS.

  19. Comparison Between Linear and Nonlinear Models of Mixed Pixels in Remote Sensing Satellite Images Based on Cierniewski Surface BRDF Model by Means of Monte Carlo Ray Tracing Simulation

    Directory of Open Access Journals (Sweden)

    Kohei Arai

    2013-04-01

    Full Text Available Comparative study on linear and nonlinear mixed pixel models of which pixels in remote sensing satellite images is composed with plural ground cover materials mixed together, is conducted for remote sensing satellite image analysis. The mixed pixel models are based on Cierniewski of ground surface reflectance model. The comparative study is conducted by using of Monte Carlo Ray Tracing: MCRT simulations. Through simulation study, the difference between linear and nonlinear mixed pixel models is clarified. Also it is found that the simulation model is validated.

  20. Fast Monte Carlo based joint iterative reconstruction for simultaneous 99mTc/ 123I SPECT imaging.

    Science.gov (United States)

    Ouyang, Jinsong; El Fakhri, Georges; Moore, Stephen C

    2007-08-01

    Simultaneous 99mTC/ 123I SPECT allows the assessment of two physiological functions under identical conditions. The separation of these radionuclides is difficult, however, because their energies are close. Most energy-window-based scatter correction methods do not fully model either physical factors or patient-specific activity and attenuation distributions. We have developed a fast Monte Carlo (MC) simulation-based multiple-radionuclide and multiple-energy joint ordered-subset expectation-maximization (JOSEM) iterative reconstruction algorithm, MC-JOSEM. MC-JOSEM simultaneously corrects for scatter and cross talk as well as detector response within the reconstruction algorithm. We evaluated MC-JOSEM for simultaneous brain profusion (99mTc-HMPAO) and neurotransmission (123I-altropane) SPECT. MC simulations of 99mTc and 123I studies were generated separately and then combined to mimic simultaneous 99mTc/ 123I SPECT. All the details of photon transport through the brain, the collimator, and detector, including Compton and coherent scatter, septal penetration, and backscatter from components behind the crystal, were modeled. We reconstructed images from simultaneous dual-radionuclide projections in three ways. First, we reconstructed the photopeak-energy-window projections (with an asymmetric energy window for 1231) using the standard ordered-subsets expectation-maximization algorithm (NSC-OSEM). Second, we used standard OSEM to reconstruct 99mTc photopeak-energy-window projections, while including an estimate of scatter from a Compton-scatter energy window (SC-OSEM). Third, we jointly reconstructed both 99mTc and 123I images using projection data associated with two photo-peak energy windows and an intermediate-energy window using MC-JOSEM. For 15 iterations of reconstruction, the bias and standard deviation of 99mTc activity estimates in several brain structures were calculated for NSC-OSEM, SC-OSEM, and MC-JOSEM, using images reconstructed from primary

  1. A 3D Monte Carlo Method for Estimation of Patient-specific Internal Organs Absorbed Dose for (99m)Tc-hynic-Tyr(3)-octreotide Imaging.

    Science.gov (United States)

    Momennezhad, Mehdi; Nasseri, Shahrokh; Zakavi, Seyed Rasoul; Parach, Ali Asghar; Ghorbani, Mahdi; Asl, Ruhollah Ghahraman

    2016-01-01

    Single-photon emission computed tomography (SPECT)-based tracers are easily available and more widely used than positron emission tomography (PET)-based tracers, and SPECT imaging still remains the most prevalent nuclear medicine imaging modality worldwide. The aim of this study is to implement an image-based Monte Carlo method for patient-specific three-dimensional (3D) absorbed dose calculation in patients after injection of (99m)Tc-hydrazinonicotinamide (hynic)-Tyr(3)-octreotide as a SPECT radiotracer. (99m)Tc patient-specific S values and the absorbed doses were calculated with GATE code for each source-target organ pair in four patients who were imaged for suspected neuroendocrine tumors. Each patient underwent multiple whole-body planar scans as well as SPECT imaging over a period of 1-24 h after intravenous injection of (99m)hynic-Tyr(3)-octreotide. The patient-specific S values calculated by GATE Monte Carlo code and the corresponding S values obtained by MIRDOSE program differed within 4.3% on an average for self-irradiation, and differed within 69.6% on an average for cross-irradiation. However, the agreement between total organ doses calculated by GATE code and MIRDOSE program for all patients was reasonably well (percentage difference was about 4.6% on an average). Normal and tumor absorbed doses calculated with GATE were slightly higher than those calculated with MIRDOSE program. The average ratio of GATE absorbed doses to MIRDOSE was 1.07 ± 0.11 (ranging from 0.94 to 1.36). According to the results, it is proposed that when cross-organ irradiation is dominant, a comprehensive approach such as GATE Monte Carlo dosimetry be used since it provides more reliable dosimetric results. PMID:27134562

  2. A Monte Carlo simulation study of the effect of energy windows in computed tomography images based on an energy-resolved photon counting detector

    Science.gov (United States)

    Lee, Seung-Wan; Choi, Yu-Na; Cho, Hyo-Min; Lee, Young-Jin; Ryu, Hyun-Ju; Kim, Hee-Joung

    2012-08-01

    The energy-resolved photon counting detector provides the spectral information that can be used to generate images. The novel imaging methods, including the K-edge imaging, projection-based energy weighting imaging and image-based energy weighting imaging, are based on the energy-resolved photon counting detector and can be realized by using various energy windows or energy bins. The location and width of the energy windows or energy bins are important because these techniques generate an image using the spectral information defined by the energy windows or energy bins. In this study, the reconstructed images acquired with K-edge imaging, projection-based energy weighting imaging and image-based energy weighting imaging were simulated using the Monte Carlo simulation. The effect of energy windows or energy bins was investigated with respect to the contrast, coefficient-of-variation (COV) and contrast-to-noise ratio (CNR). The three images were compared with respect to the CNR. We modeled the x-ray computed tomography system based on the CdTe energy-resolved photon counting detector and polymethylmethacrylate phantom, which have iodine, gadolinium and blood. To acquire K-edge images, the lower energy thresholds were fixed at K-edge absorption energy of iodine and gadolinium and the energy window widths were increased from 1 to 25 bins. The energy weighting factors optimized for iodine, gadolinium and blood were calculated from 5, 10, 15, 19 and 33 energy bins. We assigned the calculated energy weighting factors to the images acquired at each energy bin. In K-edge images, the contrast and COV decreased, when the energy window width was increased. The CNR increased as a function of the energy window width and decreased above the specific energy window width. When the number of energy bins was increased from 5 to 15, the contrast increased in the projection-based energy weighting images. There is a little difference in the contrast, when the number of energy bin is

  3. Dose profile measurement using an imaging plate: Evaluation of filters using Monte Carlo simulation of 4 MV x-rays

    International Nuclear Information System (INIS)

    Computed radiography (CR) is gradually replacing film. The application of CR for two-dimensional profiles and off-axis ratio (OAR) measurement using an imaging plate (IP) in a CR system is currently under discussion. However, a well known problem for IPs in dosimetry is that they use high atomic number (Z) materials, such as Ba, which have an energy dependency in a photon interaction. Although there are some reports that it is possible to compensate for the energy dependency with metal filters, the appropriate thicknesses of these filters and where they should be located have not been investigated. The purpose of this study is to find the most suitable filter for use with an IP as a dosimetric tool. Monte Carlo simulation (Geant4 8.1) was used to determine the filter to minimize the measurement error in OAR measurements of 4 MV x-rays. In this simulation, the material and thickness of the filter and distance between the IP and the filter were varied to determine most suitable filter conditions that gave the best fit to the MC calculated OAR in water. With regard to changing the filter material, we found that using higher Z and higher density material increased the effectiveness of the filter. Also, increasing the distance between the filter and the IP reduced the effectiveness, whereas increasing the thickness of the filter increased the effectiveness. The result of this study showed that the most appropriate filter conditions consistent with the calculated OAR in water were the ones with the IP sandwiched between two 2 mm thick lead filters at a distance of 5 mm from the IP or the IP sandwiched directly between two 1 mm lead filters. Using these filters, we measured the OAR at 10 cm depth with 100 cm source-to-surface distance and surface 10x10 cm2 field size. The results of this measurement represented that it is possible to achieve measurements with less than within 2.0% and 2.0% in the field and with less than 1.1% and 0.6% out of the field by using 2 and 1 mm

  4. A Monte Carlo template-based analysis for very high definition imaging atmospheric Cherenkov telescopes as applied to the VERITAS telescope array

    CERN Document Server

    ,

    2015-01-01

    We present a sophisticated likelihood reconstruction algorithm for shower-image analysis of imaging Cherenkov telescopes. The reconstruction algorithm is based on the comparison of the camera pixel amplitudes with the predictions from a Monte Carlo based model. Shower parameters are determined by a maximisation of a likelihood function. Maximisation of the likelihood as a function of shower fit parameters is performed using a numerical non-linear optimisation technique. A related reconstruction technique has already been developed by the CAT and the H.E.S.S. experiments, and provides a more precise direction and energy reconstruction of the photon induced shower compared to the second moment of the camera image analysis. Examples are shown of the performance of the analysis on simulated gamma-ray data from the VERITAS array.

  5. Monte Carlo Simulation of Scattered Light with Shear Waves Generated by Acoustic Radiation Force for Acousto-Optic Imaging

    International Nuclear Information System (INIS)

    A Monte Carlo method of multiple scattered coherent light with the information of shear wave propagation in scattering media is presented. The established Monte-Carlo algorithm is mainly relative to optical phase variations due to the acoustic-radiation-force shear-wave-induced displacements of light scatterers. Both the distributions and temporal behaviors of optical phase increments in probe locations are obtained. Consequently, shear wave speed is evaluated quantitatively. It is noted that the phase increments exactly track the propagations of shear waves induced by focus-ultrasound radiation force. In addition, attenuations of shear waves are demonstrated in simulation results. By using linear regression processing, the shear wave speed, which is set to 2.1 m/s in simulation, is estimated to be 2.18 m/s and 2.35 m/s at time sampling intervals of 0.2 ms and 0.5 ms, respectively

  6. Using adaptive neuro-fuzzy inference system technique for crosstalk correction in simultaneous {sup 99m}Tc/{sup 201}Tl SPECT imaging: A Monte Carlo simulation study

    Energy Technology Data Exchange (ETDEWEB)

    Heidary, Saeed, E-mail: saeedheidary@aut.ac.ir; Setayeshi, Saeed, E-mail: setayesh@aut.ac.ir

    2015-01-11

    This work presents a simulation based study by Monte Carlo which uses two adaptive neuro-fuzzy inference systems (ANFIS) for cross talk compensation of simultaneous {sup 99m}Tc/{sup 201}Tl dual-radioisotope SPECT imaging. We have compared two neuro-fuzzy systems based on fuzzy c-means (FCM) and subtractive (SUB) clustering. Our approach incorporates eight energy-windows image acquisition from 28 keV to 156 keV and two main photo peaks of {sup 201}Tl (77±10% keV) and {sup 99m}Tc (140±10% keV). The Geant4 application in emission tomography (GATE) is used as a Monte Carlo simulator for three cylindrical and a NURBS Based Cardiac Torso (NCAT) phantom study. Three separate acquisitions including two single-isotopes and one dual isotope were performed in this study. Cross talk and scatter corrected projections are reconstructed by an iterative ordered subsets expectation maximization (OSEM) algorithm which models the non-uniform attenuation in the projection/back-projection. ANFIS-FCM/SUB structures are tuned to create three to sixteen fuzzy rules for modeling the photon cross-talk of the two radioisotopes. Applying seven to nine fuzzy rules leads to a total improvement of the contrast and the bias comparatively. It is found that there is an out performance for the ANFIS-FCM due to its acceleration and accurate results.

  7. Using adaptive neuro-fuzzy inference system technique for crosstalk correction in simultaneous 99mTc/201Tl SPECT imaging: A Monte Carlo simulation study

    Science.gov (United States)

    Heidary, Saeed; Setayeshi, Saeed

    2015-01-01

    This work presents a simulation based study by Monte Carlo which uses two adaptive neuro-fuzzy inference systems (ANFIS) for cross talk compensation of simultaneous 99mTc/201Tl dual-radioisotope SPECT imaging. We have compared two neuro-fuzzy systems based on fuzzy c-means (FCM) and subtractive (SUB) clustering. Our approach incorporates eight energy-windows image acquisition from 28 keV to 156 keV and two main photo peaks of 201Tl (77±10% keV) and 99mTc (140±10% keV). The Geant4 application in emission tomography (GATE) is used as a Monte Carlo simulator for three cylindrical and a NURBS Based Cardiac Torso (NCAT) phantom study. Three separate acquisitions including two single-isotopes and one dual isotope were performed in this study. Cross talk and scatter corrected projections are reconstructed by an iterative ordered subsets expectation maximization (OSEM) algorithm which models the non-uniform attenuation in the projection/back-projection. ANFIS-FCM/SUB structures are tuned to create three to sixteen fuzzy rules for modeling the photon cross-talk of the two radioisotopes. Applying seven to nine fuzzy rules leads to a total improvement of the contrast and the bias comparatively. It is found that there is an out performance for the ANFIS-FCM due to its acceleration and accurate results.

  8. A Monte Carlo simulation study of an improved K-edge log-subtraction X-ray imaging using a photon counting CdTe detector

    Science.gov (United States)

    Lee, Youngjin; Lee, Amy Candy; Kim, Hee-Joung

    2016-09-01

    Recently, significant effort has been spent on the development of photons counting detector (PCD) based on a CdTe for applications in X-ray imaging system. The motivation of developing PCDs is higher image quality. Especially, the K-edge subtraction (KES) imaging technique using a PCD is able to improve image quality and useful for increasing the contrast resolution of a target material by utilizing contrast agent. Based on above-mentioned technique, we presented an idea for an improved K-edge log-subtraction (KELS) imaging technique. The KELS imaging technique based on the PCDs can be realized by using different subtraction energy width of the energy window. In this study, the effects of the KELS imaging technique and subtraction energy width of the energy window was investigated with respect to the contrast, standard deviation, and CNR with a Monte Carlo simulation. We simulated the PCD X-ray imaging system based on a CdTe and polymethylmethacrylate (PMMA) phantom which consists of the various iodine contrast agents. To acquired KELS images, images of the phantom using above and below the iodine contrast agent K-edge absorption energy (33.2 keV) have been acquired at different energy range. According to the results, the contrast and standard deviation were decreased, when subtraction energy width of the energy window is increased. Also, the CNR using a KELS imaging technique is higher than that of the images acquired by using whole energy range. Especially, the maximum differences of CNR between whole energy range and KELS images using a 1, 2, and 3 mm diameter iodine contrast agent were acquired 11.33, 8.73, and 8.29 times, respectively. Additionally, the optimum subtraction energy width of the energy window can be acquired at 5, 4, and 3 keV for the 1, 2, and 3 mm diameter iodine contrast agent, respectively. In conclusion, we successfully established an improved KELS imaging technique and optimized subtraction energy width of the energy window, and based on

  9. The effect of magnification on the image quality and the radiation dose in X-ray digital mammography: a Monte Carlo simulation study

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Yu-Na; Kim, Hee-Joung; Park, Hye-Suk; Lee, Chang-Lae; Cho, Hyo-Min; Lee, Seung-Wan; Ryu, Hyun-Ju [Yonsei University, Wonju (Korea, Republic of)

    2010-09-15

    There have been many efforts to advance the technology of X-ray digital mammography in order to enhance the early detection of breast pathology. The purpose of this study was to evaluate image quality and the radiation dose after magnifying X-ray digital mammography using the Geant4 Application for Tomographic Emission (GATE). In this study, we simulated a Monte Carlo model of an X-ray digital mammographic system, and we present a technique for magnification and discuss how it affects the image quality. The simulated X-ray digital mammographic system with GATE consists of an X-ray source, a compression paddle, a supporting plate, and an imaging plate (IP) of computed radiography (CR). The degree of magnification ranged from 1.0 to 2.0. We designed a semi-cylindrical phantom with a thickness of 45-mm and a radius of 50-mm in order to evaluate the image quality after magnification. The phantom was made of poly methyl methacrylate (PMMA) and contained four spherical specks with diameters of 750, 500, 250, and 100-{mu}m to simulate microcalcifications. The simulation studies were performed with an X-ray energy spectrum calculated using the spectrum processor SRS-78. A combination of a molybdenum anode and a molybdenum filter (Mo/Mo) was used for the mammographic X-ray tubes. The effects of the degree of magnification were investigated in terms of both the contrast-to-noise ratio (CNR) and the average glandular dose (AGD). The results show that the CNR increased as the degree of magnification increased and decreased as breast glandularity increased. The AGD showed only a minor increase with magnification. Based on the results, magnification of mammographic images can be used to obtain high image quality with an increased CNR. Our X-ray digital mammographic system model with GATE may be used as a basis for future studies on X-ray imaging characteristics.

  10. Study on efficiency of time computation in x-ray imaging simulation base on Monte Carlo algorithm using graphics processing unit

    Science.gov (United States)

    Setiani, Tia Dwi; Suprijadi, Haryanto, Freddy

    2016-03-01

    Monte Carlo (MC) is one of the powerful techniques for simulation in x-ray imaging. MC method can simulate the radiation transport within matter with high accuracy and provides a natural way to simulate radiation transport in complex systems. One of the codes based on MC algorithm that are widely used for radiographic images simulation is MC-GPU, a codes developed by Andrea Basal. This study was aimed to investigate the time computation of x-ray imaging simulation in GPU (Graphics Processing Unit) compared to a standard CPU (Central Processing Unit). Furthermore, the effect of physical parameters to the quality of radiographic images and the comparison of image quality resulted from simulation in the GPU and CPU are evaluated in this paper. The simulations were run in CPU which was simulated in serial condition, and in two GPU with 384 cores and 2304 cores. In simulation using GPU, each cores calculates one photon, so, a large number of photon were calculated simultaneously. Results show that the time simulations on GPU were significantly accelerated compared to CPU. The simulations on the 2304 core of GPU were performed about 64 -114 times faster than on CPU, while the simulation on the 384 core of GPU were performed about 20 - 31 times faster than in a single core of CPU. Another result shows that optimum quality of images from the simulation was gained at the history start from 108 and the energy from 60 Kev to 90 Kev. Analyzed by statistical approach, the quality of GPU and CPU images are relatively the same.

  11. Volumetric x-ray coherent scatter imaging of cancer in resected breast tissue: a Monte Carlo study using virtual anthropomorphic phantoms

    International Nuclear Information System (INIS)

    Breast cancer patients undergoing surgery often choose to have a breast conserving surgery (BCS) instead of mastectomy for removal of only the breast tumor. If post-surgical analysis such as histological assessment of the resected tumor reveals insufficient healthy tissue margins around the cancerous tumor, the patient must undergo another surgery to remove the missed tumor tissue. Such re-excisions are reported to occur in 20%–70% of BCS patients. A real-time surgical margin assessment technique that is fast and consistently accurate could greatly reduce the number of re-excisions performed in BCS. We describe here a tumor margin assessment method based on x-ray coherent scatter computed tomography (CSCT) imaging and demonstrate its utility in surgical margin assessment using Monte Carlo simulations. A CSCT system was simulated in Geant4 and used to simulate two virtual anthropomorphic CSCT scans of phantoms resembling surgically resected tissue. The resulting images were volume-rendered and found to distinguish cancerous tumors embedded in complex distributions of adipose and fibroglandular breast tissue (as is expected in the breast). The images exhibited sufficient spatial and spectral (i.e. momentum transfer) resolution to classify the tissue in any given voxel as healthy or cancerous. ROC analysis of the classification accuracy revealed an area under the curve of up to 0.97. These results indicate that coherent scatter imaging is promising as a possible fast and accurate surgical margin assessment technique. (paper)

  12. Assessment of myocardial metabolic rate of glucose by means of Bayesian ICA and Markov Chain Monte Carlo methods in small animal PET imaging

    Science.gov (United States)

    Berradja, Khadidja; Boughanmi, Nabil

    2016-09-01

    In dynamic cardiac PET FDG studies the assessment of myocardial metabolic rate of glucose (MMRG) requires the knowledge of the blood input function (IF). IF can be obtained by manual or automatic blood sampling and cross calibrated with PET. These procedures are cumbersome, invasive and generate uncertainties. The IF is contaminated by spillover of radioactivity from the adjacent myocardium and this could cause important error in the estimated MMRG. In this study, we show that the IF can be extracted from the images in a rat heart study with 18F-fluorodeoxyglucose (18F-FDG) by means of Independent Component Analysis (ICA) based on Bayesian theory and Markov Chain Monte Carlo (MCMC) sampling method (BICA). Images of the heart from rats were acquired with the Sherbrooke small animal PET scanner. A region of interest (ROI) was drawn around the rat image and decomposed into blood and tissue using BICA. The Statistical study showed that there is a significant difference (p < 0.05) between MMRG obtained with IF extracted by BICA with respect to IF extracted from measured images corrupted with spillover.

  13. A Monte Carlo investigation of low-Z target image quality generated in a linear accelerator using Varian's VirtuaLinac

    International Nuclear Information System (INIS)

    Purpose: The focus of this work was the demonstration and validation of VirtuaLinac with clinical photon beams and to investigate the implementation of low-Z targets in a TrueBeam linear accelerator (Linac) using Monte Carlo modeling. Methods: VirtuaLinac, a cloud based web application utilizing Geant4 Monte Carlo code, was used to model the Linac treatment head components. Particles were propagated through the lower portion of the treatment head using BEAMnrc. Dose distributions and spectral distributions were calculated using DOSXYZnrc and BEAMdp, respectively. For validation, 6 MV flattened and flattening filter free (FFF) photon beams were generated and compared to measurement for square fields, 10 and 40 cm wide and at dmax for diagonal profiles. Two low-Z targets were investigated: a 2.35 MeV carbon target and the proposed 2.50 MeV commercial imaging target for the TrueBeam platform. A 2.35 MeV carbon target was also simulated in a 2100EX Clinac using BEAMnrc. Contrast simulations were made by scoring the dose in the phosphor layer of an IDU20 aSi detector after propagating through a 4 or 20 cm thick phantom composed of water and ICRP bone. Results: Measured and modeled depth dose curves for 6 MV flattened and FFF beams agree within 1% for 98.3% of points at depths greater than 0.85 cm. Ninety three percent or greater of points analyzed for the diagonal profiles had a gamma value less than one for the criteria of 1.5 mm and 1.5%. The two low-Z target photon spectra produced in TrueBeam are harder than that from the carbon target in the Clinac. Percent dose at depth 10 cm is greater by 3.6% and 8.9%; the fraction of photons in the diagnostic energy range (25–150 keV) is lower by 10% and 28%; and contrasts are lower by factors of 1.1 and 1.4 (4 cm thick phantom) and 1.03 and 1.4 (20 cm thick phantom), for the TrueBeam 2.35 MV/carbon and commercial imaging beams, respectively. Conclusions: VirtuaLinac is a promising new tool for Monte Carlo modeling of novel

  14. A Monte Carlo study of a high resolution $\\gamma$-detector for small organ imaging in Nuclear Medicine

    CERN Document Server

    Ortigão, C

    2004-01-01

    A reliable Monte Carlo simulation study is of significance importance to evaluate the performance of a gamma-ray detector and the search for compromises between spatial resolution, sensitivity and energy resolution. The development of a simulation package for a new compact gamma camera based on GEANT3 is described in this report. This simulation takes into account the interaction of gamma-rays in the crystal, the production and transport of scintillation photons and allows an accurate radiation transport description of photon attenuation in high-Z collimators, for SPECT applications. In order to achieve the best setup configuration different detector arrangements were explored, namely different scintillation crystals, coatings, reflector properties and polishing types. The conventional detector system, based on PMT light readout, was compared with an HPD system. Different collimators were studied for high resolution applications with compact gamma-cameras.

  15. Octree indexing of DICOM images for voxel number reduction and improvement of Monte Carlo simulation computing efficiency

    International Nuclear Information System (INIS)

    The purpose of the present study is to introduce a compression algorithm for the CT (computed tomography) data used in Monte Carlo simulations. Performing simulations on the CT data implies large computational costs as well as large memory requirements since the number of voxels in such data reaches typically into hundreds of millions voxels. CT data, however, contain homogeneous regions which could be regrouped to form larger voxels without affecting the simulation's accuracy. Based on this property we propose a compression algorithm based on octrees: in homogeneous regions the algorithm replaces groups of voxels with a smaller number of larger voxels. This reduces the number of voxels while keeping the critical high-density gradient area. Results obtained using the present algorithm on both phantom and clinical data show that compression rates up to 75% are possible without losing the dosimetric accuracy of the simulation

  16. Development of virtual patient models for permanent implant brachytherapy Monte Carlo dose calculations: interdependence of CT image artifact mitigation and tissue assignment

    International Nuclear Information System (INIS)

    This work investigates and compares CT image metallic artifact reduction (MAR) methods and tissue assignment schemes (TAS) for the development of virtual patient models for permanent implant brachytherapy Monte Carlo (MC) dose calculations. Four MAR techniques are investigated to mitigate seed artifacts from post-implant CT images of a homogeneous phantom and eight prostate patients: a raw sinogram approach using the original CT scanner data and three methods (simple threshold replacement (STR), 3D median filter, and virtual sinogram) requiring only the reconstructed CT image. Virtual patient models are developed using six TAS ranging from the AAPM-ESTRO-ABG TG-186 basic approach of assigning uniform density tissues (resulting in a model not dependent on MAR) to more complex models assigning prostate, calcification, and mixtures of prostate and calcification using CT-derived densities. The EGSnrc user-code BrachyDose is employed to calculate dose distributions. All four MAR methods eliminate bright seed spot artifacts, and the image-based methods provide comparable mitigation of artifacts compared with the raw sinogram approach. However, each MAR technique has limitations: STR is unable to mitigate low CT number artifacts, the median filter blurs the image which challenges the preservation of tissue heterogeneities, and both sinogram approaches introduce new streaks. Large local dose differences are generally due to differences in voxel tissue-type rather than mass density. The largest differences in target dose metrics (D90, V100, V150), over 50% lower compared to the other models, are when uncorrected CT images are used with TAS that consider calcifications. Metrics found using models which include calcifications are generally a few percent lower than prostate-only models. Generally, metrics from any MAR method and any TAS which considers calcifications agree within 6%. Overall, the studied MAR methods and TAS show promise for further retrospective MC dose

  17. Development of virtual patient models for permanent implant brachytherapy Monte Carlo dose calculations: interdependence of CT image artifact mitigation and tissue assignment

    Science.gov (United States)

    Miksys, N.; Xu, C.; Beaulieu, L.; Thomson, R. M.

    2015-08-01

    This work investigates and compares CT image metallic artifact reduction (MAR) methods and tissue assignment schemes (TAS) for the development of virtual patient models for permanent implant brachytherapy Monte Carlo (MC) dose calculations. Four MAR techniques are investigated to mitigate seed artifacts from post-implant CT images of a homogeneous phantom and eight prostate patients: a raw sinogram approach using the original CT scanner data and three methods (simple threshold replacement (STR), 3D median filter, and virtual sinogram) requiring only the reconstructed CT image. Virtual patient models are developed using six TAS ranging from the AAPM-ESTRO-ABG TG-186 basic approach of assigning uniform density tissues (resulting in a model not dependent on MAR) to more complex models assigning prostate, calcification, and mixtures of prostate and calcification using CT-derived densities. The EGSnrc user-code BrachyDose is employed to calculate dose distributions. All four MAR methods eliminate bright seed spot artifacts, and the image-based methods provide comparable mitigation of artifacts compared with the raw sinogram approach. However, each MAR technique has limitations: STR is unable to mitigate low CT number artifacts, the median filter blurs the image which challenges the preservation of tissue heterogeneities, and both sinogram approaches introduce new streaks. Large local dose differences are generally due to differences in voxel tissue-type rather than mass density. The largest differences in target dose metrics (D90, V100, V150), over 50% lower compared to the other models, are when uncorrected CT images are used with TAS that consider calcifications. Metrics found using models which include calcifications are generally a few percent lower than prostate-only models. Generally, metrics from any MAR method and any TAS which considers calcifications agree within 6%. Overall, the studied MAR methods and TAS show promise for further retrospective MC dose

  18. Quantification of rat brain SPECT with 123I-ioflupane: evaluation of different reconstruction methods and image degradation compensations using Monte Carlo simulation

    Science.gov (United States)

    Roé-Vellvé, N.; Pino, F.; Falcon, C.; Cot, A.; Gispert, J. D.; Marin, C.; Pavía, J.; Ros, D.

    2014-08-01

    SPECT studies with 123I-ioflupane facilitate the diagnosis of Parkinson’s disease (PD). The effect on quantification of image degradations has been extensively evaluated in human studies but their impact on studies of experimental PD models is still unclear. The aim of this work was to assess the effect of compensating for the degrading phenomena on the quantification of small animal SPECT studies using 123I-ioflupane. This assessment enabled us to evaluate the feasibility of quantitatively detecting small pathological changes using different reconstruction methods and levels of compensation for the image degrading phenomena. Monte Carlo simulated studies of a rat phantom were reconstructed and quantified. Compensations for point spread function (PSF), scattering, attenuation and partial volume effect were progressively included in the quantification protocol. A linear relationship was found between calculated and simulated specific uptake ratio (SUR) in all cases. In order to significantly distinguish disease stages, noise-reduction during the reconstruction process was the most relevant factor, followed by PSF compensation. The smallest detectable SUR interval was determined by biological variability rather than by image degradations or coregistration errors. The quantification methods that gave the best results allowed us to distinguish PD stages with SUR values that are as close as 0.5 using groups of six rats to represent each stage.

  19. Assessment of a Monte-Carlo simulation of SPECT recordings from a new-generation heart-centric semiconductor camera: from point sources to human images

    International Nuclear Information System (INIS)

    Geant4 application for tomographic emission (GATE), a Monte-Carlo simulation platform, has previously been used for optimizing tomoscintigraphic images recorded with scintillation Anger cameras but not with the new-generation heart-centric cadmium–zinc–telluride (CZT) cameras. Using the GATE platform, this study aimed at simulating the SPECT recordings from one of these new CZT cameras and to assess this simulation by direct comparison between simulated and actual recorded data, ranging from point sources to human images. Geometry and movement of detectors, as well as their respective energy responses, were modeled for the CZT ‘D.SPECT’ camera in the GATE platform. Both simulated and actual recorded data were obtained from: (1) point and linear sources of 99mTc for compared assessments of detection sensitivity and spatial resolution, (2) a cardiac insert filled with a 99mTc solution for compared assessments of contrast-to-noise ratio and sharpness of myocardial borders and (3) in a patient with myocardial infarction using segmented cardiac magnetic resonance imaging images. Most of the data from the simulated images exhibited high concordance with the results of actual images with relative differences of only: (1) 0.5% for detection sensitivity, (2) 6.7% for spatial resolution, (3) 2.6% for contrast-to-noise ratio and 5.0% for sharpness index on the cardiac insert placed in a diffusing environment. There was also good concordance between actual and simulated gated-SPECT patient images for the delineation of the myocardial infarction area, although the quality of the simulated images was clearly superior with increases around 50% for both contrast-to-noise ratio and sharpness index. SPECT recordings from a new heart-centric CZT camera can be simulated with the GATE software with high concordance relative to the actual physical properties of this camera. These simulations may be conducted up to the stage of human SPECT-images even if further refinement is

  20. Monte Carlo simulation of small OpenPET prototype with 11C beam irradiation: effects of secondary particles on in-beam imaging

    International Nuclear Information System (INIS)

    In-beam positron emission tomography (PET) can enable visualization of an irradiated field using positron emitters (β+ decay). In particle therapies, many kinds of secondary particles are produced by nuclear interactions, which affect PET imaging. Our purpose in this work was to evaluate effects of secondary particles on in-beam PET imaging using the Monte Carlo simulation code, Geant4, by reproducing an experiment with a small OpenPET prototype in which a PMMA phantom was irradiated by a 11C beam. The number of incident particles to the detectors and their spectra, background coincidence for the PET scan, and reconstructed images were evaluated for three periods, spill-time (beam irradiation), pause-time (accelerating the particles) and beam-off time (duration after the final spill). For spill-time, we tested a background reduction technique in which coincidence events correlated with the accelerator radiofrequency were discarded (RF gated) that has been proposed in the literature. Also, background generation processes were identified. For spill-time, most background coincidences were caused by prompt gamma rays, and only 1.4% of the total coincidences generated β+ signals. Differently, for pause-time and beam-off time, more than 75% of the total coincidence events were signals. Using these coincidence events, we failed to reconstruct images during the spill-time, but we obtained successful reconstructions for the pause-time and beam-off time, which was consistent with the experimental results. From the simulation, we found that the absence of materials in the beam line and using the RF gated technique improved the signal-to-noise ratio for the spill-time. From an additional simulation with range shifter-less irradiation and the RF gated technique, we showed the feasibility of image reconstruction during the spill-time. (paper)

  1. Feasibility testing of a pre-clinical coded aperture phase contrast imaging configuration using a simple fast Monte Carlo simulator

    OpenAIRE

    Kavanagh, A.; Olivo, A.; Speller, R; Vojnovic, B

    2013-01-01

    A simple method of simulating possible coded aperture phase contrast X-ray imaging apparatus is presented. The method is based on ray tracing, with the rays treated ballistically within a voxelized sample and with the phase-shift-induced angular deviations and absorptions applied at a plane in the middle of the sample. For the particular case of a coded aperture phase contrast configuration suitable for small animal pre-clinical imaging we present results obtained using a high resolution voxe...

  2. The feasibility of polychromatic cone-beam x-ray fluorescence computed tomography (XFCT) imaging of gold nanoparticle-loaded objects: a Monte Carlo study

    International Nuclear Information System (INIS)

    A recent study investigated the feasibility to develop a bench-top x-ray fluorescence computed tomography (XFCT) system capable of determining the spatial distribution and concentration of gold nanoparticles (GNPs) in vivo using a diagnostic energy range polychromatic (i.e. 110 kVp) pencil-beam source. In this follow-up study, we examined the feasibility of a polychromatic cone-beam implementation of XFCT by Monte Carlo (MC) simulations using the MCNP5 code. In the current MC model, cylindrical columns with various sizes (5-10 mm in diameter) containing water loaded with GNPs (0.1-2% gold by weight) were inserted into a 5 cm diameter cylindrical polymethyl methacrylate phantom. The phantom was then irradiated by a lead-filtered 110 kVp x-ray source, and the resulting gold fluorescence and Compton-scattered photons were collected by a series of energy-sensitive tallies after passing through lead parallel-hole collimators. A maximum-likelihood iterative reconstruction algorithm was implemented to reconstruct the image of GNP-loaded objects within the phantom. The effects of attenuation of both the primary beam through the phantom and the gold fluorescence photons en route to the detector were corrected during the image reconstruction. Accurate images of the GNP-containing phantom were successfully reconstructed for three different phantom configurations, with both spatial distribution and relative concentration of GNPs well identified. The pixel intensity of regions containing GNPs was linearly proportional to the gold concentration. The current MC study strongly suggests the possibility of developing a bench-top, polychromatic, cone-beam XFCT system for in vivo imaging.

  3. Monte Carlo simulations of GeoPET experiments: 3D images of tracer distributions (18F, 124I and 58Co) in Opalinus clay, anhydrite and quartz

    Science.gov (United States)

    Zakhnini, Abdelhamid; Kulenkampff, Johannes; Sauerzapf, Sophie; Pietrzyk, Uwe; Lippmann-Pipke, Johanna

    2013-08-01

    Understanding conservative fluid flow and reactive tracer transport in soils and rock formations requires quantitative transport visualization methods in 3D+t. After a decade of research and development we established the GeoPET as a non-destructive method with unrivalled sensitivity and selectivity, with due spatial and temporal resolution by applying Positron Emission Tomography (PET), a nuclear medicine imaging method, to dense rock material. Requirements for reaching the physical limit of image resolution of nearly 1 mm are (a) a high-resolution PET-camera, like our ClearPET scanner (Raytest), and (b) appropriate correction methods for scatter and attenuation of 511 keV—photons in the dense geological material. The latter are by far more significant in dense geological material than in human and small animal body tissue (water). Here we present data from Monte Carlo simulations (MCS) reflecting selected GeoPET experiments. The MCS consider all involved nuclear physical processes of the measurement with the ClearPET-system and allow us to quantify the sensitivity of the method and the scatter fractions in geological media as function of material (quartz, Opalinus clay and anhydrite compared to water), PET isotope (18F, 58Co and 124I), and geometric system parameters. The synthetic data sets obtained by MCS are the basis for detailed performance assessment studies allowing for image quality improvements. A scatter correction method is applied exemplarily by subtracting projections of simulated scattered coincidences from experimental data sets prior to image reconstruction with an iterative reconstruction process.

  4. GATE Monte Carlo simulations for variations of an integrated PET/MR hybrid imaging system based on the Biograph mMR model

    International Nuclear Information System (INIS)

    A simulation toolkit, GATE (Geant4 Application for Tomographic Emission), was used to develop an accurate Monte Carlo (MC) simulation of a fully integrated 3T PET/MR hybrid imaging system (Siemens Biograph mMR). The PET/MR components of the Biograph mMR were simulated in order to allow a detailed study of variations of the system design on the PET performance, which are not easy to access and measure on a real PET/MR system. The 3T static magnetic field of the MR system was taken into account in all Monte Carlo simulations. The validation of the MC model was carried out against actual measurements performed on the PET/MR system by following the NEMA (National Electrical Manufacturers Association) NU 2-2007 standard. The comparison of simulated and experimental performance measurements included spatial resolution, sensitivity, scatter fraction, and count rate capability. The validated system model was then used for two different applications. The first application focused on investigating the effect of an extension of the PET field-of-view on the PET performance of the PET/MR system. The second application deals with simulating a modified system timing resolution and coincidence time window of the PET detector electronics in order to simulate time-of-flight (TOF) PET detection. A dedicated phantom was modeled to investigate the impact of TOF on overall PET image quality. Simulation results showed that the overall divergence between simulated and measured data was found to be less than 10%. Varying the detector geometry showed that the system sensitivity and noise equivalent count rate of the PET/MR system increased progressively with an increasing number of axial detector block rings, as to be expected. TOF-based PET reconstructions of the modeled phantom showed an improvement in signal-to-noise ratio and image contrast to the conventional non-TOF PET reconstructions. In conclusion, the validated MC simulation model of an integrated PET/MR system with an overall

  5. Comparative study using Monte Carlo methods of the radiation detection efficiency of LSO, LuAP, GSO and YAP scintillators for use in positron emission imaging (PET)

    International Nuclear Information System (INIS)

    The radiation detection efficiency of four scintillators employed, or designed to be employed, in positron emission imaging (PET) was evaluated as a function of the crystal thickness by applying Monte Carlo Methods. The scintillators studied were the LuSiO5 (LSO), LuAlO3 (LuAP), Gd2SiO5 (GSO) and the YAlO3 (YAP). Crystal thicknesses ranged from 0 to 50 mm. The study was performed via a previously generated photon transport Monte Carlo code. All photon track and energy histories were recorded and the energy transferred or absorbed in the scintillator medium was calculated together with the energy redistributed and retransported as secondary characteristic fluorescence radiation. Various parameters were calculated e.g. the fraction of the incident photon energy absorbed, transmitted or redistributed as fluorescence radiation, the scatter to primary ratio, the photon and energy distribution within each scintillator block etc. As being most significant, the fraction of the incident photon energy absorbed was found to increase with increasing crystal thickness tending to form a plateau above the 30 mm thickness. For LSO, LuAP, GSO and YAP scintillators, respectively, this fraction had the value of 44.8, 36.9 and 45.7% at the 10 mm thickness and 96.4, 93.2 and 96.9% at the 50 mm thickness. Within the plateau area approximately (57-59)% (59-63)% (52-63)% and (58-61)% of this fraction was due to scattered and reabsorbed radiation for the LSO, GSO, YAP and LuAP scintillators, respectively. In all cases, a negligible fraction (<0.1%) of the absorbed energy was found to escape the crystal as fluorescence radiation

  6. Comprehensive Evaluations of Cone-beam CT dose in Image-guided Radiation Therapy via GPU-based Monte Carlo simulations

    CERN Document Server

    Montanari, Davide; Silvestri, Chiara; Graves, Yan J; Yan, Hao; Cervino, Laura; Rice, Roger; Jiang, Steve B; Jia, Xun

    2013-01-01

    Cone beam CT (CBCT) has been widely used for patient setup in image guided radiation therapy (IGRT). Radiation dose from CBCT scans has become a clinical concern. The purposes of this study are 1) to commission a GPU-based Monte Carlo (MC) dose calculation package gCTD for Varian On-Board Imaging (OBI) system and test the calculation accuracy, and 2) to quantitatively evaluate CBCT dose from the OBI system in typical IGRT scan protocols. We first conducted dose measurements in a water phantom. X-ray source model parameters used in gCTD are obtained through a commissioning process. gCTD accuracy is demonstrated by comparing calculations with measurements in water and in CTDI phantoms. 25 brain cancer patients are used to study dose in a standard-dose head protocol, and 25 prostate cancer patients are used to study dose in pelvis protocol and pelvis spotlight protocol. Mean dose to each organ is calculated. Mean dose to 2% voxels that have the highest dose is also computed to quantify the maximum dose. It is fo...

  7. Specific absorbed fractions from the image-based VIP-Man body model and EGS4-VLSI Monte Carlo code: internal electron emitters

    Science.gov (United States)

    Chao, T. C.; Xu, X. G.

    2001-04-01

    VIP-Man is a whole-body anatomical model newly developed at Rensselaer from the high-resolution colour images of the National Library of Medicine's Visible Human Project. This paper summarizes the use of VIP-Man and the Monte Carlo method to calculate specific absorbed fractions from internal electron emitters. A specially designed EGS4 user code, named EGS4-VLSI, was developed to use the extremely large number of image data contained in the VIP-Man. Monoenergetic and isotropic electron emitters with energies from 100 keV to 4 MeV are considered to be uniformly distributed in 26 organs. This paper presents, for the first time, results of internal electron exposures based on a realistic whole-body tomographic model. Because VIP-Man has many organs and tissues that were previously not well defined (or not available) in other models, the efforts at Rensselaer and elsewhere bring an unprecedented opportunity to significantly improve the internal dosimetry.

  8. SPECIAL ISSUE DEVOTED TO MULTIPLE RADIATION SCATTERING IN RANDOM MEDIA: Estimate of the melanin content in human hairs by the inverse Monte-Carlo method using a system for digital image analysis

    Science.gov (United States)

    Bashkatov, A. N.; Genina, Elina A.; Kochubei, V. I.; Tuchin, Valerii V.

    2006-12-01

    Based on the digital image analysis and inverse Monte-Carlo method, the proximate analysis method is deve-loped and the optical properties of hairs of different types are estimated in three spectral ranges corresponding to three colour components. The scattering and absorption properties of hairs are separated for the first time by using the inverse Monte-Carlo method. The content of different types of melanin in hairs is estimated from the absorption coefficient. It is shown that the dominating type of melanin in dark hairs is eumelanin, whereas in light hairs pheomelanin dominates.

  9. Monte Carlo techniques in radiation therapy

    CERN Document Server

    Verhaegen, Frank

    2013-01-01

    Modern cancer treatment relies on Monte Carlo simulations to help radiotherapists and clinical physicists better understand and compute radiation dose from imaging devices as well as exploit four-dimensional imaging data. With Monte Carlo-based treatment planning tools now available from commercial vendors, a complete transition to Monte Carlo-based dose calculation methods in radiotherapy could likely take place in the next decade. Monte Carlo Techniques in Radiation Therapy explores the use of Monte Carlo methods for modeling various features of internal and external radiation sources, including light ion beams. The book-the first of its kind-addresses applications of the Monte Carlo particle transport simulation technique in radiation therapy, mainly focusing on external beam radiotherapy and brachytherapy. It presents the mathematical and technical aspects of the methods in particle transport simulations. The book also discusses the modeling of medical linacs and other irradiation devices; issues specific...

  10. Comprehensive Monte-Carlo simulator for optimization of imaging parameters for high sensitivity detection of skin cancer at the THz

    Science.gov (United States)

    Ney, Michael; Abdulhalim, Ibrahim

    2016-03-01

    Skin cancer detection at its early stages has been the focus of a large number of experimental and theoretical studies during the past decades. Among these studies two prominent approaches presenting high potential are reflectometric sensing at the THz wavelengths region and polarimetric imaging techniques in the visible wavelengths. While THz radiation contrast agent and source of sensitivity to cancer related tissue alterations was considered to be mainly the elevated water content in the cancerous tissue, the polarimetric approach has been verified to enable cancerous tissue differentiation based on cancer induced structural alterations to the tissue. Combining THz with the polarimetric approach, which is considered in this study, is examined in order to enable higher detection sensitivity than previously pure reflectometric THz measurements. For this, a comprehensive MC simulation of radiative transfer in a complex skin tissue model fitted for the THz domain that considers the skin`s stratified structure, tissue material optical dispersion modeling, surface roughness, scatterers, and substructure organelles has been developed. Additionally, a narrow beam Mueller matrix differential analysis technique is suggested for assessing skin cancer induced changes in the polarimetric image, enabling the tissue model and MC simulation to be utilized for determining the imaging parameters resulting in maximal detection sensitivity.

  11. Monte Carlo simulation of a quantum noise limited Čerenkov detector based on air-spaced light guiding taper for megavoltage x-ray imaging

    Energy Technology Data Exchange (ETDEWEB)

    Teymurazyan, A. [Imaging Research, Sunnybrook Health Sciences Centre, Department of Medical Biophysics, University of Toronto, Toronto M4N 3M5 (Canada); Rowlands, J. A. [Imaging Research, Sunnybrook Health Sciences Centre, Department of Medical Biophysics, University of Toronto, Toronto M4N 3M5 (Canada); Thunder Bay Regional Research Institute (TBRRI), Thunder Bay P7A 7T1 (Canada); Department of Radiation Oncology, University of Toronto, Toronto M5S 3E2 (Canada); Pang, G., E-mail: geordi.pang@sunnybrook.ca [Imaging Research, Sunnybrook Health Sciences Centre, Department of Medical Biophysics, University of Toronto, Toronto M4N 3M5 (Canada); Department of Radiation Oncology, University of Toronto, Toronto M5S 3E2 (Canada); Odette Cancer Centre, Toronto M4N 3M5 (Canada); Department of Physics, Ryerson University, Toronto M5B 2K3 (Canada)

    2014-04-15

    Purpose: Electronic Portal Imaging Devices (EPIDs) have been widely used in radiation therapy and are still needed on linear accelerators (Linacs) equipped with kilovoltage cone beam CT (kV-CBCT) or MRI systems. Our aim is to develop a new high quantum efficiency (QE) Čerenkov Portal Imaging Device (CPID) that is quantum noise limited at dose levels corresponding to a single Linac pulse. Methods: Recently a new concept of CPID for MV x-ray imaging in radiation therapy was introduced. It relies on Čerenkov effect for x-ray detection. The proposed design consisted of a matrix of optical fibers aligned with the incident x-rays and coupled to an active matrix flat panel imager (AMFPI) for image readout. A weakness of such design is that too few Čerenkov light photons reach the AMFPI for each incident x-ray and an AMFPI with an avalanche gain is required in order to overcome the readout noise for portal imaging application. In this work the authors propose to replace the optical fibers in the CPID with light guides without a cladding layer that are suspended in air. The air between the light guides takes on the role of the cladding layer found in a regular optical fiber. Since air has a significantly lower refractive index (∼1 versus 1.38 in a typical cladding layer), a much superior light collection efficiency is achieved. Results: A Monte Carlo simulation of the new design has been conducted to investigate its feasibility. Detector quantities such as quantum efficiency (QE), spatial resolution (MTF), and frequency dependent detective quantum efficiency (DQE) have been evaluated. The detector signal and the quantum noise have been compared to the readout noise. Conclusions: Our studies show that the modified new CPID has a QE and DQE more than an order of magnitude greater than that of current clinical systems and yet a spatial resolution similar to that of current low-QE flat-panel based EPIDs. Furthermore it was demonstrated that the new CPID does not require an

  12. Quantification of dopaminergic neurotransmission SPECT studies with {sup 123}I-labelled radioligands. A comparison between different imaging systems and data acquisition protocols using Monte Carlo simulation

    Energy Technology Data Exchange (ETDEWEB)

    Crespo, Cristina; Aguiar, Pablo [Universitat de Barcelona - IDIBAPS, Unitat de Biofisica i Bioenginyeria, Departament de Ciencies Fisiologiques I, Facultat de Medicina, Barcelona (Spain); Gallego, Judith [Universitat Politecnica de Catalunya, Institut de Tecniques Energetiques, Barcelona (Spain); Institut de Bioenginyeria de Catalunya, Barcelona (Spain); Cot, Albert [Universitat de Barcelona - IDIBAPS, Unitat de Biofisica i Bioenginyeria, Departament de Ciencies Fisiologiques I, Facultat de Medicina, Barcelona (Spain); Universitat Politecnica de Catalunya, Seccio d' Enginyeria Nuclear, Departament de Fisica i Enginyeria Nuclear, Barcelona (Spain); Falcon, Carles; Ros, Domenec [Universitat de Barcelona - IDIBAPS, Unitat de Biofisica i Bioenginyeria, Departament de Ciencies Fisiologiques I, Facultat de Medicina, Barcelona (Spain); CIBER en Bioingenieria, Biomateriales y Nanomedicina (CIBER-BBN), Barcelona (Spain); Bullich, Santiago [Hospital del Mar, Center for Imaging in Psychiatry, CRC-MAR, Barcelona (Spain); Pareto, Deborah [CIBER en Bioingenieria, Biomateriales y Nanomedicina (CIBER-BBN), Barcelona (Spain); PRBB, Institut d' Alta Tecnologia, Barcelona (Spain); Sempau, Josep [Universitat Politecnica de Catalunya, Institut de Tecniques Energetiques, Barcelona (Spain); CIBER en Bioingenieria, Biomateriales y Nanomedicina (CIBER-BBN), Barcelona (Spain); Lomena, Francisco [IDIBAPS, Servei de Medicina Nuclear, Hospital Clinic, Barcelona (Spain); Calvino, Francisco [Universitat Politecnica de Catalunya, Institut de Tecniques Energetiques, Barcelona (Spain); Universitat Politecnica de Catalunya, Seccio d' Enginyeria Nuclear, Departament de Fisica i Enginyeria Nuclear, Barcelona (Spain); Pavia, Javier [CIBER en Bioingenieria, Biomateriales y Nanomedicina (CIBER-BBN), Barcelona (Spain); IDIBAPS, Servei de Medicina Nuclear, Hospital Clinic, Barcelona (Spain)

    2008-07-15

    {sup 123}I-labelled radioligands are commonly used for single-photon emission computed tomography (SPECT) imaging of the dopaminergic system to study the dopamine transporter binding. The aim of this work was to compare the quantitative capabilities of two different SPECT systems through Monte Carlo (MC) simulation. The SimSET MC code was employed to generate simulated projections of a numerical phantom for two gamma cameras equipped with a parallel and a fan-beam collimator, respectively. A fully 3D iterative reconstruction algorithm was used to compensate for attenuation, the spatially variant point spread function (PSF) and scatter. A post-reconstruction partial volume effect (PVE) compensation was also developed. For both systems, the correction for all degradations and PVE compensation resulted in recovery factors of the theoretical specific uptake ratio (SUR) close to 100%. For a SUR value of 4, the recovered SUR for the parallel imaging system was 33% for a reconstruction without corrections (OSEM), 45% for a reconstruction with attenuation correction (OSEM-A), 56% for a 3D reconstruction with attenuation and PSF corrections (OSEM-AP), 68% for OSEM-AP with scatter correction (OSEM-APS) and 97% for OSEM-APS plus PVE compensation (OSEM-APSV). For the fan-beam imaging system, the recovered SUR was 41% without corrections, 55% for OSEM-A, 65% for OSEM-AP, 75% for OSEM-APS and 102% for OSEM-APSV. Our findings indicate that the correction for degradations increases the quantification accuracy, with PVE compensation playing a major role in the SUR quantification. The proposed methodology allows us to reach similar SUR values for different SPECT systems, thereby allowing a reliable standardisation in multicentric studies. (orig.)

  13. An experimental approach to improve the Monte Carlo modelling of offline PET/CT-imaging of positron emitters induced by scanned proton beams

    Science.gov (United States)

    Bauer, J.; Unholtz, D.; Kurz, C.; Parodi, K.

    2013-08-01

    We report on the experimental campaign carried out at the Heidelberg Ion-Beam Therapy Center (HIT) to optimize the Monte Carlo (MC) modelling of proton-induced positron-emitter production. The presented experimental strategy constitutes a pragmatic inverse approach to overcome the known uncertainties in the modelling of positron-emitter production due to the lack of reliable cross-section data for the relevant therapeutic energy range. This work is motivated by the clinical implementation of offline PET/CT-based treatment verification at our facility. Here, the irradiation induced tissue activation in the patient is monitored shortly after the treatment delivery by means of a commercial PET/CT scanner and compared to a MC simulated activity expectation, derived under the assumption of a correct treatment delivery. At HIT, the MC particle transport and interaction code FLUKA is used for the simulation of the expected positron-emitter yield. For this particular application, the code is coupled to externally provided cross-section data of several proton-induced reactions. Studying experimentally the positron-emitting radionuclide yield in homogeneous phantoms provides access to the fundamental production channels. Therefore, five different materials have been irradiated by monoenergetic proton pencil beams at various energies and the induced β+ activity subsequently acquired with a commercial full-ring PET/CT scanner. With the analysis of dynamically reconstructed PET images, we are able to determine separately the spatial distribution of different radionuclide concentrations at the starting time of the PET scan. The laterally integrated radionuclide yields in depth are used to tune the input cross-section data such that the impact of both the physical production and the imaging process on the various positron-emitter yields is reproduced. The resulting cross-section data sets allow to model the absolute level of measured β+ activity induced in the investigated

  14. Dosimetric effect of statistics noise of the TC image in the simulation Monte Carlo of radiotherapy treatments; Efecto dosimetrico del ruido estadistico de la imagen TC en la simulacion Monte Carlo de tratamientos de radioterapia

    Energy Technology Data Exchange (ETDEWEB)

    Laliena Bielsa, V.; Jimenez Albericio, F. J.; Gandia Martinez, A.; Font Gomez, J. A.; Mengual Gil, M. A.; Andres Redondo, M. M.

    2013-07-01

    The source of uncertainty is not exclusive of the Monte Carlo method, but it will be present in any algorithm which takes into account the correction for heterogeneity. Although we hope that the uncertainty described above is small, the objective of this work is to try to quantify depending on the CT study. (Author)

  15. Using Rose’s metal alloy as a pinhole collimator material in preclinical small-animal imaging: A Monte Carlo evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Peterson, Mikael, E-mail: Mikael.Peterson@med.lu.se; Strand, Sven-Erik; Ljungberg, Michael [Department of Medical Radiation Physics, Clinical Science, Lund University, Lund 221 85 (Sweden)

    2015-04-15

    Purpose: Pinhole collimation is the most common method of high-resolution preclinical single photon emission computed tomography imaging. The collimators are usually constructed from dense materials with high atomic numbers, such as gold and platinum, which are expensive and not always flexible in the fabrication step. In this work, the authors have investigated the properties of a fusible alloy called Rose’s metal and its potential in pinhole preclinical imaging. When compared to current standard pinhole materials such as gold and platinum, Rose’s metal has a lower density and a relatively low effective atomic number. However, it is inexpensive, has a low melting point, and does not contract when solidifying. Once cast, the piece can be machined with high precision. The aim of this study was to evaluate the imaging properties for Rose’s metal and compare them with those of standard materials. Methods: After validating their Monte Carlo code by comparing its results with published data and the results from analytical calculations, they investigated different pinhole geometries by varying the collimator material, acceptance angle, aperture diameter, and photon incident angle. The penetration-to-scatter and penetration-to-total component ratios, sensitivity, and the spatial resolution were determined for gold, tungsten, and Rose’s metal for two radionuclides, {sup 99}Tc{sup m} and {sup 125}I. Results: The Rose’s metal pinhole-imaging simulations show higher penetration/total and scatter/total ratios. For example, the penetration/total is 50% for gold and 75% for Rose’s metal when simulating {sup 99}Tc{sup m} with a 0.3 mm aperture diameter and a 60° acceptance angle. However, the degradation in spatial resolution remained below 10% relative to the spatial resolution for gold for acceptance angles below 40° and aperture diameters larger than 0.5 mm. Conclusions: Extra penetration and scatter associated with Rose’s metal contribute to degradation in the

  16. MO-G-17A-04: Internal Dosimetric Calculations for Pediatric Nuclear Imaging Applications, Using Monte Carlo Simulations and High-Resolution Pediatric Computational Models

    Energy Technology Data Exchange (ETDEWEB)

    Papadimitroulas, P; Kagadis, GC [University of Patras, Rion, Ahaia (Greece); Loudos, G [Technical Educational Institute of Athens, Aigaleo, Attiki (Greece)

    2014-06-15

    Purpose: Our purpose is to evaluate the administered absorbed dose in pediatric, nuclear imaging studies. Monte Carlo simulations with the incorporation of pediatric computational models can serve as reference for the accurate determination of absorbed dose. The procedure of the calculated dosimetric factors is described, while a dataset of reference doses is created. Methods: Realistic simulations were executed using the GATE toolkit and a series of pediatric computational models, developed by the “IT'IS Foundation”. The series of the phantoms used in our work includes 6 models in the range of 5–14 years old (3 boys and 3 girls). Pre-processing techniques were applied to the images, to incorporate the phantoms in GATE simulations. The resolution of the phantoms was set to 2 mm3. The most important organ densities were simulated according to the GATE “Materials Database”. Several used radiopharmaceuticals in SPECT and PET applications are being tested, following the EANM pediatric dosage protocol. The biodistributions of the several isotopes used as activity maps in the simulations, were derived by the literature. Results: Initial results of absorbed dose per organ (mGy) are presented in a 5 years old girl from the whole body exposure to 99mTc - SestaMIBI, 30 minutes after administration. Heart, kidney, liver, ovary, pancreas and brain are the most critical organs, in which the S-factors are calculated. The statistical uncertainty in the simulation procedure was kept lower than 5%. The Sfactors for each target organ are calculated in Gy/(MBq*sec) with highest dose being absorbed in kidneys and pancreas (9.29*10{sup 10} and 0.15*10{sup 10} respectively). Conclusion: An approach for the accurate dosimetry on pediatric models is presented, creating a reference dosage dataset for several radionuclides in children computational models with the advantages of MC techniques. Our study is ongoing, extending our investigation to other reference models and

  17. Comprehensive evaluations of cone-beam CT dose in image-guided radiation therapy via GPU-based Monte Carlo simulations

    Science.gov (United States)

    Montanari, Davide; Scolari, Enrica; Silvestri, Chiara; Jiang Graves, Yan; Yan, Hao; Cervino, Laura; Rice, Roger; Jiang, Steve B.; Jia, Xun

    2014-03-01

    Cone beam CT (CBCT) has been widely used for patient setup in image-guided radiation therapy (IGRT). Radiation dose from CBCT scans has become a clinical concern. The purposes of this study are (1) to commission a graphics processing unit (GPU)-based Monte Carlo (MC) dose calculation package gCTD for Varian On-Board Imaging (OBI) system and test the calculation accuracy, and (2) to quantitatively evaluate CBCT dose from the OBI system in typical IGRT scan protocols. We first conducted dose measurements in a water phantom. X-ray source model parameters used in gCTD are obtained through a commissioning process. gCTD accuracy is demonstrated by comparing calculations with measurements in water and in CTDI phantoms. Twenty-five brain cancer patients are used to study dose in a standard-dose head protocol, and 25 prostate cancer patients are used to study dose in pelvis protocol and pelvis spotlight protocol. Mean dose to each organ is calculated. Mean dose to 2% voxels that have the highest dose is also computed to quantify the maximum dose. It is found that the mean dose value to an organ varies largely among patients. Moreover, dose distribution is highly non-homogeneous inside an organ. The maximum dose is found to be 1-3 times higher than the mean dose depending on the organ, and is up to eight times higher for the entire body due to the very high dose region in bony structures. High computational efficiency has also been observed in our studies, such that MC dose calculation time is less than 5 min for a typical case.

  18. Monte Carlo study on the sensitivity of prompt gamma imaging to proton range variations due to interfractional changes in prostate cancer patients

    Science.gov (United States)

    Schmid, S.; Landry, G.; Thieke, C.; Verhaegen, F.; Ganswindt, U.; Belka, C.; Parodi, K.; Dedes, G.

    2015-12-01

    Proton range verification based on prompt gamma imaging is increasingly considered in proton therapy. Tissue heterogeneity normal to the beam direction or near the end of range may considerably degrade the ability of prompt gamma imaging to detect proton range shifts. The goal of this study was to systematically investigate the accuracy and precision of range detection from prompt gamma emission profiles for various fractions for intensity modulated proton therapy of prostate cancer, using a comprehensive clinical dataset of 15 different CT scans for 5 patients. Monte Carlo simulations using Geant4 were performed to generate spot-by-spot dose distributions and prompt gamma emission profiles for prostate treatment plans. The prompt gammas were scored at their point of emission. Three CT scans of the same patient were used to evaluate the impact of inter-fractional changes on proton range. The range shifts deduced from the comparison of prompt gamma emission profiles in the planning CT and subsequent CTs were then correlated to the corresponding range shifts deduced from the dose distributions for individual pencil beams. The distributions of range shift differences between prompt gamma and dose were evaluated in terms of precision (defined as half the 95% inter-percentile range IPR) and accuracy (median). In total about 1700 individual proton pencil beams were investigated. The IPR of the relative range shift differences between the dose profiles and the prompt gamma profiles varied between  ±1.4 mm and  ±2.9 mm when using the more robust profile shifting analysis. The median was found smaller than 1 mm. Methods to identify and reject unreliable spots for range verification due to range mixing were derived and resulted in an average 10% spot rejection, clearly improving the prompt gamma-dose correlation. This work supports that prompt gamma imaging can offer a reliable indicator of range changes due to anatomical variations and tissue heterogeneity

  19. Characterization of array scintillation detector for follicle thyroid 2D imaging acquisition using Monte Carlo simulation; Caracterizacao de uma matriz detectora cintiladora para aquisicao de imagem 2D da regiao folicular da glandula tireoide por emissao radioativa usando simulacao Monte Carlo

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Carlos Borges da

    2007-05-15

    The image acquisition methods applied to nuclear medicine and radiobiology are a valuable research study for determination of thyroid anatomy to seek disorders associated to follicular cells. The Monte Carlo (MC) simulation has also been used in problems related to radiation detection in order to map medical images since the improvement of data processing compatible with personnel computers (PC). This work presents an innovative study to find out the adequate scintillation inorganic detector array that could be coupled to a specific light photo sensor, a charge coupled device (CCD) through a fiber optic plate in order to map the follicles of thyroid gland. The goal is to choose the type of detector that fits the application suggested here with spatial resolution of 10 {mu}m and good detector efficiency. The methodology results are useful to map a follicle image using gamma radiation emission. A source - detector simulation is performed by using a MCNP4B (Monte Carlo for Neutron Photon transport) general code considering different source energies, detector materials and geometries including pixel sizes and reflector types. The results demonstrate that by using MCNP4B code is possible to searching for useful parameters related to the systems used in nuclear medicine, specifically in radiobiology applied to endocrine physiology studies to acquiring thyroid follicles images. (author)

  20. An experimental approach to improve the Monte Carlo modelling of offline PET/CT-imaging of positron emitters induced by scanned proton beams.

    Science.gov (United States)

    Bauer, J; Unholtz, D; Kurz, C; Parodi, K

    2013-08-01

    We report on the experimental campaign carried out at the Heidelberg Ion-Beam Therapy Center (HIT) to optimize the Monte Carlo (MC) modelling of proton-induced positron-emitter production. The presented experimental strategy constitutes a pragmatic inverse approach to overcome the known uncertainties in the modelling of positron-emitter production due to the lack of reliable cross-section data for the relevant therapeutic energy range. This work is motivated by the clinical implementation of offline PET/CT-based treatment verification at our facility. Here, the irradiation induced tissue activation in the patient is monitored shortly after the treatment delivery by means of a commercial PET/CT scanner and compared to a MC simulated activity expectation, derived under the assumption of a correct treatment delivery. At HIT, the MC particle transport and interaction code FLUKA is used for the simulation of the expected positron-emitter yield. For this particular application, the code is coupled to externally provided cross-section data of several proton-induced reactions. Studying experimentally the positron-emitting radionuclide yield in homogeneous phantoms provides access to the fundamental production channels. Therefore, five different materials have been irradiated by monoenergetic proton pencil beams at various energies and the induced β(+) activity subsequently acquired with a commercial full-ring PET/CT scanner. With the analysis of dynamically reconstructed PET images, we are able to determine separately the spatial distribution of different radionuclide concentrations at the starting time of the PET scan. The laterally integrated radionuclide yields in depth are used to tune the input cross-section data such that the impact of both the physical production and the imaging process on the various positron-emitter yields is reproduced. The resulting cross-section data sets allow to model the absolute level of measured β(+) activity induced in the investigated

  1. Development and implementation in the Monte Carlo code PENELOPE of a new virtual source model for radiotherapy photon beams and portal image calculation

    Science.gov (United States)

    Chabert, I.; Barat, E.; Dautremer, T.; Montagu, T.; Agelou, M.; Croc de Suray, A.; Garcia-Hernandez, J. C.; Gempp, S.; Benkreira, M.; de Carlan, L.; Lazaro, D.

    2016-07-01

    This work aims at developing a generic virtual source model (VSM) preserving all existing correlations between variables stored in a Monte Carlo pre-computed phase space (PS) file, for dose calculation and high-resolution portal image prediction. The reference PS file was calculated using the PENELOPE code, after the flattening filter (FF) of an Elekta Synergy 6 MV photon beam. Each particle was represented in a mobile coordinate system by its radial position (r s ) in the PS plane, its energy (E), and its polar and azimuthal angles (φ d and θ d ), describing the particle deviation compared to its initial direction after bremsstrahlung, and the deviation orientation. Three sub-sources were created by sorting out particles according to their last interaction location (target, primary collimator or FF). For each sub-source, 4D correlated-histograms were built by storing E, r s , φ d and θ d values. Five different adaptive binning schemes were studied to construct 4D histograms of the VSMs, to ensure histogram efficient handling as well as an accurate reproduction of E, r s , φ d and θ d distribution details. The five resulting VSMs were then implemented in PENELOPE. Their accuracy was first assessed in the PS plane, by comparing E, r s , φ d and θ d distributions with those obtained from the reference PS file. Second, dose distributions computed in water, using the VSMs and the reference PS file located below the FF, and also after collimation in both water and heterogeneous phantom, were compared using a 1.5%-0 mm and a 2%-0 mm global gamma index, respectively. Finally, portal images were calculated without and with phantoms in the beam. The model was then evaluated using a 1%-0 mm global gamma index. Performance of a mono-source VSM was also investigated and led, as with the multi-source model, to excellent results when combined with an adaptive binning scheme.

  2. Development and implementation in the Monte Carlo code PENELOPE of a new virtual source model for radiotherapy photon beams and portal image calculation.

    Science.gov (United States)

    Chabert, I; Barat, E; Dautremer, T; Montagu, T; Agelou, M; Croc de Suray, A; Garcia-Hernandez, J C; Gempp, S; Benkreira, M; de Carlan, L; Lazaro, D

    2016-07-21

    This work aims at developing a generic virtual source model (VSM) preserving all existing correlations between variables stored in a Monte Carlo pre-computed phase space (PS) file, for dose calculation and high-resolution portal image prediction. The reference PS file was calculated using the PENELOPE code, after the flattening filter (FF) of an Elekta Synergy 6 MV photon beam. Each particle was represented in a mobile coordinate system by its radial position (r s ) in the PS plane, its energy (E), and its polar and azimuthal angles (φ d and θ d ), describing the particle deviation compared to its initial direction after bremsstrahlung, and the deviation orientation. Three sub-sources were created by sorting out particles according to their last interaction location (target, primary collimator or FF). For each sub-source, 4D correlated-histograms were built by storing E, r s , φ d and θ d values. Five different adaptive binning schemes were studied to construct 4D histograms of the VSMs, to ensure histogram efficient handling as well as an accurate reproduction of E, r s , φ d and θ d distribution details. The five resulting VSMs were then implemented in PENELOPE. Their accuracy was first assessed in the PS plane, by comparing E, r s , φ d and θ d distributions with those obtained from the reference PS file. Second, dose distributions computed in water, using the VSMs and the reference PS file located below the FF, and also after collimation in both water and heterogeneous phantom, were compared using a 1.5%-0 mm and a 2%-0 mm global gamma index, respectively. Finally, portal images were calculated without and with phantoms in the beam. The model was then evaluated using a 1%-0 mm global gamma index. Performance of a mono-source VSM was also investigated and led, as with the multi-source model, to excellent results when combined with an adaptive binning scheme. PMID:27353090

  3. Exploring Monte Carlo methods

    CERN Document Server

    Dunn, William L

    2012-01-01

    Exploring Monte Carlo Methods is a basic text that describes the numerical methods that have come to be known as "Monte Carlo." The book treats the subject generically through the first eight chapters and, thus, should be of use to anyone who wants to learn to use Monte Carlo. The next two chapters focus on applications in nuclear engineering, which are illustrative of uses in other fields. Five appendices are included, which provide useful information on probability distributions, general-purpose Monte Carlo codes for radiation transport, and other matters. The famous "Buffon's needle proble

  4. SAN CARLOS APACHE PAPERS.

    Science.gov (United States)

    ROESSEL, ROBERT A., JR.

    THE FIRST SECTION OF THIS BOOK COVERS THE HISTORICAL AND CULTURAL BACKGROUND OF THE SAN CARLOS APACHE INDIANS, AS WELL AS AN HISTORICAL SKETCH OF THE DEVELOPMENT OF THEIR FORMAL EDUCATIONAL SYSTEM. THE SECOND SECTION IS DEVOTED TO THE PROBLEMS OF TEACHERS OF THE INDIAN CHILDREN IN GLOBE AND SAN CARLOS, ARIZONA. IT IS DIVIDED INTO THREE PARTS--(1)…

  5. San Carlo Operaen

    DEFF Research Database (Denmark)

    Holm, Bent

    2005-01-01

    En indplacering af operahuset San Carlo i en kulturhistorisk repræsentationskontekst med særligt henblik på begrebet napolalità.......En indplacering af operahuset San Carlo i en kulturhistorisk repræsentationskontekst med særligt henblik på begrebet napolalità....

  6. Comparison of two accelerators for Monte Carlo radiation transport calculations, Nvidia Tesla M2090 GPU and Intel Xeon Phi 5110p coprocessor: A case study for X-ray CT imaging dose calculation

    International Nuclear Information System (INIS)

    Highlights: • A new Monte Carlo photon transport code ARCHER-CT for CT dose calculations is developed to execute on the GPU and coprocessor. • ARCHER-CT is verified against MCNP. • The GPU code on an Nvidia M2090 GPU is 5.15–5.81 times faster than the parallel CPU code on an Intel X5650 6-core CPU. • The coprocessor code on an Intel Xeon Phi 5110p coprocessor is 3.30–3.38 times faster than the CPU code. - Abstract: Hardware accelerators are currently becoming increasingly important in boosting high performance computing systems. In this study, we tested the performance of two accelerator models, Nvidia Tesla M2090 GPU and Intel Xeon Phi 5110p coprocessor, using a new Monte Carlo photon transport package called ARCHER-CT we have developed for fast CT imaging dose calculation. The package contains three components, ARCHER-CTCPU, ARCHER-CTGPU and ARCHER-CTCOP designed to be run on the multi-core CPU, GPU and coprocessor architectures respectively. A detailed GE LightSpeed Multi-Detector Computed Tomography (MDCT) scanner model and a family of voxel patient phantoms are included in the code to calculate absorbed dose to radiosensitive organs under user-specified scan protocols. The results from ARCHER agree well with those from the production code Monte Carlo N-Particle eXtended (MCNPX). It is found that all the code components are significantly faster than the parallel MCNPX run on 12 MPI processes, and that the GPU and coprocessor codes are 5.15–5.81 and 3.30–3.38 times faster than the parallel ARCHER-CTCPU, respectively. The M2090 GPU performs better than the 5110p coprocessor in our specific test. Besides, the heterogeneous computation mode in which the CPU and the hardware accelerator work concurrently can increase the overall performance by 13–18%

  7. Comparison of 2 accelerators of Monte Carlo radiation transport calculations, NVIDIA tesla M2090 GPU and Intel Xeon Phi 5110p coprocessor: a case study for X-ray CT Imaging Dose calculation

    International Nuclear Information System (INIS)

    Hardware accelerators are currently becoming increasingly important in boosting high performance computing systems. In this study, we tested the performance of two accelerator models, NVIDIA Tesla M2090 GPU and Intel Xeon Phi 5110p coprocessor, using a new Monte Carlo photon transport package called ARCHER-CT we have developed for fast CT imaging dose calculation. The package contains three code variants, ARCHER-CT(CPU), ARCHER-CT(GPU) and ARCHER-CT(COP) to run in parallel on the multi-core CPU, GPU and coprocessor architectures respectively. A detailed GE LightSpeed Multi-Detector Computed Tomography (MDCT) scanner model and a family of voxel patient phantoms were included in the code to calculate absorbed dose to radiosensitive organs under specified scan protocols. The results from ARCHER agreed well with those from the production code Monte Carlo N-Particle eXtended (MCNPX). It was found that all the code variants were significantly faster than the parallel MCNPX running on 12 MPI processes, and that the GPU and coprocessor performed equally well, being 2.89-4.49 and 3.01-3.23 times faster than the parallel ARCHER-CT(CPU) running with 12 hyper-threads. (authors)

  8. Count rate studies of a box-shaped PET breast imaging system comprised of position sensitive avalanche photodiodes utilizing monte carlo simulation.

    Science.gov (United States)

    Foudray, Angela M K; Habte, Frezghi; Chinn, Garry; Zhang, Jin; Levin, Craig S

    2006-01-01

    We are investigating a high-sensitivity, high-resolution positron emission tomography (PET) system for clinical use in the detection, diagnosis and staging of breast cancer. Using conventional figures of merit, design parameters were evaluated for count rate performance, module dead time, and construction complexity. The detector system modeled comprises extremely thin position-sensitive avalanche photodiodes coupled to lutetium oxy-orthosilicate scintillation crystals. Previous investigations of detector geometries with Monte Carlo indicated that one of the largest impacts on sensitivity is local scintillation crystal density when considering systems having the same average scintillation crystal densities (same crystal packing fraction and system solid-angle coverage). Our results show the system has very good scatter and randoms rejection at clinical activity ranges ( approximately 200 muCi). PMID:17645997

  9. Monte Carlo Radiative Transfer

    CERN Document Server

    Whitney, Barbara A

    2011-01-01

    I outline methods for calculating the solution of Monte Carlo Radiative Transfer (MCRT) in scattering, absorption and emission processes of dust and gas, including polarization. I provide a bibliography of relevant papers on methods with astrophysical applications.

  10. Monte Carlo transition probabilities

    OpenAIRE

    Lucy, L. B.

    2001-01-01

    Transition probabilities governing the interaction of energy packets and matter are derived that allow Monte Carlo NLTE transfer codes to be constructed without simplifying the treatment of line formation. These probabilities are such that the Monte Carlo calculation asymptotically recovers the local emissivity of a gas in statistical equilibrium. Numerical experiments with one-point statistical equilibrium problems for Fe II and Hydrogen confirm this asymptotic behaviour. In addition, the re...

  11. The Analysis of the Patterns of Radiation-Induced DNA Damage Foci by a Stochastic Monte Carlo Model of DNA Double Strand Breaks Induction by Heavy Ions and Image Segmentation Software

    Science.gov (United States)

    Ponomarev, Artem; Cucinotta, F.

    2011-01-01

    To create a generalized mechanistic model of DNA damage in human cells that will generate analytical and image data corresponding to experimentally observed DNA damage foci and will help to improve the experimental foci yields by simulating spatial foci patterns and resolving problems with quantitative image analysis. Material and Methods: The analysis of patterns of RIFs (radiation-induced foci) produced by low- and high-LET (linear energy transfer) radiation was conducted by using a Monte Carlo model that combines the heavy ion track structure with characteristics of the human genome on the level of chromosomes. The foci patterns were also simulated in the maximum projection plane for flat nuclei. Some data analysis was done with the help of image segmentation software that identifies individual classes of RIFs and colocolized RIFs, which is of importance to some experimental assays that assign DNA damage a dual phosphorescent signal. Results: The model predicts the spatial and genomic distributions of DNA DSBs (double strand breaks) and associated RIFs in a human cell nucleus for a particular dose of either low- or high-LET radiation. We used the model to do analyses for different irradiation scenarios. In the beam-parallel-to-the-disk-of-a-flattened-nucleus scenario we found that the foci appeared to be merged due to their high density, while, in the perpendicular-beam scenario, the foci appeared as one bright spot per hit. The statistics and spatial distribution of regions of densely arranged foci, termed DNA foci chains, were predicted numerically using this model. Another analysis was done to evaluate the number of ion hits per nucleus, which were visible from streaks of closely located foci. In another analysis, our image segmentaiton software determined foci yields directly from images with single-class or colocolized foci. Conclusions: We showed that DSB clustering needs to be taken into account to determine the true DNA damage foci yield, which helps to

  12. CERN honours Carlo Rubbia

    CERN Document Server

    2009-01-01

    Carlo Rubbia turned 75 on March 31, and CERN held a symposium to mark his birthday and pay tribute to his impressive contribution to both CERN and science. Carlo Rubbia, 4th from right, together with the speakers at the symposium.On 7 April CERN hosted a celebration marking Carlo Rubbia’s 75th birthday and 25 years since he was awarded the Nobel Prize for Physics. "Today we will celebrate 100 years of Carlo Rubbia" joked CERN’s Director-General, Rolf Heuer in his opening speech, "75 years of his age and 25 years of the Nobel Prize." Rubbia received the Nobel Prize along with Simon van der Meer for contributions to the discovery of the W and Z bosons, carriers of the weak interaction. During the symposium, which was held in the Main Auditorium, several eminent speakers gave lectures on areas of science to which Carlo Rubbia made decisive contributions. Among those who spoke were Michel Spiro, Director of the French National Insti...

  13. Development of Monte Carlo code for Z-pinch driven fusion neutron imaging diagnosis system simulation%Z箍缩中子编码诊断系统模拟平台搭建

    Institute of Scientific and Technical Information of China (English)

    贾清刚; 张天奎; 张凤娜; 胡华四

    2013-01-01

    开发了基于Geant4的Z箍缩中子编码成像系统模拟平台,实现聚变中子编码成像诊断系统各关键部件的完整模拟.获得了低中子产额(约1010量级)下,中子经编码孔编码后在闪烁体阵列中形成的发光分布图像.利用维纳滤波、Richardson-Lucy(RL)及遗传算法(GA)对低中子产额下获得的极低信噪比图像进行重建,并对信噪比、中子产额及重建效果进行了对比研究,结果表明:遗传算法对低信噪比中子编码图像的重建具有很强的鲁棒性;中子编码图像的信噪比与遗传算法重建结果的准确性呈正比.%The model of Z-pinch driven fusion imaging diagnosis system was set up by a Monte Carlo code based on the Geant4 simulation toolkit. All physical processes that the reality involves are taken into consideration in simulation. The light image of low neutron yield (about 1010) pill was obtained. Three types of image reconstruction algorithm, i. e. Richardson-Lucy, Wiener filtering and genetic algorithm were employed to reconstruct the neutron image with a low signal to noise ratio (SNR) and yield. The effects of neutron yields and the SNR on reconstruction performance were discussed. The results show that genetic algorithm is very robust for reconstructing neutron images with a low SNR. And the index of reconstruction performance and the image correlation coefficient using genetic algorithm, are proportional to the SNR of the neutron coded image.

  14. The Virtual Monte Carlo

    CERN Document Server

    Hrivnacova, I; Berejnov, V V; Brun, R; Carminati, F; Fassò, A; Futo, E; Gheata, A; Caballero, I G; Morsch, Andreas

    2003-01-01

    The concept of Virtual Monte Carlo (VMC) has been developed by the ALICE Software Project to allow different Monte Carlo simulation programs to run without changing the user code, such as the geometry definition, the detector response simulation or input and output formats. Recently, the VMC classes have been integrated into the ROOT framework, and the other relevant packages have been separated from the AliRoot framework and can be used individually by any other HEP project. The general concept of the VMC and its set of base classes provided in ROOT will be presented. Existing implementations for Geant3, Geant4 and FLUKA and simple examples of usage will be described.

  15. Carlo Caso (1940 - 2007)

    CERN Multimedia

    Leonardo Rossi

    Carlo Caso (1940 - 2007) Our friend and colleague Carlo Caso passed away on July 7th, after several months of courageous fight against cancer. Carlo spent most of his scientific career at CERN, taking an active part in the experimental programme of the laboratory. His long and fruitful involvement in particle physics started in the sixties, in the Genoa group led by G. Tomasini. He then made several experiments using the CERN liquid hydrogen bubble chambers -first the 2000HBC and later BEBC- to study various facets of the production and decay of meson and baryon resonances. He later made his own group and joined the NA27 Collaboration to exploit the EHS Spectrometer with a rapid cycling bubble chamber as vertex detector. Amongst their many achievements, they were the first to measure, with excellent precision, the lifetime of the charmed D mesons. At the start of the LEP era, Carlo and his group moved to the DELPHI experiment, participating in the construction and running of the HPC electromagnetic c...

  16. Fundamentals of Monte Carlo

    Energy Technology Data Exchange (ETDEWEB)

    Wollaber, Allan Benton [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-06-16

    This is a powerpoint which serves as lecture material for the Parallel Computing summer school. It goes over the fundamentals of Monte Carlo. Welcome to Los Alamos, the birthplace of “Monte Carlo” for computational physics. Stanislaw Ulam, John von Neumann, and Nicholas Metropolis are credited as the founders of modern Monte Carlo methods. The name “Monte Carlo” was chosen in reference to the Monte Carlo Casino in Monaco (purportedly a place where Ulam’s uncle went to gamble). The central idea (for us) – to use computer-generated “random” numbers to determine expected values or estimate equation solutions – has since spread to many fields. "The first thoughts and attempts I made to practice [the Monte Carlo Method] were suggested by a question which occurred to me in 1946 as I was convalescing from an illness and playing solitaires. The question was what are the chances that a Canfield solitaire laid out with 52 cards will come out successfully? After spending a lot of time trying to estimate them by pure combinatorial calculations, I wondered whether a more practical method than “abstract thinking” might not be to lay it out say one hundred times and simply observe and count the number of successful plays... Later [in 1946], I described the idea to John von Neumann, and we began to plan actual calculations." - Stanislaw Ulam.

  17. Monte Carlo dose mapping on deforming anatomy

    Science.gov (United States)

    Zhong, Hualiang; Siebers, Jeffrey V.

    2009-10-01

    This paper proposes a Monte Carlo-based energy and mass congruent mapping (EMCM) method to calculate the dose on deforming anatomy. Different from dose interpolation methods, EMCM separately maps each voxel's deposited energy and mass from a source image to a reference image with a displacement vector field (DVF) generated by deformable image registration (DIR). EMCM was compared with other dose mapping methods: energy-based dose interpolation (EBDI) and trilinear dose interpolation (TDI). These methods were implemented in EGSnrc/DOSXYZnrc, validated using a numerical deformable phantom and compared for clinical CT images. On the numerical phantom with an analytically invertible deformation map, EMCM mapped the dose exactly the same as its analytic solution, while EBDI and TDI had average dose errors of 2.5% and 6.0%. For a lung patient's IMRT treatment plan, EBDI and TDI differed from EMCM by 1.96% and 7.3% in the lung patient's entire dose region, respectively. As a 4D Monte Carlo dose calculation technique, EMCM is accurate and its speed is comparable to 3D Monte Carlo simulation. This method may serve as a valuable tool for accurate dose accumulation as well as for 4D dosimetry QA.

  18. Feasibility of a CdTe-based SPECT for high-resolution low-dose small animal imaging: a Monte Carlo simulation study

    International Nuclear Information System (INIS)

    Dedicated single-photon-emission computed tomography (SPECT) systems based on pixelated semiconductors such as cadmium telluride (CdTe) are in development to study small animal models of human disease. In an effort to develop a high-resolution, low-dose system for small animal imaging, we compared a CdTe-based SPECT system and a conventional NaI(Tl)-based SPECT system in terms of spatial resolution, sensitivity, contrast, and contrast-to-noise ratio (CNR). In addition, we investigated the radiation absorbed dose and calculated a figure of merit (FOM) for both SPECT systems. Using the conventional NaI(Tl)-based SPECT system, we achieved a spatial resolution of 1.66 mm at a 30 mm source-to-collimator distance, and a resolution of 2.4-mm hot-rods. Using the newly-developed CdTe-based SPECT system, we achieved a spatial resolution of 1.32 mm FWHM at a 30 mm source-to-collimator distance, and a resolution of 1.7-mm hot-rods. The sensitivities at a 30 mm source-to-collimator distance were 115.73 counts/sec/MBq and 83.38 counts/sec/MBq for the CdTe-based SPECT and conventional NaI(Tl)-based SPECT systems, respectively. To compare quantitative measurements in the mouse brain, we calculated the CNR for images from both systems. The CNR from the CdTe-based SPECT system was 4.41, while that from the conventional NaI(Tl)-based SPECT system was 3.11 when the injected striatal dose was 160 Bq/voxel. The CNR increased as a function of injected dose in both systems. The FOM of the CdTe-based SPECT system was superior to that of the conventional NaI(Tl)-based SPECT system, and the highest FOM was achieved with the CdTe-based SPECT at a dose of 40 Bq/voxel injected into the striatum. Thus, a CdTe-based SPECT system showed significant improvement in performance compared with a conventional system in terms of spatial resolution, sensitivity, and CNR, while reducing the radiation dose to the small animal subject. Herein, we discuss the feasibility of a CdTe-based SPECT system for high

  19. Monte Carlo and nonlinearities

    CERN Document Server

    Dauchet, Jérémi; Blanco, Stéphane; Caliot, Cyril; Charon, Julien; Coustet, Christophe; Hafi, Mouna El; Eymet, Vincent; Farges, Olivier; Forest, Vincent; Fournier, Richard; Galtier, Mathieu; Gautrais, Jacques; Khuong, Anaïs; Pelissier, Lionel; Piaud, Benjamin; Roger, Maxime; Terrée, Guillaume; Weitz, Sebastian

    2016-01-01

    The Monte Carlo method is widely used to numerically predict systems behaviour. However, its powerful incremental design assumes a strong premise which has severely limited application so far: the estimation process must combine linearly over dimensions. Here we show that this premise can be alleviated by projecting nonlinearities on a polynomial basis and increasing the configuration-space dimension. Considering phytoplankton growth in light-limited environments, radiative transfer in planetary atmospheres, electromagnetic scattering by particles and concentrated-solar-power-plant productions, we prove the real world usability of this advance on four test-cases that were so far regarded as impracticable by Monte Carlo approaches. We also illustrate an outstanding feature of our method when applied to sharp problems with interacting particles: handling rare events is now straightforward. Overall, our extension preserves the features that made the method popular: addressing nonlinearities does not compromise o...

  20. Fundamentals of Monte Carlo

    Energy Technology Data Exchange (ETDEWEB)

    Wollaber, Allan Benton [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-06-16

    This is a powerpoint presentation which serves as lecture material for the Parallel Computing summer school. It goes over the fundamentals of the Monte Carlo calculation method. The material is presented according to the following outline: Introduction (background, a simple example: estimating π), Why does this even work? (The Law of Large Numbers, The Central Limit Theorem), How to sample (inverse transform sampling, rejection), and An example from particle transport.

  1. CERN honours Carlo Rubbia

    CERN Multimedia

    2009-01-01

    On 7 April CERN will be holding a symposium to mark the 75th birthday of Carlo Rubbia, who shared the 1984 Nobel Prize for Physics with Simon van der Meer for contributions to the discovery of the W and Z bosons, carriers of the weak interaction. Following a presentation by Rolf Heuer, lectures will be given by eminent speakers on areas of science to which Carlo Rubbia has made decisive contributions. Michel Spiro, Director of the French National Institute of Nuclear and Particle Physics (IN2P3) of the CNRS, Lyn Evans, sLHC Project Leader, and Alan Astbury of the TRIUMF Laboratory will talk about the physics of the weak interaction and the discovery of the W and Z bosons. Former CERN Director-General Herwig Schopper will lecture on CERN’s accelerators from LEP to the LHC. Giovanni Bignami, former President of the Italian Space Agency and Professor at the IUSS School for Advanced Studies in Pavia will speak about his work with Carlo Rubbia. Finally, Hans Joachim Sch...

  2. CERN honours Carlo Rubbia

    CERN Multimedia

    2009-01-01

    On 7 April CERN will be holding a symposium to mark the 75th birthday of Carlo Rubbia, who shared the 1984 Nobel Prize for Physics with Simon van der Meer for contributions to the discovery of the W and Z bosons, carriers of the weak interaction. Following a presentation by Rolf Heuer, lectures will be given by eminent speakers on areas of science to which Carlo Rubbia has made decisive contributions. Michel Spiro, Director of the French National Institute of Nuclear and Particle Physics (IN2P3) of the CNRS, Lyn Evans, sLHC Project Leader, and Alan Astbury of the TRIUMF Laboratory will talk about the physics of the weak interaction and the discovery of the W and Z bosons. Former CERN Director-General Herwig Schopper will lecture on CERN’s accelerators from LEP to the LHC. Giovanni Bignami, former President of the Italian Space Agency, will speak about his work with Carlo Rubbia. Finally, Hans Joachim Schellnhuber of the Potsdam Institute for Climate Research and Sven Kul...

  3. Who Writes Carlos Bulosan?

    Directory of Open Access Journals (Sweden)

    Charlie Samuya Veric

    2001-12-01

    Full Text Available The importance of Carlos Bulosan in Filipino and Filipino-American radical history and literature is indisputable. His eminence spans the pacific, and he is known, diversely, as a radical poet, fictionist, novelist, and labor organizer. Author of the canonical America Iis the Hearts, Bulosan is celebrated for chronicling the conditions in America in his time, such as racism and unemployment. In the history of criticism on Bulosan's life and work, however, there is an undeclared general consensus that views Bulosan and his work as coherent permanent texts of radicalism and anti-imperialism. Central to the existence of such a tradition of critical reception are the generations of critics who, in more ways than one, control the discourse on and of Carlos Bulosan. This essay inquires into the sphere of the critical reception that orders, for our time and for the time ahead, the reading and interpretation of Bulosan. What eye and seeing, the essay asks, determine the perception of Bulosan as the angel of radicalism? What is obscured in constructing Bulosan as an immutable figure of the political? What light does the reader conceive when the personal is brought into the open and situated against the political? the essay explores the answers to these questions in Bulosan's loving letters to various friends, strangers, and white American women. The presence of these interrogations, the essay believes, will secure ultimately the continuing importance of Carlos Bulosan to radical literature and history.

  4. Optimization of Monte Carlo simulations

    OpenAIRE

    Bryskhe, Henrik

    2009-01-01

    This thesis considers several different techniques for optimizing Monte Carlo simulations. The Monte Carlo system used is Penelope but most of the techniques are applicable to other systems. The two mayor techniques are the usage of the graphics card to do geometry calculations, and raytracing. Using graphics card provides a very efficient way to do fast ray and triangle intersections. Raytracing provides an approximation of Monte Carlo simulation but is much faster to perform. A program was ...

  5. Images

    Data.gov (United States)

    National Aeronautics and Space Administration — Images for the website main pages and all configurations. The upload and access points for the other images are: Website Template RSW images BSCW Images HIRENASD...

  6. Monte Carlo techniques

    International Nuclear Information System (INIS)

    The course of ''Monte Carlo Techniques'' will try to give a general overview of how to build up a method based on a given theory, allowing you to compare the outcome of an experiment with that theory. Concepts related with the construction of the method, such as, random variables, distributions of random variables, generation of random variables, random-based numerical methods, will be introduced in this course. Examples of some of the current theories in High Energy Physics describing the e+e- annihilation processes (QED, Electro-Weak, QCD) will also be briefly introduced. A second step in the employment of this method is related to the detector. The interactions that a particle could have along its way, through the detector as well as the response of the different materials which compound the detector will be quoted in this course. An example of detector at LEP era, in which these techniques are being applied, will close the course. (orig.)

  7. MCMini: Monte Carlo on GPGPU

    Energy Technology Data Exchange (ETDEWEB)

    Marcus, Ryan C. [Los Alamos National Laboratory

    2012-07-25

    MCMini is a proof of concept that demonstrates the possibility for Monte Carlo neutron transport using OpenCL with a focus on performance. This implementation, written in C, shows that tracing particles and calculating reactions on a 3D mesh can be done in a highly scalable fashion. These results demonstrate a potential path forward for MCNP or other Monte Carlo codes.

  8. Monte Carlo methods for electromagnetics

    CERN Document Server

    Sadiku, Matthew NO

    2009-01-01

    Until now, novices had to painstakingly dig through the literature to discover how to use Monte Carlo techniques for solving electromagnetic problems. Written by one of the foremost researchers in the field, Monte Carlo Methods for Electromagnetics provides a solid understanding of these methods and their applications in electromagnetic computation. Including much of his own work, the author brings together essential information from several different publications.Using a simple, clear writing style, the author begins with a historical background and review of electromagnetic theory. After addressing probability and statistics, he introduces the finite difference method as well as the fixed and floating random walk Monte Carlo methods. The text then applies the Exodus method to Laplace's and Poisson's equations and presents Monte Carlo techniques for handing Neumann problems. It also deals with whole field computation using the Markov chain, applies Monte Carlo methods to time-varying diffusion problems, and ...

  9. SU-E-CAMPUS-I-05: Internal Dosimetric Calculations for Several Imaging Radiopharmaceuticals in Preclinical Studies and Quantitative Assessment of the Mouse Size Impact On Them. Realistic Monte Carlo Simulations Based On the 4D-MOBY Model

    Energy Technology Data Exchange (ETDEWEB)

    Kostou, T; Papadimitroulas, P; Kagadis, GC [University of Patras, Rion, Ahaia (Greece); Loudos, G [Technical Educational Institute of Athens, Aigaleo, Attiki (Greece)

    2014-06-15

    Purpose: Commonly used radiopharmaceuticals were tested to define the most important dosimetric factors in preclinical studies. Dosimetric calculations were applied in two different whole-body mouse models, with varying organ size, so as to determine their impact on absorbed doses and S-values. Organ mass influence was evaluated with computational models and Monte Carlo(MC) simulations. Methods: MC simulations were executed on GATE to determine dose distribution in the 4D digital MOBY mouse phantom. Two mouse models, 28 and 34 g respectively, were constructed based on realistic preclinical exams to calculate the absorbed doses and S-values of five commonly used radionuclides in SPECT/PET studies (18F, 68Ga, 177Lu, 111In and 99mTc).Radionuclide biodistributions were obtained from literature. Realistic statistics (uncertainty lower than 4.5%) were acquired using the standard physical model in Geant4. Comparisons of the dosimetric calculations on the two different phantoms for each radiopharmaceutical are presented. Results: Dose per organ in mGy was calculated for all radiopharmaceuticals. The two models introduced a difference of 0.69% in their brain masses, while the largest differences were observed in the marrow 18.98% and in the thyroid 18.65% masses.Furthermore, S-values of the most important target-organs were calculated for each isotope. Source-organ was selected to be the whole mouse body.Differences on the S-factors were observed in the 6.0–30.0% range. Tables with all the calculations as reference dosimetric data were developed. Conclusion: Accurate dose per organ and the most appropriate S-values are derived for specific preclinical studies. The impact of the mouse model size is rather high (up to 30% for a 17.65% difference in the total mass), and thus accurate definition of the organ mass is a crucial parameter for self-absorbed S values calculation.Our goal is to extent the study for accurate estimations in small animal imaging, whereas it is known

  10. Parallelizing Monte Carlo with PMC

    Energy Technology Data Exchange (ETDEWEB)

    Rathkopf, J.A.; Jones, T.R.; Nessett, D.M.; Stanberry, L.C.

    1994-11-01

    PMC (Parallel Monte Carlo) is a system of generic interface routines that allows easy porting of Monte Carlo packages of large-scale physics simulation codes to Massively Parallel Processor (MPP) computers. By loading various versions of PMC, simulation code developers can configure their codes to run in several modes: serial, Monte Carlo runs on the same processor as the rest of the code; parallel, Monte Carlo runs in parallel across many processors of the MPP with the rest of the code running on other MPP processor(s); distributed, Monte Carlo runs in parallel across many processors of the MPP with the rest of the code running on a different machine. This multi-mode approach allows maintenance of a single simulation code source regardless of the target machine. PMC handles passing of messages between nodes on the MPP, passing of messages between a different machine and the MPP, distributing work between nodes, and providing independent, reproducible sequences of random numbers. Several production codes have been parallelized under the PMC system. Excellent parallel efficiency in both the distributed and parallel modes results if sufficient workload is available per processor. Experiences with a Monte Carlo photonics demonstration code and a Monte Carlo neutronics package are described.

  11. Monte Carlo Simulation and Experimental Characterization of a Dual Head Gamma Camera

    CERN Document Server

    Rodrigues, S; Abreu, M C; Santos, N; Rato-Mendes, P; Peralta, L

    2007-01-01

    The GEANT4 Monte Carlo simulation and experimental characterization of the Siemens E.Cam Dual Head gamma camera hosted in the Particular Hospital of Algarve have been done. Imaging tests of thyroid and other phantoms have been made "in situ" and compared with the results obtained with the Monte Carlo simulation.

  12. Accelerated GPU based SPECT Monte Carlo simulations

    Science.gov (United States)

    Garcia, Marie-Paule; Bert, Julien; Benoit, Didier; Bardiès, Manuel; Visvikis, Dimitris

    2016-06-01

    Monte Carlo (MC) modelling is widely used in the field of single photon emission computed tomography (SPECT) as it is a reliable technique to simulate very high quality scans. This technique provides very accurate modelling of the radiation transport and particle interactions in a heterogeneous medium. Various MC codes exist for nuclear medicine imaging simulations. Recently, new strategies exploiting the computing capabilities of graphical processing units (GPU) have been proposed. This work aims at evaluating the accuracy of such GPU implementation strategies in comparison to standard MC codes in the context of SPECT imaging. GATE was considered the reference MC toolkit and used to evaluate the performance of newly developed GPU Geant4-based Monte Carlo simulation (GGEMS) modules for SPECT imaging. Radioisotopes with different photon energies were used with these various CPU and GPU Geant4-based MC codes in order to assess the best strategy for each configuration. Three different isotopes were considered: 99m Tc, 111In and 131I, using a low energy high resolution (LEHR) collimator, a medium energy general purpose (MEGP) collimator and a high energy general purpose (HEGP) collimator respectively. Point source, uniform source, cylindrical phantom and anthropomorphic phantom acquisitions were simulated using a model of the GE infinia II 3/8" gamma camera. Both simulation platforms yielded a similar system sensitivity and image statistical quality for the various combinations. The overall acceleration factor between GATE and GGEMS platform derived from the same cylindrical phantom acquisition was between 18 and 27 for the different radioisotopes. Besides, a full MC simulation using an anthropomorphic phantom showed the full potential of the GGEMS platform, with a resulting acceleration factor up to 71. The good agreement with reference codes and the acceleration factors obtained support the use of GPU implementation strategies for improving computational efficiency

  13. TARC: Carlo Rubbia's Energy Amplifier

    CERN Multimedia

    Laurent Guiraud

    1997-01-01

    Transmutation by Adiabatic Resonance Crossing (TARC) is Carlo Rubbia's energy amplifier. This CERN experiment demonstrated that long-lived fission fragments, such as 99-TC, can be efficiently destroyed.

  14. Modelling cerebral blood oxygenation using Monte Carlo XYZ-PA

    Science.gov (United States)

    Zam, Azhar; Jacques, Steven L.; Alexandrov, Sergey; Li, Youzhi; Leahy, Martin J.

    2013-02-01

    Continuous monitoring of cerebral blood oxygenation is critically important for the management of many lifethreatening conditions. Non-invasive monitoring of cerebral blood oxygenation with a photoacoustic technique offers advantages over current invasive and non-invasive methods. We introduce a Monte Carlo XYZ-PA to model the energy deposition in 3D and the time-resolved pressures and velocity potential based on the energy absorbed by the biological tissue. This paper outlines the benefits of using Monte Carlo XYZ-PA for optimization of photoacoustic measurement and imaging. To the best of our knowledge this is the first fully integrated tool for photoacoustic modelling.

  15. Fast Monte Carlo-assisted simulation of cloudy Earth backgrounds

    Science.gov (United States)

    Adler-Golden, Steven; Richtsmeier, Steven C.; Berk, Alexander; Duff, James W.

    2012-11-01

    A calculation method has been developed for rapidly synthesizing radiometrically accurate ultraviolet through longwavelengthinfrared spectral imagery of the Earth for arbitrary locations and cloud fields. The method combines cloudfree surface reflectance imagery with cloud radiance images calculated from a first-principles 3-D radiation transport model. The MCScene Monte Carlo code [1-4] is used to build a cloud image library; a data fusion method is incorporated to speed convergence. The surface and cloud images are combined with an upper atmospheric description with the aid of solar and thermal radiation transport equations that account for atmospheric inhomogeneity. The method enables a wide variety of sensor and sun locations, cloud fields, and surfaces to be combined on-the-fly, and provides hyperspectral wavelength resolution with minimal computational effort. The simulations agree very well with much more time-consuming direct Monte Carlo calculations of the same scene.

  16. Proton Upset Monte Carlo Simulation

    Science.gov (United States)

    O'Neill, Patrick M.; Kouba, Coy K.; Foster, Charles C.

    2009-01-01

    The Proton Upset Monte Carlo Simulation (PROPSET) program calculates the frequency of on-orbit upsets in computer chips (for given orbits such as Low Earth Orbit, Lunar Orbit, and the like) from proton bombardment based on the results of heavy ion testing alone. The software simulates the bombardment of modern microelectronic components (computer chips) with high-energy (.200 MeV) protons. The nuclear interaction of the proton with the silicon of the chip is modeled and nuclear fragments from this interaction are tracked using Monte Carlo techniques to produce statistically accurate predictions.

  17. Synchronous Parallel Kinetic Monte Carlo

    Energy Technology Data Exchange (ETDEWEB)

    Mart?nez, E; Marian, J; Kalos, M H

    2006-12-14

    A novel parallel kinetic Monte Carlo (kMC) algorithm formulated on the basis of perfect time synchronicity is presented. The algorithm provides an exact generalization of any standard serial kMC model and is trivially implemented in parallel architectures. We demonstrate the mathematical validity and parallel performance of the method by solving several well-understood problems in diffusion.

  18. Monte Carlo Particle Lists: MCPL

    CERN Document Server

    Kittelmann, Thomas; Knudsen, Erik B; Willendrup, Peter; Cai, Xiao Xiao; Kanaki, Kalliopi

    2016-01-01

    A binary format with lists of particle state information, for interchanging particles between various Monte Carlo simulation applications, is presented. Portable C code for file manipulation is made available to the scientific community, along with converters and plugins for several popular simulation packages.

  19. Monte Carlo simulation of tomography techniques using the platform Gate

    International Nuclear Information System (INIS)

    Simulations play a key role in functional imaging, with applications ranging from scanner design, scatter correction, protocol optimisation. GATE (Geant4 for Application Tomography Emission) is a platform for Monte Carlo Simulation. It is based on Geant4 to generate and track particles, to model geometry and physics process. Explicit modelling of time includes detector motion, time of flight, tracer kinetics. Interfaces to voxellised models and image reconstruction packages improve the integration of GATE in the global modelling cycle. In this work Monte Carlo simulations are used to understand and optimise the gamma camera's performances. We study the effect of the distance between source and collimator, the diameter of the holes and the thick of the collimator on the spatial resolution, energy resolution and efficiency of the gamma camera. We also study the reduction of simulation's time and implement a model of left ventricle in GATE. (Author). 7 refs

  20. Microlens assembly error analysis for light field camera based on Monte Carlo method

    Science.gov (United States)

    Li, Sai; Yuan, Yuan; Zhang, Hao-Wei; Liu, Bin; Tan, He-Ping

    2016-08-01

    This paper describes numerical analysis of microlens assembly errors in light field cameras using the Monte Carlo method. Assuming that there were no manufacturing errors, home-built program was used to simulate images of coupling distance error, movement error and rotation error that could appear during microlens installation. By researching these images, sub-aperture images and refocus images, we found that the images present different degrees of fuzziness and deformation for different microlens assembly errors, while the subaperture image presents aliasing, obscured images and other distortions that result in unclear refocus images.

  1. Monte Carlo dose calculation in dental amalgam phantom

    OpenAIRE

    Mohd Zahri Abdul Aziz; Yusoff, A. L.; N D Osman; R. Abdullah; Rabaie, N. A.; M S Salikin

    2015-01-01

    It has become a great challenge in the modern radiation treatment to ensure the accuracy of treatment delivery in electron beam therapy. Tissue inhomogeneity has become one of the factors for accurate dose calculation, and this requires complex algorithm calculation like Monte Carlo (MC). On the other hand, computed tomography (CT) images used in treatment planning system need to be trustful as they are the input in radiotherapy treatment. However, with the presence of metal amalgam in treatm...

  2. Kinematics of multigrid Monte Carlo

    International Nuclear Information System (INIS)

    We study the kinematics of multigrid Monte Carlo algorithms by means of acceptance rates for nonlocal Metropolis update proposals. An approximation formula for acceptance rates is derived. We present a comparison of different coarse-to-fine interpolation schemes in free field theory, where the formula is exact. The predictions of the approximation formula for several interacting models are well confirmed by Monte Carlo simulations. The following rule is found: For a critical model with fundametal Hamiltonian Η(φ), absence of critical slowing down can only be expected if the expansion of (Η(φ+ψ)) in terms of the shift ψ contains no relevant (mass) term. We also introduce a multigrid update procedure for nonabelian lattice gauge theory and study the acceptance rates for gauge group SU(2) in four dimensions. (orig.)

  3. Neural Adaptive Sequential Monte Carlo

    OpenAIRE

    Gu, Shixiang; Ghahramani, Zoubin; Turner, Richard E

    2015-01-01

    Sequential Monte Carlo (SMC), or particle filtering, is a popular class of methods for sampling from an intractable target distribution using a sequence of simpler intermediate distributions. Like other importance sampling-based methods, performance is critically dependent on the proposal distribution: a bad proposal can lead to arbitrarily inaccurate estimates of the target distribution. This paper presents a new method for automatically adapting the proposal using an approximation of the Ku...

  4. Adaptive Multilevel Monte Carlo Simulation

    KAUST Repository

    Hoel, H

    2011-08-23

    This work generalizes a multilevel forward Euler Monte Carlo method introduced in Michael B. Giles. (Michael Giles. Oper. Res. 56(3):607–617, 2008.) for the approximation of expected values depending on the solution to an Itô stochastic differential equation. The work (Michael Giles. Oper. Res. 56(3):607– 617, 2008.) proposed and analyzed a forward Euler multilevelMonte Carlo method based on a hierarchy of uniform time discretizations and control variates to reduce the computational effort required by a standard, single level, Forward Euler Monte Carlo method. This work introduces an adaptive hierarchy of non uniform time discretizations, generated by an adaptive algorithmintroduced in (AnnaDzougoutov et al. Raùl Tempone. Adaptive Monte Carlo algorithms for stopped diffusion. In Multiscale methods in science and engineering, volume 44 of Lect. Notes Comput. Sci. Eng., pages 59–88. Springer, Berlin, 2005; Kyoung-Sook Moon et al. Stoch. Anal. Appl. 23(3):511–558, 2005; Kyoung-Sook Moon et al. An adaptive algorithm for ordinary, stochastic and partial differential equations. In Recent advances in adaptive computation, volume 383 of Contemp. Math., pages 325–343. Amer. Math. Soc., Providence, RI, 2005.). This form of the adaptive algorithm generates stochastic, path dependent, time steps and is based on a posteriori error expansions first developed in (Anders Szepessy et al. Comm. Pure Appl. Math. 54(10):1169– 1214, 2001). Our numerical results for a stopped diffusion problem, exhibit savings in the computational cost to achieve an accuracy of ϑ(TOL),from(TOL−3), from using a single level version of the adaptive algorithm to ϑ(((TOL−1)log(TOL))2).

  5. Monomial Gamma Monte Carlo Sampling

    OpenAIRE

    Zhang, Yizhe; Wang, Xiangyu; Chen, Changyou; Fan, Kai; Carin, Lawrence

    2016-01-01

    We unify slice sampling and Hamiltonian Monte Carlo (HMC) sampling by demonstrating their connection under the canonical transformation from Hamiltonian mechanics. This insight enables us to extend HMC and slice sampling to a broader family of samplers, called monomial Gamma samplers (MGS). We analyze theoretically the mixing performance of such samplers by proving that the MGS draws samples from a target distribution with zero-autocorrelation, in the limit of a single parameter. This propert...

  6. Monte Carlo simulation of a prototype photodetector used in radiotherapy

    CERN Document Server

    Kausch, C; Albers, D; Schmidt, R; Schreiber, B

    2000-01-01

    The imaging performance of prototype electronic portal imaging devices (EPID) has been investigated. Monte Carlo simulations have been applied to calculate the modulation transfer function (MTF( f )), the noise power spectrum (NPS( f )) and the detective quantum efficiency (DQE( f )) for different new type of EPIDs, which consist of a detector combination of metal or polyethylene (PE), a phosphor layer of Gd sub 2 O sub 2 S and a flat array of photodiodes. The simulated results agree well with measurements. Based on simulated results, possible optimization of these devices is discussed.

  7. Melting of Single Lipid Components in Binary Lipid Mixtures: A Comparison between FTIR Spectroscopy, DSC and Monte Carlo Simulations

    CERN Document Server

    Fidorra, M; Seeger, H M

    2007-01-01

    Monte Carlo (MC) Simulations, Differential Scanning Calorimetry (DSC) and Fourier Transform InfraRed (FTIR) spectroscopy were used to study the melting behavior of single lipid components in two-component membranes of 1,2-Dimyristoyl-D54-sn-Glycero-3-Phosphocholine (DMPC-d54) and 1,2-Distearoyl-sn-Glycero-3-Phosphocholine (DSPC). Microscopic information on the temperature dependent melting of the single lipid species could be investigated using FTIR. The microscopic behavior measured could be well described by the results from the MC simulations. These simulations also allowed to calculate heat capacity profiles as determined with DSC. These ones provide macroscopic information about melting enthalpies and entropy changes which are not accessible with FTIR. Therefore, the MC simulations allowed us to link the two different experimental approaches of FTIR and DSC.

  8. Cuartel San Carlos. Yacimiento veterano

    Directory of Open Access Journals (Sweden)

    Mariana Flores

    2007-01-01

    Full Text Available El Cuartel San Carlos es un monumento histórico nacional (1986 de finales del siglo XVIII (1785-1790, caracterizado por sufrir diversas adversidades en su construcción y soportar los terremotos de 1812 y 1900. En el año 2006, el organismo encargado de su custodia, el Instituto de Patrimonio Cultural del Ministerio de Cultura, ejecutó tres etapas de exploración arqueológica, que abarcaron las áreas Traspatio, Patio Central y las Naves Este y Oeste de la edificación. Este trabajo reseña el análisis de la documentación arqueológica obtenida en el sitio, a partir de la realización de dicho proyecto, denominado EACUSAC (Estudio Arqueológico del Cuartel San Carlos, que representa además, la tercera campaña realizada en el sitio. La importancia de este yacimiento histórico, radica en su participación en los acontecimientos que propiciaron conflictos de poder durante el surgimiento de la República y en los sucesos políticos del siglo XX. De igual manera, se encontró en el sitio una amplia muestra de materiales arqueológicos que reseñan un estilo de vida cotidiana militar, así como las dinámicas sociales internas ocurridas en el San Carlos, como lugar estratégico para la defensa de los diferentes regímenes que atravesó el país, desde la época del imperialismo español hasta nuestros días.

  9. Carlos Restrepo. Un verdadero Maestro

    OpenAIRE

    Pelayo Correa

    2009-01-01

    Carlos Restrepo fue el primer profesor de Patología y un miembro ilustre del grupo de pioneros que fundaron la Facultad de Medicina de la Universidad del Valle. Estos pioneros convergieron en Cali en la década de 1950, en posesión de un espíritu renovador y creativo que emprendió con mucho éxito la labor de cambiar la cultura académica del Valle del Cauca. Ellos encontraron una sociedad apacible, que disfrutaba de la generosidad de su entorno, sin deseos de romper las tradiciones centenarias...

  10. Monte Carlo primer for health physicists

    International Nuclear Information System (INIS)

    The basic ideas and principles of Monte Carlo calculations are presented in the form of a primer for health physicists. A simple integral with a known answer is evaluated by two different Monte Carlo approaches. Random number, which underlie Monte Carlo work, are discussed, and a sample table of random numbers generated by a hand calculator is presented. Monte Carlo calculations of dose and linear energy transfer (LET) from 100-keV neutrons incident on a tissue slab are discussed. The random-number table is used in a hand calculation of the initial sequence of events for a 100-keV neutron entering the slab. Some pitfalls in Monte Carlo work are described. While this primer addresses mainly the bare bones of Monte Carlo, a final section briefly describes some of the more sophisticated techniques used in practice to reduce variance and computing time

  11. Monte Carlo Treatment Planning for Advanced Radiotherapy

    DEFF Research Database (Denmark)

    Cronholm, Rickard

    for commissioning of a Monte Carlo model of a medical linear accelerator, ensuring agreement with measurements within 1% for a range of situations, is presented. The resulting Monte Carlo model was validated against measurements for a wider range of situations, including small field output factors, and agreement...... modulating the intensity of the field during the irradiation. The workflow described has the potential to fully model the dynamic delivery, including gantry rotation during irradiation, of modern radiotherapy. Three corner stones of Monte Carlo Treatment Planning are identified: Building, commissioning...... and validation of a Monte Carlo model of a medical linear accelerator (i), converting a CT scan of a patient to a Monte Carlo compliant phantom (ii) and translating the treatment plan parameters (including beam energy, angles of incidence, collimator settings etc) to a Monte Carlo input file (iii). A protocol...

  12. Parameter estimation in deformable models using Markov chain Monte Carlo

    Science.gov (United States)

    Chalana, Vikram; Haynor, David R.; Sampson, Paul D.; Kim, Yongmin

    1997-04-01

    Deformable models have gained much popularity recently for many applications in medical imaging, such as image segmentation, image reconstruction, and image registration. Such models are very powerful because various kinds of information can be integrated together in an elegant statistical framework. Each such piece of information is typically associated with a user-defined parameter. The values of these parameters can have a significant effect on the results generated using these models. Despite the popularity of deformable models for various applications, not much attention has been paid to the estimation of these parameters. In this paper we describe systematic methods for the automatic estimation of these deformable model parameters. These methods are derived by posing the deformable models as a Bayesian inference problem. Our parameter estimation methods use Markov chain Monte Carlo methods for generating samples from highly complex probability distributions.

  13. Mean field simulation for Monte Carlo integration

    CERN Document Server

    Del Moral, Pierre

    2013-01-01

    In the last three decades, there has been a dramatic increase in the use of interacting particle methods as a powerful tool in real-world applications of Monte Carlo simulation in computational physics, population biology, computer sciences, and statistical machine learning. Ideally suited to parallel and distributed computation, these advanced particle algorithms include nonlinear interacting jump diffusions; quantum, diffusion, and resampled Monte Carlo methods; Feynman-Kac particle models; genetic and evolutionary algorithms; sequential Monte Carlo methods; adaptive and interacting Marko

  14. Multidimensional stochastic approximation Monte Carlo.

    Science.gov (United States)

    Zablotskiy, Sergey V; Ivanov, Victor A; Paul, Wolfgang

    2016-06-01

    Stochastic Approximation Monte Carlo (SAMC) has been established as a mathematically founded powerful flat-histogram Monte Carlo method, used to determine the density of states, g(E), of a model system. We show here how it can be generalized for the determination of multidimensional probability distributions (or equivalently densities of states) of macroscopic or mesoscopic variables defined on the space of microstates of a statistical mechanical system. This establishes this method as a systematic way for coarse graining a model system, or, in other words, for performing a renormalization group step on a model. We discuss the formulation of the Kadanoff block spin transformation and the coarse-graining procedure for polymer models in this language. We also apply it to a standard case in the literature of two-dimensional densities of states, where two competing energetic effects are present g(E_{1},E_{2}). We show when and why care has to be exercised when obtaining the microcanonical density of states g(E_{1}+E_{2}) from g(E_{1},E_{2}). PMID:27415383

  15. A Monte Carlo Model of Light Propagation in Nontransparent Tissue

    Institute of Scientific and Technical Information of China (English)

    姚建铨; 朱水泉; 胡海峰; 王瑞康

    2004-01-01

    To sharpen the imaging of structures, it is vital to develop a convenient and efficient quantitative algorithm of the optical coherence tomography (OCT) sampling. In this paper a new Monte Carlo model is set up and how light propagates in bio-tissue is analyzed in virtue of mathematics and physics equations. The relations,in which light intensity of Class 1 and Class 2 light with different wavelengths changes with their permeation depth,and in which Class 1 light intensity (signal light intensity) changes with the probing depth, and in which angularly resolved diffuse reflectance and diffuse transmittance change with the exiting angle, are studied. The results show that Monte Carlo simulation results are consistent with the theory data.

  16. 1-D EQUILIBRIUM DISCRETE DIFFUSION MONTE CARLO

    Energy Technology Data Exchange (ETDEWEB)

    T. EVANS; ET AL

    2000-08-01

    We present a new hybrid Monte Carlo method for 1-D equilibrium diffusion problems in which the radiation field coexists with matter in local thermodynamic equilibrium. This method, the Equilibrium Discrete Diffusion Monte Carlo (EqDDMC) method, combines Monte Carlo particles with spatially discrete diffusion solutions. We verify the EqDDMC method with computational results from three slab problems. The EqDDMC method represents an incremental step toward applying this hybrid methodology to non-equilibrium diffusion, where it could be simultaneously coupled to Monte Carlo transport.

  17. Comparative evaluation of photon cross section libraries for materials of interest in PET Monte Carlo simulations

    CERN Document Server

    Zaidi, H

    1999-01-01

    the many applications of Monte Carlo modelling in nuclear medicine imaging make it desirable to increase the accuracy and computational speed of Monte Carlo codes. The accuracy of Monte Carlo simulations strongly depends on the accuracy in the probability functions and thus on the cross section libraries used for photon transport calculations. A comparison between different photon cross section libraries and parametrizations implemented in Monte Carlo simulation packages developed for positron emission tomography and the most recent Evaluated Photon Data Library (EPDL97) developed by the Lawrence Livermore National Laboratory was performed for several human tissues and common detector materials for energies from 1 keV to 1 MeV. Different photon cross section libraries and parametrizations show quite large variations as compared to the EPDL97 coefficients. This latter library is more accurate and was carefully designed in the form of look-up tables providing efficient data storage, access, and management. Toge...

  18. El lenguaje de Carlos Alonso

    Directory of Open Access Journals (Sweden)

    Bárbara Bustamante

    2005-10-01

    Full Text Available El talento de Carlos Alonso (Argentina, 1929 ha logrado conquistar un lenguaje con estilo propio. La creación de dibujos, pinturas, pasteles y tintas, collages y grabados fijaron en el campo visual la proyección de su subjetividad. Tanto la imagen como la palabra explicitan una visión crítica de la realidad, que tensiona al espectador obligándolo a una condición reflexiva y comprometida con el mensaje; este es el aspecto más destacado por los historiadores del arte. Sin embargo, la presente investigación pretende focalizar aspectos icónicos y plásticos de su hacer.

  19. Monte Carlo lattice program KIM

    International Nuclear Information System (INIS)

    The Monte Carlo program KIM solves the steady-state linear neutron transport equation for a fixed-source problem or, by successive fixed-source runs, for the eigenvalue problem, in a two-dimensional thermal reactor lattice. Fluxes and reaction rates are the main quantities computed by the program, from which power distribution and few-group averaged cross sections are derived. The simulation ranges from 10 MeV to zero and includes anisotropic and inelastic scattering in the fast energy region, the epithermal Doppler broadening of the resonances of some nuclides, and the thermalization phenomenon by taking into account the thermal velocity distribution of some molecules. Besides the well known combinatorial geometry, the program allows complex configurations to be represented by a discrete set of points, an approach greatly improving calculation speed

  20. General Monte Carlo code MONK

    International Nuclear Information System (INIS)

    The Monte Carlo code MONK is a general program written to provide a high degree of flexibility to the user. MONK is distinguished by its detailed representation of nuclear data in point form i.e., the cross-section is tabulated at specific energies instead of the more usual group representation. The nuclear data are unadjusted in the point form but recently the code has been modified to accept adjusted group data as used in fast and thermal reactor applications. The various geometrical handling capabilities and importance sampling techniques are described. In addition to the nuclear data aspects, the following features are also described; geometrical handling routines, tracking cycles, neutron source and output facilities. 12 references. (U.S.)

  1. Challenges of Monte Carlo Transport

    Energy Technology Data Exchange (ETDEWEB)

    Long, Alex Roberts [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). Computational Physics and Methods (CCS-2)

    2016-06-10

    These are slides from a presentation for Parallel Summer School at Los Alamos National Laboratory. Solving discretized partial differential equations (PDEs) of interest can require a large number of computations. We can identify concurrency to allow parallel solution of discrete PDEs. Simulated particles histories can be used to solve the Boltzmann transport equation. Particle histories are independent in neutral particle transport, making them amenable to parallel computation. Physical parameters and method type determine the data dependencies of particle histories. Data requirements shape parallel algorithms for Monte Carlo. Then, Parallel Computational Physics and Parallel Monte Carlo are discussed and, finally, the results are given. The mesh passing method greatly simplifies the IMC implementation and allows simple load-balancing. Using MPI windows and passive, one-sided RMA further simplifies the implementation by removing target synchronization. The author is very interested in implementations of PGAS that may allow further optimization for one-sided, read-only memory access (e.g. Open SHMEM). The MPICH_RMA_OVER_DMAPP option and library is required to make one-sided messaging scale on Trinitite - Moonlight scales poorly. Interconnect specific libraries or functions are likely necessary to ensure performance. BRANSON has been used to directly compare the current standard method to a proposed method on idealized problems. The mesh passing algorithm performs well on problems that are designed to show the scalability of the particle passing method. BRANSON can now run load-imbalanced, dynamic problems. Potential avenues of improvement in the mesh passing algorithm will be implemented and explored. A suite of test problems that stress DD methods will elucidate a possible path forward for production codes.

  2. Challenges of Monte Carlo Transport

    Energy Technology Data Exchange (ETDEWEB)

    Long, Alex Roberts [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). Computational Physics and Methods (CCS-2)

    2016-06-10

    These are slides from a presentation for Parallel Summer School at Los Alamos National Laboratory. Solving discretized partial differential equations (PDEs) of interest can require a large number of computations. We can identify concurrency to allow parallel solution of discrete PDEs. Simulated particles histories can be used to solve the Boltzmann transport equation. Particle histories are independent in neutral particle transport, making them amenable to parallel computation. Physical parameters and method type determine the data dependencies of particle histories. Data requirements shape parallel algorithms for Monte Carlo. Then, Parallel Computational Physics and Parallel Monte Carlo are discussed and finally the results are given. The mesh passing method greatly simplifies the IMC implementation and allows simple load-balancing. Using MPI windows and passive, one-sided RMA further simplifies the implementation by removing target synchronization. The author is very interested in implementations of PGAS that may allow further optimization for one-sided, read-only memory access (e.g. Open SHMEM). The MPICH_RMA_OVER_DMAPP option and library is required to make one-sided messaging scale on Trinitite - Moonlight scales poorly. Interconnect specific libraries or functions are likely necessary to ensure performance. BRANSON has been used to directly compare the current standard method to a proposed method on idealized problems. The mesh passing algorithm performs well on problems that are designed to show the scalability of the particle passing method. BRANSON can now run load-imbalanced, dynamic problems. Potential avenues of improvement in the mesh passing algorithm will be implemented and explored. A suite of test problems that stress DD methods will elucidate a possible path forward for production codes.

  3. Carlos Restrepo. Un verdadero Maestro

    Directory of Open Access Journals (Sweden)

    Pelayo Correa

    2009-12-01

    Full Text Available Carlos Restrepo fue el primer profesor de Patología y un miembro ilustre del grupo de pioneros que fundaron la Facultad de Medicina de la Universidad del Valle. Estos pioneros convergieron en Cali en la década de 1950, en posesión de un espíritu renovador y creativo que emprendió con mucho éxito la labor de cambiar la cultura académica del Valle del Cauca. Ellos encontraron una sociedad apacible, que disfrutaba de la generosidad de su entorno, sin deseos de romper las tradiciones centenarias de estilo de vida sencillo y satisfecho. Cuando los hijos tenían el deseo y la capacidad de seguir estudios universitarios, especialmente en el área de la medicina, la familia los enviaba a climas menos cálidos, que supuestamente favorecían la función cerebral y la acumulación de conocimientos. Los pioneros de la educación médica en el Valle del Cauca, en buena parte reclutados en universidades nacionales y extranjeras, sabían muy bien que el ambiente vallecaucano no impide una formación universitaria de primera clase. Carlos Restrepo era prototipo del espíritu de cambio y formación intelectual de las nuevas generaciones. Lo manifestaba de múltiples maneras, en buena parte con su genio alegre, extrovertido, optimista, de risa fácil y contagiosa. Pero esta fase amable de su personalidad no ocultaba su tarea formativa; exigía de sus discípulos dedicación y trabajo duro, con fidelidad expresados en memorables caricaturas que exageraban su genio ocasionalmente explosivo. El grupo de pioneros se enfocó con un espíritu de total entrega (tiempo completo y dedicación exclusiva y organizó la nueva Facultad en bien definidos y estructurados departamentos: Anatomía, Bioquímica, Fisiología, Farmacología, Patología, Medicina Interna, Cirugía, Obstetricia y Ginecología, Psiquiatría y Medicina Preventiva. Los departamentos integraron sus funciones primordiales en la enseñanza, la investigación y el servicio a la comunidad. El centro

  4. Comparison of an adaptive neuro-fuzzy inference system and an artificial neural network in the cross-talk correction of simultaneous 99 m Tc / 201Tl SPECT imaging using a GATE Monte-Carlo simulation

    Science.gov (United States)

    Heidary, Saeed; Setayeshi, Saeed; Ghannadi-Maragheh, Mohammad

    2014-09-01

    The aim of this study is to compare the adaptive neuro-fuzzy inference system (ANFIS) and the artificial neural network (ANN) to estimate the cross-talk contamination of 99 m Tc / 201 Tl image acquisition in the 201 Tl energy window (77 ± 15% keV). GATE (Geant4 Application in Emission and Tomography) is employed due to its ability to simulate multiple radioactive sources concurrently. Two kinds of phantoms, including two digital and one physical phantom, are used. In the real and the simulation studies, data acquisition is carried out using eight energy windows. The ANN and the ANFIS are prepared in MATLAB, and the GATE results are used as a training data set. Three indications are evaluated and compared. The ANFIS method yields better outcomes for two indications (Spearman's rank correlation coefficient and contrast) and the two phantom results in each category. The maximum image biasing, which is the third indication, is found to be 6% more than that for the ANN.

  5. Geant4 Simulation of Neutron Penumbral Imaging

    Institute of Scientific and Technical Information of China (English)

    ZHENG; Yu-lai; WANG; Qiang; YANG; Lu; LI; Yong

    2012-01-01

    <正>The penumbral imaging technology is effective analysis method of Inertial Confinement Fusion (ICF) neutron imaging. To meet neutron penumbral imaging need, simulation of neutron transport in penumbral imaging systems was done by using Monte Carlo program Geant4, and two-dimensional image was got.

  6. The Nuclear Reactor Imaging by Cosmic Ray Muons Tomography with Monte Carlo Simulation%反应堆宇宙线缪子成像蒙特卡罗模拟研究

    Institute of Scientific and Technical Information of China (English)

    以恒冠; 岳晓光; 曾志; 于百蕙; 赵自然; 王学武; 程建平; 王义; 曾鸣; 罗志飞

    2014-01-01

    Cosmic ray muons tomography can be utilized to image the Generation II pressurized water nuclear re -actor core , even the status of the nuclear reactor cores cannot be detected and monitored with conventional methods under severe nuclear accidents .In this paper , the detailed simulation model based on the major struc-tures of pressurized water reactor was established in Geant 4 program.The reconstruction images of nuclear reac-tor cores were obtained and the image noise reduction processing was applied .Simulation results demonstrate that high-Z materials in the nuclear reactor cores can be identified clearly with cosmic ray muons tomography . It is feasible to use large zenith angle cosmic ray muons tomography to monitor the pressurized water nuclear re -actor cores .This method can be realized with several 8 m ×8 m position sensitive detectors in about 3 months .%宇宙线缪子成像可对第二代反应堆压水堆( PWR)堆芯进行成像,即使在严重核事故下,常规方法无法监测时,仍可探知堆芯状态,了解堆芯情况。论文基于PWR主要结构参数建立详细的模拟模型,通过Geant4程序进行模拟,对反应堆堆芯进行图像重建,并对图像进行降噪处理。研究结果表明,宇宙线缪子可对堆芯高Z材料成像,核燃料轮廓清晰可见,利用大角度宇宙线缪子对PWR堆芯进行成像、对堆芯状态进行监控的方法可行。若要实现这种方法,使用多个8 m×8 m的大面积位置灵敏探测器,在3个月内可以实现。

  7. MATLAB platform for Monte Carlo planning and dosimetry experimental evaluation

    International Nuclear Information System (INIS)

    A new platform for the full Monte Carlo planning and an independent experimental evaluation that it can be integrated into clinical practice. The tool has proved its usefulness and efficiency and now forms part of the flow of work of our research group, the tool used for the generation of results, which are to be suitably revised and are being published. This software is an effort of integration of numerous algorithms of image processing, along with planning optimization algorithms, allowing the process of MCTP planning from a single interface. In addition, becomes a flexible and accurate tool for the evaluation of experimental dosimetric data for the quality control of actual treatments. (Author)

  8. Monte carlo simulation for soot dynamics

    KAUST Repository

    Zhou, Kun

    2012-01-01

    A new Monte Carlo method termed Comb-like frame Monte Carlo is developed to simulate the soot dynamics. Detailed stochastic error analysis is provided. Comb-like frame Monte Carlo is coupled with the gas phase solver Chemkin II to simulate soot formation in a 1-D premixed burner stabilized flame. The simulated soot number density, volume fraction, and particle size distribution all agree well with the measurement available in literature. The origin of the bimodal distribution of particle size distribution is revealed with quantitative proof.

  9. Fast quantum Monte Carlo on a GPU

    CERN Document Server

    Lutsyshyn, Y

    2013-01-01

    We present a scheme for the parallelization of quantum Monte Carlo on graphical processing units, focusing on bosonic systems and variational Monte Carlo. We use asynchronous execution schemes with shared memory persistence, and obtain an excellent acceleration. Comparing with single core execution, GPU-accelerated code runs over x100 faster. The CUDA code is provided along with the package that is necessary to execute variational Monte Carlo for a system representing liquid helium-4. The program was benchmarked on several models of Nvidia GPU, including Fermi GTX560 and M2090, and the latest Kepler architecture K20 GPU. Kepler-specific optimization is discussed.

  10. Advanced computers and Monte Carlo

    International Nuclear Information System (INIS)

    High-performance parallelism that is currently available is synchronous in nature. It is manifested in such architectures as Burroughs ILLIAC-IV, CDC STAR-100, TI ASC, CRI CRAY-1, ICL DAP, and many special-purpose array processors designed for signal processing. This form of parallelism has apparently not been of significant value to many important Monte Carlo calculations. Nevertheless, there is much asynchronous parallelism in many of these calculations. A model of a production code that requires up to 20 hours per problem on a CDC 7600 is studied for suitability on some asynchronous architectures that are on the drawing board. The code is described and some of its properties and resource requirements ae identified to compare with corresponding properties and resource requirements are identified to compare with corresponding properties and resource requirements are identified to compare with corresponding properties and resources of some asynchronous multiprocessor architectures. Arguments are made for programer aids and special syntax to identify and support important asynchronous parallelism. 2 figures, 5 tables

  11. A Monte-Carlo-Based Network Method for Source Positioning in Bioluminescence Tomography

    OpenAIRE

    Zhun Xu; Xiaolei Song; Xiaomeng Zhang; Jing Bai

    2007-01-01

    We present an approach based on the improved Levenberg Marquardt (LM) algorithm of backpropagation (BP) neural network to estimate the light source position in bioluminescent imaging. For solving the forward problem, the table-based random sampling algorithm (TBRS), a fast Monte Carlo simulation method ...

  12. 11th International Conference on Monte Carlo and Quasi-Monte Carlo Methods in Scientific Computing

    CERN Document Server

    Nuyens, Dirk

    2016-01-01

    This book presents the refereed proceedings of the Eleventh International Conference on Monte Carlo and Quasi-Monte Carlo Methods in Scientific Computing that was held at the University of Leuven (Belgium) in April 2014. These biennial conferences are major events for Monte Carlo and quasi-Monte Carlo researchers. The proceedings include articles based on invited lectures as well as carefully selected contributed papers on all theoretical aspects and applications of Monte Carlo and quasi-Monte Carlo methods. Offering information on the latest developments in these very active areas, this book is an excellent reference resource for theoreticians and practitioners interested in solving high-dimensional computational problems, arising, in particular, in finance, statistics and computer graphics.

  13. Monte Carlo methods for particle transport

    CERN Document Server

    Haghighat, Alireza

    2015-01-01

    The Monte Carlo method has become the de facto standard in radiation transport. Although powerful, if not understood and used appropriately, the method can give misleading results. Monte Carlo Methods for Particle Transport teaches appropriate use of the Monte Carlo method, explaining the method's fundamental concepts as well as its limitations. Concise yet comprehensive, this well-organized text: * Introduces the particle importance equation and its use for variance reduction * Describes general and particle-transport-specific variance reduction techniques * Presents particle transport eigenvalue issues and methodologies to address these issues * Explores advanced formulations based on the author's research activities * Discusses parallel processing concepts and factors affecting parallel performance Featuring illustrative examples, mathematical derivations, computer algorithms, and homework problems, Monte Carlo Methods for Particle Transport provides nuclear engineers and scientists with a practical guide ...

  14. Frontiers of quantum Monte Carlo workshop: preface

    International Nuclear Information System (INIS)

    The introductory remarks, table of contents, and list of attendees are presented from the proceedings of the conference, Frontiers of Quantum Monte Carlo, which appeared in the Journal of Statistical Physics

  15. "Shaakal" Carlos kaebas arreteerija kohtusse / Margo Pajuste

    Index Scriptorium Estoniae

    Pajuste, Margo

    2006-01-01

    Ilmunud ka: Postimees : na russkom jazõke 3. juuli lk. 11. Vangistatud kurikuulus terrorist "Shaakal" Carlos alustas kohtuasja oma kunagise vahistaja vastu. Ta süüdistab Prantsusmaa luureteenistuse endist juhti inimröövis

  16. Simulation and the Monte Carlo method

    CERN Document Server

    Rubinstein, Reuven Y

    2016-01-01

    Simulation and the Monte Carlo Method, Third Edition reflects the latest developments in the field and presents a fully updated and comprehensive account of the major topics that have emerged in Monte Carlo simulation since the publication of the classic First Edition over more than a quarter of a century ago. While maintaining its accessible and intuitive approach, this revised edition features a wealth of up-to-date information that facilitates a deeper understanding of problem solving across a wide array of subject areas, such as engineering, statistics, computer science, mathematics, and the physical and life sciences. The book begins with a modernized introduction that addresses the basic concepts of probability, Markov processes, and convex optimization. Subsequent chapters discuss the dramatic changes that have occurred in the field of the Monte Carlo method, with coverage of many modern topics including: Markov Chain Monte Carlo, variance reduction techniques such as the transform likelihood ratio...

  17. Smart detectors for Monte Carlo radiative transfer

    CERN Document Server

    Baes, Maarten

    2008-01-01

    Many optimization techniques have been invented to reduce the noise that is inherent in Monte Carlo radiative transfer simulations. As the typical detectors used in Monte Carlo simulations do not take into account all the information contained in the impacting photon packages, there is still room to optimize this detection process and the corresponding estimate of the surface brightness distributions. We want to investigate how all the information contained in the distribution of impacting photon packages can be optimally used to decrease the noise in the surface brightness distributions and hence to increase the efficiency of Monte Carlo radiative transfer simulations. We demonstrate that the estimate of the surface brightness distribution in a Monte Carlo radiative transfer simulation is similar to the estimate of the density distribution in an SPH simulation. Based on this similarity, a recipe is constructed for smart detectors that take full advantage of the exact location of the impact of the photon pack...

  18. 基于锥形束CT图像采用XVMC剂量算法时射束能量的影响%The Influence of Energy on X-ray Voxel Monte Carlo Algorithm Based on Kilovoltage Cone Beam Computed Tomography Images for Dose Calculation

    Institute of Scientific and Technical Information of China (English)

    吴魁; 李光俊; 柏森

    2012-01-01

    研究在千伏级锥形束CT(CBCT)图像中不同的能量对X-ray Voxel Monte Carlo(XVMC)算法剂量计算精度的影响.采用CIRS062模体刻度CT和CBCT图像的CT值-相对电子密度表,用头颈部人体仿真模体(CDP)在相同摆位条件下分别行CT和CBCT扫描,并在CDP中模拟局部进展期鼻咽癌病例,在Monaco计划系统中设计IMRT计划,选取的能量包括6 MV和15 MV光子,用XVMC算法分别对CT和CBCT图像进行剂量计算,排除旋转摆位误差等其他因素带来的误差后对CT和CBCT计划的结果进行比较,并分析能量因素产生的影响.DVHs、靶区和危及器官受量的比较以及靶区剂量适型度和均匀性的比较均显示了CT和CBCT计划有较好的符合度,从多数评估指标来看,15 MV能量时CT和CBCT计划的偏差更小.对CT和CBCT计划的剂量分布的比较采用γ分析,标准是2 mm/2%,阈值是10%,6 MV能量时各个平面的平均通过率分别是99.3%±0.47%,15 MV能量时则是99.4%±0.44%.显示出CBCT图像重新进行相对电子密度刻度后用XVMC算法进行剂量计算时具有良好的精度,选用15 MV能量时计算结果精度更高.%This paper is to investigate how the different energy impact the accuracy of X-ray Voxel Monte Carlo (XVMC ) algorithm when it is applied for dose calculation in Kilovoltage cone beam CT(kv-CBCT) images. The CIRS model 062 was used to calibrate the CT numbers-relative electron density table of CT and CBCT images. CT and CBCT scans were performed when simulation model of human head-and-neck placed in same position to simulate locally advanced nasopharyngeal carcinoma. 6MV and 15MV photon were selected in Monaco TPS to design intensity-modulated radiotherapy( 1MRT) plans. XVMC algorithm was selected for dose calculation then the calculation results were compared and the impact of energy on the calculation accuracy was analyzed. The comparison results of dose volume histograms(DVHs), dose received by targets, organs at

  19. Monte Carlo simulation of granular fluids

    OpenAIRE

    Montanero, J. M.

    2003-01-01

    An overview of recent work on Monte Carlo simulations of a granular binary mixture is presented. The results are obtained numerically solving the Enskog equation for inelastic hard-spheres by means of an extension of the well-known direct Monte Carlo simulation (DSMC) method. The homogeneous cooling state and the stationary state reached using the Gaussian thermostat are considered. The temperature ratio, the fourth velocity moments and the velocity distribution functions are obtained for bot...

  20. Carlos II: el centenario olvidado

    Directory of Open Access Journals (Sweden)

    Luis Antonio RIBOT GARCÍA

    2009-12-01

    Full Text Available RESUMEN: A partir de una reflexión inicial sobre el fenómeno de las conmemoraciones, el autor se plantea las causas por las que el tercer centenario de la muerte de Carlos II no dará lugar a ninguna conmemoración. Con independencia de las valoraciones de todo tipo que puedan hacerse de dichas celebraciones, lo cierto es que, en este caso, tal vez hubieran permitido acercar al gran público a uno de los monarcas peor conocidos y menos valorados de la historia de España. Lo más grave, sin embargo, es que la sombra del desconocimiento y el juicio peyorativo se extienden también sobre todo su reinado. Las investigaciones sobre aquel periodo, sin embargo, a pesar de que no abundan, muestran una realidad bastante distinta, en la que la decadencia y la pérdida de la hegemonía internacional convivieron con importantes iniciativas y realizaciones políticas, tanto en el ámbito interno de la Monarquía, como en las relaciones internacionales.ABSTRACT: Parting from an initial reflection about the phenomenon of commemorations, the author ponders the causes for which the third centenary of Charles IFs death will not be the subjet of any celebrations. Besides any evaluations which might be made of these events, the truth is that, perhaps, in this case, a commemoration would have brought the general public closer to one of the least known and worst valued monarchs in the history of Spain. What is more serious, however, is the fact that the shadow of ignorance and pejorative judgement extend also over the entirety of his reign. Though scarce, research about this period shows a very different reality, in wich decadence and the loss of international hegemony cohabitated with important political initiatives and achievements, both in the monarchy's internal domain and in the international arena.

  1. Realistic PET Monte Carlo Simulation With Pixelated Block Detectors, Light Sharing, Random Coincidences and Dead-Time Modeling

    OpenAIRE

    Guérin, Bastein; Fakhri, Georges El

    2008-01-01

    We have developed and validated a realistic simulation of random coincidences, pixelated block detectors, light sharing among crystal elements and dead-time in 2D and 3D positron emission tomography (PET) imaging based on the SimSET Monte Carlo simulation software. Our simulation was validated by comparison to a Monte Carlo transport code widely used for PET modeling, GATE, and to measurements made on a PET scanner.

  2. Therapeutic Applications of Monte Carlo Calculations in Nuclear Medicine

    CERN Document Server

    Sgouros, George

    2003-01-01

    This book examines the applications of Monte Carlo (MC) calculations in therapeutic nuclear medicine, from basic principles to computer implementations of software packages and their applications in radiation dosimetry and treatment planning. It is written for nuclear medicine physicists and physicians as well as radiation oncologists, and can serve as a supplementary text for medical imaging, radiation dosimetry and nuclear engineering graduate courses in science, medical and engineering faculties. With chapters is written by recognised authorities in that particular field, the book covers the entire range of MC applications in therapeutic medical and health physics, from its use in imaging prior to therapy to dose distribution modelling targeted radiotherapy. The contributions discuss the fundamental concepts of radiation dosimetry, radiobiological aspects of targeted radionuclide therapy and the various components and steps required for implementing a dose calculation and treatment planning methodology in ...

  3. Characterization of parallel-hole collimator using Monte Carlo Simulation

    International Nuclear Information System (INIS)

    Accuracy of in vivo activity quantification improves after the correction of penetrated and scattered photons. However, accurate assessment is not possible with physical experiment. We have used Monte Carlo Simulation to accurately assess the contribution of penetrated and scattered photons in the photopeak window. Simulations were performed with Simulation of Imaging Nuclear Detectors Monte Carlo Code. The simulations were set up in such a way that it provides geometric, penetration, and scatter components after each simulation and writes binary images to a data file. These components were analyzed graphically using Microsoft Excel (Microsoft Corporation, USA). Each binary image was imported in software (ImageJ) and logarithmic transformation was applied for visual assessment of image quality, plotting profile across the center of the images and calculating full width at half maximum (FWHM) in horizontal and vertical directions. The geometric, penetration, and scatter at 140 keV for low-energy general-purpose were 93.20%, 4.13%, 2.67% respectively. Similarly, geometric, penetration, and scatter at 140 keV for low-energy high-resolution (LEHR), medium-energy general-purpose (MEGP), and high-energy general-purpose (HEGP) collimator were (94.06%, 3.39%, 2.55%), (96.42%, 1.52%, 2.06%), and (96.70%, 1.45%, 1.85%), respectively. For MEGP collimator at 245 keV photon and for HEGP collimator at 364 keV were 89.10%, 7.08%, 3.82% and 67.78%, 18.63%, 13.59%, respectively. Low-energy general-purpose and LEHR collimator is best to image 140 keV photon. HEGP can be used for 245 keV and 364 keV; however, correction for penetration and scatter must be applied if one is interested to quantify the in vivo activity of energy 364 keV. Due to heavy penetration and scattering, 511 keV photons should not be imaged with HEGP collimator

  4. The Monte Carlo code MCBEND - where it is and where it's going

    International Nuclear Information System (INIS)

    The Monte Carlo method forms a corner stone to the calculational procedures established in the UK for shielding design and assessment. The emphasis of the work in the shielding area is centred on the Monte Carlo code MCBEND. The work programme in support of the code is broadly directed towards utilisation of new hardware, the development of improved modelling algorithms, the development of new acceleration methods for specific applications and enhancements to user image. This paper summarises the current status of MCBEND and reviews developments carried out over the past two years and planned for the future. (author)

  5. Development of Monte Carlo depletion code MCDEP

    Energy Technology Data Exchange (ETDEWEB)

    Kim, K. S.; Kim, K. Y.; Lee, J. C.; Ji, S. K. [KAERI, Taejon (Korea, Republic of)

    2003-07-01

    Monte Carlo neutron transport calculation has been used to obtain a reference solution in reactor physics analysis. The typical and widely-used Monte Carlo transport code is MCNP (Monte Carlo N-Particle Transport Code) developed in Los Alamos National Laboratory. The drawbacks of Monte-Carlo transport codes are the lacks of the capacities for the depletion and temperature dependent calculations. In this research we developed MCDEP (Monte Carlo Depletion Code Package) using MCNP with the capacity of the depletion calculation. This code package is the integration of MCNP and depletion module of ORIGEN-2 using the matrix exponential method. This code package enables the automatic MCNP and depletion calculations only with the initial MCNP and MCDEP inputs prepared by users. Depletion chains were simplified for the efficiency of computing time and the treatment of short-lived nuclides without cross section data. The results of MCDEP showed that the reactivity and pin power distributions for the PWR fuel pins and assemblies are consistent with those of CASMO-3 and HELIOS.

  6. Approaching Chemical Accuracy with Quantum Monte Carlo

    CERN Document Server

    Petruzielo, F R; Umrigar, C J

    2012-01-01

    A quantum Monte Carlo study of the atomization energies for the G2 set of molecules is presented. Basis size dependence of diffusion Monte Carlo atomization energies is studied with a single determinant Slater-Jastrow trial wavefunction formed from Hartree-Fock orbitals. With the largest basis set, the mean absolute deviation from experimental atomization energies for the G2 set is 3.0 kcal/mol. Optimizing the orbitals within variational Monte Carlo improves the agreement between diffusion Monte Carlo and experiment, reducing the mean absolute deviation to 2.1 kcal/mol. Moving beyond a single determinant Slater-Jastrow trial wavefunction, diffusion Monte Carlo with a small complete active space Slater-Jastrow trial wavefunction results in near chemical accuracy. In this case, the mean absolute deviation from experimental atomization energies is 1.2 kcal/mol. It is shown from calculations on systems containing phosphorus that the accuracy can be further improved by employing a larger active space.

  7. Quantitative Monte Carlo-based holmium-166 SPECT reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Elschot, Mattijs; Smits, Maarten L. J.; Nijsen, Johannes F. W.; Lam, Marnix G. E. H.; Zonnenberg, Bernard A.; Bosch, Maurice A. A. J. van den; Jong, Hugo W. A. M. de [Department of Radiology and Nuclear Medicine, University Medical Center Utrecht, Heidelberglaan 100, 3584 CX Utrecht (Netherlands); Viergever, Max A. [Image Sciences Institute, University Medical Center Utrecht, Heidelberglaan 100, 3584 CX Utrecht (Netherlands)

    2013-11-15

    Purpose: Quantitative imaging of the radionuclide distribution is of increasing interest for microsphere radioembolization (RE) of liver malignancies, to aid treatment planning and dosimetry. For this purpose, holmium-166 ({sup 166}Ho) microspheres have been developed, which can be visualized with a gamma camera. The objective of this work is to develop and evaluate a new reconstruction method for quantitative {sup 166}Ho SPECT, including Monte Carlo-based modeling of photon contributions from the full energy spectrum.Methods: A fast Monte Carlo (MC) simulator was developed for simulation of {sup 166}Ho projection images and incorporated in a statistical reconstruction algorithm (SPECT-fMC). Photon scatter and attenuation for all photons sampled from the full {sup 166}Ho energy spectrum were modeled during reconstruction by Monte Carlo simulations. The energy- and distance-dependent collimator-detector response was modeled using precalculated convolution kernels. Phantom experiments were performed to quantitatively evaluate image contrast, image noise, count errors, and activity recovery coefficients (ARCs) of SPECT-fMC in comparison with those of an energy window-based method for correction of down-scattered high-energy photons (SPECT-DSW) and a previously presented hybrid method that combines MC simulation of photopeak scatter with energy window-based estimation of down-scattered high-energy contributions (SPECT-ppMC+DSW). Additionally, the impact of SPECT-fMC on whole-body recovered activities (A{sup est}) and estimated radiation absorbed doses was evaluated using clinical SPECT data of six {sup 166}Ho RE patients.Results: At the same noise level, SPECT-fMC images showed substantially higher contrast than SPECT-DSW and SPECT-ppMC+DSW in spheres ≥17 mm in diameter. The count error was reduced from 29% (SPECT-DSW) and 25% (SPECT-ppMC+DSW) to 12% (SPECT-fMC). ARCs in five spherical volumes of 1.96–106.21 ml were improved from 32%–63% (SPECT-DSW) and 50%–80

  8. Quantum Monte Carlo with Variable Spins

    CERN Document Server

    Melton, Cody A; Mitas, Lubos

    2016-01-01

    We investigate the inclusion of variable spins in electronic structure quantum Monte Carlo, with a focus on diffusion Monte Carlo with Hamiltonians that include spin-orbit interactions. Following our previous introduction of fixed-phase spin-orbit diffusion Monte Carlo (FPSODMC), we thoroughly discuss the details of the method and elaborate upon its technicalities. We present a proof for an upper-bound property for complex nonlocal operators, which allows for the implementation of T-moves to ensure the variational property. We discuss the time step biases associated with our particular choice of spin representation. Applications of the method are also presented for atomic and molecular systems. We calculate the binding energies and geometry of the PbH and Sn$_2$ molecules, as well as the electron affinities of the 6$p$ row elements in close agreement with experiments.

  9. Yours in Revolution: Retrofitting Carlos the Jackal

    Directory of Open Access Journals (Sweden)

    Samuel Thomas

    2013-09-01

    Full Text Available This paper explores the representation of ‘Carlos the Jackal’, the one-time ‘World’s Most Wanted Man’ and ‘International Face of Terror’ – primarily in cin-ema but also encompassing other forms of popular culture and aspects of Cold War policy-making. At the centre of the analysis is Olivier Assayas’s Carlos (2010, a transnational, five and a half hour film (first screened as a TV mini-series about the life and times of the infamous militant. Concentrating on the var-ious ways in which Assayas expresses a critical preoccupation with names and faces through complex formal composition, the project examines the play of ab-straction and embodiment that emerges from the narrativisation of terrorist vio-lence. Lastly, it seeks to engage with the hidden implications of Carlos in terms of the intertwined trajectories of formal experimentation and revolutionary politics.

  10. Monte Carlo strategies in scientific computing

    CERN Document Server

    Liu, Jun S

    2008-01-01

    This paperback edition is a reprint of the 2001 Springer edition This book provides a self-contained and up-to-date treatment of the Monte Carlo method and develops a common framework under which various Monte Carlo techniques can be "standardized" and compared Given the interdisciplinary nature of the topics and a moderate prerequisite for the reader, this book should be of interest to a broad audience of quantitative researchers such as computational biologists, computer scientists, econometricians, engineers, probabilists, and statisticians It can also be used as the textbook for a graduate-level course on Monte Carlo methods Many problems discussed in the alter chapters can be potential thesis topics for masters’ or PhD students in statistics or computer science departments Jun Liu is Professor of Statistics at Harvard University, with a courtesy Professor appointment at Harvard Biostatistics Department Professor Liu was the recipient of the 2002 COPSS Presidents' Award, the most prestigious one for sta...

  11. CosmoPMC: Cosmology Population Monte Carlo

    CERN Document Server

    Kilbinger, Martin; Cappe, Olivier; Cardoso, Jean-Francois; Fort, Gersende; Prunet, Simon; Robert, Christian P; Wraith, Darren

    2011-01-01

    We present the public release of the Bayesian sampling algorithm for cosmology, CosmoPMC (Cosmology Population Monte Carlo). CosmoPMC explores the parameter space of various cosmological probes, and also provides a robust estimate of the Bayesian evidence. CosmoPMC is based on an adaptive importance sampling method called Population Monte Carlo (PMC). Various cosmology likelihood modules are implemented, and new modules can be added easily. The importance-sampling algorithm is written in C, and fully parallelised using the Message Passing Interface (MPI). Due to very little overhead, the wall-clock time required for sampling scales approximately with the number of CPUs. The CosmoPMC package contains post-processing and plotting programs, and in addition a Monte-Carlo Markov chain (MCMC) algorithm. The sampling engine is implemented in the library pmclib, and can be used independently. The software is available for download at http://www.cosmopmc.info.

  12. SPQR: a Monte Carlo reactor kinetics code

    International Nuclear Information System (INIS)

    The SPQR Monte Carlo code has been developed to analyze fast reactor core accident problems where conventional methods are considered inadequate. The code is based on the adiabatic approximation of the quasi-static method. This initial version contains no automatic material motion or feedback. An existing Monte Carlo code is used to calculate the shape functions and the integral quantities needed in the kinetics module. Several sample problems have been devised and analyzed. Due to the large statistical uncertainty associated with the calculation of reactivity in accident simulations, the results, especially at later times, differ greatly from deterministic methods. It was also found that in large uncoupled systems, the Monte Carlo method has difficulty in handling asymmetric perturbations

  13. Correspondencia epistolar de Carlos Vicioso a Carlos Pau durante su estancia en Bicorp (Valencia [Epistolary correspondece between Carlos Vicioso and Carlos Pau during a stay in Bicorp (Valencia

    Directory of Open Access Journals (Sweden)

    Pedro Pablo Ferrer Gallego

    2012-07-01

    Full Text Available RESUMEN: Se presenta y comenta un conjunto de cartas enviadas por Carlos Vicioso a Carlos Pau durante su estancia en Bicorp (Valencia entre 1914 y 1915. Las cartas se encuentran depositadas en el Archivo Histórico del Instituto Botánico de Barcelona. Esta correspondencia epistolar marca el comienzo de la relación científica entre Vicioso y Pau, basada en un primer momento en las consultas que le hace Vicioso al de Segorbe para la determinación de las especies que a través de pliegos de herbario envía desde su estancia en la localidad valenciana. En la actualidad estos pliegos testigo se encuentran conservados en diferentes herbarios oficiales nacionales y también extranjeros, fruto del envío e intercambio de material entre Vicioso y otros botánicos de la época, principalmente con Pau, Sennen y Font Quer.ABSTRACT: Epistolary correspondece between Carlos Vicioso and Carlos Pau during a stay in Bicorp (Valencia. A set of letters sent from Carlos Vicioso to Carlos Pau during a stay in Bicorp (Valencia between 1914 and 1915 are here presented and discussed. The letters are located in the Archivo Histórico del Instituto Botánico de Barcelona. This lengthy correspondence among the authors shows the beginning of their scientific relationship. At first, the correspondence was based on the consults from Vicioso to Pau for the determination of the species which were sent from Bicorp, in the herbarium sheets. Nowadays, these witness sheets are preserved in the national and international herbaria, thanks to the botanical material exchange among Vicioso and other botanist of the time, mainly with Pau, Sennen and Font Quer.

  14. Accuracy of Monte Carlo simulations compared to in-vivo MDCT dosimetry

    Energy Technology Data Exchange (ETDEWEB)

    Bostani, Maryam, E-mail: mbostani@mednet.ucla.edu; McMillan, Kyle; Cagnon, Chris H.; McNitt-Gray, Michael F. [Departments of Biomedical Physics and Radiology, David Geffen School of Medicine, University of California, Los Angeles, Los Angeles, California 90024 (United States); Mueller, Jonathon W. [United States Air Force, Keesler Air Force Base, Biloxi, Mississippi 39534 (United States); Cody, Dianna D. [University of Texas M.D. Anderson Cancer Center, Houston, Texas 77030 (United States); DeMarco, John J. [Departments of Biomedical Physics and Radiation Oncology, David Geffen School of Medicine, University of California, Los Angeles, Los Angeles, California 90024 (United States)

    2015-02-15

    Purpose: The purpose of this study was to assess the accuracy of a Monte Carlo simulation-based method for estimating radiation dose from multidetector computed tomography (MDCT) by comparing simulated doses in ten patients to in-vivo dose measurements. Methods: MD Anderson Cancer Center Institutional Review Board approved the acquisition of in-vivo rectal dose measurements in a pilot study of ten patients undergoing virtual colonoscopy. The dose measurements were obtained by affixing TLD capsules to the inner lumen of rectal catheters. Voxelized patient models were generated from the MDCT images of the ten patients, and the dose to the TLD for all exposures was estimated using Monte Carlo based simulations. The Monte Carlo simulation results were compared to the in-vivo dose measurements to determine accuracy. Results: The calculated mean percent difference between TLD measurements and Monte Carlo simulations was −4.9% with standard deviation of 8.7% and a range of −22.7% to 5.7%. Conclusions: The results of this study demonstrate very good agreement between simulated and measured doses in-vivo. Taken together with previous validation efforts, this work demonstrates that the Monte Carlo simulation methods can provide accurate estimates of radiation dose in patients undergoing CT examinations.

  15. Analytical positron range modelling in heterogeneous media for PET Monte Carlo simulation

    Energy Technology Data Exchange (ETDEWEB)

    Lehnert, Wencke; Meikle, Steven R [Discipline of Medical Radiation Sciences, Faculty of Health Sciences, University of Sydney, PO Box 170, Lidcombe NSW 1825 (Australia); Gregoire, Marie-Claude; Reilhac, Anthonin, E-mail: wlehnert@uni.sydney.edu.au [Australian Nuclear Science and Technology Organisation, Lucas Heights NSW 2234 (Australia)

    2011-06-07

    Monte Carlo simulation codes that model positron interactions along their tortuous path are expected to be accurate but are usually slow. A simpler and potentially faster approach is to model positron range from analytical annihilation density distributions. The aims of this paper were to efficiently implement and validate such a method, with the addition of medium heterogeneity representing a further challenge. The analytical positron range model was evaluated by comparing annihilation density distributions with those produced by the Monte Carlo simulator GATE and by quantitatively analysing the final reconstructed images of Monte Carlo simulated data. In addition, the influence of positronium formation on positron range and hence on the performance of Monte Carlo simulation was investigated. The results demonstrate that 1D annihilation density distributions for different isotope-media combinations can be fitted with Gaussian functions and hence be described by simple look-up-tables of fitting coefficients. Together with the method developed for simulating positron range in heterogeneous media, this allows for efficient modelling of positron range in Monte Carlo simulation. The level of agreement of the analytical model with GATE depends somewhat on the simulated scanner and the particular research task, but appears to be suitable for lower energy positron emitters, such as {sup 18}F or {sup 11}C. No reliable conclusion about the influence of positronium formation on positron range and simulation accuracy could be drawn.

  16. Modelling photon transport in non-uniform media for SPECT with a vectorized Monte Carlo code.

    Science.gov (United States)

    Smith, M F

    1993-10-01

    A vectorized Monte Carlo code has been developed for modelling photon transport in non-uniform media for single-photon-emission computed tomography (SPECT). The code is designed to compute photon detection kernels, which are used to build system matrices for simulating SPECT projection data acquisition and for use in matrix-based image reconstruction. Non-uniform attenuating and scattering regions are constructed from simple three-dimensional geometric shapes, in which the density and mass attenuation coefficients are individually specified. On a Stellar GS1000 computer, Monte Carlo simulations are performed between 1.6 and 2.0 times faster when the vector processor is utilized than when computations are performed in scalar mode. Projection data acquired with a clinical SPECT gamma camera for a line source in a non-uniform thorax phantom are well modelled by Monte Carlo simulations. The vectorized Monte Carlo code was used to stimulate a 99Tcm SPECT myocardial perfusion study, and compensations for non-uniform attenuation and the detection of scattered photons improve activity estimation. The speed increase due to vectorization makes Monte Carlo simulation more attractive as a tool for modelling photon transport in non-uniform media for SPECT. PMID:8248288

  17. Yours in revolution : retrofitting Carlos the Jackal.

    OpenAIRE

    Samuel Thomas

    2013-01-01

    This paper explores the representation of ‘Carlos the Jackal’, the one-time ‘World’s Most Wanted Man’ and ‘International Face of Terror’ – primarily in cin-ema but also encompassing other forms of popular culture and aspects of Cold War policy-making. At the centre of the analysis is Olivier Assayas’s Carlos (2010), a transnational, five and a half hour film (first screened as a TV mini-series) about the life and times of the infamous militant. Concentrating on the var-ious ways in which Assa...

  18. Geodesic Monte Carlo on Embedded Manifolds.

    Science.gov (United States)

    Byrne, Simon; Girolami, Mark

    2013-12-01

    Markov chain Monte Carlo methods explicitly defined on the manifold of probability distributions have recently been established. These methods are constructed from diffusions across the manifold and the solution of the equations describing geodesic flows in the Hamilton-Jacobi representation. This paper takes the differential geometric basis of Markov chain Monte Carlo further by considering methods to simulate from probability distributions that themselves are defined on a manifold, with common examples being classes of distributions describing directional statistics. Proposal mechanisms are developed based on the geodesic flows over the manifolds of support for the distributions, and illustrative examples are provided for the hypersphere and Stiefel manifold of orthonormal matrices. PMID:25309024

  19. Monte Carlo dose computation for IMRT optimization*

    Science.gov (United States)

    Laub, W.; Alber, M.; Birkner, M.; Nüsslin, F.

    2000-07-01

    A method which combines the accuracy of Monte Carlo dose calculation with a finite size pencil-beam based intensity modulation optimization is presented. The pencil-beam algorithm is employed to compute the fluence element updates for a converging sequence of Monte Carlo dose distributions. The combination is shown to improve results over the pencil-beam based optimization in a lung tumour case and a head and neck case. Inhomogeneity effects like a broader penumbra and dose build-up regions can be compensated for by intensity modulation.

  20. Monte Carlo simulation of granular fluids

    CERN Document Server

    Montanero, J M

    2003-01-01

    An overview of recent work on Monte Carlo simulations of a granular binary mixture is presented. The results are obtained numerically solving the Enskog equation for inelastic hard-spheres by means of an extension of the well-known direct Monte Carlo simulation (DSMC) method. The homogeneous cooling state and the stationary state reached using the Gaussian thermostat are considered. The temperature ratio, the fourth velocity moments and the velocity distribution functions are obtained for both cases. The shear viscosity characterizing the momentum transport in the thermostatted case is calculated as well. The simulation results are compared with analytical predictions showing an excellent agreement.

  1. Monte carlo simulations of organic photovoltaics.

    Science.gov (United States)

    Groves, Chris; Greenham, Neil C

    2014-01-01

    Monte Carlo simulations are a valuable tool to model the generation, separation, and collection of charges in organic photovoltaics where charges move by hopping in a complex nanostructure and Coulomb interactions between charge carriers are important. We review the Monte Carlo techniques that have been applied to this problem, and describe the results of simulations of the various recombination processes that limit device performance. We show how these processes are influenced by the local physical and energetic structure of the material, providing information that is useful for design of efficient photovoltaic systems.

  2. Monte Carlo dose distributions for radiosurgery

    Energy Technology Data Exchange (ETDEWEB)

    Perucha, M.; Leal, A.; Rincon, M.; Carrasco, E. [Sevilla Univ. (Spain). Dept. Fisiologia Medica y Biofisica; Sanchez-Doblado, F. [Sevilla Univ. (Spain). Dept. Fisiologia Medica y Biofisica]|[Hospital Univ. Virgen Macarena, Sevilla (Spain). Servicio de Oncologia Radioterapica; Nunez, L. [Clinica Puerta de Hierro, Madrid (Spain). Servicio de Radiofisica; Arrans, R.; Sanchez-Calzado, J.A.; Errazquin, L. [Hospital Univ. Virgen Macarena, Sevilla (Spain). Servicio de Oncologia Radioterapica; Sanchez-Nieto, B. [Royal Marsden NHS Trust (United Kingdom). Joint Dept. of Physics]|[Inst. of Cancer Research, Sutton, Surrey (United Kingdom)

    2001-07-01

    The precision of Radiosurgery Treatment planning systems is limited by the approximations of their algorithms and by their dosimetrical input data. This fact is especially important in small fields. However, the Monte Carlo methods is an accurate alternative as it considers every aspect of particle transport. In this work an acoustic neurinoma is studied by comparing the dose distribution of both a planning system and Monte Carlo. Relative shifts have been measured and furthermore, Dose-Volume Histograms have been calculated for target and adjacent organs at risk. (orig.)

  3. Monte Carlo simulation of neutron scattering instruments

    International Nuclear Information System (INIS)

    A library of Monte Carlo subroutines has been developed for the purpose of design of neutron scattering instruments. Using small-angle scattering as an example, the philosophy and structure of the library are described and the programs are used to compare instruments at continuous wave (CW) and long-pulse spallation source (LPSS) neutron facilities. The Monte Carlo results give a count-rate gain of a factor between 2 and 4 using time-of-flight analysis. This is comparable to scaling arguments based on the ratio of wavelength bandwidth to resolution width

  4. Monte Carlo dose distributions for radiosurgery

    International Nuclear Information System (INIS)

    The precision of Radiosurgery Treatment planning systems is limited by the approximations of their algorithms and by their dosimetrical input data. This fact is especially important in small fields. However, the Monte Carlo methods is an accurate alternative as it considers every aspect of particle transport. In this work an acoustic neurinoma is studied by comparing the dose distribution of both a planning system and Monte Carlo. Relative shifts have been measured and furthermore, Dose-Volume Histograms have been calculated for target and adjacent organs at risk. (orig.)

  5. Monte Carlo dose calculation in dental amalgam phantom.

    Science.gov (United States)

    Aziz, Mohd Zahri Abdul; Yusoff, A L; Osman, N D; Abdullah, R; Rabaie, N A; Salikin, M S

    2015-01-01

    It has become a great challenge in the modern radiation treatment to ensure the accuracy of treatment delivery in electron beam therapy. Tissue inhomogeneity has become one of the factors for accurate dose calculation, and this requires complex algorithm calculation like Monte Carlo (MC). On the other hand, computed tomography (CT) images used in treatment planning system need to be trustful as they are the input in radiotherapy treatment. However, with the presence of metal amalgam in treatment volume, the CT images input showed prominent streak artefact, thus, contributed sources of error. Hence, metal amalgam phantom often creates streak artifacts, which cause an error in the dose calculation. Thus, a streak artifact reduction technique was applied to correct the images, and as a result, better images were observed in terms of structure delineation and density assigning. Furthermore, the amalgam density data were corrected to provide amalgam voxel with accurate density value. As for the errors of dose uncertainties due to metal amalgam, they were reduced from 46% to as low as 2% at d80 (depth of the 80% dose beyond Zmax) using the presented strategies. Considering the number of vital and radiosensitive organs in the head and the neck regions, this correction strategy is suggested in reducing calculation uncertainties through MC calculation. PMID:26500401

  6. Monte Carlo dose calculation in dental amalgam phantom.

    Science.gov (United States)

    Aziz, Mohd Zahri Abdul; Yusoff, A L; Osman, N D; Abdullah, R; Rabaie, N A; Salikin, M S

    2015-01-01

    It has become a great challenge in the modern radiation treatment to ensure the accuracy of treatment delivery in electron beam therapy. Tissue inhomogeneity has become one of the factors for accurate dose calculation, and this requires complex algorithm calculation like Monte Carlo (MC). On the other hand, computed tomography (CT) images used in treatment planning system need to be trustful as they are the input in radiotherapy treatment. However, with the presence of metal amalgam in treatment volume, the CT images input showed prominent streak artefact, thus, contributed sources of error. Hence, metal amalgam phantom often creates streak artifacts, which cause an error in the dose calculation. Thus, a streak artifact reduction technique was applied to correct the images, and as a result, better images were observed in terms of structure delineation and density assigning. Furthermore, the amalgam density data were corrected to provide amalgam voxel with accurate density value. As for the errors of dose uncertainties due to metal amalgam, they were reduced from 46% to as low as 2% at d80 (depth of the 80% dose beyond Zmax) using the presented strategies. Considering the number of vital and radiosensitive organs in the head and the neck regions, this correction strategy is suggested in reducing calculation uncertainties through MC calculation.

  7. Monte carlo dose calculation in dental amalgam phantom

    Directory of Open Access Journals (Sweden)

    Mohd Zahri Abdul Aziz

    2015-01-01

    Full Text Available It has become a great challenge in the modern radiation treatment to ensure the accuracy of treatment delivery in electron beam therapy. Tissue inhomogeneity has become one of the factors for accurate dose calculation, and this requires complex algorithm calculation like Monte Carlo (MC. On the other hand, computed tomography (CT images used in treatment planning system need to be trustful as they are the input in radiotherapy treatment. However, with the presence of metal amalgam in treatment volume, the CT images input showed prominent streak artefact, thus, contributed sources of error. Hence, metal amalgam phantom often creates streak artifacts, which cause an error in the dose calculation. Thus, a streak artifact reduction technique was applied to correct the images, and as a result, better images were observed in terms of structure delineation and density assigning. Furthermore, the amalgam density data were corrected to provide amalgam voxel with accurate density value. As for the errors of dose uncertainties due to metal amalgam, they were reduced from 46% to as low as 2% at d80 (depth of the 80% dose beyond Zmax using the presented strategies. Considering the number of vital and radiosensitive organs in the head and the neck regions, this correction strategy is suggested in reducing calculation uncertainties through MC calculation.

  8. Fast sequential Monte Carlo methods for counting and optimization

    CERN Document Server

    Rubinstein, Reuven Y; Vaisman, Radislav

    2013-01-01

    A comprehensive account of the theory and application of Monte Carlo methods Based on years of research in efficient Monte Carlo methods for estimation of rare-event probabilities, counting problems, and combinatorial optimization, Fast Sequential Monte Carlo Methods for Counting and Optimization is a complete illustration of fast sequential Monte Carlo techniques. The book provides an accessible overview of current work in the field of Monte Carlo methods, specifically sequential Monte Carlo techniques, for solving abstract counting and optimization problems. Written by authorities in the

  9. Monte Carlo methods in AB initio quantum chemistry quantum Monte Carlo for molecules

    CERN Document Server

    Lester, William A; Reynolds, PJ

    1994-01-01

    This book presents the basic theory and application of the Monte Carlo method to the electronic structure of atoms and molecules. It assumes no previous knowledge of the subject, only a knowledge of molecular quantum mechanics at the first-year graduate level. A working knowledge of traditional ab initio quantum chemistry is helpful, but not essential.Some distinguishing features of this book are: Clear exposition of the basic theory at a level to facilitate independent study. Discussion of the various versions of the theory: diffusion Monte Carlo, Green's function Monte Carlo, and release n

  10. Use of Monte Carlo Methods in brachytherapy; Uso del metodo de Monte Carlo en braquiterapia

    Energy Technology Data Exchange (ETDEWEB)

    Granero Cabanero, D.

    2015-07-01

    The Monte Carlo method has become a fundamental tool for brachytherapy dosimetry mainly because no difficulties associated with experimental dosimetry. In brachytherapy the main handicap of experimental dosimetry is the high dose gradient near the present sources making small uncertainties in the positioning of the detectors lead to large uncertainties in the dose. This presentation will review mainly the procedure for calculating dose distributions around a fountain using the Monte Carlo method showing the difficulties inherent in these calculations. In addition we will briefly review other applications of the method of Monte Carlo in brachytherapy dosimetry, as its use in advanced calculation algorithms, calculating barriers or obtaining dose applicators around. (Author)

  11. On the use of stochastic approximation Monte Carlo for Monte Carlo integration

    KAUST Repository

    Liang, Faming

    2009-03-01

    The stochastic approximation Monte Carlo (SAMC) algorithm has recently been proposed as a dynamic optimization algorithm in the literature. In this paper, we show in theory that the samples generated by SAMC can be used for Monte Carlo integration via a dynamically weighted estimator by calling some results from the literature of nonhomogeneous Markov chains. Our numerical results indicate that SAMC can yield significant savings over conventional Monte Carlo algorithms, such as the Metropolis-Hastings algorithm, for the problems for which the energy landscape is rugged. © 2008 Elsevier B.V. All rights reserved.

  12. Application of Monte Carlo methods in tomotherapy and radiation biophysics

    Science.gov (United States)

    Hsiao, Ya-Yun

    Helical tomotherapy is an attractive treatment for cancer therapy because highly conformal dose distributions can be achieved while the on-board megavoltage CT provides simultaneous images for accurate patient positioning. The convolution/superposition (C/S) dose calculation methods typically used for Tomotherapy treatment planning may overestimate skin (superficial) doses by 3-13%. Although more accurate than C/S methods, Monte Carlo (MC) simulations are too slow for routine clinical treatment planning. However, the computational requirements of MC can be reduced by developing a source model for the parts of the accelerator that do not change from patient to patient. This source model then becomes the starting point for additional simulations of the penetration of radiation through patient. In the first section of this dissertation, a source model for a helical tomotherapy is constructed by condensing information from MC simulations into series of analytical formulas. The MC calculated percentage depth dose and beam profiles computed using the source model agree within 2% of measurements for a wide range of field sizes, which suggests that the proposed source model provides an adequate representation of the tomotherapy head for dose calculations. Monte Carlo methods are a versatile technique for simulating many physical, chemical and biological processes. In the second major of this thesis, a new methodology is developed to simulate of the induction of DNA damage by low-energy photons. First, the PENELOPE Monte Carlo radiation transport code is used to estimate the spectrum of initial electrons produced by photons. The initial spectrum of electrons are then combined with DNA damage yields for monoenergetic electrons from the fast Monte Carlo damage simulation (MCDS) developed earlier by Semenenko and Stewart (Purdue University). Single- and double-strand break yields predicted by the proposed methodology are in good agreement (1%) with the results of published

  13. Poster — Thur Eve — 14: Improving Tissue Segmentation for Monte Carlo Dose Calculation using DECT

    Energy Technology Data Exchange (ETDEWEB)

    Di Salvio, A.; Bedwani, S.; Carrier, J-F. [Centre hospitalier de l' Université de Montréal (Canada); Bouchard, H. [National Physics Laboratory, Teddington (United Kingdom)

    2014-08-15

    Purpose: To improve Monte Carlo dose calculation accuracy through a new tissue segmentation technique with dual energy CT (DECT). Methods: Electron density (ED) and effective atomic number (EAN) can be extracted directly from DECT data with a stoichiometric calibration method. Images are acquired with Monte Carlo CT projections using the user code egs-cbct and reconstructed using an FDK backprojection algorithm. Calibration is performed using projections of a numerical RMI phantom. A weighted parameter algorithm then uses both EAN and ED to assign materials to voxels from DECT simulated images. This new method is compared to a standard tissue characterization from single energy CT (SECT) data using a segmented calibrated Hounsfield unit (HU) to ED curve. Both methods are compared to the reference numerical head phantom. Monte Carlo simulations on uniform phantoms of different tissues using dosxyz-nrc show discrepancies in depth-dose distributions. Results: Both SECT and DECT segmentation methods show similar performance assigning soft tissues. Performance is however improved with DECT in regions with higher density, such as bones, where it assigns materials correctly 8% more often than segmentation with SECT, considering the same set of tissues and simulated clinical CT images, i.e. including noise and reconstruction artifacts. Furthermore, Monte Carlo results indicate that kV photon beam depth-dose distributions can double between two tissues of density higher than muscle. Conclusions: A direct acquisition of ED and the added information of EAN with DECT data improves tissue segmentation and increases the accuracy of Monte Carlo dose calculation in kV photon beams.

  14. Scalable Domain Decomposed Monte Carlo Particle Transport

    Energy Technology Data Exchange (ETDEWEB)

    O' Brien, Matthew Joseph [Univ. of California, Davis, CA (United States)

    2013-12-05

    In this dissertation, we present the parallel algorithms necessary to run domain decomposed Monte Carlo particle transport on large numbers of processors (millions of processors). Previous algorithms were not scalable, and the parallel overhead became more computationally costly than the numerical simulation.

  15. Accelerating Hasenbusch's acceleration of hybrid Monte Carlo

    International Nuclear Information System (INIS)

    Hasenbusch has proposed splitting the pseudo-fermionic action into two parts, in order to speed-up Hybrid Monte Carlo simulations of QCD. We have tested a different splitting, also using clover-improved Wilson fermions. An additional speed-up between 5 and 20% over the original proposal was achieved in production runs. (orig.)

  16. A comparison of Monte Carlo generators

    CERN Document Server

    Golan, Tomasz

    2014-01-01

    A comparison of GENIE, NEUT, NUANCE, and NuWro Monte Carlo neutrino event generators is presented using a set of four observables: protons multiplicity, total visible energy, most energetic proton momentum, and $\\pi^+$ two-dimensional energy vs cosine distribution.

  17. Advances in Monte Carlo computer simulation

    Science.gov (United States)

    Swendsen, Robert H.

    2011-03-01

    Since the invention of the Metropolis method in 1953, Monte Carlo methods have been shown to provide an efficient, practical approach to the calculation of physical properties in a wide variety of systems. In this talk, I will discuss some of the advances in the MC simulation of thermodynamics systems, with an emphasis on optimization to obtain a maximum of useful information.

  18. Monte Carlo methods beyond detailed balance

    NARCIS (Netherlands)

    Schram, Raoul D.; Barkema, Gerard T.

    2015-01-01

    Monte Carlo algorithms are nearly always based on the concept of detailed balance and ergodicity. In this paper we focus on algorithms that do not satisfy detailed balance. We introduce a general method for designing non-detailed balance algorithms, starting from a conventional algorithm satisfying

  19. Using CIPSI nodes in diffusion Monte Carlo

    CERN Document Server

    Caffarel, Michel; Giner, Emmanuel; Scemama, Anthony

    2016-01-01

    Several aspects of the recently proposed DMC-CIPSI approach consisting in using selected Configuration Interaction (SCI) approaches such as CIPSI (Configuration Interaction using a Perturbative Selection done Iteratively) to build accurate nodes for diffusion Monte Carlo (DMC) calculations are presented and discussed. The main ideas are illustrated with a number of calculations for diatomics molecules and for the benchmark G1 set.

  20. A note on simultaneous Monte Carlo tests

    DEFF Research Database (Denmark)

    Hahn, Ute

    In this short note, Monte Carlo tests of goodness of fit for data of the form X(t), t ∈ I are considered, that reject the null hypothesis if X(t) leaves an acceptance region bounded by an upper and lower curve for some t in I. A construction of the acceptance region is proposed that complies to a...

  1. Monte Carlo Renormalization Group: a review

    International Nuclear Information System (INIS)

    The logic and the methods of Monte Carlo Renormalization Group (MCRG) are reviewed. A status report of results for 4-dimensional lattice gauge theories derived using MCRG is presented. Existing methods for calculating the improved action are reviewed and evaluated. The Gupta-Cordery improved MCRG method is described and compared with the standard one. 71 refs., 8 figs

  2. Juan Carlos D'Olivo: A portrait

    Science.gov (United States)

    Aguilar-Arévalo, Alexis A.

    2013-06-01

    This report attempts to give a brief bibliographical sketch of the academic life of Juan Carlos D'Olivo, researcher and teacher at the Instituto de Ciencias Nucleares of UNAM, devoted to advancing the fields of High Energy Physics and Astroparticle Physics in Mexico and Latin America.

  3. Optical monitoring of rheumatoid arthritis: Monte Carlo generated reconstruction kernels

    Science.gov (United States)

    Minet, O.; Beuthan, J.; Hielscher, A. H.; Zabarylo, U.

    2008-06-01

    Optical imaging in biomedicine is governed by the light absorption and scattering interaction on microscopic and macroscopic constituents in the medium. Therefore, light scattering characteristics of human tissue correlate with the stage of some diseases. In the near infrared range the scattering event with the coefficient approximately two orders of magnitude greater than absorption plays a dominant role. When measuring the optical parameters variations were discovered that correlate with the rheumatoid arthritis of a small joint. The potential of an experimental setup for transillumination the finger joint with a laser diode and the pattern of the stray light detection are demonstrated. The scattering caused by skin contains no useful information and it can be removed by a deconvolution technique to enhance the diagnostic value of this non-invasive optical method. Monte Carlo simulations ensure both the construction of the corresponding point spread function and both the theoretical verification of the stray light picture in rather complex geometry.

  4. Kuidas kirjutatakse ajalugu? / Carlo Ginzburg ; interv. Marek Tamm

    Index Scriptorium Estoniae

    Ginzburg, Carlo

    2007-01-01

    Ülevaade Pisa Euroopa kultuuride professori C. Ginzburg'i teostest. Varem. ilm.: Märgid, jäljed ja tõendid : intervjuu Carlo Ginzburgiga // Ginzburg, Carlo. Juust ja vaglad. - Tallinn, 2000. - Lk. 262-271

  5. Neutron stimulated emission computed tomography: a Monte Carlo simulation approach

    Energy Technology Data Exchange (ETDEWEB)

    Sharma, A C [Department of Biomedical Engineering, Duke University, 136 Hudson Hall, Durham, NC 27708 (United States); Harrawood, B P [Duke Advance Imaging Labs, Department of Radiology, 2424 Erwin Rd, Suite 302, Durham, NC 27705 (United States); Bender, J E [Department of Biomedical Engineering, Duke University, 136 Hudson Hall, Durham, NC 27708 (United States); Tourassi, G D [Duke Advance Imaging Labs, Department of Radiology, 2424 Erwin Rd, Suite 302, Durham, NC 27705 (United States); Kapadia, A J [Department of Biomedical Engineering, Duke University, 136 Hudson Hall, Durham, NC 27708 (United States)

    2007-10-21

    A Monte Carlo simulation has been developed for neutron stimulated emission computed tomography (NSECT) using the GEANT4 toolkit. NSECT is a new approach to biomedical imaging that allows spectral analysis of the elements present within the sample. In NSECT, a beam of high-energy neutrons interrogates a sample and the nuclei in the sample are stimulated to an excited state by inelastic scattering of the neutrons. The characteristic gammas emitted by the excited nuclei are captured in a spectrometer to form multi-energy spectra. Currently, a tomographic image is formed using a collimated neutron beam to define the line integral paths for the tomographic projections. These projection data are reconstructed to form a representation of the distribution of individual elements in the sample. To facilitate the development of this technique, a Monte Carlo simulation model has been constructed from the GEANT4 toolkit. This simulation includes modeling of the neutron beam source and collimation, the samples, the neutron interactions within the samples, the emission of characteristic gammas, and the detection of these gammas in a Germanium crystal. In addition, the model allows the absorbed radiation dose to be calculated for internal components of the sample. NSECT presents challenges not typically addressed in Monte Carlo modeling of high-energy physics applications. In order to address issues critical to the clinical development of NSECT, this paper will describe the GEANT4 simulation environment and three separate simulations performed to accomplish three specific aims. First, comparison of a simulation to a tomographic experiment will verify the accuracy of both the gamma energy spectra produced and the positioning of the beam relative to the sample. Second, parametric analysis of simulations performed with different user-defined variables will determine the best way to effectively model low energy neutrons in tissue, which is a concern with the high hydrogen content in

  6. Evaluation of the material assignment method used by a Monte Carlo treatment planning system.

    Science.gov (United States)

    Isambert, A; Brualla, L; Lefkopoulos, D

    2009-12-01

    An evaluation of the conversion process from Hounsfield units (HU) to material composition in computerised tomography (CT) images, employed by the Monte Carlo based treatment planning system ISOgray (DOSIsoft), is presented. A boundary in the HU for the material conversion between "air" and "lung" materials was determined based on a study using 22 patients. The dosimetric consequence of the new boundary was quantitatively evaluated for a lung patient plan.

  7. Convergence measure and some parallel aspects of Markov-chain Monte Carlo algorithms

    Science.gov (United States)

    Malfait, Maurits J.; Roose, Dirk; Vandermeulen, Dirk

    1993-10-01

    We examine methods to assess the convergence of Markov chain Monte Carlo (MCMC) algorithms and to accelerate their execution via parallel computing. We propose a convergence measure based on the deviations between simultaneously running MCMC algorithms. We also examine the acceleration of MCMC algorithms when independent parallel sampler are used and report on some experiments with coupled samplers. As applications we use small Ising model simulations and a larger medical image processing algorithm.

  8. Monte Carlo simulations of fluid vesicles

    Science.gov (United States)

    Sreeja, K. K.; Ipsen, John H.; Kumar, P. B. Sunil

    2015-07-01

    Lipid vesicles are closed two dimensional fluid surfaces that are studied extensively as model systems for understanding the physical properties of biological membranes. Here we review the recent developments in the Monte Carlo techniques for simulating fluid vesicles and discuss some of their applications. The technique, which treats the membrane as an elastic sheet, is most suitable for the study of large scale conformations of membranes. The model can be used to study vesicles with fixed and varying topologies. Here we focus on the case of multi-component membranes with the local lipid and protein composition coupled to the membrane curvature leading to a variety of shapes. The phase diagram is more intriguing in the case of fluid vesicles having an in-plane orientational order that induce anisotropic directional curvatures. Methods to explore the steady state morphological structures due to active flux of materials have also been described in the context of Monte Carlo simulations.

  9. Hybrid Monte Carlo with Chaotic Mixing

    CERN Document Server

    Kadakia, Nirag

    2016-01-01

    We propose a hybrid Monte Carlo (HMC) technique applicable to high-dimensional multivariate normal distributions that effectively samples along chaotic trajectories. The method is predicated on the freedom of choice of the HMC momentum distribution, and due to its mixing properties, exhibits sample-to-sample autocorrelations that decay far faster than those in the traditional hybrid Monte Carlo algorithm. We test the methods on distributions of varying correlation structure, finding that the proposed technique produces superior covariance estimates, is less reliant on step-size tuning, and can even function with sparse or no momentum re-sampling. The method presented here is promising for more general distributions, such as those that arise in Bayesian learning of artificial neural networks and in the state and parameter estimation of dynamical systems.

  10. Monte Carlo Simulation for Particle Detectors

    CERN Document Server

    Pia, Maria Grazia

    2012-01-01

    Monte Carlo simulation is an essential component of experimental particle physics in all the phases of its life-cycle: the investigation of the physics reach of detector concepts, the design of facilities and detectors, the development and optimization of data reconstruction software, the data analysis for the production of physics results. This note briefly outlines some research topics related to Monte Carlo simulation, that are relevant to future experimental perspectives in particle physics. The focus is on physics aspects: conceptual progress beyond current particle transport schemes, the incorporation of materials science knowledge relevant to novel detection technologies, functionality to model radiation damage, the capability for multi-scale simulation, quantitative validation and uncertainty quantification to determine the predictive power of simulation. The R&D on simulation for future detectors would profit from cooperation within various components of the particle physics community, and synerg...

  11. PHOTOS Monte Carlo and its theoretical accuracy

    CERN Document Server

    Was, Z; Nanava, G

    2008-01-01

    Because of properties of QED, the bremsstrahlung corrections to decays of particles or resonances can be calculated, with a good precision, separately from other effects. Thanks to the widespread use of event records such calculations can be embodied into a separate module of Monte Carlo simulation chains, as used in High Energy Experiments of today. The PHOTOS Monte Carlo program is used for this purpose since nearly 20 years now. In the following talk let us review the main ideas and constraints which shaped the program version of today and enabled it widespread use. Finally, we will underline importance of aspects related to reliability of program results: event record contents and implementation of channel specific matrix elements.

  12. Composite biasing in Monte Carlo radiative transfer

    CERN Document Server

    Baes, Maarten; Lunttila, Tuomas; Bianchi, Simone; Camps, Peter; Juvela, Mika; Kuiper, Rolf

    2016-01-01

    Biasing or importance sampling is a powerful technique in Monte Carlo radiative transfer, and can be applied in different forms to increase the accuracy and efficiency of simulations. One of the drawbacks of the use of biasing is the potential introduction of large weight factors. We discuss a general strategy, composite biasing, to suppress the appearance of large weight factors. We use this composite biasing approach for two different problems faced by current state-of-the-art Monte Carlo radiative transfer codes: the generation of photon packages from multiple components, and the penetration of radiation through high optical depth barriers. In both cases, the implementation of the relevant algorithms is trivial and does not interfere with any other optimisation techniques. Through simple test models, we demonstrate the general applicability, accuracy and efficiency of the composite biasing approach. In particular, for the penetration of high optical depths, the gain in efficiency is spectacular for the spe...

  13. No-compromise reptation quantum Monte Carlo

    International Nuclear Information System (INIS)

    Since its publication, the reptation quantum Monte Carlo algorithm of Baroni and Moroni (1999 Phys. Rev. Lett. 82 4745) has been applied to several important problems in physics, but its mathematical foundations are not well understood. We show that their algorithm is not of typical Metropolis-Hastings type, and we specify conditions required for the generated Markov chain to be stationary and to converge to the intended distribution. The time-step bias may add up, and in many applications it is only the middle of a reptile that is the most important. Therefore, we propose an alternative, 'no-compromise reptation quantum Monte Carlo' to stabilize the middle of the reptile. (fast track communication)

  14. EU Commissioner Carlos Moedas visits SESAME

    CERN Multimedia

    CERN Bulletin

    2015-01-01

    The European Commissioner for research, science and innovation, Carlos Moedas, visited the SESAME laboratory in Jordan on Monday 13 April. When it begins operation in 2016, SESAME, a synchrotron light source, will be the Middle East’s first major international science centre, carrying out experiments ranging from the physical sciences to environmental science and archaeology.   CERN Director-General Rolf Heuer (left) and European Commissioner Carlos Moedas with the model SESAME magnet. © European Union, 2015.   Commissioner Moedas was accompanied by a European Commission delegation led by Robert-Jan Smits, Director-General of DG Research and Innovation, as well as Rolf Heuer, CERN Director-General, Jean-Pierre Koutchouk, coordinator of the CERN-EC Support for SESAME Magnets (CESSAMag) project and Princess Sumaya bint El Hassan of Jordan, a leading advocate of science in the region. They toured the SESAME facility together with SESAME Director, Khaled Tou...

  15. Monte Carlo Shell Model Mass Predictions

    International Nuclear Information System (INIS)

    The nuclear mass calculation is discussed in terms of large-scale shell model calculations. First, the development and limitations of the conventional shell model calculations are mentioned. In order to overcome the limitations, the Quantum Monte Carlo Diagonalization (QMCD) method has been proposed. The basic formulation and features of the QMCD method are presented as well as its application to the nuclear shell model, referred to as Monte Carlo Shell Model (MCSM). The MCSM provides us with a breakthrough in shell model calculations: the structure of low-lying states can be studied with realistic interactions for a nearly unlimited variety of nuclei. Thus, the MCSM can contribute significantly to the study of nuclear masses. An application to N∼20 unstable nuclei far from the β-stability line is mentioned

  16. Status of Monte Carlo at Los Alamos

    Energy Technology Data Exchange (ETDEWEB)

    Thompson, W.L.; Cashwell, E.D.; Godfrey, T.N.K.; Schrandt, R.G.; Deutsch, O.L.; Booth, T.E.

    1980-05-01

    Four papers were presented by Group X-6 on April 22, 1980, at the Oak Ridge Radiation Shielding Information Center (RSIC) Seminar-Workshop on Theory and Applications of Monte Carlo Methods. These papers are combined into one report for convenience and because they are related to each other. The first paper (by Thompson and Cashwell) is a general survey about X-6 and MCNP and is an introduction to the other three papers. It can also serve as a resume of X-6. The second paper (by Godfrey) explains some of the details of geometry specification in MCNP. The third paper (by Cashwell and Schrandt) illustrates calculating flux at a point with MCNP; in particular, the once-more-collided flux estimator is demonstrated. Finally, the fourth paper (by Thompson, Deutsch, and Booth) is a tutorial on some variance-reduction techniques. It should be required for a fledging Monte Carlo practitioner.

  17. Multilevel Monte Carlo Approaches for Numerical Homogenization

    KAUST Repository

    Efendiev, Yalchin R.

    2015-10-01

    In this article, we study the application of multilevel Monte Carlo (MLMC) approaches to numerical random homogenization. Our objective is to compute the expectation of some functionals of the homogenized coefficients, or of the homogenized solutions. This is accomplished within MLMC by considering different sizes of representative volumes (RVEs). Many inexpensive computations with the smallest RVE size are combined with fewer expensive computations performed on larger RVEs. Likewise, when it comes to homogenized solutions, different levels of coarse-grid meshes are used to solve the homogenized equation. We show that, by carefully selecting the number of realizations at each level, we can achieve a speed-up in the computations in comparison to a standard Monte Carlo method. Numerical results are presented for both one-dimensional and two-dimensional test-cases that illustrate the efficiency of the approach.

  18. Status of Monte Carlo at Los Alamos

    International Nuclear Information System (INIS)

    At Los Alamos the early work of Fermi, von Neumann, and Ulam has been developed and supplemented by many followers, notably Cashwell and Everett, and the main product today is the continuous-energy, general-purpose, generalized-geometry, time-dependent, coupled neutron-photon transport code called MCNP. The Los Alamos Monte Carlo research and development effort is concentrated in Group X-6. MCNP treats an arbitrary three-dimensional configuration of arbitrary materials in geometric cells bounded by first- and second-degree surfaces and some fourth-degree surfaces (elliptical tori). Monte Carlo has evolved into perhaps the main method for radiation transport calculations at Los Alamos. MCNP is used in every technical division at the Laboratory by over 130 users about 600 times a month accounting for nearly 200 hours of CDC-7600 time

  19. Monte Carlo study of real time dynamics

    CERN Document Server

    Alexandru, Andrei; Bedaque, Paulo F; Vartak, Sohan; Warrington, Neill C

    2016-01-01

    Monte Carlo studies involving real time dynamics are severely restricted by the sign problem that emerges from highly oscillatory phase of the path integral. In this letter, we present a new method to compute real time quantities on the lattice using the Schwinger-Keldysh formalism via Monte Carlo simulations. The key idea is to deform the path integration domain to a complex manifold where the phase oscillations are mild and the sign problem is manageable. We use the previously introduced "contraction algorithm" to create a Markov chain on this alternative manifold. We substantiate our approach by analyzing the quantum mechanical anharmonic oscillator. Our results are in agreement with the exact ones obtained by diagonalization of the Hamiltonian. The method we introduce is generic and in principle applicable to quantum field theory albeit very slow. We discuss some possible improvements that should speed up the algorithm.

  20. The lund Monte Carlo for jet fragmentation

    International Nuclear Information System (INIS)

    We present a Monte Carlo program based on the Lund model for jet fragmentation. Quark, gluon, diquark and hadron jets are considered. Special emphasis is put on the fragmentation of colour singlet jet systems, for which energy, momentum and flavour are conserved explicitly. The model for decays of unstable particles, in particular the weak decay of heavy hadrons, is described. The central part of the paper is a detailed description on how to use the FORTRAN 77 program. (Author)

  1. Autocorrelations in hybrid Monte Carlo simulations

    International Nuclear Information System (INIS)

    Simulations of QCD suffer from severe critical slowing down towards the continuum limit. This problem is known to be prominent in the topological charge, however, all observables are affected to various degree by these slow modes in the Monte Carlo evolution. We investigate the slowing down in high statistics simulations and propose a new error analysis method, which gives a realistic estimate of the contribution of the slow modes to the errors. (orig.)

  2. Simulated Annealing using Hybrid Monte Carlo

    OpenAIRE

    Salazar, Rafael; Toral, Raúl

    1997-01-01

    We propose a variant of the simulated annealing method for optimization in the multivariate analysis of differentiable functions. The method uses global actualizations via the hybrid Monte Carlo algorithm in their generalized version for the proposal of new configurations. We show how this choice can improve upon the performance of simulated annealing methods (mainly when the number of variables is large) by allowing a more effective searching scheme and a faster annealing schedule.

  3. A Monte Carlo for BFKL Physics

    OpenAIRE

    Orr, Lynne H.; Stirling, W. J.

    2000-01-01

    Virtual photon scattering in e^+e^- collisions can result in events with the electron-positron pair at large rapidity separation with hadronic activity in between. The BFKL equation resums large logarithms that dominate the cross section for this process. We report here on a Monte Carlo method for solving the BFKL equation that allows kinematic constraints to be taken into account. The application to e^+e^- collisions is in progress.

  4. Monte Carlo Simulations of Star Clusters

    CERN Document Server

    Giersz, M

    2000-01-01

    A revision of Stod\\'o{\\l}kiewicz's Monte Carlo code is used to simulate evolution of large star clusters. The survey on the evolution of multi-mass N-body systems influenced by the tidal field of a parent galaxy and by stellar evolution is discussed. For the first time, the simulation on the "star-by-star" bases of evolution of 1,000,000 body star cluster is presented. \\

  5. A Ballistic Monte Carlo Approximation of {\\pi}

    CERN Document Server

    Dumoulin, Vincent

    2014-01-01

    We compute a Monte Carlo approximation of {\\pi} using importance sampling with shots coming out of a Mossberg 500 pump-action shotgun as the proposal distribution. An approximated value of 3.136 is obtained, corresponding to a 0.17% error on the exact value of {\\pi}. To our knowledge, this represents the first attempt at estimating {\\pi} using such method, thus opening up new perspectives towards computing mathematical constants using everyday tools.

  6. Lookahead Strategies for Sequential Monte Carlo

    OpenAIRE

    Lin, Ming; Chen, Rong; Liu, Jun

    2013-01-01

    Based on the principles of importance sampling and resampling, sequential Monte Carlo (SMC) encompasses a large set of powerful techniques dealing with complex stochastic dynamic systems. Many of these systems possess strong memory, with which future information can help sharpen the inference about the current state. By providing theoretical justification of several existing algorithms and introducing several new ones, we study systematically how to construct efficient SMC algorithms to take ...

  7. Handbook of Markov chain Monte Carlo

    CERN Document Server

    Brooks, Steve

    2011-01-01

    ""Handbook of Markov Chain Monte Carlo"" brings together the major advances that have occurred in recent years while incorporating enough introductory material for new users of MCMC. Along with thorough coverage of the theoretical foundations and algorithmic and computational methodology, this comprehensive handbook includes substantial realistic case studies from a variety of disciplines. These case studies demonstrate the application of MCMC methods and serve as a series of templates for the construction, implementation, and choice of MCMC methodology.

  8. Monte Carlo methods for preference learning

    DEFF Research Database (Denmark)

    Viappiani, P.

    2012-01-01

    Utility elicitation is an important component of many applications, such as decision support systems and recommender systems. Such systems query the users about their preferences and give recommendations based on the system’s belief about the utility function. Critical to these applications is th...... is the acquisition of prior distribution about the utility parameters and the possibility of real time Bayesian inference. In this paper we consider Monte Carlo methods for these problems....

  9. The Moment Guided Monte Carlo Method

    OpenAIRE

    Degond, Pierre; Dimarco, Giacomo; Pareschi, Lorenzo

    2009-01-01

    In this work we propose a new approach for the numerical simulation of kinetic equations through Monte Carlo schemes. We introduce a new technique which permits to reduce the variance of particle methods through a matching with a set of suitable macroscopic moment equations. In order to guarantee that the moment equations provide the correct solutions, they are coupled to the kinetic equation through a non equilibrium term. The basic idea, on which the method relies, consists in guiding the p...

  10. Quantum Monte Carlo for vibrating molecules

    International Nuclear Information System (INIS)

    Quantum Monte Carlo (QMC) has successfully computed the total electronic energies of atoms and molecules. The main goal of this work is to use correlation function quantum Monte Carlo (CFQMC) to compute the vibrational state energies of molecules given a potential energy surface (PES). In CFQMC, an ensemble of random walkers simulate the diffusion and branching processes of the imaginary-time time dependent Schroedinger equation in order to evaluate the matrix elements. The program QMCVIB was written to perform multi-state VMC and CFQMC calculations and employed for several calculations of the H2O and C3 vibrational states, using 7 PES's, 3 trial wavefunction forms, two methods of non-linear basis function parameter optimization, and on both serial and parallel computers. In order to construct accurate trial wavefunctions different wavefunctions forms were required for H2O and C3. In order to construct accurate trial wavefunctions for C3, the non-linear parameters were optimized with respect to the sum of the energies of several low-lying vibrational states. In order to stabilize the statistical error estimates for C3 the Monte Carlo data was collected into blocks. Accurate vibrational state energies were computed using both serial and parallel QMCVIB programs. Comparison of vibrational state energies computed from the three C3 PES's suggested that a non-linear equilibrium geometry PES is the most accurate and that discrete potential representations may be used to conveniently determine vibrational state energies

  11. Carlos Castillo-Chavez: a century ahead.

    Science.gov (United States)

    Schatz, James

    2013-01-01

    When the opportunity to contribute a short essay about Dr. Carlos Castillo-Chavez presented itself in the context of this wonderful birthday celebration my immediate reaction was por supuesto que sí! Sixteen years ago, I travelled to Cornell University with my colleague at the National Security Agency (NSA) Barbara Deuink to meet Carlos and hear about his vision to expand the talent pool of mathematicians in our country. Our motivation was very simple. First of all, the Agency relies heavily on mathematicians to carry out its mission. If the U.S. mathematics community is not healthy, NSA is not healthy. Keeping our country safe requires a team of the sharpest minds in the nation to tackle amazing intellectual challenges on a daily basis. Second, the Agency cares deeply about diversity. Within the mathematical sciences, students with advanced degrees from the Chicano, Latino, Native American, and African-American communities are underrepresented. It was clear that addressing this issue would require visionary leadership and a long-term commitment. Carlos had the vision for a program that would provide promising undergraduates from minority communities with an opportunity to gain confidence and expertise through meaningful research experiences while sharing in the excitement of mathematical and scientific discovery. His commitment to the venture was unquestionable and that commitment has not waivered since the inception of the Mathematics and Theoretical Biology Institute (MTBI) in 1996.

  12. Use of MOSFET dosimeters to validate Monte Carlo radiation treatment calculation in an anthropomorphic phantom

    Science.gov (United States)

    Juste, Belén; Miró, R.; Abella, V.; Santos, A.; Verdú, Gumersindo

    2015-11-01

    Radiation therapy treatment planning based on Monte Carlo simulation provide a very accurate dose calculation compared to deterministic systems. Nowadays, Metal-Oxide-Semiconductor Field Effect Transistor (MOSFET) dosimeters are increasingly utilized in radiation therapy to verify the received dose by patients. In the present work, we have used the MCNP6 (Monte Carlo N-Particle transport code) to simulate the irradiation of an anthropomorphic phantom (RANDO) with a medical linear accelerator. The detailed model of the Elekta Precise multileaf collimator using a 6 MeV photon beam was designed and validated by means of different beam sizes and shapes in previous works. To include in the simulation the RANDO phantom geometry a set of Computer Tomography images of the phantom was obtained and formatted. The slices are input in PLUNC software, which performs the segmentation by defining anatomical structures and a Matlab algorithm writes the phantom information in MCNP6 input deck format. The simulation was verified and therefore the phantom model and irradiation was validated throughout the comparison of High-Sensitivity MOSFET dosimeter (Best medical Canada) measurements in different points inside the phantom with simulation results. On-line Wireless MOSFET provide dose estimation in the extremely thin sensitive volume, so a meticulous and accurate validation has been performed. The comparison show good agreement between the MOSFET measurements and the Monte Carlo calculations, confirming the validity of the developed procedure to include patients CT in simulations and approving the use of Monte Carlo simulations as an accurate therapy treatment plan.

  13. Cost effective distributed computing for Monte Carlo radiation dosimetry

    International Nuclear Information System (INIS)

    Full text: An inexpensive computing facility has been established for performing repetitive Monte Carlo simulations with the BEAM and EGS4/EGSnrc codes of linear accelerator beams, for calculating effective dose from diagnostic imaging procedures and of ion chambers and phantoms used for the Australian high energy absorbed dose standards. The facility currently consists of 3 dual-processor 450 MHz processor PCs linked by a high speed LAN. The 3 PCs can be accessed either locally from a single keyboard/monitor/mouse combination using a SwitchView controller or remotely via a computer network from PCs with suitable communications software (e.g. Telnet, Kermit etc). All 3 PCs are identically configured to have the Red Hat Linux 6.0 operating system. A Fortran compiler and the BEAM and EGS4/EGSnrc codes are available on the 3 PCs. The preparation of sequences of jobs utilising the Monte Carlo codes is simplified using load-distributing software (enFuzion 6.0 marketed by TurboLinux Inc, formerly Cluster from Active Tools) which efficiently distributes the computing load amongst all 6 processors. We describe 3 applications of the system - (a) energy spectra from radiotherapy sources, (b) mean mass-energy absorption coefficients and stopping powers for absolute absorbed dose standards and (c) dosimetry for diagnostic procedures; (a) and (b) are based on the transport codes BEAM and FLURZnrc while (c) is a Fortran/EGS code developed at ARPANSA. Efficiency gains ranged from 3 for (c) to close to the theoretical maximum of 6 for (a) and (b), with the gain depending on the amount of 'bookkeeping' to begin each task and the time taken to complete a single task. We have found the use of a load-balancing batch processing system with many PCs to be an economical way of achieving greater productivity for Monte Carlo calculations or of any computer intensive task requiring many runs with different parameters. Copyright (2000) Australasian College of Physical Scientists and

  14. Monte Carlo simulation for dual head gamma camera

    International Nuclear Information System (INIS)

    Monte Carlo (MC) simulation technique was used widely in medical physics applications. In nuclear medicine MC was used to design new medical imaging devices such as positron emission tomography (PET), gamma camera and single photon emission computed tomography (SPECT). Also it can be used to study the factors affecting image quality and internal dosimetry, Gate is on of monte Carlo code that has a number of advantages for simulation of SPECT and PET. There is a limit accessibilities in machines which are used in clinics because of the work load of machines. This makes it hard to evaluate some factors effecting machine performance which must be evaluated routinely. Also because of difficulties of carrying out scientific research and training of students, MC model can be optimum solution for the problem. The aim of this study was to use gate monte Carlo code to model Nucline spirit, medico dual head gamma camera hosted in radiation and isotopes center of Khartoum which is equipped with low energy general purpose LEGP collimators. This was used model to evaluate spatial resolution and sensitivity which is important factor affecting image quality and to demonstrate the validity of gate by comparing experimental results with simulation results on spatial resolution. The gate model of Nuclide spirit, medico dual head gamma camera was developed by applying manufacturer specifications. Then simulation was run. In evaluation of spatial resolution the FWHM was calculated from image profile of line source of Tc 99m gammas emitter of energy 140 KeV at different distances from modeled camera head at 5,10,15,20,22,27,32,37 cm and for these distances the spatial resolution was founded to be 5.76, 7.73, 10.7, 13.8, 14.01,16.91, 19.75 and 21.9 mm, respectively. These results showed a decrement of spatial resolution with increase of the distance between object (line source) and collimator in linear manner. FWHM calculated at 10 cm was compared with experimental results. The

  15. Monte-Carlo simulation for determining SNR and DQE of linear array plastic scintillating fiber

    Institute of Scientific and Technical Information of China (English)

    Mohammad Mehdi NASSERI; MA Qing-Li; YIN Ze-Jie; WU Xiao-Yi

    2004-01-01

    Fundamental characteristics of the plastic-scintillating fiber (PSF) for wide energy range of electromagnetic radiation (X & γ) have been studied to evaluate possibility of using the PSF as an imaging detector for industrial purposes. Monte-Carlo simulation program (GEANT4.5.1, 2003) was used to generate the data. In order to evaluate image quality of the detector, fiber array was irradiated under various energy and fluxes. Signal to noise ratio (SNR)as well as detector quantum efficiency (DQE) were obtained.

  16. Guideline of Monte Carlo calculation. Neutron/gamma ray transport simulation by Monte Carlo method

    CERN Document Server

    2002-01-01

    This report condenses basic theories and advanced applications of neutron/gamma ray transport calculations in many fields of nuclear energy research. Chapters 1 through 5 treat historical progress of Monte Carlo methods, general issues of variance reduction technique, cross section libraries used in continuous energy Monte Carlo codes. In chapter 6, the following issues are discussed: fusion benchmark experiments, design of ITER, experiment analyses of fast critical assembly, core analyses of JMTR, simulation of pulsed neutron experiment, core analyses of HTTR, duct streaming calculations, bulk shielding calculations, neutron/gamma ray transport calculations of the Hiroshima atomic bomb. Chapters 8 and 9 treat function enhancements of MCNP and MVP codes, and a parallel processing of Monte Carlo calculation, respectively. An important references are attached at the end of this report.

  17. Fission Matrix Capability for MCNP Monte Carlo

    Energy Technology Data Exchange (ETDEWEB)

    Carney, Sean E. [Los Alamos National Laboratory; Brown, Forrest B. [Los Alamos National Laboratory; Kiedrowski, Brian C. [Los Alamos National Laboratory; Martin, William R. [Los Alamos National Laboratory

    2012-09-05

    In a Monte Carlo criticality calculation, before the tallying of quantities can begin, a converged fission source (the fundamental eigenvector of the fission kernel) is required. Tallies of interest may include powers, absorption rates, leakage rates, or the multiplication factor (the fundamental eigenvalue of the fission kernel, k{sub eff}). Just as in the power iteration method of linear algebra, if the dominance ratio (the ratio of the first and zeroth eigenvalues) is high, many iterations of neutron history simulations are required to isolate the fundamental mode of the problem. Optically large systems have large dominance ratios, and systems containing poor neutron communication between regions are also slow to converge. The fission matrix method, implemented into MCNP[1], addresses these problems. When Monte Carlo random walk from a source is executed, the fission kernel is stochastically applied to the source. Random numbers are used for: distances to collision, reaction types, scattering physics, fission reactions, etc. This method is used because the fission kernel is a complex, 7-dimensional operator that is not explicitly known. Deterministic methods use approximations/discretization in energy, space, and direction to the kernel. Consequently, they are faster. Monte Carlo directly simulates the physics, which necessitates the use of random sampling. Because of this statistical noise, common convergence acceleration methods used in deterministic methods do not work. In the fission matrix method, we are using the random walk information not only to build the next-iteration fission source, but also a spatially-averaged fission kernel. Just like in deterministic methods, this involves approximation and discretization. The approximation is the tallying of the spatially-discretized fission kernel with an incorrect fission source. We address this by making the spatial mesh fine enough that this error is negligible. As a consequence of discretization we get a

  18. Quantum Monte Carlo for vibrating molecules

    Energy Technology Data Exchange (ETDEWEB)

    Brown, W.R. [Univ. of California, Berkeley, CA (United States). Chemistry Dept.]|[Lawrence Berkeley National Lab., CA (United States). Chemical Sciences Div.

    1996-08-01

    Quantum Monte Carlo (QMC) has successfully computed the total electronic energies of atoms and molecules. The main goal of this work is to use correlation function quantum Monte Carlo (CFQMC) to compute the vibrational state energies of molecules given a potential energy surface (PES). In CFQMC, an ensemble of random walkers simulate the diffusion and branching processes of the imaginary-time time dependent Schroedinger equation in order to evaluate the matrix elements. The program QMCVIB was written to perform multi-state VMC and CFQMC calculations and employed for several calculations of the H{sub 2}O and C{sub 3} vibrational states, using 7 PES`s, 3 trial wavefunction forms, two methods of non-linear basis function parameter optimization, and on both serial and parallel computers. In order to construct accurate trial wavefunctions different wavefunctions forms were required for H{sub 2}O and C{sub 3}. In order to construct accurate trial wavefunctions for C{sub 3}, the non-linear parameters were optimized with respect to the sum of the energies of several low-lying vibrational states. In order to stabilize the statistical error estimates for C{sub 3} the Monte Carlo data was collected into blocks. Accurate vibrational state energies were computed using both serial and parallel QMCVIB programs. Comparison of vibrational state energies computed from the three C{sub 3} PES`s suggested that a non-linear equilibrium geometry PES is the most accurate and that discrete potential representations may be used to conveniently determine vibrational state energies.

  19. A Monte Carlo approach to water management

    Science.gov (United States)

    Koutsoyiannis, D.

    2012-04-01

    Common methods for making optimal decisions in water management problems are insufficient. Linear programming methods are inappropriate because hydrosystems are nonlinear with respect to their dynamics, operation constraints and objectives. Dynamic programming methods are inappropriate because water management problems cannot be divided into sequential stages. Also, these deterministic methods cannot properly deal with the uncertainty of future conditions (inflows, demands, etc.). Even stochastic extensions of these methods (e.g. linear-quadratic-Gaussian control) necessitate such drastic oversimplifications of hydrosystems that may make the obtained results irrelevant to the real world problems. However, a Monte Carlo approach is feasible and can form a general methodology applicable to any type of hydrosystem. This methodology uses stochastic simulation to generate system inputs, either unconditional or conditioned on a prediction, if available, and represents the operation of the entire system through a simulation model as faithful as possible, without demanding a specific mathematical form that would imply oversimplifications. Such representation fully respects the physical constraints, while at the same time it evaluates the system operation constraints and objectives in probabilistic terms, and derives their distribution functions and statistics through Monte Carlo simulation. As the performance criteria of a hydrosystem operation will generally be highly nonlinear and highly nonconvex functions of the control variables, a second Monte Carlo procedure, implementing stochastic optimization, is necessary to optimize system performance and evaluate the control variables of the system. The latter is facilitated if the entire representation is parsimonious, i.e. if the number of control variables is kept at a minimum by involving a suitable system parameterization. The approach is illustrated through three examples for (a) a hypothetical system of two reservoirs

  20. Modulated pulse bathymetric lidar Monte Carlo simulation

    Science.gov (United States)

    Luo, Tao; Wang, Yabo; Wang, Rong; Du, Peng; Min, Xia

    2015-10-01

    A typical modulated pulse bathymetric lidar system is investigated by simulation using a modulated pulse lidar simulation system. In the simulation, the return signal is generated by Monte Carlo method with modulated pulse propagation model and processed by mathematical tools like cross-correlation and digital filter. Computer simulation results incorporating the modulation detection scheme reveal a significant suppression of the water backscattering signal and corresponding target contrast enhancement. More simulation experiments are performed with various modulation and reception variables to investigate the effect of them on the bathymetric system performance.

  1. Monte Carlo Simulation of an American Option

    Directory of Open Access Journals (Sweden)

    Gikiri Thuo

    2007-04-01

    Full Text Available We implement gradient estimation techniques for sensitivity analysis of option pricing which can be efficiently employed in Monte Carlo simulation. Using these techniques we can simultaneously obtain an estimate of the option value together with the estimates of sensitivities of the option value to various parameters of the model. After deriving the gradient estimates we incorporate them in an iterative stochastic approximation algorithm for pricing an option with early exercise features. We illustrate the procedure using an example of an American call option with a single dividend that is analytically tractable. In particular we incorporate estimates for the gradient with respect to the early exercise threshold level.

  2. Discovering correlated fermions using quantum Monte Carlo.

    Science.gov (United States)

    Wagner, Lucas K; Ceperley, David M

    2016-09-01

    It has become increasingly feasible to use quantum Monte Carlo (QMC) methods to study correlated fermion systems for realistic Hamiltonians. We give a summary of these techniques targeted at researchers in the field of correlated electrons, focusing on the fundamentals, capabilities, and current status of this technique. The QMC methods often offer the highest accuracy solutions available for systems in the continuum, and, since they address the many-body problem directly, the simulations can be analyzed to obtain insight into the nature of correlated quantum behavior. PMID:27518859

  3. A Monte Carlo algorithm for degenerate plasmas

    Energy Technology Data Exchange (ETDEWEB)

    Turrell, A.E., E-mail: a.turrell09@imperial.ac.uk; Sherlock, M.; Rose, S.J.

    2013-09-15

    A procedure for performing Monte Carlo calculations of plasmas with an arbitrary level of degeneracy is outlined. It has possible applications in inertial confinement fusion and astrophysics. Degenerate particles are initialised according to the Fermi–Dirac distribution function, and scattering is via a Pauli blocked binary collision approximation. The algorithm is tested against degenerate electron–ion equilibration, and the degenerate resistivity transport coefficient from unmagnetised first order transport theory. The code is applied to the cold fuel shell and alpha particle equilibration problem of inertial confinement fusion.

  4. Monte Carlo method in radiation transport problems

    International Nuclear Information System (INIS)

    In neutral radiation transport problems (neutrons, photons), two values are important: the flux in the phase space and the density of particles. To solve the problem with Monte Carlo method leads to, among other things, build a statistical process (called the play) and to provide a numerical value to a variable x (this attribution is called score). Sampling techniques are presented. Play biasing necessity is proved. A biased simulation is made. At last, the current developments (rewriting of programs for instance) are presented due to several reasons: two of them are the vectorial calculation apparition and the photon and neutron transport in vacancy media

  5. Introduction to Monte-Carlo method

    International Nuclear Information System (INIS)

    We recall first some well known facts about random variables and sampling. Then we define the Monte-Carlo method in the case where one wants to compute a given integral. Afterwards, we ship to discrete Markov chains for which we define random walks, and apply to finite difference approximations of diffusion equations. Finally we consider Markov chains with continuous state (but discrete time), transition probabilities and random walks, which are the main piece of this work. The applications are: diffusion and advection equations, and the linear transport equation with scattering

  6. IN MEMORIAM CARLOS RESTREPO. UN VERDADERO MAESTRO

    OpenAIRE

    Pelayo Correa

    2009-01-01

    Carlos Restrepo fue el primer profesor de Patología y un miembro ilustre del grupo de pioneros que fundaron la Facultad de Medicina de la Universidad del Valle. Estos pioneros convergieron en Cali en la década de 1950, en posesión de un espíritu renovador y creativo que emprendió con mucho éxito la labor de cambiar la cultura académica del Valle del Cauca. Ellos encontraron una sociedad apacible, que disfrutaba de la generosidad de su entorno, sin deseos de romper las tradiciones centenarias ...

  7. Hybrid Monte Carlo simulation of polymer chains

    CERN Document Server

    Irbäck, A

    1993-01-01

    We develop the hybrid Monte Carlo method for simulations of single off-lattice polymer chains. We discuss implementation and choice of simulation parameters in some detail. The performance of the algorithm is tested on models for homopolymers with short- or long-range self-repulsion, using chains with $16\\le N\\le 512$ monomers. Without excessive fine tuning, we find that the computational cost grows as $N^{2+z^\\prime}$ with $0.64

  8. Carlos Pereda y la cultura argumental

    OpenAIRE

    Eduardo Harada O.

    2010-01-01

    En este artículo se discute la fenomenología de la atención argumental de Carlos Pereda. Se trata de mostrar que esta fenomenología toma en cuenta todos los aspectos de la argumentación, principalmente, las reglas y virtudes epistémicas que sirven para controlar esta actividad de manera interna así como evitar los vértigos argumentales, además, no sólo estudia a los argumentos o apoyos determinados o deductivos sino, igualmente, a los subdeterminados, pues sostiene que éstos son una parte imp...

  9. The Moment Guided Monte Carlo Method

    CERN Document Server

    Degond, Pierre; Pareschi, Lorenzo

    2009-01-01

    In this work we propose a new approach for the numerical simulation of kinetic equations through Monte Carlo schemes. We introduce a new technique which permits to reduce the variance of particle methods through a matching with a set of suitable macroscopic moment equations. In order to guarantee that the moment equations provide the correct solutions, they are coupled to the kinetic equation through a non equilibrium term. The basic idea, on which the method relies, consists in guiding the particle positions and velocities through moment equations so that the concurrent solution of the moment and kinetic models furnishes the same macroscopic quantities.

  10. by means of FLUKA Monte Carlo method

    Directory of Open Access Journals (Sweden)

    Ermis Elif Ebru

    2015-01-01

    Full Text Available Calculations of gamma-ray mass attenuation coefficients of various detector materials (crystals were carried out by means of FLUKA Monte Carlo (MC method at different gamma-ray energies. NaI, PVT, GSO, GaAs and CdWO4 detector materials were chosen in the calculations. Calculated coefficients were also compared with the National Institute of Standards and Technology (NIST values. Obtained results through this method were highly in accordance with those of the NIST values. It was concluded from the study that FLUKA MC method can be an alternative way to calculate the gamma-ray mass attenuation coefficients of the detector materials.

  11. Exascale Monte Carlo R&D

    Energy Technology Data Exchange (ETDEWEB)

    Marcus, Ryan C. [Los Alamos National Laboratory

    2012-07-24

    Overview of this presentation is (1) Exascale computing - different technologies, getting there; (2) high-performance proof-of-concept MCMini - features and results; and (3) OpenCL toolkit - Oatmeal (OpenCL Automatic Memory Allocation Library) - purpose and features. Despite driver issues, OpenCL seems like a good, hardware agnostic tool. MCMini demonstrates the possibility for GPGPU-based Monte Carlo methods - it shows great scaling for HPC application and algorithmic equivalence. Oatmeal provides a flexible framework to aid in the development of scientific OpenCL codes.

  12. Discovering correlated fermions using quantum Monte Carlo

    Science.gov (United States)

    Wagner, Lucas K.; Ceperley, David M.

    2016-09-01

    It has become increasingly feasible to use quantum Monte Carlo (QMC) methods to study correlated fermion systems for realistic Hamiltonians. We give a summary of these techniques targeted at researchers in the field of correlated electrons, focusing on the fundamentals, capabilities, and current status of this technique. The QMC methods often offer the highest accuracy solutions available for systems in the continuum, and, since they address the many-body problem directly, the simulations can be analyzed to obtain insight into the nature of correlated quantum behavior.

  13. Monte Carlo simulations for heavy ion dosimetry

    OpenAIRE

    Geithner, Oksana

    2006-01-01

    Water-to-air stopping power ratio ( ) calculations for the ionization chamber dosimetry of clinically relevant ion beams with initial energies from 50 to 450 MeV/u have been performed using the Monte Carlo technique. To simulate the transport of a particle in water the computer code SHIELD-HIT v2 was used which is a substantially modified version of its predecessor SHIELD-HIT v1. The code was partially rewritten, replacing formerly used single precision variables with double precision variabl...

  14. Markov chains analytic and Monte Carlo computations

    CERN Document Server

    Graham, Carl

    2014-01-01

    Markov Chains: Analytic and Monte Carlo Computations introduces the main notions related to Markov chains and provides explanations on how to characterize, simulate, and recognize them. Starting with basic notions, this book leads progressively to advanced and recent topics in the field, allowing the reader to master the main aspects of the classical theory. This book also features: Numerous exercises with solutions as well as extended case studies.A detailed and rigorous presentation of Markov chains with discrete time and state space.An appendix presenting probabilistic notions that are nec

  15. Modeling weight variability in a pan coating process using Monte Carlo simulations.

    Science.gov (United States)

    Pandey, Preetanshu; Katakdaunde, Manoj; Turton, Richard

    2006-10-06

    The primary objective of the current study was to investigate process variables affecting weight gain mass coating variability (CV(m) ) in pan coating devices using novel video-imaging techniques and Monte Carlo simulations. Experimental information such as the tablet location, circulation time distribution, velocity distribution, projected surface area, and spray dynamics was the main input to the simulations. The data on the dynamics of tablet movement were obtained using novel video-imaging methods. The effects of pan speed, pan loading, tablet size, coating time, spray flux distribution, and spray area and shape were investigated. CV(m) was found to be inversely proportional to the square root of coating time. The spray shape was not found to affect the CV(m) of the process significantly, but an increase in the spray area led to lower CV(m) s. Coating experiments were conducted to verify the predictions from the Monte Carlo simulations, and the trends predicted from the model were in good agreement. It was observed that the Monte Carlo simulations underpredicted CV(m) s in comparison to the experiments. The model developed can provide a basis for adjustments in process parameters required during scale-up operations and can be useful in predicting the process changes that are needed to achieve the same CV(m) when a variable is altered.

  16. Construction of the Jacobian matrix for fluorescence diffuse optical tomography using a perturbation Monte Carlo method

    Science.gov (United States)

    Zhang, Xiaofeng

    2012-03-01

    Image formation in fluorescence diffuse optical tomography is critically dependent on construction of the Jacobian matrix. For clinical and preclinical applications, because of the highly heterogeneous characteristics of the medium, Monte Carlo methods are frequently adopted to construct the Jacobian. Conventional adjoint Monte Carlo method typically compute the Jacobian by multiplying the photon density fields radiated from the source at the excitation wavelength and from the detector at the emission wavelength. Nonetheless, this approach assumes that the source and the detector in Green's function are reciprocal, which is invalid in general. This assumption is particularly questionable in small animal imaging, where the mean free path length of photons is typically only one order of magnitude smaller than the representative dimension of the medium. We propose a new method that does not rely on the reciprocity of the source and the detector by tracing photon propagation entirely from the source to the detector. This method relies on the perturbation Monte Carlo theory to account for the differences in optical properties of the medium at the excitation and the emission wavelengths. Compared to the adjoint methods, the proposed method is more valid in reflecting the physical process of photon transport in diffusive media and is more efficient in constructing the Jacobian matrix for densely sampled configurations.

  17. Synchrotron stereotactic radiotherapy: dosimetry by Fricke gel and Monte Carlo simulations.

    Science.gov (United States)

    Boudou, Caroline; Biston, Marie-Claude; Corde, Stéphanie; Adam, Jean-François; Ferrero, Claudio; Estève, François; Elleaume, Hélène

    2004-11-21

    Synchrotron stereotactic radiotherapy (SSR) consists in loading the tumour with a high atomic number element (Z), and exposing it to monochromatic x-rays from a synchrotron source (50-100 keV), in stereotactic conditions. The dose distribution results from both the stereotactic monochromatic x-ray irradiation and the presence of the high Z element. The purpose of this preliminary study was to evaluate the two-dimensional dose distribution resulting solely from the irradiation geometry, using Monte Carlo simulations and a Fricke gel dosimeter. The verification of a Monte Carlo-based dosimetry was first assessed by depth dose measurements in a water tank. We thereafter used a Fricke dosimeter to compare Monte Carlo simulations with dose measurements. The Fricke dosimeter is a solution containing ferrous ions which are oxidized to ferric ions under ionizing radiation, proportionally to the absorbed dose. A cylindrical phantom filled with Fricke gel was irradiated in stereotactic conditions over several slices with a continuous beam (beam section = 0.1 x 1 cm2). The phantom and calibration vessels were then imaged by nuclear magnetic resonance. The measured doses were fairly consistent with those predicted by Monte Carlo simulations. However, the measured maximum absolute dose was 10% underestimated regarding calculation. The loss of information in the higher region of dose is explained by the diffusion of ferric ions. Monte Carlo simulation is the most accurate tool for dosimetry including complex geometries made of heterogeneous materials. Although the technique requires improvements, gel dosimetry remains an essential tool for the experimental verification of dose distribution in SSR with millimetre precision.

  18. State-of-the-art Monte Carlo 1988

    Energy Technology Data Exchange (ETDEWEB)

    Soran, P.D.

    1988-06-28

    Particle transport calculations in highly dimensional and physically complex geometries, such as detector calibration, radiation shielding, space reactors, and oil-well logging, generally require Monte Carlo transport techniques. Monte Carlo particle transport can be performed on a variety of computers ranging from APOLLOs to VAXs. Some of the hardware and software developments, which now permit Monte Carlo methods to be routinely used, are reviewed in this paper. The development of inexpensive, large, fast computer memory, coupled with fast central processing units, permits Monte Carlo calculations to be performed on workstations, minicomputers, and supercomputers. The Monte Carlo renaissance is further aided by innovations in computer architecture and software development. Advances in vectorization and parallelization architecture have resulted in the development of new algorithms which have greatly reduced processing times. Finally, the renewed interest in Monte Carlo has spawned new variance reduction techniques which are being implemented in large computer codes. 45 refs.

  19. Monte Carlo simulations in theoretical physic; Simulations Monte Carlo en physique theorique

    Energy Technology Data Exchange (ETDEWEB)

    Billoire, A.

    1991-12-31

    After a presentation of the MONTE CARLO method principle, the method is applied, first to the critical exponents calculations in the three dimensions ISING model, and secondly to the discrete quantum chromodynamic with calculation times in function of computer power. 28 refs., 4 tabs.

  20. Simulation of Cone Beam CT System Based on Monte Carlo Method

    CERN Document Server

    Wang, Yu; Cao, Ruifen; Hu, Liqin; Li, Bingbing

    2014-01-01

    Adaptive Radiation Therapy (ART) was developed based on Image-guided Radiation Therapy (IGRT) and it is the trend of photon radiation therapy. To get a better use of Cone Beam CT (CBCT) images for ART, the CBCT system model was established based on Monte Carlo program and validated against the measurement. The BEAMnrc program was adopted to the KV x-ray tube. Both IOURCE-13 and ISOURCE-24 were chosen to simulate the path of beam particles. The measured Percentage Depth Dose (PDD) and lateral dose profiles under 1cm water were compared with the dose calculated by DOSXYZnrc program. The calculated PDD was better than 1% within the depth of 10cm. More than 85% points of calculated lateral dose profiles was within 2%. The correct CBCT system model helps to improve CBCT image quality for dose verification in ART and assess the CBCT image concomitant dose risk.

  1. Challenges and prospects for whole-coremonte Carlo analysis

    International Nuclear Information System (INIS)

    The advantages for using Monte Carlo methods to analyze full-core reactor configurations include essentially exact representation of geometry and physical phenomena that are important for reactor analysis. But this substantial advantage comes at a substantial cost because of the computational burden, both in terms of memory demand and computational time. This paper focuses on the challenges facing full-core Monte Carlo for keff calculations and the prospects for Monte Carlo becoming a routine tool for reactor analysis.

  2. Temperature variance study in Monte-Carlo photon transport theory

    International Nuclear Information System (INIS)

    We study different Monte-Carlo methods for solving radiative transfer problems, and particularly Fleck's Monte-Carlo method. We first give the different time-discretization schemes and the corresponding stability criteria. Then we write the temperature variance as a function of the variances of temperature and absorbed energy at the previous time step. Finally we obtain some stability criteria for the Monte-Carlo method in the stationary case

  3. Discrete diffusion Monte Carlo for frequency-dependent radiative transfer

    Energy Technology Data Exchange (ETDEWEB)

    Densmore, Jeffrey D [Los Alamos National Laboratory; Kelly, Thompson G [Los Alamos National Laboratory; Urbatish, Todd J [Los Alamos National Laboratory

    2010-11-17

    Discrete Diffusion Monte Carlo (DDMC) is a technique for increasing the efficiency of Implicit Monte Carlo radiative-transfer simulations. In this paper, we develop an extension of DDMC for frequency-dependent radiative transfer. We base our new DDMC method on a frequency-integrated diffusion equation for frequencies below a specified threshold. Above this threshold we employ standard Monte Carlo. With a frequency-dependent test problem, we confirm the increased efficiency of our new DDMC technique.

  4. Unbiased combinations of nonanalog Monte Carlo techniques and fair games

    International Nuclear Information System (INIS)

    Historically, Monte Carlo variance reduction techniques have developed one at a time in response to calculational needs. This paper provides the theoretical basis for obtaining unbiased Monte Carlo estimates from all possible combinations of variance reduction techniques. Hitherto, the techniques have not been proven to be unbiased in arbitrary combinations. The authors are unaware of any Monte Carlo techniques (in any linear process) that are not treated by the theorem herein. (author)

  5. Alternative Monte Carlo Approach for General Global Illumination

    Institute of Scientific and Technical Information of China (English)

    徐庆; 李朋; 徐源; 孙济洲

    2004-01-01

    An alternative Monte Carlo strategy for the computation of global illumination problem was presented.The proposed approach provided a new and optimal way for solving Monte Carlo global illumination based on the zero variance importance sampling procedure. A new importance driven Monte Carlo global illumination algorithm in the framework of the new computing scheme was developed and implemented. Results, which were obtained by rendering test scenes, show that this new framework and the newly derived algorithm are effective and promising.

  6. MontePython: Implementing Quantum Monte Carlo using Python

    OpenAIRE

    J.K. Nilsen

    2006-01-01

    We present a cross-language C++/Python program for simulations of quantum mechanical systems with the use of Quantum Monte Carlo (QMC) methods. We describe a system for which to apply QMC, the algorithms of variational Monte Carlo and diffusion Monte Carlo and we describe how to implement theses methods in pure C++ and C++/Python. Furthermore we check the efficiency of the implementations in serial and parallel cases to show that the overhead using Python can be negligible.

  7. Combinatorial nuclear level density by a Monte Carlo method

    OpenAIRE

    Cerf, N.

    1993-01-01

    We present a new combinatorial method for the calculation of the nuclear level density. It is based on a Monte Carlo technique, in order to avoid a direct counting procedure which is generally impracticable for high-A nuclei. The Monte Carlo simulation, making use of the Metropolis sampling scheme, allows a computationally fast estimate of the level density for many fermion systems in large shell model spaces. We emphasize the advantages of this Monte Carlo approach, particularly concerning t...

  8. THE MCNPX MONTE CARLO RADIATION TRANSPORT CODE

    Energy Technology Data Exchange (ETDEWEB)

    WATERS, LAURIE S. [Los Alamos National Laboratory; MCKINNEY, GREGG W. [Los Alamos National Laboratory; DURKEE, JOE W. [Los Alamos National Laboratory; FENSIN, MICHAEL L. [Los Alamos National Laboratory; JAMES, MICHAEL R. [Los Alamos National Laboratory; JOHNS, RUSSELL C. [Los Alamos National Laboratory; PELOWITZ, DENISE B. [Los Alamos National Laboratory

    2007-01-10

    MCNPX (Monte Carlo N-Particle eXtended) is a general-purpose Monte Carlo radiation transport code with three-dimensional geometry and continuous-energy transport of 34 particles and light ions. It contains flexible source and tally options, interactive graphics, and support for both sequential and multi-processing computer platforms. MCNPX is based on MCNP4B, and has been upgraded to most MCNP5 capabilities. MCNP is a highly stable code tracking neutrons, photons and electrons, and using evaluated nuclear data libraries for low-energy interaction probabilities. MCNPX has extended this base to a comprehensive set of particles and light ions, with heavy ion transport in development. Models have been included to calculate interaction probabilities when libraries are not available. Recent additions focus on the time evolution of residual nuclei decay, allowing calculation of transmutation and delayed particle emission. MCNPX is now a code of great dynamic range, and the excellent neutronics capabilities allow new opportunities to simulate devices of interest to experimental particle physics; particularly calorimetry. This paper describes the capabilities of the current MCNPX version 2.6.C, and also discusses ongoing code development.

  9. Quantum Monte Carlo for atoms and molecules

    International Nuclear Information System (INIS)

    The diffusion quantum Monte Carlo with fixed nodes (QMC) approach has been employed in studying energy-eigenstates for 1--4 electron systems. Previous work employing the diffusion QMC technique yielded energies of high quality for H2, LiH, Li2, and H2O. Here, the range of calculations with this new approach has been extended to include additional first-row atoms and molecules. In addition, improvements in the previously computed fixed-node energies of LiH, Li2, and H2O have been obtained using more accurate trial functions. All computations were performed within, but are not limited to, the Born-Oppenheimer approximation. In our computations, the effects of variation of Monte Carlo parameters on the QMC solution of the Schroedinger equation were studied extensively. These parameters include the time step, renormalization time and nodal structure. These studies have been very useful in determining which choices of such parameters will yield accurate QMC energies most efficiently. Generally, very accurate energies (90--100% of the correlation energy is obtained) have been computed with single-determinant trail functions multiplied by simple correlation functions. Improvements in accuracy should be readily obtained using more complex trial functions

  10. Monte Carlo generators in ATLAS software

    International Nuclear Information System (INIS)

    This document describes how Monte Carlo (MC) generators can be used in the ATLAS software framework (Athena). The framework is written in C++ using Python scripts for job configuration. Monte Carlo generators that provide the four-vectors describing the results of LHC collisions are written in general by third parties and are not part of Athena. These libraries are linked from the LCG Generator Services (GENSER) distribution. Generators are run from within Athena and the generated event output is put into a transient store, in HepMC format, using StoreGate. A common interface, implemented via inheritance of a GeneratorModule class, guarantees common functionality for the basic generation steps. The generator information can be accessed and manipulated by helper packages like TruthHelper. The ATLAS detector simulation as well access the truth information from StoreGate1. Steering is done through specific interfaces to allow for flexible configuration using ATLAS Python scripts. Interfaces to most general purpose generators, including: Pythia6, Pythia8, Herwig, Herwig++ and Sherpa are provided, as well as to more specialized packages, for example Phojet and Cascade. A second type of interface exist for the so called Matrix Element generators that only generate the particles produced in the hard scattering process and write events in the Les Houches event format. A generic interface to pass these events to Pythia6 and Herwig for parton showering and hadronisation has been written.

  11. Information Geometry and Sequential Monte Carlo

    CERN Document Server

    Sim, Aaron; Stumpf, Michael P H

    2012-01-01

    This paper explores the application of methods from information geometry to the sequential Monte Carlo (SMC) sampler. In particular the Riemannian manifold Metropolis-adjusted Langevin algorithm (mMALA) is adapted for the transition kernels in SMC. Similar to its function in Markov chain Monte Carlo methods, the mMALA is a fully adaptable kernel which allows for efficient sampling of high-dimensional and highly correlated parameter spaces. We set up the theoretical framework for its use in SMC with a focus on the application to the problem of sequential Bayesian inference for dynamical systems as modelled by sets of ordinary differential equations. In addition, we argue that defining the sequence of distributions on geodesics optimises the effective sample sizes in the SMC run. We illustrate the application of the methodology by inferring the parameters of simulated Lotka-Volterra and Fitzhugh-Nagumo models. In particular we demonstrate that compared to employing a standard adaptive random walk kernel, the SM...

  12. Quantum Monte Carlo Calculations of Neutron Matter

    CERN Document Server

    Carlson, J; Ravenhall, D G

    2003-01-01

    Uniform neutron matter is approximated by a cubic box containing a finite number of neutrons, with periodic boundary conditions. We report variational and Green's function Monte Carlo calculations of the ground state of fourteen neutrons in a periodic box using the Argonne $\\vep $ two-nucleon interaction at densities up to one and half times the nuclear matter density. The effects of the finite box size are estimated using variational wave functions together with cluster expansion and chain summation techniques. They are small at subnuclear densities. We discuss the expansion of the energy of low-density neutron gas in powers of its Fermi momentum. This expansion is strongly modified by the large nn scattering length, and does not begin with the Fermi-gas kinetic energy as assumed in both Skyrme and relativistic mean field theories. The leading term of neutron gas energy is ~ half the Fermi-gas kinetic energy. The quantum Monte Carlo results are also used to calibrate the accuracy of variational calculations ...

  13. Reactor perturbation calculations by Monte Carlo methods

    International Nuclear Information System (INIS)

    Whilst Monte Carlo methods are useful for reactor calculations involving complicated geometry, it is difficult to apply them to the calculation of perturbation worths because of the large amount of computing time needed to obtain good accuracy. Various ways of overcoming these difficulties are investigated in this report, with the problem of estimating absorbing control rod worths particularly in mind. As a basis for discussion a method of carrying out multigroup reactor calculations by Monte Carlo methods is described. Two methods of estimating a perturbation worth directly, without differencing two quantities of like magnitude, are examined closely but are passed over in favour of a third method based on a correlation technique. This correlation method is described, and demonstrated by a limited range of calculations for absorbing control rods in a fast reactor. In these calculations control rod worths of between 1% and 7% in reactivity are estimated to an accuracy better than 10% (3 standard errors) in about one hour's computing time on the English Electric KDF.9 digital computer. (author)

  14. FastDIRC: a fast Monte Carlo and reconstruction algorithm for DIRC detectors

    CERN Document Server

    Hardin, John

    2016-01-01

    FastDIRC is a novel fast Monte Carlo and reconstruction algorithm for DIRC detectors. A DIRC employs rectangular fused-silica bars both as Cherenkov radiators and as light guides. Cherenkov-photon imaging and time-of-propagation information are utilized by a DIRC to identify charged particles. GEANT-based DIRC Monte Carlo simulations are extremely CPU intensive. The FastDIRC algorithm permits fully simulating a DIRC detector more than 10000 times faster than using GEANT. This facilitates designing a DIRC-reconstruction algorithm that improves the Cherenkov-angle resolution of a DIRC detector by about 30% compared to existing algorithms. FastDIRC also greatly reduces the time required to study competing DIRC-detector designs.

  15. Three Dimension Monte Carlo Simulation of Austenite Grain Growth in CGHAZ of an Ultrafine Grain Steel

    Institute of Scientific and Technical Information of China (English)

    Dong CHEN; Yongping LEI; Xiaoyan LI; Yaowu SHI; Zhiling TIAN

    2003-01-01

    In the present research Monte Carlo technique was used to simulate the grain growth in heat-affected zone(HAZ) of an ultrafine grain steel. An experimental data based (EBD) model proposed by Gao was used to establish the relation between tMCS and real time temperature kinetics in our simulation. The simulations give out the evolution of grain structure and grain size distribution in HAZ of the ultrafine grain steel. A Microsoft Window based on computer program for the simulation of grain growth in the HAZ of weldment in three dimensions has been developed using Monte Carlo technique. For the system, inputting the temperature field data and material properties, the evolution of grain structure, both image of simulated grain structure and numerical datum reflecting grain size distribution can be produced by the program. The system was applied to the ultrafine grain steel welding, and the simulated results show that the ultrafine grain steel has large tendency of grain growth.

  16. Comparison of internal dose estimates obtained using organ-level, voxel S value, and Monte Carlo techniques

    Energy Technology Data Exchange (ETDEWEB)

    Grimes, Joshua, E-mail: grimes.joshua@mayo.edu [Department of Physics and Astronomy, University of British Columbia, Vancouver V5Z 1L8 (Canada); Celler, Anna [Department of Radiology, University of British Columbia, Vancouver V5Z 1L8 (Canada)

    2014-09-15

    Purpose: The authors’ objective was to compare internal dose estimates obtained using the Organ Level Dose Assessment with Exponential Modeling (OLINDA/EXM) software, the voxel S value technique, and Monte Carlo simulation. Monte Carlo dose estimates were used as the reference standard to assess the impact of patient-specific anatomy on the final dose estimate. Methods: Six patients injected with{sup 99m}Tc-hydrazinonicotinamide-Tyr{sup 3}-octreotide were included in this study. A hybrid planar/SPECT imaging protocol was used to estimate {sup 99m}Tc time-integrated activity coefficients (TIACs) for kidneys, liver, spleen, and tumors. Additionally, TIACs were predicted for {sup 131}I, {sup 177}Lu, and {sup 90}Y assuming the same biological half-lives as the {sup 99m}Tc labeled tracer. The TIACs were used as input for OLINDA/EXM for organ-level dose calculation and voxel level dosimetry was performed using the voxel S value method and Monte Carlo simulation. Dose estimates for {sup 99m}Tc, {sup 131}I, {sup 177}Lu, and {sup 90}Y distributions were evaluated by comparing (i) organ-level S values corresponding to each method, (ii) total tumor and organ doses, (iii) differences in right and left kidney doses, and (iv) voxelized dose distributions calculated by Monte Carlo and the voxel S value technique. Results: The S values for all investigated radionuclides used by OLINDA/EXM and the corresponding patient-specific S values calculated by Monte Carlo agreed within 2.3% on average for self-irradiation, and differed by as much as 105% for cross-organ irradiation. Total organ doses calculated by OLINDA/EXM and the voxel S value technique agreed with Monte Carlo results within approximately ±7%. Differences between right and left kidney doses determined by Monte Carlo were as high as 73%. Comparison of the Monte Carlo and voxel S value dose distributions showed that each method produced similar dose volume histograms with a minimum dose covering 90% of the volume (D90

  17. Acceleration of GATE Monte Carlo simulations

    OpenAIRE

    De Beenhouwer, Jan

    2008-01-01

    Positron Emission Tomography (PET) and Single Photon Emission Computed Tomography are forms of medical imaging that produce functional images that reflect biological processes. They are based on the tracer principle. A biologically active substance, a pharmaceutical, is selected so that its spatial and temporal distribution in the body reflects a certain body function or metabolism. In order to form images of the distribution, the pharmaceutical is labeled with gamma-ray-emitting or positron-...

  18. A Monte Carlo paradigm for capillarity in porous media

    Science.gov (United States)

    Lu, Ning; Zeidman, Benjamin D.; Lusk, Mark T.; Willson, Clinton S.; Wu, David T.

    2010-12-01

    Wet porous media are ubiquitous in nature as soils, rocks, plants, and bones, and in engineering settings such as oil production, ground stability, filtration and composites. Their physical and chemical behavior is governed by the distribution of liquid and interfaces between phases. Characterization of the interfacial distribution is mostly based on macroscopic experiments, aided by empirical formulae. We present an alternative computational paradigm utilizing a Monte Carlo algorithm to simulate interfaces in complex realistic pore geometries. The method agrees with analytical solutions available only for idealized pore geometries, and is in quantitative agreement with Micro X-ray Computed Tomography (microXCT), capillary pressure, and interfacial area measurements for natural soils. We demonstrate that this methodology predicts macroscopic properties such as the capillary pressure and air-liquid interface area versus liquid saturation based only on the pore size information from microXCT images and interfacial interaction energies. The generality of this method should allow simulation of capillarity in many porous materials.

  19. A Monte Carlo paradigm for capillarity in porous media

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Ning; Zeidman, Benjamin D.; Lusk, Mark T.; Willson, Clinton S.; Wu, David T. (CSM); (LSU)

    2011-08-09

    Wet porous media are ubiquitous in nature as soils, rocks, plants, and bones, and in engineering settings such as oil production, ground stability, filtration and composites. Their physical and chemical behavior is governed by the distribution of liquid and interfaces between phases. Characterization of the interfacial distribution is mostly based on macroscopic experiments, aided by empirical formulae. We present an alternative computational paradigm utilizing a Monte Carlo algorithm to simulate interfaces in complex realistic pore geometries. The method agrees with analytical solutions available only for idealized pore geometries, and is in quantitative agreement with Micro X-ray Computed Tomography (microXCT), capillary pressure, and interfacial area measurements for natural soils. We demonstrate that this methodology predicts macroscopic properties such as the capillary pressure and air-liquid interface area versus liquid saturation based only on the pore size information from microXCT images and interfacial interaction energies. The generality of this method should allow simulation of capillarity in many porous materials.

  20. Bayesian segmentation of hyperspectral images

    CERN Document Server

    Mohammadpour, Adel; Mohammad-Djafari, Ali

    2007-01-01

    In this paper we consider the problem of joint segmentation of hyperspectral images in the Bayesian framework. The proposed approach is based on a Hidden Markov Modeling (HMM) of the images with common segmentation, or equivalently with common hidden classification label variables which is modeled by a Potts Markov Random Field. We introduce an appropriate Markov Chain Monte Carlo (MCMC) algorithm to implement the method and show some simulation results.

  1. Bayesian segmentation of hyperspectral images

    Science.gov (United States)

    Mohammadpour, Adel; Féron, Olivier; Mohammad-Djafari, Ali

    2004-11-01

    In this paper we consider the problem of joint segmentation of hyperspectral images in the Bayesian framework. The proposed approach is based on a Hidden Markov Modeling (HMM) of the images with common segmentation, or equivalently with common hidden classification label variables which is modeled by a Potts Markov Random Field. We introduce an appropriate Markov Chain Monte Carlo (MCMC) algorithm to implement the method and show some simulation results.

  2. Morse Monte Carlo Radiation Transport Code System

    Energy Technology Data Exchange (ETDEWEB)

    Emmett, M.B.

    1975-02-01

    The report contains sections containing descriptions of the MORSE and PICTURE codes, input descriptions, sample problems, deviations of the physical equations and explanations of the various error messages. The MORSE code is a multipurpose neutron and gamma-ray transport Monte Carlo code. Time dependence for both shielding and criticality problems is provided. General three-dimensional geometry may be used with an albedo option available at any material surface. The PICTURE code provide aid in preparing correct input data for the combinatorial geometry package CG. It provides a printed view of arbitrary two-dimensional slices through the geometry. By inspecting these pictures one may determine if the geometry specified by the input cards is indeed the desired geometry. 23 refs. (WRF)

  3. Monte Carlo Implementation of Polarized Hadronization

    CERN Document Server

    Matevosyan, Hrayr H; Thomas, Anthony W

    2016-01-01

    We study the polarized quark hadronization in a Monte Carlo (MC) framework based on the recent extension of the quark-jet framework, where a self-consistent treatment of the quark polarization transfer in a sequential hadronization picture has been presented. Here, we first adopt this approach for MC simulations of hadronization process with finite number of produced hadrons, expressing the relevant probabilities in terms of the eight leading twist quark-to-quark transverse momentum dependent (TMD) splitting functions (SFs) for elementary $q \\to q'+h$ transition. We present explicit expressions for the unpolarized and Collins fragmentation functions (FFs) of unpolarized hadrons emitted at rank two. Further, we demonstrate that all the current spectator-type model calculations of the leading twist quark-to-quark TMD SFs violate the positivity constraints, and propose quark model based ansatz for these input functions that circumvents the problem. We validate our MC framework by explicitly proving the absence o...

  4. Commensurabilities between ETNOs: a Monte Carlo survey

    CERN Document Server

    Marcos, C de la Fuente

    2016-01-01

    Many asteroids in the main and trans-Neptunian belts are trapped in mean motion resonances with Jupiter and Neptune, respectively. As a side effect, they experience accidental commensurabilities among themselves. These commensurabilities define characteristic patterns that can be used to trace the source of the observed resonant behaviour. Here, we explore systematically the existence of commensurabilities between the known ETNOs using their heliocentric and barycentric semimajor axes, their uncertainties, and Monte Carlo techniques. We find that the commensurability patterns present in the known ETNO population resemble those found in the main and trans-Neptunian belts. Although based on small number statistics, such patterns can only be properly explained if most, if not all, of the known ETNOs are subjected to the resonant gravitational perturbations of yet undetected trans-Plutonian planets. We show explicitly that some of the statistically significant commensurabilities are compatible with the Planet Nin...

  5. Hybrid algorithms in quantum Monte Carlo

    International Nuclear Information System (INIS)

    With advances in algorithms and growing computing powers, quantum Monte Carlo (QMC) methods have become a leading contender for high accuracy calculations for the electronic structure of realistic systems. The performance gain on recent HPC systems is largely driven by increasing parallelism: the number of compute cores of a SMP and the number of SMPs have been going up, as the Top500 list attests. However, the available memory as well as the communication and memory bandwidth per element has not kept pace with the increasing parallelism. This severely limits the applicability of QMC and the problem size it can handle. OpenMP/MPI hybrid programming provides applications with simple but effective solutions to overcome efficiency and scalability bottlenecks on large-scale clusters based on multi/many-core SMPs. We discuss the design and implementation of hybrid methods in QMCPACK and analyze its performance on current HPC platforms characterized by various memory and communication hierarchies.

  6. Nuclear reactions in Monte Carlo codes

    CERN Document Server

    Ferrari, Alfredo

    2002-01-01

    The physics foundations of hadronic interactions as implemented in most Monte Carlo codes are presented together with a few practical examples. The description of the relevant physics is presented schematically split into the major steps in order to stress the different approaches required for the full understanding of nuclear reactions at intermediate and high energies. Due to the complexity of the problem, only a few semi-qualitative arguments are developed in this paper. The description will be necessarily schematic and somewhat incomplete, but hopefully it will be useful for a first introduction into this topic. Examples are shown mostly for the high energy regime, where all mechanisms mentioned in the paper are at work and to which perhaps most of the readers are less accustomed. Examples for lower energies can be found in the references. (43 refs) .

  7. San Carlos Apache Tribe - Energy Organizational Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Rapp, James; Albert, Steve

    2012-04-01

    The San Carlos Apache Tribe (SCAT) was awarded $164,000 in late-2011 by the U.S. Department of Energy (U.S. DOE) Tribal Energy Program's "First Steps Toward Developing Renewable Energy and Energy Efficiency on Tribal Lands" Grant Program. This grant funded:  The analysis and selection of preferred form(s) of tribal energy organization (this Energy Organization Analysis, hereinafter referred to as "EOA").  Start-up staffing and other costs associated with the Phase 1 SCAT energy organization.  An intern program.  Staff training.  Tribal outreach and workshops regarding the new organization and SCAT energy programs and projects, including two annual tribal energy summits (2011 and 2012). This report documents the analysis and selection of preferred form(s) of a tribal energy organization.

  8. Monte Carlo modeling and meteor showers

    International Nuclear Information System (INIS)

    Prediction of short lived increases in the cosmic dust influx, the concentration in lower thermosphere of atoms and ions of meteor origin and the determination of the frequency of micrometeor impacts on spacecraft are all of scientific and practical interest and all require adequate models of meteor showers at an early stage of their existence. A Monte Carlo model of meteor matter ejection from a parent body at any point of space was worked out by other researchers. This scheme is described. According to the scheme, the formation of ten well known meteor streams was simulated and the possibility of genetic affinity of each of them with the most probable parent comet was analyzed. Some of the results are presented

  9. Monte Carlo modeling and meteor showers

    Science.gov (United States)

    Kulikova, N. V.

    1987-08-01

    Prediction of short lived increases in the cosmic dust influx, the concentration in lower thermosphere of atoms and ions of meteor origin and the determination of the frequency of micrometeor impacts on spacecraft are all of scientific and practical interest and all require adequate models of meteor showers at an early stage of their existence. A Monte Carlo model of meteor matter ejection from a parent body at any point of space was worked out by other researchers. This scheme is described. According to the scheme, the formation of ten well known meteor streams was simulated and the possibility of genetic affinity of each of them with the most probable parent comet was analyzed. Some of the results are presented.

  10. Monte Carlo Exploration of Warped Higgsless Models

    CERN Document Server

    Hewett, J L; Rizzo, T G

    2004-01-01

    We have performed a detailed Monte Carlo exploration of the parameter space for a warped Higgsless model of electroweak symmetry breaking in 5 dimensions. This model is based on the $SU(2)_L\\times SU(2)_R\\times U(1)_{B-L}$ gauge group in an AdS$_5$ bulk with arbitrary gauge kinetic terms on both the Planck and TeV branes. Constraints arising from precision electroweak measurements and collider data are found to be relatively easy to satisfy. We show, however, that the additional requirement of perturbative unitarity up to the cut-off, $\\simeq 10$ TeV, in $W_L^+W_L^-$ elastic scattering in the absence of dangerous tachyons eliminates all models. If successful models of this class exist, they must be highly fine-tuned.

  11. Variable length trajectory compressible hybrid Monte Carlo

    CERN Document Server

    Nishimura, Akihiko

    2016-01-01

    Hybrid Monte Carlo (HMC) generates samples from a prescribed probability distribution in a configuration space by simulating Hamiltonian dynamics, followed by the Metropolis (-Hastings) acceptance/rejection step. Compressible HMC (CHMC) generalizes HMC to a situation in which the dynamics is reversible but not necessarily Hamiltonian. This article presents a framework to further extend the algorithm. Within the existing framework, each trajectory of the dynamics must be integrated for the same amount of (random) time to generate a valid Metropolis proposal. Our generalized acceptance/rejection mechanism allows a more deliberate choice of the integration time for each trajectory. The proposed algorithm in particular enables an effective application of variable step size integrators to HMC-type sampling algorithms based on reversible dynamics. The potential of our framework is further demonstrated by another extension of HMC which reduces the wasted computations due to unstable numerical approximations and corr...

  12. Monte Carlo modelling of TRIGA research reactor

    Science.gov (United States)

    El Bakkari, B.; Nacir, B.; El Bardouni, T.; El Younoussi, C.; Merroun, O.; Htet, A.; Boulaich, Y.; Zoubair, M.; Boukhal, H.; Chakir, M.

    2010-10-01

    The Moroccan 2 MW TRIGA MARK II research reactor at Centre des Etudes Nucléaires de la Maâmora (CENM) achieved initial criticality on May 2, 2007. The reactor is designed to effectively implement the various fields of basic nuclear research, manpower training, and production of radioisotopes for their use in agriculture, industry, and medicine. This study deals with the neutronic analysis of the 2-MW TRIGA MARK II research reactor at CENM and validation of the results by comparisons with the experimental, operational, and available final safety analysis report (FSAR) values. The study was prepared in collaboration between the Laboratory of Radiation and Nuclear Systems (ERSN-LMR) from Faculty of Sciences of Tetuan (Morocco) and CENM. The 3-D continuous energy Monte Carlo code MCNP (version 5) was used to develop a versatile and accurate full model of the TRIGA core. The model represents in detailed all components of the core with literally no physical approximation. Continuous energy cross-section data from the more recent nuclear data evaluations (ENDF/B-VI.8, ENDF/B-VII.0, JEFF-3.1, and JENDL-3.3) as well as S( α, β) thermal neutron scattering functions distributed with the MCNP code were used. The cross-section libraries were generated by using the NJOY99 system updated to its more recent patch file "up259". The consistency and accuracy of both the Monte Carlo simulation and neutron transport physics were established by benchmarking the TRIGA experiments. Core excess reactivity, total and integral control rods worth as well as power peaking factors were used in the validation process. Results of calculations are analysed and discussed.

  13. Criticality benchmarking of ANET Monte Carlo code

    International Nuclear Information System (INIS)

    In this work the new Monte Carlo code ANET is tested on criticality calculations. ANET is developed based on the high energy physics code GEANT of CERN and aims at progressively satisfying several requirements regarding both simulations of GEN II/III reactors, as well as of innovative nuclear reactor designs such as the Accelerator Driven Systems (ADSs). Here ANET is applied on three different nuclear configurations, including a subcritical assembly, a Material Testing Reactor and the conceptual configuration of an ADS. In the first case, calculation of the effective multiplication factor (keff) are performed for the Training Nuclear Reactor of the Aristotle University of Thessaloniki, while in the second case keff is computed for the fresh fueled core of the Portuguese research reactor (RPJ) just after its conversion to Low Enriched Uranium, considering the control rods at the position that renders the reactor critical. In both cases ANET computations are compared with corresponding results obtained by three different well established codes, including both deterministic (XSDRNPM/CITATION) and Monte Carlo (TRIPOLI, MCNP). In the RPI case, keff computations are also compared with observations during the reactor core commissioning since the control rods are considered at criticality position. The above verification studies show ANET to produce reasonable results since they are satisfactorily compared with other models as well as with observations. For the third case (ADS), preliminary ANET computations of keff for various intensities of the proton beam are presented, showing also a reasonable code performance concerning both the order of magnitude and the relative variation of the computed parameter. (author)

  14. QWalk: A Quantum Monte Carlo Program for Electronic Structure

    CERN Document Server

    Wagner, Lucas K; Mitas, Lubos

    2007-01-01

    We describe QWalk, a new computational package capable of performing Quantum Monte Carlo electronic structure calculations for molecules and solids with many electrons. We describe the structure of the program and its implementation of Quantum Monte Carlo methods. It is open-source, licensed under the GPL, and available at the web site http://www.qwalk.org

  15. Recent Developments in Quantum Monte Carlo: Methods and Applications

    Science.gov (United States)

    Aspuru-Guzik, Alan; Austin, Brian; Domin, Dominik; Galek, Peter T. A.; Handy, Nicholas; Prasad, Rajendra; Salomon-Ferrer, Romelia; Umezawa, Naoto; Lester, William A.

    2007-12-01

    The quantum Monte Carlo method in the diffusion Monte Carlo form has become recognized for its capability of describing the electronic structure of atomic, molecular and condensed matter systems to high accuracy. This talk will briefly outline the method with emphasis on recent developments connected with trial function construction, linear scaling, and applications to selected systems.

  16. Adjoint electron-photon transport Monte Carlo calculations with ITS

    International Nuclear Information System (INIS)

    A general adjoint coupled electron-photon Monte Carlo code for solving the Boltzmann-Fokker-Planck equation has recently been created. It is a modified version of ITS 3.0, a coupled electronphoton Monte Carlo code that has world-wide distribution. The applicability of the new code to radiation-interaction problems of the type found in space environments is demonstrated

  17. Neutron point-flux calculation by Monte Carlo

    International Nuclear Information System (INIS)

    A survey of the usual methods for estimating flux at a point is given. The associated variance-reducing techniques in direct Monte Carlo games are explained. The multigroup Monte Carlo codes MC for critical systems and PUNKT for point source-point detector-systems are represented, and problems in applying the codes to practical tasks are discussed. (author)

  18. CERN Summer Student Report 2016 Monte Carlo Data Base Improvement

    CERN Document Server

    Caciulescu, Alexandru Razvan

    2016-01-01

    During my Summer Student project I worked on improving the Monte Carlo Data Base and MonALISA services for the ALICE Collaboration. The project included learning the infrastructure for tracking and monitoring of the Monte Carlo productions as well as developing a new RESTful API for seamless integration with the JIRA issue tracking framework.

  19. Quantum Monte Carlo Simulations : Algorithms, Limitations and Applications

    NARCIS (Netherlands)

    Raedt, H. De

    1992-01-01

    A survey is given of Quantum Monte Carlo methods currently used to simulate quantum lattice models. The formalisms employed to construct the simulation algorithms are sketched. The origin of fundamental (minus sign) problems which limit the applicability of the Quantum Monte Carlo approach is shown

  20. Practical schemes for accurate forces in quantum Monte Carlo

    NARCIS (Netherlands)

    Moroni, S.; Saccani, S.; Filippi, C.

    2014-01-01

    While the computation of interatomic forces has become a well-established practice within variational Monte Carlo (VMC), the use of the more accurate Fixed-Node Diffusion Monte Carlo (DMC) method is still largely limited to the computation of total energies on structures obtained at a lower level of

  1. Managing the Knowledge Commons: Interview with Carlo Vercellone -

    OpenAIRE

    Vercellone, Carlo

    2015-01-01

    Interview with Dr. Carlo Vercellone, one of the leading theorists of cognitive capitalism and economist at the CNRS Lab of The Sorbonne Economic Centre (Centre d'Economie de la Sorbonne, CES). - See more at: http://www.nesta.org.uk/blog/managing-knowledge-commons-interview-carlo-vercellone#sthash.1F1Ig5dF.dpuf,

  2. IN MEMORIAM CARLOS RESTREPO. UN VERDADERO MAESTRO

    Directory of Open Access Journals (Sweden)

    Pelayo Correa

    2009-06-01

    Full Text Available Carlos Restrepo fue el primer profesor de Patología y un miembro ilustre del grupo de pioneros que fundaron la Facultad de Medicina de la Universidad del Valle. Estos pioneros convergieron en Cali en la década de 1950, en posesión de un espíritu renovador y creativo que emprendió con mucho éxito la labor de cambiar la cultura académica del Valle del Cauca. Ellos encontraron una sociedad apacible, que disfrutaba de la generosidad de su entorno, sin deseos de romper las tradiciones centenarias de estilo de vida sencillo y satisfecho. Cuando los hijos tenían el deseo y la capacidad de seguir estudios universitarios, especialmente en el área de la medicina, la familia los enviaba a climas menos cálidos, que supuestamente favorecían la función cerebral y la acumulación de conocimientos. Los pioneros de la educación médica en el Valle del Cauca, en buena parte reclutados en universidades nacionales y extranjeras, sabían muy bien que el ambiente vallecaucano no impide una formación universitaria de primera clase.Carlos Restrepo era prototipo del espíritu de cambio y formación intelectual de las nuevas generaciones. Lo manifestaba de múltiples maneras, en buena parte con su genio alegre, extrovertido, optimista, de risa fácil y contagiosa. Pero esta fase amable de su personalidad no ocultaba su tarea formativa; exigía de sus discípulos dedicación y trabajo duro, con fidelidad expresados en memorables caricaturas que exageraban su genio ocasionalmente explosivo.El grupo de pioneros se enfocó con un espíritu de total entrega (tiempo completo y dedicación exclusiva y organizó la nueva Facultad en bien definidos y estructurados departamentos: Anatomía, Bioquímica, Fisiología, Farmacología, Patología, Medicina Interna, Cirugía, Obstetricia y Ginecología, Psiquiatría y Medicina Preventiva. Los departamentos integraron sus funciones primordiales en la enseñanza, la investigación y el servicio a la comunidad. El centro

  3. Application of the measurement-based Monte Carlo method in nasopharyngeal cancer patients for intensity modulated radiation therapy

    International Nuclear Information System (INIS)

    This study aims to utilize a measurement-based Monte Carlo (MBMC) method to evaluate the accuracy of dose distributions calculated using the Eclipse radiotherapy treatment planning system (TPS) based on the anisotropic analytical algorithm. Dose distributions were calculated for the nasopharyngeal carcinoma (NPC) patients treated with the intensity modulated radiotherapy (IMRT). Ten NPC IMRT plans were evaluated by comparing their dose distributions with those obtained from the in-house MBMC programs for the same CT images and beam geometry. To reconstruct the fluence distribution of the IMRT field, an efficiency map was obtained by dividing the energy fluence of the intensity modulated field by that of the open field, both acquired from an aS1000 electronic portal imaging device. The integrated image of the non-gated mode was used to acquire the full dose distribution delivered during the IMRT treatment. This efficiency map redistributed the particle weightings of the open field phase-space file for IMRT applications. Dose differences were observed in the tumor and air cavity boundary. The mean difference between MBMC and TPS in terms of the planning target volume coverage was 0.6% (range: 0.0–2.3%). The mean difference for the conformity index was 0.01 (range: 0.0–0.01). In conclusion, the MBMC method serves as an independent IMRT dose verification tool in a clinical setting. - Highlights: ► The patient-based Monte Carlo method serves as a reference standard to verify IMRT doses. ► 3D Dose distributions for NPC patients have been verified by the Monte Carlo method. ► Doses predicted by the Monte Carlo method matched closely with those by the TPS. ► The Monte Carlo method predicted a higher mean dose to the middle ears than the TPS. ► Critical organ doses should be confirmed to avoid overdose to normal organs

  4. Characterization of Siemens Bio graph 6 PET by Monte Carlo simulation; Caracterizacion del escaner PET Biograph 6 de Siemens mediante simulacion Monte Carlo

    Energy Technology Data Exchange (ETDEWEB)

    Gallego Franco, P.; Garcia Marcos, R.

    2015-07-01

    GAMOS simulation code based on Geant4 is a very powerful tool for the design and modeling optimization on Positron Emission Tomography (PET) systems. In order to obtain a proper image quality, it results to be extremely important determine the optimal activity which is going to be delivered. For this reason a study about the internal system parameters that affects image quality, such as scatter fraction (SF) and the count rate equivalent noise (NEC), has been carried out. The study involves the comparison of experimental measures on both parameters, with those obtained by Monte Carlo simulation of Siemens Pet Biograph 6 True Point with True V option. Based on simulations results, a paralizable dead-time model that adjusts, depending on the activity provided, the proper dead-time for scanner detectors. Also a study about the variation of this proper dead-time with the activity has been carried out. (Author)

  5. Monte Carlo 2000 Conference : Advanced Monte Carlo for Radiation Physics, Particle Transport Simulation and Applications

    CERN Document Server

    Baräo, Fernando; Nakagawa, Masayuki; Távora, Luis; Vaz, Pedro

    2001-01-01

    This book focusses on the state of the art of Monte Carlo methods in radiation physics and particle transport simulation and applications, the latter involving in particular, the use and development of electron--gamma, neutron--gamma and hadronic codes. Besides the basic theory and the methods employed, special attention is paid to algorithm development for modeling, and the analysis of experiments and measurements in a variety of fields ranging from particle to medical physics.

  6. Hybrid SN/Monte Carlo research and results

    International Nuclear Information System (INIS)

    The neutral particle transport equation is solved by a hybrid method that iteratively couples regions where deterministic (SN) and stochastic (Monte Carlo) methods are applied. The Monte Carlo and SN regions are fully coupled in the sense that no assumption is made about geometrical separation or decoupling. The hybrid method has been successfully applied to realistic shielding problems. The vectorized Monte Carlo algorithm in the hybrid method has been ported to the massively parallel architecture of the Connection Machine. Comparisons of performance on a vector machine (Cray Y-MP) and the Connection Machine (CM-2) show that significant speedups are obtainable for vectorized Monte Carlo algorithms on massively parallel machines, even when realistic problems requiring variance reduction are considered. However, the architecture of the Connection Machine does place some limitations on the regime in which the Monte Carlo algorithm may by expected to perform well. (author)

  7. Reconstruction of Monte Carlo replicas from Hessian parton distributions

    CERN Document Server

    Hou, Tie-Jiun; Huston, Joey; Nadolsky, Pavel; Schmidt, Carl; Stump, Daniel; Wang, Bo-Ting; Xie, Ke-Ping; Dulat, Sayipjamal; Pumplin, Jon; Yuan, C -P

    2016-01-01

    We explore connections between two common methods for quantifying the uncertainty in parton distribution functions (PDFs), based on the Hessian error matrix and Monte-Carlo sampling. CT14 parton distributions in the Hessian representation are converted into Monte-Carlo replicas by a numerical method that reproduces important properties of CT14 Hessian PDFs: the asymmetry of CT14 uncertainties and positivity of individual parton distributions. The ensembles of CT14 Monte-Carlo replicas constructed this way at NNLO and NLO are suitable for various collider applications, such as cross section reweighting. Master formulas for computation of asymmetric standard deviations in the Monte-Carlo representation are derived. A numerical program is made available for conversion of Hessian PDFs into Monte-Carlo replicas according to normal, log-normal, and Watt-Thorne sampling procedures.

  8. Problems in radiation shielding calculations with Monte Carlo methods

    International Nuclear Information System (INIS)

    The Monte Carlo method is a very useful tool for solving a large class of radiation transport problem. In contrast with deterministic method, geometric complexity is a much less significant problem for Monte Carlo calculations. However, the accuracy of Monte Carlo calculations is of course, limited by statistical error of the quantities to be estimated. In this report, we point out some typical problems to solve a large shielding system including radiation streaming. The Monte Carlo coupling technique was developed to settle such a shielding problem accurately. However, the variance of the Monte Carlo results using the coupling technique of which detectors were located outside the radiation streaming, was still not enough. So as to bring on more accurate results for the detectors located outside the streaming and also for a multi-legged-duct streaming problem, a practicable way of ''Prism Scattering technique'' is proposed in the study. (author)

  9. Quantum Monte Carlo methods algorithms for lattice models

    CERN Document Server

    Gubernatis, James; Werner, Philipp

    2016-01-01

    Featuring detailed explanations of the major algorithms used in quantum Monte Carlo simulations, this is the first textbook of its kind to provide a pedagogical overview of the field and its applications. The book provides a comprehensive introduction to the Monte Carlo method, its use, and its foundations, and examines algorithms for the simulation of quantum many-body lattice problems at finite and zero temperature. These algorithms include continuous-time loop and cluster algorithms for quantum spins, determinant methods for simulating fermions, power methods for computing ground and excited states, and the variational Monte Carlo method. Also discussed are continuous-time algorithms for quantum impurity models and their use within dynamical mean-field theory, along with algorithms for analytically continuing imaginary-time quantum Monte Carlo data. The parallelization of Monte Carlo simulations is also addressed. This is an essential resource for graduate students, teachers, and researchers interested in ...

  10. A hybrid Monte Carlo and response matrix Monte Carlo method in criticality calculation

    International Nuclear Information System (INIS)

    Full core calculations are very useful and important in reactor physics analysis, especially in computing the full core power distributions, optimizing the refueling strategies and analyzing the depletion of fuels. To reduce the computing time and accelerate the convergence, a method named Response Matrix Monte Carlo (RMMC) method based on analog Monte Carlo simulation was used to calculate the fixed source neutron transport problems in repeated structures. To make more accurate calculations, we put forward the RMMC method based on non-analog Monte Carlo simulation and investigate the way to use RMMC method in criticality calculations. Then a new hybrid RMMC and MC (RMMC+MC) method is put forward to solve the criticality problems with combined repeated and flexible geometries. This new RMMC+MC method, having the advantages of both MC method and RMMC method, can not only increase the efficiency of calculations, also simulate more complex geometries rather than repeated structures. Several 1-D numerical problems are constructed to test the new RMMC and RMMC+MC method. The results show that RMMC method and RMMC+MC method can efficiently reduce the computing time and variations in the calculations. Finally, the future research directions are mentioned and discussed at the end of this paper to make RMMC method and RMMC+MC method more powerful. (authors)

  11. Radiation doses in volume-of-interest breast computed tomography—A Monte Carlo simulation study

    Energy Technology Data Exchange (ETDEWEB)

    Lai, Chao-Jen, E-mail: cjlai3711@gmail.com; Zhong, Yuncheng; Yi, Ying; Wang, Tianpeng; Shaw, Chris C. [Department of Imaging Physics, The University of Texas MD Anderson Cancer Center, Houston, Texas 77030-4009 (United States)

    2015-06-15

    Purpose: Cone beam breast computed tomography (breast CT) with true three-dimensional, nearly isotropic spatial resolution has been developed and investigated over the past decade to overcome the problem of lesions overlapping with breast anatomical structures on two-dimensional mammographic images. However, the ability of breast CT to detect small objects, such as tissue structure edges and small calcifications, is limited. To resolve this problem, the authors proposed and developed a volume-of-interest (VOI) breast CT technique to image a small VOI using a higher radiation dose to improve that region’s visibility. In this study, the authors performed Monte Carlo simulations to estimate average breast dose and average glandular dose (AGD) for the VOI breast CT technique. Methods: Electron–Gamma-Shower system code-based Monte Carlo codes were used to simulate breast CT. The Monte Carlo codes estimated were validated using physical measurements of air kerma ratios and point doses in phantoms with an ion chamber and optically stimulated luminescence dosimeters. The validated full cone x-ray source was then collimated to simulate half cone beam x-rays to image digital pendant-geometry, hemi-ellipsoidal, homogeneous breast phantoms and to estimate breast doses with full field scans. 13-cm in diameter, 10-cm long hemi-ellipsoidal homogeneous phantoms were used to simulate median breasts. Breast compositions of 25% and 50% volumetric glandular fractions (VGFs) were used to investigate the influence on breast dose. The simulated half cone beam x-rays were then collimated to a narrow x-ray beam with an area of 2.5 × 2.5 cm{sup 2} field of view at the isocenter plane and to perform VOI field scans. The Monte Carlo results for the full field scans and the VOI field scans were then used to estimate the AGD for the VOI breast CT technique. Results: The ratios of air kerma ratios and dose measurement results from the Monte Carlo simulation to those from the physical

  12. Commissioning and First Observations with Wide FastCam at the Telescopio Carlos S\\'anchez

    CERN Document Server

    Velasco, Sergio; Oscoz, Alejandro; López, Roberto L; Puga, Marta; Murga, Gaizka; Pérez-Garrido, Antonio; Pallé, Enric; Ricci, Davide; Ayuso, Ismael; Hernández-Sánchez, Mónica; Truant, Nicola

    2016-01-01

    The FastCam instrument platform, jointly developed by the IAC and the UPCT, allows, in real-time, acquisition, selection and storage of images with a resolution that reaches the diffraction limit of medium-sized telescopes. FastCam incorporates a specially designed software package to analyse series of tens of thousands of images in parallel with the data acquisition at the telescope. Wide FastCam is a new instrument that, using the same software for data acquisition, does not look for lucky imaging but fast observations in a much larger field of view. Here we describe the commissioning process and first observations with Wide FastCam at the Telescopio Carlos S\\'anchez (TCS) in the Observatorio del Teide.

  13. Contrast to Noise Ratio and Contrast Detail Analysis in Mammography:A Monte Carlo Study

    Science.gov (United States)

    Metaxas, V.; Delis, H.; Kalogeropoulou, C.; Zampakis, P.; Panayiotakis, G.

    2015-09-01

    The mammographic spectrum is one of the major factors affecting image quality in mammography. In this study, a Monte Carlo (MC) simulation model was used to evaluate image quality characteristics of various mammographic spectra. The anode/filter combinations evaluated, were those traditionally used in mammography, for tube voltages between 26 and 30 kVp. The imaging performance was investigated in terms of Contrast to Noise Ratio (CNR) and Contrast Detail (CD) analysis, by involving human observers, utilizing a mathematical CD phantom. Soft spectra provided the best characteristics in terms of both CNR and CD scores, while tube voltage had a limited effect. W-anode spectra filtered with k-edge filters demonstrated an improved performance, that sometimes was better compared to softer x-ray spectra, produced by Mo or Rh anode. Regarding the filter material, k-edge filters showed superior performance compared to Al filters.

  14. PeneloPET, a Monte Carlo PET simulation tool based on PENELOPE: features and validation

    Energy Technology Data Exchange (ETDEWEB)

    Espana, S; Herraiz, J L; Vicente, E; Udias, J M [Grupo de Fisica Nuclear, Departmento de Fisica Atomica, Molecular y Nuclear, Universidad Complutense de Madrid, Madrid (Spain); Vaquero, J J; Desco, M [Unidad de Medicina y CirugIa Experimental, Hospital General Universitario Gregorio Maranon, Madrid (Spain)], E-mail: jose@nuc2.fis.ucm.es

    2009-03-21

    Monte Carlo simulations play an important role in positron emission tomography (PET) imaging, as an essential tool for the research and development of new scanners and for advanced image reconstruction. PeneloPET, a PET-dedicated Monte Carlo tool, is presented and validated in this work. PeneloPET is based on PENELOPE, a Monte Carlo code for the simulation of the transport in matter of electrons, positrons and photons, with energies from a few hundred eV to 1 GeV. PENELOPE is robust, fast and very accurate, but it may be unfriendly to people not acquainted with the FORTRAN programming language. PeneloPET is an easy-to-use application which allows comprehensive simulations of PET systems within PENELOPE. Complex and realistic simulations can be set by modifying a few simple input text files. Different levels of output data are available for analysis, from sinogram and lines-of-response (LORs) histogramming to fully detailed list mode. These data can be further exploited with the preferred programming language, including ROOT. PeneloPET simulates PET systems based on crystal array blocks coupled to photodetectors and allows the user to define radioactive sources, detectors, shielding and other parts of the scanner. The acquisition chain is simulated in high level detail; for instance, the electronic processing can include pile-up rejection mechanisms and time stamping of events, if desired. This paper describes PeneloPET and shows the results of extensive validations and comparisons of simulations against real measurements from commercial acquisition systems. PeneloPET is being extensively employed to improve the image quality of commercial PET systems and for the development of new ones.

  15. Development and evaluation of attenuation and scatter correction techniques for SPECT using the Monte Carlo method

    International Nuclear Information System (INIS)

    Quantitative scintigrafic images, obtained by NaI(Tl) scintillation cameras, are limited by photon attenuation and contribution from scattered photons. A Monte Carlo program was developed in order to evaluate these effects. Simple source-phantom geometries and more complex nonhomogeneous cases can be simulated. Comparisons with experimental data for both homogeneous and nonhomogeneous regions and with published results have shown good agreement. The usefulness for simulation of parameters in scintillation camera systems, stationary as well as in SPECT systems, has also been demonstrated. An attenuation correction method based on density maps and build-up functions has been developed. The maps were obtained from a transmission measurement using an external 57Co flood source and the build-up was simulated by the Monte Carlo code. Two scatter correction methods, the dual-window method and the convolution-subtraction method, have been compared using the Monte Carlo method. The aim was to compare the estimated scatter with the true scatter in the photo-peak window. It was concluded that accurate depth-dependent scatter functions are essential for a proper scatter correction. A new scatter and attenuation correction method has been developed based on scatter line-spread functions (SLSF) obtained for different depths and lateral positions in the phantom. An emission image is used to determine the source location in order to estimate the scatter in the photo-peak window. Simulation studies of a clinically realistic source in different positions in cylindrical water phantoms were made for three photon energies. The SLSF-correction method was also evaluated by simulation studies for 1. a myocardial source, 2. uniform source in the lungs and 3. a tumour located in the lungs in a realistic, nonhomogeneous computer phantom. The results showed that quantitative images could be obtained in nonhomogeneous regions. (67 refs.)

  16. Comparing analytical and Monte Carlo optical diffusion models in phosphor-based X-ray detectors

    Science.gov (United States)

    Kalyvas, N.; Liaparinos, P.

    2014-03-01

    Luminescent materials are employed as radiation to light converters in detectors of medical imaging systems, often referred to as phosphor screens. Several processes affect the light transfer properties of phosphors. Amongst the most important is the interaction of light. Light attenuation (absorption and scattering) can be described either through "diffusion" theory in theoretical models or "quantum" theory in Monte Carlo methods. Although analytical methods, based on photon diffusion equations, have been preferentially employed to investigate optical diffusion in the past, Monte Carlo simulation models can overcome several of the analytical modelling assumptions. The present study aimed to compare both methodologies and investigate the dependence of the analytical model optical parameters as a function of particle size. It was found that the optical photon attenuation coefficients calculated by analytical modeling are decreased with respect to the particle size (in the region 1- 12 μm). In addition, for particles sizes smaller than 6μm there is no simultaneous agreement between the theoretical modulation transfer function and light escape values with respect to the Monte Carlo data.

  17. Perspectives &advanced projects for small satellite missions at Carlo Gavazzi Space

    Science.gov (United States)

    Morea, G.; Sabatini, P.

    2004-11-01

    This paper presents the Planned and on-going programs in Carlo Gavazzi Space (CGS) for the next five years. Thanks to the success of the first MITA platform mission, CGS has acquired a consolidated experience in Satellite System Design and of Prime Contractor in Satellite programmes. After four years from launch of first MITA platform from Plesetsk (CSI) several mission concept and satellite program have started and are under developing. The common elements to these program is the low mission cost and short development plan. The first ASI Scientific Small Mission using the MITA platform, AGILE is a Gamma Ray detector aimed to identify Gamma Ray Bursts. The Payload has been developed with the contribution of a large group of Italian Research Centres and Institutes, Carlo Gavazzi Space is also responsible for the overall mission as leader of an Italian Consortium. In the frame of ASI's Earth Observation Programmes, Carlo Gavazzi Space has also successfully concluded the Phase B/C of the HypSEO (HyperSpectral Earth Observer) mission. The Desertsat satellite, devoted to the study of the sand dunes movements and to the assessment of the desertification process, is a joint collaboration with ASI and Egypt. Desertsat is equipped with an Multispectral imager. PALAMEDE, whose peculiar characteristics are two: the first is to use components and technologies not space qualified and therefore by far cheaper than those normally used for space systems, the second is that it is entirely realised by the students of Politecnico.

  18. Molecular Dynamics and Monte Carlo simulations resolve apparent diffusion rate differences for proteins confined in nanochannels

    Energy Technology Data Exchange (ETDEWEB)

    Tringe, J.W., E-mail: tringe2@llnl.gov [Lawrence Livermore National Laboratory, 7000 East Avenue, Livermore, CA (United States); Ileri, N. [Lawrence Livermore National Laboratory, 7000 East Avenue, Livermore, CA (United States); Department of Chemical Engineering & Materials Science, University of California, Davis, CA (United States); Levie, H.W. [Lawrence Livermore National Laboratory, 7000 East Avenue, Livermore, CA (United States); Stroeve, P.; Ustach, V.; Faller, R. [Department of Chemical Engineering & Materials Science, University of California, Davis, CA (United States); Renaud, P. [Swiss Federal Institute of Technology, Lausanne, (EPFL) (Switzerland)

    2015-08-18

    Highlights: • WGA proteins in nanochannels modeled by Molecular Dynamics and Monte Carlo. • Protein surface coverage characterized by atomic force microscopy. • Models indicate transport characteristics depend strongly on surface coverage. • Results resolve of a four orders of magnitude difference in diffusion coefficient values. - Abstract: We use Molecular Dynamics and Monte Carlo simulations to examine molecular transport phenomena in nanochannels, explaining four orders of magnitude difference in wheat germ agglutinin (WGA) protein diffusion rates observed by fluorescence correlation spectroscopy (FCS) and by direct imaging of fluorescently-labeled proteins. We first use the ESPResSo Molecular Dynamics code to estimate the surface transport distance for neutral and charged proteins. We then employ a Monte Carlo model to calculate the paths of protein molecules on surfaces and in the bulk liquid transport medium. Our results show that the transport characteristics depend strongly on the degree of molecular surface coverage. Atomic force microscope characterization of surfaces exposed to WGA proteins for 1000 s show large protein aggregates consistent with the predicted coverage. These calculations and experiments provide useful insight into the details of molecular motion in confined geometries.

  19. Hamiltonian Monte Carlo algorithm for the characterization of hydraulic conductivity from the heat tracing data

    Science.gov (United States)

    Djibrilla Saley, A.; Jardani, A.; Soueid Ahmed, A.; Raphael, A.; Dupont, J. P.

    2016-11-01

    Estimating spatial distributions of the hydraulic conductivity in heterogeneous aquifers has always been an important and challenging task in hydrology. Generally, the hydraulic conductivity field is determined from hydraulic head or pressure measurements. In the present study, we propose to use temperature data as source of information for characterizing the spatial distributions of the hydraulic conductivity field. In this way, we performed a laboratory sandbox experiment with the aim of imaging the heterogeneities of the hydraulic conductivity field from thermal monitoring. During the laboratory experiment, we injected a hot water pulse, which induces a heat plume motion into the sandbox. The induced plume was followed by a set of thermocouples placed in the sandbox. After the temperature data acquisition, we performed a hydraulic tomography using the stochastic Hybrid Monte Carlo approach, also called the Hamiltonian Monte Carlo (HMC) algorithm to invert the temperature data. This algorithm is based on a combination of the Metropolis Monte Carlo method and the Hamiltonian dynamics approach. The parameterization of the inverse problem was done with the Karhunen-Loève (KL) expansion to reduce the dimensionality of the unknown parameters. Our approach has provided successful reconstruction of the hydraulic conductivity field with low computational effort.

  20. Monte Carlo simulation for the micellar behavior of amphiphilic comb-like copolymers

    Institute of Scientific and Technical Information of China (English)

    冯莺; 隋家贤; 赵季若; 陈欣方

    2000-01-01

    Micellar behaviors in 2D and 3D lattice models for amphiphilic comb-like copolymers in water phase and in water/oil mixtures were simulated. A dynamical algorithm together with chain reptation movements was used in the simulation. Three-dimension displaying program was pro-grammed and free energy was estimated by Monte Carlo technigue. The results demonstrate that reduced interaction energy influences morphological structures of micelle and emulsion ??stems greatly; 3D simulation showing can display more direct images of morphological structures; the amphiphilic comb-like polymers with a hydrophobic main chain and hydrophilic side chains have lower energy in water than in oil.

  1. Monte Carlo design for a new neutron collimator at the ENEA Casaccia TRIGA reactor.

    Science.gov (United States)

    Burgio, N; Rosa, R

    2004-10-01

    The TRIGA RC-1 1MW reactor operating at ENEA Casaccia Center is currently being developed as a second neutron imaging facility that shall be devoted to computed tomography as well as neutron tomography. In order to reduce the gamma-ray content in the neutron beam, the reactor tangential piercing channel was selected. A set of Monte Carlo simulation was used to design the neutron collimator, to determine the preliminary choice of the materials to be employed in the collimator design. PMID:15246415

  2. Rapid Monte Carlo simulation of detector DQE(f)

    Energy Technology Data Exchange (ETDEWEB)

    Star-Lack, Josh, E-mail: josh.starlack@varian.com; Sun, Mingshan; Abel, Eric [Varian Medical Systems, Palo Alto, California 94304-1030 (United States); Meyer, Andre; Morf, Daniel [Varian Medical Systems, CH-5405, Baden-Dattwil (Switzerland); Constantin, Dragos; Fahrig, Rebecca [Department of Radiology, Stanford University, Stanford, California 94305 (United States)

    2014-03-15

    Purpose: Performance optimization of indirect x-ray detectors requires proper characterization of both ionizing (gamma) and optical photon transport in a heterogeneous medium. As the tool of choice for modeling detector physics, Monte Carlo methods have failed to gain traction as a design utility, due mostly to excessive simulation times and a lack of convenient simulation packages. The most important figure-of-merit in assessing detector performance is the detective quantum efficiency (DQE), for which most of the computational burden has traditionally been associated with the determination of the noise power spectrum (NPS) from an ensemble of flood images, each conventionally having 10{sup 7} − 10{sup 9} detected gamma photons. In this work, the authors show that the idealized conditions inherent in a numerical simulation allow for a dramatic reduction in the number of gamma and optical photons required to accurately predict the NPS. Methods: The authors derived an expression for the mean squared error (MSE) of a simulated NPS when computed using the International Electrotechnical Commission-recommended technique based on taking the 2D Fourier transform of flood images. It is shown that the MSE is inversely proportional to the number of flood images, and is independent of the input fluence provided that the input fluence is above a minimal value that avoids biasing the estimate. The authors then propose to further lower the input fluence so that each event creates a point-spread function rather than a flood field. The authors use this finding as the foundation for a novel algorithm in which the characteristic MTF(f), NPS(f), and DQE(f) curves are simultaneously generated from the results of a single run. The authors also investigate lowering the number of optical photons used in a scintillator simulation to further increase efficiency. Simulation results are compared with measurements performed on a Varian AS1000 portal imager, and with a previously published

  3. Carlos Gardel, el patrimonio que sonrie

    Directory of Open Access Journals (Sweden)

    María Julia Carozzi

    2003-10-01

    Full Text Available Analizando los modos en que los porteños recordaron a Carlos Gardel en el mes del 68 aniversario de su muerte, el artículo intenta dar cuenta de una de las formas en que los habitantes de la ciudad de Buenos Aires conciben aquello que es memorable, identifican aquello en que se reconocen como porteños y singularizan aquello frente a lo cual experimentan sentimientos de pertenencia colectiva. El trabajo señala la centralidad que el milagro, la mimesis y el contacto directo con su cuerpo desempeñan en la preservación de la memoria de Gardel, quien encarna tanto al tango como a su éxito en el mundo. El caso de Gardel se presenta como un ejemplo de la organización de la memoria y la identidad de los porteños en particular y los argentinos en general en torno a personas reales a quienes se les asigna un valor extraordinario. Al sostener su profundo enraizamiento en cuerpos humanos concretos, tornan problemática la adopción local de los conceptos globalmente aceptados de patrimonio histórico y cultural.The article analyses one of the ways in which the inhabitants of Buenos Aires conceive that which is memorable, source of positive identification and origin of feelings of communitas by examining their commemoration of the 68th anniversary of the death of Carlos Gardel. It underscores the central role that miracles, mimesis and direct bodily contact play in the preservation of the memory of the star, who incarnates both the tango and its world-wide success. The case of Gardel is presented as an example of the centrality that real persons of extraordinary value have in the organization of local memory and collective identity. Since they are embedded in concrete human bodies, they reveal problems in the local adoption of globally accepted concepts of historical and cultural heritage.

  4. SU-E-J-144: Low Activity Studies of Carbon 11 Activation Via GATE Monte Carlo

    International Nuclear Information System (INIS)

    Purpose: To investigate the behavior of a Monte Carlo simulation code with low levels of activity (∼1,000Bq). Such activity levels are expected from phantoms and patients activated via a proton therapy beam. Methods: Three different ranges for a therapeutic proton radiation beam were examined in a Monte Carlo simulation code: 13.5, 17.0 and 21.0cm. For each range, the decay of an equivalent length11C source and additional sources of length plus or minus one cm was studied in a benchmark PET simulation for activities of 1000, 2000 and 3000Bq. The ranges were chosen to coincide with a previous activation study, and the activities were chosen to coincide with the approximate level of isotope creation expected in a phantom or patient irradiated by a therapeutic proton beam. The GATE 7.0 simulation was completed on a cluster node, running Scientific Linux Carbon 6 (Red Hat©). The resulting Monte Carlo data were investigated with the ROOT (CERN) analysis tool. The half-life of11C was extracted via a histogram fit to the number of simulated PET events vs. time. Results: The average slope of the deviation of the extracted carbon half life from the expected/nominal value vs. activity showed a generally positive value. This was unexpected, as the deviation should, in principal, decrease with increased activity and lower statistical uncertainty. Conclusion: For activity levels on the order of 1,000Bq, the behavior of a benchmark PET test was somewhat unexpected. It is important to be aware of the limitations of low activity PET images, and low activity Monte Carlo simulations. This work was funded in part by the Philips corporation

  5. Recent advances and future prospects for Monte Carlo

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Forrest B [Los Alamos National Laboratory

    2010-01-01

    The history of Monte Carlo methods is closely linked to that of computers: The first known Monte Carlo program was written in 1947 for the ENIAC; a pre-release of the first Fortran compiler was used for Monte Carlo In 1957; Monte Carlo codes were adapted to vector computers in the 1980s, clusters and parallel computers in the 1990s, and teraflop systems in the 2000s. Recent advances include hierarchical parallelism, combining threaded calculations on multicore processors with message-passing among different nodes. With the advances In computmg, Monte Carlo codes have evolved with new capabilities and new ways of use. Production codes such as MCNP, MVP, MONK, TRIPOLI and SCALE are now 20-30 years old (or more) and are very rich in advanced featUres. The former 'method of last resort' has now become the first choice for many applications. Calculations are now routinely performed on office computers, not just on supercomputers. Current research and development efforts are investigating the use of Monte Carlo methods on FPGAs. GPUs, and many-core processors. Other far-reaching research is exploring ways to adapt Monte Carlo methods to future exaflop systems that may have 1M or more concurrent computational processes.

  6. Iterative acceleration methods for Monte Carlo and deterministic criticality calculations

    International Nuclear Information System (INIS)

    If you have ever given up on a nuclear criticality calculation and terminated it because it took so long to converge, you might find this thesis of interest. The author develops three methods for improving the fission source convergence in nuclear criticality calculations for physical systems with high dominance ratios for which convergence is slow. The Fission Matrix Acceleration Method and the Fission Diffusion Synthetic Acceleration (FDSA) Method are acceleration methods that speed fission source convergence for both Monte Carlo and deterministic methods. The third method is a hybrid Monte Carlo method that also converges for difficult problems where the unaccelerated Monte Carlo method fails. The author tested the feasibility of all three methods in a test bed consisting of idealized problems. He has successfully accelerated fission source convergence in both deterministic and Monte Carlo criticality calculations. By filtering statistical noise, he has incorporated deterministic attributes into the Monte Carlo calculations in order to speed their source convergence. He has used both the fission matrix and a diffusion approximation to perform unbiased accelerations. The Fission Matrix Acceleration method has been implemented in the production code MCNP and successfully applied to a real problem. When the unaccelerated calculations are unable to converge to the correct solution, they cannot be accelerated in an unbiased fashion. A Hybrid Monte Carlo method weds Monte Carlo and a modified diffusion calculation to overcome these deficiencies. The Hybrid method additionally possesses reduced statistical errors

  7. Iterative acceleration methods for Monte Carlo and deterministic criticality calculations

    Energy Technology Data Exchange (ETDEWEB)

    Urbatsch, T.J.

    1995-11-01

    If you have ever given up on a nuclear criticality calculation and terminated it because it took so long to converge, you might find this thesis of interest. The author develops three methods for improving the fission source convergence in nuclear criticality calculations for physical systems with high dominance ratios for which convergence is slow. The Fission Matrix Acceleration Method and the Fission Diffusion Synthetic Acceleration (FDSA) Method are acceleration methods that speed fission source convergence for both Monte Carlo and deterministic methods. The third method is a hybrid Monte Carlo method that also converges for difficult problems where the unaccelerated Monte Carlo method fails. The author tested the feasibility of all three methods in a test bed consisting of idealized problems. He has successfully accelerated fission source convergence in both deterministic and Monte Carlo criticality calculations. By filtering statistical noise, he has incorporated deterministic attributes into the Monte Carlo calculations in order to speed their source convergence. He has used both the fission matrix and a diffusion approximation to perform unbiased accelerations. The Fission Matrix Acceleration method has been implemented in the production code MCNP and successfully applied to a real problem. When the unaccelerated calculations are unable to converge to the correct solution, they cannot be accelerated in an unbiased fashion. A Hybrid Monte Carlo method weds Monte Carlo and a modified diffusion calculation to overcome these deficiencies. The Hybrid method additionally possesses reduced statistical errors.

  8. The Monte Carlo method the method of statistical trials

    CERN Document Server

    Shreider, YuA

    1966-01-01

    The Monte Carlo Method: The Method of Statistical Trials is a systematic account of the fundamental concepts and techniques of the Monte Carlo method, together with its range of applications. Some of these applications include the computation of definite integrals, neutron physics, and in the investigation of servicing processes. This volume is comprised of seven chapters and begins with an overview of the basic features of the Monte Carlo method and typical examples of its application to simple problems in computational mathematics. The next chapter examines the computation of multi-dimensio

  9. Finding Planet Nine: a Monte Carlo approach

    CERN Document Server

    Marcos, C de la Fuente

    2016-01-01

    Planet Nine is a hypothetical planet located well beyond Pluto that has been proposed in an attempt to explain the observed clustering in physical space of the perihelia of six extreme trans-Neptunian objects or ETNOs. The predicted approximate values of its orbital elements include a semimajor axis of 700 au, an eccentricity of 0.6, an inclination of 30 degrees, and an argument of perihelion of 150 degrees. Searching for this putative planet is already under way. Here, we use a Monte Carlo approach to create a synthetic population of Planet Nine orbits and study its visibility statistically in terms of various parameters and focusing on the aphelion configuration. Our analysis shows that, if Planet Nine exists and is at aphelion, it might be found projected against one out of four specific areas in the sky. Each area is linked to a particular value of the longitude of the ascending node and two of them are compatible with an apsidal antialignment scenario. In addition and after studying the current statistic...

  10. Monte Carlo Production Management at CMS

    CERN Document Server

    Boudoul, G.; Pol, A; Srimanobhas, P; Vlimant, J R; Franzoni, Giovanni

    2015-01-01

    The analysis of the LHC data at the Compact Muon Solenoid (CMS) experiment requires the production of a large number of simulated events.During the runI of LHC (2010-2012), CMS has produced over 12 Billion simulated events,organized in approximately sixty different campaigns each emulating specific detector conditions and LHC running conditions (pile up).In order toaggregate the information needed for the configuration and prioritization of the events production,assure the book-keeping and of all the processing requests placed by the physics analysis groups,and to interface with the CMS production infrastructure,the web-based service Monte Carlo Management (McM) has been developed and put in production in 2012.McM is based on recent server infrastructure technology (CherryPy + java) and relies on a CouchDB database back-end.This contribution will coverthe one and half year of operational experience managing samples of simulated events for CMS,the evolution of its functionalitiesand the extension of its capabi...

  11. Monte Carlo simulations for focusing elliptical guides

    Energy Technology Data Exchange (ETDEWEB)

    Valicu, Roxana [FRM2 Garching, Muenchen (Germany); Boeni, Peter [E20, TU Muenchen (Germany)

    2009-07-01

    The aim of the Monte Carlo simulations using McStas Programme was to improve the focusing of the neutron beam existing at PGAA (FRM II) by prolongation of the existing elliptic guide (coated now with supermirrors with m=3) with a new part. First we have tried with an initial length of the additional guide of 7,5cm and coatings for the neutron guide of supermirrors with m=4,5 and 6. The gain (calculated by dividing the intensity in the focal point after adding the guide by the intensity at the focal point with the initial guide) obtained for this coatings indicated that a coating with m=5 would be appropriate for a first trial. The next step was to vary the length of the additional guide for this m value and therefore choosing the appropriate length for the maximal gain. With the m value and the length of the guide fixed we have introduced an aperture 1 cm before the focal point and we have varied the radius of this aperture in order to obtain a focused beam. We have observed a dramatic decrease in the size of the beam in the focal point after introducing this aperture. The simulation results, the gains obtained and the evolution of the beam size will be presented.

  12. Linear Scaling Quantum Monte Carlo Calculations

    Science.gov (United States)

    Williamson, Andrew

    2002-03-01

    New developments to the quantum Monte Carlo approach are presented that improve the scaling of the time required to calculate the total energy of a configuration of electronic coordinates from N^3 to nearly linear[1]. The first factor of N is achieved by applying a unitary transform to the set of single particle orbitals used to construct the Slater determinant, creating a set of maximally localized Wannier orbitals. These localized functions are then truncated beyond a given cutoff radius to introduce sparsity into the Slater determinant. The second factor of N is achieved by evaluating the maximally localized Wannier orbitals on a cubic spline grid, which removes the size dependence of the basis set (e.g. plane waves, Gaussians) typically used to expand the orbitals. Application of this method to the calculation of the binding energy of carbon fullerenes and silicon nanostructures will be presented. An extension of the approach to deal with excited states of systems will also be presented in the context of the calculation of the excitonic gap of a variety of systems. This work was performed under the auspices of the U.S. Dept. of Energy at the University of California/LLNL under contract no. W-7405-Eng-48. [1] A.J. Williamson, R.Q. Hood and J.C. Grossman, Phys. Rev. Lett. 87 246406 (2001)

  13. Diffusion Monte Carlo in internal coordinates.

    Science.gov (United States)

    Petit, Andrew S; McCoy, Anne B

    2013-08-15

    An internal coordinate extension of diffusion Monte Carlo (DMC) is described as a first step toward a generalized reduced-dimensional DMC approach. The method places no constraints on the choice of internal coordinates other than the requirement that they all be independent. Using H(3)(+) and its isotopologues as model systems, the methodology is shown to be capable of successfully describing the ground state properties of molecules that undergo large amplitude, zero-point vibrational motions. Combining the approach developed here with the fixed-node approximation allows vibrationally excited states to be treated. Analysis of the ground state probability distribution is shown to provide important insights into the set of internal coordinates that are less strongly coupled and therefore more suitable for use as the nodal coordinates for the fixed-node DMC calculations. In particular, the curvilinear normal mode coordinates are found to provide reasonable nodal surfaces for the fundamentals of H(2)D(+) and D(2)H(+) despite both molecules being highly fluxional.

  14. Commensurabilities between ETNOs: a Monte Carlo survey

    Science.gov (United States)

    de la Fuente Marcos, C.; de la Fuente Marcos, R.

    2016-07-01

    Many asteroids in the main and trans-Neptunian belts are trapped in mean motion resonances with Jupiter and Neptune, respectively. As a side effect, they experience accidental commensurabilities among themselves. These commensurabilities define characteristic patterns that can be used to trace the source of the observed resonant behaviour. Here, we explore systematically the existence of commensurabilities between the known ETNOs using their heliocentric and barycentric semimajor axes, their uncertainties, and Monte Carlo techniques. We find that the commensurability patterns present in the known ETNO population resemble those found in the main and trans-Neptunian belts. Although based on small number statistics, such patterns can only be properly explained if most, if not all, of the known ETNOs are subjected to the resonant gravitational perturbations of yet undetected trans-Plutonian planets. We show explicitly that some of the statistically significant commensurabilities are compatible with the Planet Nine hypothesis; in particular, a number of objects may be trapped in the 5:3 and 3:1 mean motion resonances with a putative Planet Nine with semimajor axis ˜700 au.

  15. Monte Carlo simulations for heavy ion dosimetry

    Energy Technology Data Exchange (ETDEWEB)

    Geithner, O.

    2006-07-26

    Water-to-air stopping power ratio (s{sub w,air}) calculations for the ionization chamber dosimetry of clinically relevant ion beams with initial energies from 50 to 450 MeV/u have been performed using the Monte Carlo technique. To simulate the transport of a particle in water the computer code SHIELD-HIT v2 was used which is a substantially modified version of its predecessor SHIELD-HIT v1. The code was partially rewritten, replacing formerly used single precision variables with double precision variables. The lowest particle transport specific energy was decreased from 1 MeV/u down to 10 keV/u by modifying the Bethe- Bloch formula, thus widening its range for medical dosimetry applications. Optional MSTAR and ICRU-73 stopping power data were included. The fragmentation model was verified using all available experimental data and some parameters were adjusted. The present code version shows excellent agreement with experimental data. Additional to the calculations of stopping power ratios, s{sub w,air}, the influence of fragments and I-values on s{sub w,air} for carbon ion beams was investigated. The value of s{sub w,air} deviates as much as 2.3% at the Bragg peak from the recommended by TRS-398 constant value of 1.130 for an energy of 50 MeV/u. (orig.)

  16. A continuation multilevel Monte Carlo algorithm

    KAUST Repository

    Collier, Nathan

    2014-09-05

    We propose a novel Continuation Multi Level Monte Carlo (CMLMC) algorithm for weak approximation of stochastic models. The CMLMC algorithm solves the given approximation problem for a sequence of decreasing tolerances, ending when the required error tolerance is satisfied. CMLMC assumes discretization hierarchies that are defined a priori for each level and are geometrically refined across levels. The actual choice of computational work across levels is based on parametric models for the average cost per sample and the corresponding variance and weak error. These parameters are calibrated using Bayesian estimation, taking particular notice of the deepest levels of the discretization hierarchy, where only few realizations are available to produce the estimates. The resulting CMLMC estimator exhibits a non-trivial splitting between bias and statistical contributions. We also show the asymptotic normality of the statistical error in the MLMC estimator and justify in this way our error estimate that allows prescribing both required accuracy and confidence in the final result. Numerical results substantiate the above results and illustrate the corresponding computational savings in examples that are described in terms of differential equations either driven by random measures or with random coefficients. © 2014, Springer Science+Business Media Dordrecht.

  17. Monte Carlo study of nanowire magnetic properties

    Institute of Scientific and Technical Information of China (English)

    R.Masrour; L.Bahmad; A.Benyoussef

    2013-01-01

    In this work,we use Monte Carlo simulations to study the magnetic properties of a nanowire system based on a honeycomb lattice,in the absence as well as in the presence of both an external magnetic field and crystal field.The system is formed with NL layers having spins that can take the values σ =+1/2 and S =+1,0.The blocking temperature is deduced,for each spin configuration,depending on the crystal field A.The effect of the exchange interaction coupling Jp between the spin configurations σ and S is studied for different values of temperature at fixed crystal field.The established ground-state phase diagram,in the plane (Jp,A),shows that the only stable configurations are:(1/2,0),(1/2,+1),and (1/2,-1).The thermal magnetization and susceptibility are investigated for the two spin configurations,in the absence as well as in the presence of a crystal field.Finally,we establish the hysteresis cycle for different temperature values,showing that there is almost no remaining magnetization in the absence of the external magnetic field,and that the studied system exhibits a super-paramagnetic behavior.

  18. Monte Carlo Simulation of River Meander Modelling

    Science.gov (United States)

    Posner, A. J.; Duan, J. G.

    2010-12-01

    This study first compares the first order analytical solutions for flow field by Ikeda et. al. (1981) and Johanesson and Parker (1989b). Ikeda et. al.’s (1981) linear bank erosion model was implemented to predict the rate of bank erosion in which the bank erosion coefficient is treated as a stochastic variable that varies with physical properties of the bank (e.g. cohesiveness, stratigraphy, vegetation density). The developed model was used to predict the evolution of meandering planforms. Then, the modeling results were analyzed and compared to the observed data. Since the migration of meandering channel consists of downstream translation, lateral expansion, and downstream or upstream rotations. Several measures are formulated in order to determine which of the resulting planform is closest to the experimental measured one. Results from the deterministic model highly depend on the calibrated erosion coefficient. Since field measurements are always limited, the stochastic model yielded more realistic predictions of meandering planform evolutions. Due to the random nature of bank erosion coefficient, the meandering planform evolution is a stochastic process that can only be accurately predicted by a stochastic model. Quasi-2D Ikeda (1989) flow solution with Monte Carlo Simulation of Bank Erosion Coefficient.

  19. Markov Chain Monte Carlo and Irreversibility

    Science.gov (United States)

    Ottobre, Michela

    2016-06-01

    Markov Chain Monte Carlo (MCMC) methods are statistical methods designed to sample from a given measure π by constructing a Markov chain that has π as invariant measure and that converges to π. Most MCMC algorithms make use of chains that satisfy the detailed balance condition with respect to π; such chains are therefore reversible. On the other hand, recent work [18, 21, 28, 29] has stressed several advantages of using irreversible processes for sampling. Roughly speaking, irreversible diffusions converge to equilibrium faster (and lead to smaller asymptotic variance as well). In this paper we discuss some of the recent progress in the study of nonreversible MCMC methods. In particular: i) we explain some of the difficulties that arise in the analysis of nonreversible processes and we discuss some analytical methods to approach the study of continuous-time irreversible diffusions; ii) most of the rigorous results on irreversible diffusions are available for continuous-time processes; however, for computational purposes one needs to discretize such dynamics. It is well known that the resulting discretized chain will not, in general, retain all the good properties of the process that it is obtained from. In particular, if we want to preserve the invariance of the target measure, the chain might no longer be reversible. Therefore iii) we conclude by presenting an MCMC algorithm, the SOL-HMC algorithm [23], which results from a nonreversible discretization of a nonreversible dynamics.

  20. Monte Carlo Production Management at CMS

    Science.gov (United States)

    Boudoul, G.; Franzoni, G.; Norkus, A.; Pol, A.; Srimanobhas, P.; Vlimant, J.-R.

    2015-12-01

    The analysis of the LHC data at the Compact Muon Solenoid (CMS) experiment requires the production of a large number of simulated events. During the RunI of LHC (20102012), CMS has produced over 12 Billion simulated events, organized in approximately sixty different campaigns each emulating specific detector conditions and LHC running conditions (pile up). In order to aggregate the information needed for the configuration and prioritization of the events production, assure the book-keeping of all the processing requests placed by the physics analysis groups, and to interface with the CMS production infrastructure, the web- based service Monte Carlo Management (McM) has been developed and put in production in 2013. McM is based on recent server infrastructure technology (CherryPy + AngularJS) and relies on a CouchDB database back-end. This contribution covers the one and half year of operational experience managing samples of simulated events for CMS, the evolution of its functionalities and the extension of its capability to monitor the status and advancement of the events production.

  1. Measuring Berry curvature with quantum Monte Carlo

    CERN Document Server

    Kolodrubetz, Michael

    2014-01-01

    The Berry curvature and its descendant, the Berry phase, play an important role in quantum mechanics. They can be used to understand the Aharonov-Bohm effect, define topological Chern numbers, and generally to investigate the geometric properties of a quantum ground state manifold. While Berry curvature has been well-studied in the regimes of few-body physics and non-interacting particles, its use in the regime of strong interactions is hindered by the lack of numerical methods to solve it. In this paper we fill this gap by implementing a quantum Monte Carlo method to solve for the Berry curvature, based on interpreting Berry curvature as a leading correction to imaginary time ramps. We demonstrate our algorithm using the transverse-field Ising model in one and two dimensions, the latter of which is non-integrable. Despite the fact that the Berry curvature gives information about the phase of the wave function, we show that our algorithm has no sign or phase problem for standard sign-problem-free Hamiltonians...

  2. Monte Carlo Simulations of the Photospheric Process

    CERN Document Server

    Santana, Rodolfo; Hernandez, Roberto A; Kumar, Pawan

    2015-01-01

    We present a Monte Carlo (MC) code we wrote to simulate the photospheric process and to study the photospheric spectrum above the peak energy. Our simulations were performed with a photon to electron ratio $N_{\\gamma}/N_{e} = 10^{5}$, as determined by observations of the GRB prompt emission. We searched an exhaustive parameter space to determine if the photospheric process can match the observed high-energy spectrum of the prompt emission. If we do not consider electron re-heating, we determined that the best conditions to produce the observed high-energy spectrum are low photon temperatures and high optical depths. However, for these simulations, the spectrum peaks at an energy below 300 keV by a factor $\\sim 10$. For the cases we consider with higher photon temperatures and lower optical depths, we demonstrate that additional energy in the electrons is required to produce a power-law spectrum above the peak-energy. By considering electron re-heating near the photosphere, the spectrum for these simulations h...

  3. Monte Carlo method application to shielding calculations

    International Nuclear Information System (INIS)

    CANDU spent fuel discharged from the reactor core contains Pu, so it must be stressed in two directions: tracing for the fuel reactivity in order to prevent critical mass formation and personnel protection during the spent fuel manipulation. The basic tasks accomplished by the shielding calculations in a nuclear safety analysis consist in dose rates calculations in order to prevent any risks both for personnel protection and impact on the environment during the spent fuel manipulation, transport and storage. To perform photon dose rates calculations the Monte Carlo MORSE-SGC code incorporated in SAS4 sequence from SCALE system was used. The paper objective was to obtain the photon dose rates to the spent fuel transport cask wall, both in radial and axial directions. As source of radiation one spent CANDU fuel bundle was used. All the geometrical and material data related to the transport cask were considered according to the shipping cask type B model, whose prototype has been realized and tested in the Institute for Nuclear Research Pitesti. (authors)

  4. Rare event simulation using Monte Carlo methods

    CERN Document Server

    Rubino, Gerardo

    2009-01-01

    In a probabilistic model, a rare event is an event with a very small probability of occurrence. The forecasting of rare events is a formidable task but is important in many areas. For instance a catastrophic failure in a transport system or in a nuclear power plant, the failure of an information processing system in a bank, or in the communication network of a group of banks, leading to financial losses. Being able to evaluate the probability of rare events is therefore a critical issue. Monte Carlo Methods, the simulation of corresponding models, are used to analyze rare events. This book sets out to present the mathematical tools available for the efficient simulation of rare events. Importance sampling and splitting are presented along with an exposition of how to apply these tools to a variety of fields ranging from performance and dependability evaluation of complex systems, typically in computer science or in telecommunications, to chemical reaction analysis in biology or particle transport in physics. ...

  5. Parallel Monte Carlo simulation of aerosol dynamics

    KAUST Repository

    Zhou, K.

    2014-01-01

    A highly efficient Monte Carlo (MC) algorithm is developed for the numerical simulation of aerosol dynamics, that is, nucleation, surface growth, and coagulation. Nucleation and surface growth are handled with deterministic means, while coagulation is simulated with a stochastic method (Marcus-Lushnikov stochastic process). Operator splitting techniques are used to synthesize the deterministic and stochastic parts in the algorithm. The algorithm is parallelized using the Message Passing Interface (MPI). The parallel computing efficiency is investigated through numerical examples. Near 60% parallel efficiency is achieved for the maximum testing case with 3.7 million MC particles running on 93 parallel computing nodes. The algorithm is verified through simulating various testing cases and comparing the simulation results with available analytical and/or other numerical solutions. Generally, it is found that only small number (hundreds or thousands) of MC particles is necessary to accurately predict the aerosol particle number density, volume fraction, and so forth, that is, low order moments of the Particle Size Distribution (PSD) function. Accurately predicting the high order moments of the PSD needs to dramatically increase the number of MC particles. 2014 Kun Zhou et al.

  6. Qualitative Simulation of Photon Transport in Free Space Based on Monte Carlo Method and Its Parallel Implementation

    Directory of Open Access Journals (Sweden)

    Xueli Chen

    2010-01-01

    Full Text Available During the past decade, Monte Carlo method has obtained wide applications in optical imaging to simulate photon transport process inside tissues. However, this method has not been effectively extended to the simulation of free-space photon transport at present. In this paper, a uniform framework for noncontact optical imaging is proposed based on Monte Carlo method, which consists of the simulation of photon transport both in tissues and in free space. Specifically, the simplification theory of lens system is utilized to model the camera lens equipped in the optical imaging system, and Monte Carlo method is employed to describe the energy transformation from the tissue surface to the CCD camera. Also, the focusing effect of camera lens is considered to establish the relationship of corresponding points between tissue surface and CCD camera. Furthermore, a parallel version of the framework is realized, making the simulation much more convenient and effective. The feasibility of the uniform framework and the effectiveness of the parallel version are demonstrated with a cylindrical phantom based on real experimental results.

  7. Monte-Carlo scatter correction for cone-beam computed tomography with limited scan field-of-view

    Science.gov (United States)

    Bertram, Matthias; Sattel, Timo; Hohmann, Steffen; Wiegert, Jens

    2008-03-01

    In flat detector cone-beam computed tomography (CBCT), scattered radiation is a major source of image degradation, making accurate a posteriori scatter correction inevitable. A potential solution to this problem is provided by computerized scatter correction based on Monte-Carlo simulations. Using this technique, the detected distributions of X-ray scatter are estimated for various viewing directions using Monte-Carlo simulations of an intermediate reconstruction. However, as a major drawback, for standard CBCT geometries and with standard size flat detectors such as mounted on interventional C-arms, the scan field of view is too small to accommodate the human body without lateral truncations, and thus this technique cannot be readily applied. In this work, we present a novel method for constructing a model of the object in a laterally and possibly also axially extended field of view, which enables meaningful application of Monte-Carlo based scatter correction even in case of heavy truncations. Evaluation is based on simulations of a clinical CT data set of a human abdomen, which strongly exceeds the field of view of the simulated C-arm based CBCT imaging geometry. By using the proposed methodology, almost complete removal of scatter-caused inhomogeneities is demonstrated in reconstructed images.

  8. Penumbral imaging and numerical evaluation of large area source neutron imaging system

    Institute of Scientific and Technical Information of China (English)

    WU YueLei; HU HuaSi; ZHANG BoPing; LI LinBo; CHEN Da; SHAN Qing; ZHU Jie

    2009-01-01

    The fusion neutron penumbral imaging system Monte Carlo model was established. The transfer func-tions of the two discrete units in the neutron source were obtained in two situations: Imaging in geo-metrical near-optical and real situation. The spatial resolutions of the imaging system in two situations were evaluated and compared. The penumbral images of four units in the source were obtained by means of 2-dimensional (2D) convolution and Monte Carlo simulation. The penumbral images were reconstructed with the same method of filter. The same results were confirmed. The encoding essence of penumbral imaging was revealed. With MCNP(Monte Carlo N-particle) simulation, the neutron pen-umbral images of the large area source (200 μm×200 μm) on scintillation fiber array were obtained. The improved Wiener filter method was used to reconstruct the penumbral image and the source image was obtained. The results agree with the preset neutron source image. The feasibility of the neutron imaging system was verified.

  9. Penumbral imaging and numerical evaluation of large area source neutron imaging system

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    The fusion neutron penumbral imaging system Monte Carlo model was established. The transfer functions of the two discrete units in the neutron source were obtained in two situations:Imaging in geometrical near-optical and real situation. The spatial resolutions of the imaging system in two situations were evaluated and compared. The penumbral images of four units in the source were obtained by means of 2-dimensional (2D) convolution and Monte Carlo simulation. The penumbral images were reconstructed with the same method of filter. The same results were confirmed. The encoding essence of penumbral imaging was revealed. With MCNP(Monte Carlo N-particle) simulation,the neutron penumbral images of the large area source (200 μm×200 μm) on scintillation fiber array were obtained. The improved Wiener filter method was used to reconstruct the penumbral image and the source image was obtained. The results agree with the preset neutron source image. The feasibility of the neutron imaging system was verified.

  10. Monte Carlo computations of the hadronic mass spectrum

    International Nuclear Information System (INIS)

    This paper summarizes two talks presented at the Orbis Scientiae Meeting, 1982. Monte Carlo results on the mass gap (or glueball mass) and on the masses of the lightest quark-model hadrons are illustrated

  11. Monte Carlo techniques for analyzing deep penetration problems

    Energy Technology Data Exchange (ETDEWEB)

    Cramer, S.N.; Gonnord, J.; Hendricks, J.S.

    1985-01-01

    A review of current methods and difficulties in Monte Carlo deep-penetration calculations is presented. Statistical uncertainty is discussed, and recent adjoint optimization of splitting, Russian roulette, and exponential transformation biasing is reviewed. Other aspects of the random walk and estimation processes are covered, including the relatively new DXANG angular biasing technique. Specific items summarized are albedo scattering, Monte Carlo coupling techniques with discrete ordinates and other methods, adjoint solutions, and multi-group Monte Carlo. The topic of code-generated biasing parameters is presented, including the creation of adjoint importance functions from forward calculations. Finally, current and future work in the area of computer learning and artificial intelligence is discussed in connection with Monte Carlo applications. 29 refs.

  12. Carlo Ginzburg: anomaalia viitab normile / intervjueerinud Marek Tamm

    Index Scriptorium Estoniae

    Ginzburg, Carlo, 1939-

    2014-01-01

    Intervjuu itaalia ajaloolase Carlo Ginzburgiga tema raamatu "Ükski saar pole saar : neli pilguheitu inglise kirjandusele globaalsest vaatenurgast" eesti keeles ilmumise puhul. Teos ilmus Tallinna Ülikooli Kirjastuses

  13. Bayesian phylogeny analysis via stochastic approximation Monte Carlo

    KAUST Repository

    Cheon, Sooyoung

    2009-11-01

    Monte Carlo methods have received much attention in the recent literature of phylogeny analysis. However, the conventional Markov chain Monte Carlo algorithms, such as the Metropolis-Hastings algorithm, tend to get trapped in a local mode in simulating from the posterior distribution of phylogenetic trees, rendering the inference ineffective. In this paper, we apply an advanced Monte Carlo algorithm, the stochastic approximation Monte Carlo algorithm, to Bayesian phylogeny analysis. Our method is compared with two popular Bayesian phylogeny software, BAMBE and MrBayes, on simulated and real datasets. The numerical results indicate that our method outperforms BAMBE and MrBayes. Among the three methods, SAMC produces the consensus trees which have the highest similarity to the true trees, and the model parameter estimates which have the smallest mean square errors, but costs the least CPU time. © 2009 Elsevier Inc. All rights reserved.

  14. Suppression of the initial transient in Monte Carlo criticality simulations

    International Nuclear Information System (INIS)

    Criticality Monte Carlo calculations aim at estimating the effective multiplication factor (k-effective) for a fissile system through iterations simulating neutrons propagation (making a Markov chain). Arbitrary initialization of the neutron population can deeply bias the k-effective estimation, defined as the mean of the k-effective computed at each iteration. A simplified model of this cycle k-effective sequence is built, based on characteristics of industrial criticality Monte Carlo calculations. Statistical tests, inspired by Brownian bridge properties, are designed to discriminate stationarity of the cycle k-effective sequence. The initial detected transient is, then, suppressed in order to improve the estimation of the system k-effective. The different versions of this methodology are detailed and compared, firstly on a plan of numerical tests fitted on criticality Monte Carlo calculations, and, secondly on real criticality calculations. Eventually, the best methodologies observed in these tests are selected and allow to improve industrial Monte Carlo criticality calculations. (author)

  15. On the Markov Chain Monte Carlo (MCMC) method

    Indian Academy of Sciences (India)

    Rajeeva L Karandikar

    2006-04-01

    Markov Chain Monte Carlo (MCMC) is a popular method used to generate samples from arbitrary distributions, which may be specified indirectly. In this article, we give an introduction to this method along with some examples.

  16. An Introduction to Multilevel Monte Carlo for Option Valuation

    CERN Document Server

    Higham, Desmond J

    2015-01-01

    Monte Carlo is a simple and flexible tool that is widely used in computational finance. In this context, it is common for the quantity of interest to be the expected value of a random variable defined via a stochastic differential equation. In 2008, Giles proposed a remarkable improvement to the approach of discretizing with a numerical method and applying standard Monte Carlo. His multilevel Monte Carlo method offers an order of speed up given by the inverse of epsilon, where epsilon is the required accuracy. So computations can run 100 times more quickly when two digits of accuracy are required. The multilevel philosophy has since been adopted by a range of researchers and a wealth of practically significant results has arisen, most of which have yet to make their way into the expository literature. In this work, we give a brief, accessible, introduction to multilevel Monte Carlo and summarize recent results applicable to the task of option evaluation.

  17. Monte Carlo techniques for analyzing deep penetration problems

    International Nuclear Information System (INIS)

    A review of current methods and difficulties in Monte Carlo deep-penetration calculations is presented. Statistical uncertainty is discussed, and recent adjoint optimization of splitting, Russian roulette, and exponential transformation biasing is reviewed. Other aspects of the random walk and estimation processes are covered, including the relatively new DXANG angular biasing technique. Specific items summarized are albedo scattering, Monte Carlo coupling techniques with discrete ordinates and other methods, adjoint solutions, and multi-group Monte Carlo. The topic of code-generated biasing parameters is presented, including the creation of adjoint importance functions from forward calculations. Finally, current and future work in the area of computer learning and artificial intelligence is discussed in connection with Monte Carlo applications. 29 refs

  18. Monte Carlo variance reduction approaches for non-Boltzmann tallies

    International Nuclear Information System (INIS)

    Quantities that depend on the collective effects of groups of particles cannot be obtained from the standard Boltzmann transport equation. Monte Carlo estimates of these quantities are called non-Boltzmann tallies and have become increasingly important recently. Standard Monte Carlo variance reduction techniques were designed for tallies based on individual particles rather than groups of particles. Experience with non-Boltzmann tallies and analog Monte Carlo has demonstrated the severe limitations of analog Monte Carlo for many non-Boltzmann tallies. In fact, many calculations absolutely require variance reduction methods to achieve practical computation times. Three different approaches to variance reduction for non-Boltzmann tallies are described and shown to be unbiased. The advantages and disadvantages of each of the approaches are discussed

  19. Proton therapy Monte Carlo SRNA-VOX code

    OpenAIRE

    Ilić Radovan D.

    2012-01-01

    The most powerful feature of the Monte Carlo method is the possibility of simulating all individual particle interactions in three dimensions and performing numerical experiments with a preset error. These facts were the motivation behind the development of a general-purpose Monte Carlo SRNA program for proton transport simulation in technical systems described by standard geometrical forms (plane, sphere, cone, cylinder, cube). Some of the possible applications of the SRNA program are:...

  20. Measuring the reliability of MCMC inference with bidirectional Monte Carlo

    OpenAIRE

    Grosse, Roger B.; Ancha, Siddharth; Roy, Daniel M.

    2016-01-01

    Markov chain Monte Carlo (MCMC) is one of the main workhorses of probabilistic inference, but it is notoriously hard to measure the quality of approximate posterior samples. This challenge is particularly salient in black box inference methods, which can hide details and obscure inference failures. In this work, we extend the recently introduced bidirectional Monte Carlo technique to evaluate MCMC-based posterior inference algorithms. By running annealed importance sampling (AIS) chains both ...

  1. Confidence and efficiency scaling in Variational Quantum Monte Carlo calculations

    CERN Document Server

    Delyon, François; Holzmann, Markus

    2016-01-01

    Based on the central limit theorem, we discuss the problem of evaluation of the statistical error of Monte Carlo calculations using a time discretized diffusion process. We present a robust and practical method to determine the effective variance of general observables and show how to verify the equilibrium hypothesis by the Kolmogorov-Smirnov test. We then derive scaling laws of the efficiency illustrated by Variational Monte Carlo calculations on the two dimensional electron gas.

  2. HISTORY AND TERRITORY HEURISTICS FOR MONTE CARLO GO

    OpenAIRE

    BRUNO BOUZY

    2006-01-01

    Recently, the Monte Carlo approach has been applied to computer go with promising success. INDIGO uses such an approach which can be enhanced with specific heuristics. This paper assesses two heuristics within the 19 × 19 Monte Carlo go framework of INDIGO: the territory heuristic and the history heuristic, both in their internal and external versions. The external territory heuristic is more effective, leading to a 40-point improvement on 19 × 19 boards. The external history heuristic brings...

  3. Identification of Logical Errors through Monte-Carlo Simulation

    CERN Document Server

    Emmett, Hilary L

    2010-01-01

    The primary focus of Monte Carlo simulation is to identify and quantify risk related to uncertainty and variability in spreadsheet model inputs. The stress of Monte Carlo simulation often reveals logical errors in the underlying spreadsheet model that might be overlooked during day-to-day use or traditional "what-if" testing. This secondary benefit of simulation requires a trained eye to recognize warning signs of poor model construction.

  4. Computing Greeks with Multilevel Monte Carlo Methods using Importance Sampling

    OpenAIRE

    Euget, Thomas

    2012-01-01

    This paper presents a new efficient way to reduce the variance of an estimator of popular payoffs and greeks encounter in financial mathematics. The idea is to apply Importance Sampling with the Multilevel Monte Carlo recently introduced by M.B. Giles. So far, Importance Sampling was proved successful in combination with standard Monte Carlo method. We will show efficiency of our approach on the estimation of financial derivatives prices and then on the estimation of Greeks (i.e. sensitivitie...

  5. The computation of Greeks with multilevel Monte Carlo

    OpenAIRE

    Burgos, Sylvestre Jean-Baptiste Louis; Michael B. Giles

    2014-01-01

    In mathematical finance, the sensitivities of option prices to various market parameters, also known as the “Greeks”, reflect the exposure to different sources of risk. Computing these is essential to predict the impact of market moves on portfolios and to hedge them adequately. This is commonly done using Monte Carlo simulations. However, obtaining accurate estimates of the Greeks can be computationally costly. Multilevel Monte Carlo offers complexity improvements over standard Monte Carl...

  6. On the inner workings of Monte Carlo codes

    OpenAIRE

    Dubbeldam, D.; Torres Knoop, A.; Walton, K.S.

    2013-01-01

    We review state-of-the-art Monte Carlo (MC) techniques for computing fluid coexistence properties (Gibbs simulations) and adsorption simulations in nanoporous materials such as zeolites and metal-organic frameworks. Conventional MC is discussed and compared to advanced techniques such as reactive MC, configurational-bias Monte Carlo and continuous fractional MC. The latter technique overcomes the problem of low insertion probabilities in open systems. Other modern methods are (hyper-)parallel...

  7. Public Infrastructure for Monte Carlo Simulation: publicMC@BATAN

    CERN Document Server

    Waskita, A A; Akbar, Z; Handoko, L T; 10.1063/1.3462759

    2010-01-01

    The first cluster-based public computing for Monte Carlo simulation in Indonesia is introduced. The system has been developed to enable public to perform Monte Carlo simulation on a parallel computer through an integrated and user friendly dynamic web interface. The beta version, so called publicMC@BATAN, has been released and implemented for internal users at the National Nuclear Energy Agency (BATAN). In this paper the concept and architecture of publicMC@BATAN are presented.

  8. Monte Carlo method for solving a parabolic problem

    Directory of Open Access Journals (Sweden)

    Tian Yi

    2016-01-01

    Full Text Available In this paper, we present a numerical method based on random sampling for a parabolic problem. This method combines use of the Crank-Nicolson method and Monte Carlo method. In the numerical algorithm, we first discretize governing equations by Crank-Nicolson method, and obtain a large sparse system of linear algebraic equations, then use Monte Carlo method to solve the linear algebraic equations. To illustrate the usefulness of this technique, we apply it to some test problems.

  9. Monte Carlo Simulation of Optical Properties of Wake Bubbles

    Institute of Scientific and Technical Information of China (English)

    CAO Jing; WANG Jiang-An; JIANG Xing-Zhou; SHI Sheng-Wei

    2007-01-01

    Based on Mie scattering theory and the theory of multiple light scattering, the light scattering properties of air bubbles in a wake are analysed by Monte Carlo simulation. The results show that backscattering is enhanced obviously due to the existence of bubbles, especially with the increase of bubble density, and that it is feasible to use the Monte Carlo method to study the properties of light scattering by air bubbles.

  10. Monte Carlo methods and applications in nuclear physics

    International Nuclear Information System (INIS)

    Monte Carlo methods for studying few- and many-body quantum systems are introduced, with special emphasis given to their applications in nuclear physics. Variational and Green's function Monte Carlo methods are presented in some detail. The status of calculations of light nuclei is reviewed, including discussions of the three-nucleon-interaction, charge and magnetic form factors, the coulomb sum rule, and studies of low-energy radiative transitions. 58 refs., 12 figs

  11. A Particle Population Control Method for Dynamic Monte Carlo

    Science.gov (United States)

    Sweezy, Jeremy; Nolen, Steve; Adams, Terry; Zukaitis, Anthony

    2014-06-01

    A general particle population control method has been derived from splitting and Russian Roulette for dynamic Monte Carlo particle transport. A well-known particle population control method, known as the particle population comb, has been shown to be a special case of this general method. This general method has been incorporated in Los Alamos National Laboratory's Monte Carlo Application Toolkit (MCATK) and examples of it's use are shown for both super-critical and sub-critical systems.

  12. Monte Carlo simulation of large electron fields

    Science.gov (United States)

    Faddegon, Bruce A.; Perl, Joseph; Asai, Makoto

    2008-03-01

    Two Monte Carlo systems, EGSnrc and Geant4, the latter with two different 'physics lists,' were used to calculate dose distributions in large electron fields used in radiotherapy. Source and geometry parameters were adjusted to match calculated results to measurement. Both codes were capable of accurately reproducing the measured dose distributions of the six electron beams available on the accelerator. Depth penetration matched the average measured with a diode and parallel-plate chamber to 0.04 cm or better. Calculated depth dose curves agreed to 2% with diode measurements in the build-up region, although for the lower beam energies there was a discrepancy of up to 5% in this region when calculated results are compared to parallel-plate measurements. Dose profiles at the depth of maximum dose matched to 2-3% in the central 25 cm of the field, corresponding to the field size of the largest applicator. A 4% match was obtained outside the central region. The discrepancy observed in the bremsstrahlung tail in published results that used EGS4 is no longer evident. Simulations with the different codes and physics lists used different source energies, incident beam angles, thicknesses of the primary foils, and distance between the primary and secondary foil. The true source and geometry parameters were not known with sufficient accuracy to determine which parameter set, including the energy of the source, was closest to the truth. These results underscore the requirement for experimental benchmarks of depth penetration and electron scatter for beam energies and foils relevant to radiotherapy.

  13. Lattice Monte Carlo simulations of polymer melts

    Science.gov (United States)

    Hsu, Hsiao-Ping

    2014-12-01

    We use Monte Carlo simulations to study polymer melts consisting of fully flexible and moderately stiff chains in the bond fluctuation model at a volume fraction 0.5. In order to reduce the local density fluctuations, we test a pre-packing process for the preparation of the initial configurations of the polymer melts, before the excluded volume interaction is switched on completely. This process leads to a significantly faster decrease of the number of overlapping monomers on the lattice. This is useful for simulating very large systems, where the statistical properties of the model with a marginally incomplete elimination of excluded volume violations are the same as those of the model with strictly excluded volume. We find that the internal mean square end-to-end distance for moderately stiff chains in a melt can be very well described by a freely rotating chain model with a precise estimate of the bond-bond orientational correlation between two successive bond vectors in equilibrium. The plot of the probability distributions of the reduced end-to-end distance of chains of different stiffness also shows that the data collapse is excellent and described very well by the Gaussian distribution for ideal chains. However, while our results confirm the systematic deviations between Gaussian statistics for the chain structure factor Sc(q) [minimum in the Kratky-plot] found by Wittmer et al. [EPL 77, 56003 (2007)] for fully flexible chains in a melt, we show that for the available chain length these deviations are no longer visible, when the chain stiffness is included. The mean square bond length and the compressibility estimated from collective structure factors depend slightly on the stiffness of the chains.

  14. Monte Carlo Volcano Seismic Moment Tensors

    Science.gov (United States)

    Waite, G. P.; Brill, K. A.; Lanza, F.

    2015-12-01

    Inverse modeling of volcano seismic sources can provide insight into the geometry and dynamics of volcanic conduits. But given the logistical challenges of working on an active volcano, seismic networks are typically deficient in spatial and temporal coverage; this potentially leads to large errors in source models. In addition, uncertainties in the centroid location and moment-tensor components, including volumetric components, are difficult to constrain from the linear inversion results, which leads to a poor understanding of the model space. In this study, we employ a nonlinear inversion using a Monte Carlo scheme with the objective of defining robustly resolved elements of model space. The model space is randomized by centroid location and moment tensor eigenvectors. Point sources densely sample the summit area and moment tensors are constrained to a randomly chosen geometry within the inversion; Green's functions for the random moment tensors are all calculated from modeled single forces, making the nonlinear inversion computationally reasonable. We apply this method to very-long-period (VLP) seismic events that accompany minor eruptions at Fuego volcano, Guatemala. The library of single force Green's functions is computed with a 3D finite-difference modeling algorithm through a homogeneous velocity-density model that includes topography, for a 3D grid of nodes, spaced 40 m apart, within the summit region. The homogenous velocity and density model is justified by long wavelength of VLP data. The nonlinear inversion reveals well resolved model features and informs the interpretation through a better understanding of the possible models. This approach can also be used to evaluate possible station geometries in order to optimize networks prior to deployment.

  15. An X-ray scatter system for material identification in cluttered objects: A Monte Carlo simulation study

    Energy Technology Data Exchange (ETDEWEB)

    Lakshmanan, Manu N. [Ravin Advanced Imaging Laboratories, Dept. of Radiology, Duke University Medical Center, Durham, NC (United States); Kapadia, Anuj J., E-mail: anuj.kapadia@duke.edu [Ravin Advanced Imaging Laboratories, Dept. of Radiology, Duke University Medical Center, Durham, NC (United States); Sahbaee, Pooyan [Ravin Advanced Imaging Laboratories, Dept. of Radiology, Duke University Medical Center, Durham, NC (United States); Dept. of Physics, NC State University, Raleigh, NC (United States); Wolter, Scott D. [Dept. of Physics, Elon University, Elon, NC (United States); Harrawood, Brian P. [Ravin Advanced Imaging Laboratories, Dept. of Radiology, Duke University Medical Center, Durham, NC (United States); Brady, David [Dept. of Electrical and Computer Engineering, Duke University, Durham, NC (United States); Samei, Ehsan [Ravin Advanced Imaging Laboratories, Dept. of Radiology, Duke University Medical Center, Durham, NC (United States); Dept. of Electrical and Computer Engineering, Duke University, Durham, NC (United States)

    2014-09-15

    The analysis of X-ray scatter patterns has been demonstrated as an effective method of identifying specific materials in mixed object environments, for both biological and non-biological applications. Here we describe an X-ray scatter imaging system for material identification in cluttered objects and investigate its performance using a large-scale Monte Carlo simulation study of one-thousand objects containing a broad array of materials. The GEANT4 Monte Carlo source code for Rayleigh scatter physics was modified to model coherent scatter diffraction in bulk materials based on experimentally measured form factors for 33 materials. The simulation was then used to model coherent scatter signals from a variety of targets and clutter (background) materials in one thousand randomized objects. The resulting scatter images were used to characterize four parameters of the imaging system that affected its ability to identify target materials: (a) the arrangement of materials in the object, (b) clutter attenuation, (c) type of target material, and (d) the X-ray tube current. We found that the positioning of target materials within the object did not significantly affect their detectability; however, a strong negative correlation was observed between the target detectability and the clutter attenuation of the object. The imaging signal was also found to be relatively invariant to increases in X-ray tube current above 1 mAs for most materials considered in the study. This work is the first Monte Carlo study to our knowledge of a large population of cluttered object of an X-ray scatter imaging system for material identification and lays the foundation for large-scale studies of the effectiveness of X-ray scatter imaging systems for material identification in complex samples.

  16. Implications of Monte Carlo Statistical Errors in Criticality Safety Assessments

    Energy Technology Data Exchange (ETDEWEB)

    Pevey, Ronald E.

    2005-09-15

    Most criticality safety calculations are performed using Monte Carlo techniques because of Monte Carlo's ability to handle complex three-dimensional geometries. For Monte Carlo calculations, the more histories sampled, the lower the standard deviation of the resulting estimates. The common intuition is, therefore, that the more histories, the better; as a result, analysts tend to run Monte Carlo analyses as long as possible (or at least to a minimum acceptable uncertainty). For Monte Carlo criticality safety analyses, however, the optimization situation is complicated by the fact that procedures usually require that an extra margin of safety be added because of the statistical uncertainty of the Monte Carlo calculations. This additional safety margin affects the impact of the choice of the calculational standard deviation, both on production and on safety. This paper shows that, under the assumptions of normally distributed benchmarking calculational errors and exact compliance with the upper subcritical limit (USL), the standard deviation that optimizes production is zero, but there is a non-zero value of the calculational standard deviation that minimizes the risk of inadvertently labeling a supercritical configuration as subcritical. Furthermore, this value is shown to be a simple function of the typical benchmarking step outcomes--the bias, the standard deviation of the bias, the upper subcritical limit, and the number of standard deviations added to calculated k-effectives before comparison to the USL.

  17. Validation of MTF measurement for CBCT system using Monte Carlo simulations

    Science.gov (United States)

    Hao, Ting; Gao, Feng; Zhao, Huijuan; Zhou, Zhongxing

    2016-03-01

    To evaluate the spatial resolution performance of cone beam computed tomography (CBCT) system, accurate measurement of the modulation transfer function (MTF) is required. This accuracy depends on the MTF measurement method and CBCT reconstruction algorithms. In this work, the accuracy of MTF measurement of CBCT system using wire phantom is validated by Monte Carlo simulation. A Monte Carlo simulation software tool BEAMnrc/EGSnrc was employed to model X-ray radiation beams and transport. Tungsten wires were simulated with different diameters and radial distances from the axis of rotation. We adopted filtered back projection technique to reconstruct images from 360° acquisition. The MTFs for four reconstruction kernels were measured from corresponding reconstructed wire images, while the ram-lak kernel increased the MTF relative to the cosine, hamming and hann kernel. The results demonstrated that the MTF degraded radially from the axis of rotation. This study suggested that an increase in the MTF for the CBCT system is possible by optimizing scanning settings and reconstruction parameters.

  18. Voxel classification methodology for rapid Monte Carlo simulation of light propagation in complex media

    Institute of Scientific and Technical Information of China (English)

    Nunu Ren; Heng Zhao; Shouping Zhu; Xiaochao Qu; Hongliang Liu; Zhenhua Hu; Jimin Liang; Jie Tian

    2011-01-01

    @@ Monte Carlo (MC) method is a statistical method for simulating photon propagation in media in the optical molecular imaging field.However, obtaining an accurate result using the method is quite time-consuming,especially because the boundary of the media is complex.A voxel classification method is proposed to reduce the computation cost.All the voxels generated by dividing the media are classified into three types (outside, boundary, and inside) according to the position of the voxel.The classified information is used to determine the relative position of the photon and the intersection between photon path and media boundary in the MC method.The influencing factor8 and effectiveness of the proposed method are analyzed and validated by simulation experiments.%Monte Carlo (MC) method is a statistical method for simulating photon propagation in media in the optical molecular imaging field. However, obtaining an accurate result using the method is quite time-consuming,especially because the boundary of the media is complex. A voxel classification method is proposed to reduce the computation cost. All the voxels generated by dividing the media are classified into three types (outside, boundary, and inside) according to the position of the voxel. The classified information is used to determine the relative position of the photon and the intersection between photon path and media boundary in the MC method. The influencing factors and effectiveness of the proposed method are analyzed and validated by simulation experiments.

  19. Neutron penumbral imaging simulation and reconstruction for Inertial Confinement Fusion Experiments

    OpenAIRE

    Wang, Xian-You; Fang, Zhen-yun; Tang, Yun-Qing; Tang, Zhi-Cheng; Xiao, Hong; Xu, Ming

    2012-01-01

    Neutron penumbral imaging technique has been successfully used as the diagnosis method in Inertial Con?ned Fusion. To help the design of the imaging systems in the future in CHINA. We construct the Monte carlo imaging system by Geant4. Use the point spread function from the simulation and decode algorithm (Lucy-Rechardson algorithm) we got the recovery image.

  20. Monte Carlo Studies of medium-size telescope designs for the Cherenkov Telescope Array

    CERN Document Server

    Wood, M; Dumm, J; Funk, S

    2015-01-01

    We present studies for optimizing the next generation of ground-based imaging atmospheric Cherenkov telescopes (IACTs). Results focus on mid-sized telescopes (MSTs) for CTA, detecting very high energy gamma rays in the energy range from a few hundred GeV to a few tens of TeV. We describe a novel, flexible detector Monte Carlo package, FAST (FAst Simulation for imaging air cherenkov Telescopes), that we use to simulate different array and telescope designs. The simulation is somewhat simplified to allow for efficient exploration over a large telescope design parameter space. We investigate a wide range of telescope performance parameters including optical resolution, camera pixel size, and light collection area. In order to ensure a comparison of the arrays at their maximum sensitivity, we analyze the simulations with the most sensitive techniques used in the field, such as maximum likelihood template reconstruction and boosted decision trees for background rejection. Choosing telescope design parameters repre...

  1. Monte Carlo Studies of the GCT Telescope for the Cherenkov Telescope Array

    CERN Document Server

    Armstrong, Thomas; Rulten, Cameron; Stamatescu, Victor; Zech, Andreas

    2015-01-01

    The GCT is an innovative dual-mirror solution proposed for the small-size telescopes for CTA, capable of imaging primary cosmic gamma-rays from below a TeV to hundreds of TeV. The reduced plate scale resulting from the secondary optics allows the use of compact photosensors, including multi-anode photomultiplier tubes or silicon photomultipliers. We show preliminary results of Monte Carlo simulations using the packages CORSIKA and Sim_telarray, comparing the relative performance of each photosensor type. We also investigate the effect of the secondary optics in terms of optical performance, image resolution and camera response. With the ongoing commissioning of the prototype structure and camera, we present the preliminary expected performance of GCT.

  2. Prediction of beam hardening artefacts in computed tomography using Monte Carlo simulations

    Science.gov (United States)

    Thomsen, M.; Knudsen, E. B.; Willendrup, P. K.; Bech, M.; Willner, M.; Pfeiffer, F.; Poulsen, M.; Lefmann, K.; Feidenhans'l, R.

    2015-01-01

    We show how radiological images of both single and multi material samples can be simulated using the Monte Carlo simulation tool McXtrace and how these images can be used to make a three dimensional reconstruction. Good numerical agreement between the X-ray attenuation coefficient in experimental and simulated data can be obtained, which allows us to use simulated projections in the linearisation procedure for single material samples and in that way reduce beam hardening artefacts. The simulations can be used to predict beam hardening artefacts in multi material samples with complex geometry, illustrated with an example. Linearisation requires knowledge about the X-ray transmission at varying sample thickness, but in some cases homogeneous calibration phantoms are hard to manufacture, which affects the accuracy of the calibration. Using simulated data overcomes the manufacturing problems and in that way improves the calibration.

  3. Monte Carlo-based Noise Compensation in Coil Intensity Corrected Endorectal MRI

    CERN Document Server

    Lui, Dorothy; Haider, Masoom; Wong, Alexander

    2015-01-01

    Background: Prostate cancer is one of the most common forms of cancer found in males making early diagnosis important. Magnetic resonance imaging (MRI) has been useful in visualizing and localizing tumor candidates and with the use of endorectal coils (ERC), the signal-to-noise ratio (SNR) can be improved. The coils introduce intensity inhomogeneities and the surface coil intensity correction built into MRI scanners is used to reduce these inhomogeneities. However, the correction typically performed at the MRI scanner level leads to noise amplification and noise level variations. Methods: In this study, we introduce a new Monte Carlo-based noise compensation approach for coil intensity corrected endorectal MRI which allows for effective noise compensation and preservation of details within the prostate. The approach accounts for the ERC SNR profile via a spatially-adaptive noise model for correcting non-stationary noise variations. Such a method is useful particularly for improving the image quality of coil i...

  4. Dosimetry for synchrotron stereotactic radiotherapy: Monte Carlo simulations and radiosensitive gels; Dosimetrie pour la radiotherapie stereotaxique en rayonnement synchrotron: calculs Monte-Carlo et gels radiosensibles

    Energy Technology Data Exchange (ETDEWEB)

    Boudou, C

    2006-09-15

    High grade gliomas are extremely aggressive brain tumours. Specific techniques combining the presence of high atomic number elements within the tumour to an irradiation with a low x-rays (below 100 keV) beam from a synchrotron source were proposed. For the sake of clinical trials, the use of treatment planning system has to be foreseen as well as tailored dosimetry protocols. Objectives of this thesis work were (1) the development of a dose calculation tools based on Monte Carlo code for particles transport and (2) the implementation of an experimental method for the three dimensional verification of the dose delivered. The dosimetric tool is an interface between tomography images from patient or sample and the M.C.N.P.X. general purpose code. Besides, dose distributions were measured through a radiosensitive polymer gel, providing acceptable results compared to calculations.

  5. Time-gated optical imaging through turbid media using stimulated Raman scattering: Studies on image contrast

    Indian Academy of Sciences (India)

    K Divakar Rao; H S Patel; B Jain; P K Gupta

    2005-02-01

    In this paper, we report the development of experimental set-up for timegated optical imaging through turbid media using stimulated Raman scattering. Our studies on the contrast of time-gated images show that for a given optical thickness, the image contrast is better for sample with lower scattering coefficient and higher physical thickness, and that the contrast improves with decreasing value of anisotropy parameters of the scatterers. These results are consistent with time-resolved Monte Carlo simulations.

  6. Monte-Carlo Application for Nondestructive Nuclear Waste Analysis

    Science.gov (United States)

    Carasco, C.; Engels, R.; Frank, M.; Furletov, S.; Furletova, J.; Genreith, C.; Havenith, A.; Kemmerling, G.; Kettler, J.; Krings, T.; Ma, J.-L.; Mauerhofer, E.; Neike, D.; Payan, E.; Perot, B.; Rossbach, M.; Schitthelm, O.; Schumann, M.; Vasquez, R.

    2014-06-01

    , neutron flux distribution. The validation of the measurements simulations with Mont-Carlo transport codes for the design, optimization and data analysis of further P&DGNAA facilities is performed in collaboration with LMN CEA Cadarache. The performance of the prompt gamma neutron activation analysis (PGNAA) for the nondestructive determination of actinides in small samples is investigated. The quantitative determination of actinides relies on the precise knowledge of partial neutron capture cross sections. Up to today these cross sections are not very accurate for analytical purpose. The goal of the TANDEM (Trans-uranium Actinides' Nuclear Data - Evaluation and Measurement) Collaboration is the evaluation of these cross sections. Cross sections are measured using prompt gamma activation analysis facilities in Budapest and Munich. Geant4 is used to optimally design the detection system with Compton suppression. Furthermore, for the evaluation of the cross sections it is strongly needed to correct the results to the self-attenuation of the prompt gammas within the sample. In the framework of cooperation RWTH Aachen University, Forschungszentrum Jülich and the Siemens AG will study the feasibility of a compact Neutron Imaging System for Radioactive waste Analysis (NISRA). The system is based on a 14 MeV neutron source and an advanced detector system (a-Si flat panel) linked to an exclusive converter/scintillator for fast neutrons. For shielding and radioprotection studies the codes MCNPX and Geant4 were used. The two codes were benchmarked in processing time and accuracy in the neutron and gamma fluxes. Also the detector response was simulated with Geant4 to optimize components of the system.

  7. GATE as a GEANT4-based Monte Carlo platform for the evaluation of proton pencil beam scanning treatment plans

    Science.gov (United States)

    Grevillot, L.; Bertrand, D.; Dessy, F.; Freud, N.; Sarrut, D.

    2012-07-01

    Active scanning delivery systems take full advantage of ion beams to best conform to the tumor and to spare surrounding healthy tissues; however, it is also a challenging technique for quality assurance. In this perspective, we upgraded the GATE/GEANT4 Monte Carlo platform in order to recalculate the treatment planning system (TPS) dose distributions for active scanning systems. A method that allows evaluating the TPS dose distributions with the GATE Monte Carlo platform has been developed and applied to the XiO TPS (Elekta), for the IBA proton pencil beam scanning (PBS) system. First, we evaluated the specificities of each dose engine. A dose-conversion scheme that allows one to convert dose to medium into dose to water was implemented within GATE. Specific test cases in homogeneous and heterogeneous configurations allowed for the estimation of the differences between the beam models implemented in XiO and GATE. Finally, dose distributions of a prostate treatment plan were compared. In homogeneous media, a satisfactory agreement was generally obtained between XiO and GATE. The maximum stopping power difference of 3% occurred in a human tissue of 0.9 g cm-3 density and led to a significant range shift. Comparisons in heterogeneous configurations pointed out the limits of the TPS dose calculation accuracy and the superiority of Monte Carlo simulations. The necessity of computing dose to water in our Monte Carlo code for comparisons with TPSs is also presented. Finally, the new capabilities of the platform are applied to a prostate treatment plan and dose differences between both dose engines are analyzed in detail. This work presents a generic method to compare TPS dose distributions with the GATE Monte Carlo platform. It is noteworthy that GATE is also a convenient tool for imaging applications, therefore opening new research possibilities for the PBS modality.

  8. Monte Carlo treatment planning for photon and electron beams

    Science.gov (United States)

    Reynaert, N.; van der Marck, S. C.; Schaart, D. R.; Van der Zee, W.; Van Vliet-Vroegindeweij, C.; Tomsej, M.; Jansen, J.; Heijmen, B.; Coghe, M.; De Wagter, C.

    2007-04-01

    During the last few decades, accuracy in photon and electron radiotherapy has increased substantially. This is partly due to enhanced linear accelerator technology, providing more flexibility in field definition (e.g. the usage of computer-controlled dynamic multileaf collimators), which led to intensity modulated radiotherapy (IMRT). Important improvements have also been made in the treatment planning process, more specifically in the dose calculations. Originally, dose calculations relied heavily on analytic, semi-analytic and empirical algorithms. The more accurate convolution/superposition codes use pre-calculated Monte Carlo dose "kernels" partly accounting for tissue density heterogeneities. It is generally recognized that the Monte Carlo method is able to increase accuracy even further. Since the second half of the 1990s, several Monte Carlo dose engines for radiotherapy treatment planning have been introduced. To enable the use of a Monte Carlo treatment planning (MCTP) dose engine in clinical circumstances, approximations have been introduced to limit the calculation time. In this paper, the literature on MCTP is reviewed, focussing on patient modeling, approximations in linear accelerator modeling and variance reduction techniques. An overview of published comparisons between MC dose engines and conventional dose calculations is provided for phantom studies and clinical examples, evaluating the added value of MCTP in the clinic. An overview of existing Monte Carlo dose engines and commercial MCTP systems is presented and some specific issues concerning the commissioning of a MCTP system are discussed.

  9. An unbiased Hessian representation for Monte Carlo PDFs

    Energy Technology Data Exchange (ETDEWEB)

    Carrazza, Stefano; Forte, Stefano [Universita di Milano, TIF Lab, Dipartimento di Fisica, Milan (Italy); INFN, Sezione di Milano (Italy); Kassabov, Zahari [Universita di Milano, TIF Lab, Dipartimento di Fisica, Milan (Italy); Universita di Torino, Dipartimento di Fisica, Turin (Italy); INFN, Sezione di Torino (Italy); Latorre, Jose Ignacio [Universitat de Barcelona, Departament d' Estructura i Constituents de la Materia, Barcelona (Spain); Rojo, Juan [University of Oxford, Rudolf Peierls Centre for Theoretical Physics, Oxford (United Kingdom)

    2015-08-15

    We develop a methodology for the construction of a Hessian representation of Monte Carlo sets of parton distributions, based on the use of a subset of the Monte Carlo PDF replicas as an unbiased linear basis, and of a genetic algorithm for the determination of the optimal basis. We validate the methodology by first showing that it faithfully reproduces a native Monte Carlo PDF set (NNPDF3.0), and then, that if applied to Hessian PDF set (MMHT14) which was transformed into a Monte Carlo set, it gives back the starting PDFs with minimal information loss. We then show that, when applied to a large Monte Carlo PDF set obtained as combination of several underlying sets, the methodology leads to a Hessian representation in terms of a rather smaller set of parameters (MC-H PDFs), thereby providing an alternative implementation of the recently suggested Meta-PDF idea and a Hessian version of the recently suggested PDF compression algorithm (CMC-PDFs). The mc2hessian conversion code is made publicly available together with (through LHAPDF6) a Hessian representations of the NNPDF3.0 set, and the MC-H PDF set. (orig.)

  10. Frequency domain optical tomography using a Monte Carlo perturbation method

    Science.gov (United States)

    Yamamoto, Toshihiro; Sakamoto, Hiroki

    2016-04-01

    A frequency domain Monte Carlo method is applied to near-infrared optical tomography, where an intensity-modulated light source with a given modulation frequency is used to reconstruct optical properties. The frequency domain reconstruction technique allows for better separation between the scattering and absorption properties of inclusions, even for ill-posed inverse problems, due to cross-talk between the scattering and absorption reconstructions. The frequency domain Monte Carlo calculation for light transport in an absorbing and scattering medium has thus far been analyzed mostly for the reconstruction of optical properties in simple layered tissues. This study applies a Monte Carlo calculation algorithm, which can handle complex-valued particle weights for solving a frequency domain transport equation, to optical tomography in two-dimensional heterogeneous tissues. The Jacobian matrix that is needed to reconstruct the optical properties is obtained by a first-order "differential operator" technique, which involves less variance than the conventional "correlated sampling" technique. The numerical examples in this paper indicate that the newly proposed Monte Carlo method provides reconstructed results for the scattering and absorption coefficients that compare favorably with the results obtained from conventional deterministic or Monte Carlo methods.

  11. Parallel MCNP Monte Carlo transport calculations with MPI

    International Nuclear Information System (INIS)

    The steady increase in computational performance has made Monte Carlo calculations for large/complex systems possible. However, in order to make these calculations practical, order of magnitude increases in performance are necessary. The Monte Carlo method is inherently parallel (particles are simulated independently) and thus has the potential for near-linear speedup with respect to the number of processors. Further, the ever-increasing accessibility of parallel computers, such as workstation clusters, facilitates the practical use of parallel Monte Carlo. Recognizing the nature of the Monte Carlo method and the trends in available computing, the code developers at Los Alamos National Laboratory implemented the message-passing general-purpose Monte Carlo radiation transport code MCNP (version 4A). The PVM package was chosen by the MCNP code developers because it supports a variety of communication networks, several UNIX platforms, and heterogeneous computer systems. This PVM version of MCNP has been shown to produce speedups that approach the number of processors and thus, is a very useful tool for transport analysis. Due to software incompatibilities on the local IBM SP2, PVM has not been available, and thus it is not possible to take advantage of this useful tool. Hence, it became necessary to implement an alternative message-passing library package into MCNP. Because the message-passing interface (MPI) is supported on the local system, takes advantage of the high-speed communication switches in the SP2, and is considered to be the emerging standard, it was selected

  12. Monte Carlo Simulation of Damage Depth in Focused Ion Beam Milling Si3N4 Thin Film

    Institute of Scientific and Technical Information of China (English)

    TAN Yong-wen; XIE Xue-bing; Jack Zhou; XU Tian-wei; YANG Wei-guo; YANG Hai

    2007-01-01

    The damage properties of Focused Ion Beam(FIB) milling Si3N4 thin film are investigated by the detailed analyzing images of nanoholes and simulation of Monte Carlo. The damage depth in the Si3N4 thin film for two different ion species(Gallium and Arsenic) under various parameters(ion energy, angle of incidence) are investigated by Monte Carlo method. The simulations show the damage depth increases with the increasing ion energy, the damage depth is dependent on the angle of incident ion, the curves of the damage depth for Ga ion and As ion at 30 keV nearly superpose, while the damage depth for Ga with 90 keV ion is more than that for As ion with the same energy.

  13. Monte Carlo investigation of the dosimetric effect of the Autoscan ultrasound probe for guidance in radiotherapy

    Science.gov (United States)

    Martyn, Michael; O'Shea, Tuathan; Harris, Emma; Bamber, Jeffrey; Gilroy, Stephen; Foley, Mark J.

    2016-04-01

    The aim of this study was to quantify the dosimetric effect of the Autoscan™ ultrasound probe, which is a 3D transperineal probe used for real-time tissue tracking during the delivery of radiotherapy. CT images of an anthropomorphic phantom, with and without the probe placed in contact with its surface, were obtained (0.75 mm slice width, 140 kVp). CT datasets were used for relative dose calculation in Monte Carlo simulations of a 7-field plan delivered to the phantom. The Monte Carlo software packages BEAMnrc and DOSXYZnrc were used for this purpose. A number of simulations, which varied the distance of the radiation field edge from the probe face (0 mm to 5 mm), were performed. Perineal surface doses as a function of distance from the radiation field edge, with and without the probe in place, were compared. The presence of the probe was found to result in an increase in perineal surface dose, relative to the maximum dose. The maximum increase in surface dose was 18.15%, at a probe face to field edge distance of 0 mm. However increases in surface dose fall-off rapidly as this distance increases, agreeing within Monte Carlo simulation uncertainty at distances >= 5 mm. Using data from three patient volunteers, a typical probe face to field edge distance was calculated to be ≍20 mm. Our results therefore indicate that the presence of the probe is unlikely to adversely affect a typical patient treatment, since the dosimetric effect of the probe is minimal at these distances.

  14. MMCTP: a radiotherapy research environment for Monte Carlo and patient-specific treatment planning

    International Nuclear Information System (INIS)

    Radiotherapy research lacks a flexible computational research environment for Monte Carlo (MC) and patient-specific treatment planning. The purpose of this study was to develop a flexible software package on low-cost hardware with the aim of integrating new patient-specific treatment planning with MC dose calculations suitable for large-scale prospective and retrospective treatment planning studies. We designed the software package 'McGill Monte Carlo treatment planning' (MMCTP) for the research development of MC and patient-specific treatment planning. The MMCTP design consists of a graphical user interface (GUI), which runs on a simple workstation connected through standard secure-shell protocol to a cluster for lengthy MC calculations. Treatment planning information (e.g., images, structures, beam geometry properties and dose distributions) is converted into a convenient MMCTP local file storage format designated, the McGill RT format. MMCTP features include (a) DICOMRT, RTOG and CADPlan CART format imports; (b) 2D and 3D visualization views for images, structure contours, and dose distributions; (c) contouring tools; (d) DVH analysis, and dose matrix comparison tools; (e) external beam editing; (f) MC transport calculation from beam source to patient geometry for photon and electron beams. The MC input files, which are prepared from the beam geometry properties and patient information (e.g., images and structure contours), are uploaded and run on a cluster using shell commands controlled from the MMCTP GUI. The visualization, dose matrix operation and DVH tools offer extensive options for plan analysis and comparison between MC plans and plans imported from commercial treatment planning systems. The MMCTP GUI provides a flexible research platform for the development of patient-specific MC treatment planning for photon and electron external beam radiation therapy. The impact of this tool lies in the fact that it allows for systematic, platform-independent, large

  15. Analysis of a proposed Compton backscatter imaging technique

    Science.gov (United States)

    Hall, James M.; Jacoby, Barry A.

    1994-03-01

    One-sided imaging techniques are currently being used in nondestructive evaluation of surfaces and shallow subsurface structures. In this work we present both analytical calculations and detailed Monte Carlo simulations aimed at assessing the capability of a proposed Compton backscattering imaging technique designed to detect and characterize voids located several centimeters below the surface of a solid.

  16. Optimum and efficient sampling for variational quantum Monte Carlo

    CERN Document Server

    Trail, John Robert; 10.1063/1.3488651

    2010-01-01

    Quantum mechanics for many-body systems may be reduced to the evaluation of integrals in 3N dimensions using Monte-Carlo, providing the Quantum Monte Carlo ab initio methods. Here we limit ourselves to expectation values for trial wavefunctions, that is to Variational quantum Monte Carlo. Almost all previous implementations employ samples distributed as the physical probability density of the trial wavefunction, and assume the Central Limit Theorem to be valid. In this paper we provide an analysis of random error in estimation and optimisation that leads naturally to new sampling strategies with improved computational and statistical properties. A rigorous lower limit to the random error is derived, and an efficient sampling strategy presented that significantly increases computational efficiency. In addition the infinite variance heavy tailed random errors of optimum parameters in conventional methods are replaced with a Normal random error, strengthening the theoretical basis of optimisation. The method is ...

  17. Efficiency of Monte Carlo sampling in chaotic systems.

    Science.gov (United States)

    Leitão, Jorge C; Lopes, J M Viana Parente; Altmann, Eduardo G

    2014-11-01

    In this paper we investigate how the complexity of chaotic phase spaces affect the efficiency of importance sampling Monte Carlo simulations. We focus on flat-histogram simulations of the distribution of finite-time Lyapunov exponent in a simple chaotic system and obtain analytically that the computational effort: (i) scales polynomially with the finite time, a tremendous improvement over the exponential scaling obtained in uniform sampling simulations; and (ii) the polynomial scaling is suboptimal, a phenomenon known as critical slowing down. We show that critical slowing down appears because of the limited possibilities to issue a local proposal in the Monte Carlo procedure when it is applied to chaotic systems. These results show how generic properties of chaotic systems limit the efficiency of Monte Carlo simulations.

  18. Application of biasing techniques to the contributon Monte Carlo method

    Energy Technology Data Exchange (ETDEWEB)

    Dubi, A.; Gerstl, S.A.W.

    1980-01-01

    Recently, a new Monte Carlo Method called the Contribution Monte Carlo Method was developed. The method is based on the theory of contributions, and uses a new receipe for estimating target responses by a volume integral over the contribution current. The analog features of the new method were discussed in previous publications. The application of some biasing methods to the new contribution scheme is examined here. A theoretical model is developed that enables an analytic prediction of the benefit to be expected when these biasing schemes are applied to both the contribution method and regular Monte Carlo. This model is verified by a variety of numerical experiments and is shown to yield satisfying results, especially for deep-penetration problems. Other considerations regarding the efficient use of the new method are also discussed, and remarks are made as to the application of other biasing methods. 14 figures, 1 tables.

  19. Monte Carlo Simulation in Statistical Physics An Introduction

    CERN Document Server

    Binder, Kurt

    2010-01-01

    Monte Carlo Simulation in Statistical Physics deals with the computer simulation of many-body systems in condensed-matter physics and related fields of physics, chemistry and beyond, to traffic flows, stock market fluctuations, etc.). Using random numbers generated by a computer, probability distributions are calculated, allowing the estimation of the thermodynamic properties of various systems. This book describes the theoretical background to several variants of these Monte Carlo methods and gives a systematic presentation from which newcomers can learn to perform such simulations and to analyze their results. The fifth edition covers Classical as well as Quantum Monte Carlo methods. Furthermore a new chapter on the sampling of free-energy landscapes has been added. To help students in their work a special web server has been installed to host programs and discussion groups (http://wwwcp.tphys.uni-heidelberg.de). Prof. Binder was awarded the Berni J. Alder CECAM Award for Computational Physics 2001 as well ...

  20. Sequential Monte Carlo on large binary sampling spaces

    CERN Document Server

    Schäfer, Christian

    2011-01-01

    A Monte Carlo algorithm is said to be adaptive if it automatically calibrates its current proposal distribution using past simulations. The choice of the parametric family that defines the set of proposal distributions is critical for a good performance. In this paper, we present such a parametric family for adaptive sampling on high-dimensional binary spaces. A practical motivation for this problem is variable selection in a linear regression context. We want to sample from a Bayesian posterior distribution on the model space using an appropriate version of Sequential Monte Carlo. Raw versions of Sequential Monte Carlo are easily implemented using binary vectors with independent components. For high-dimensional problems, however, these simple proposals do not yield satisfactory results. The key to an efficient adaptive algorithm are binary parametric families which take correlations into account, analogously to the multivariate normal distribution on continuous spaces. We provide a review of models for binar...

  1. Calibration and Monte Carlo modelling of neutron long counters

    CERN Document Server

    Tagziria, H

    2000-01-01

    The Monte Carlo technique has become a very powerful tool in radiation transport as full advantage is taken of enhanced cross-section data, more powerful computers and statistical techniques, together with better characterisation of neutron and photon source spectra. At the National Physical Laboratory, calculations using the Monte Carlo radiation transport code MCNP-4B have been combined with accurate measurements to characterise two long counters routinely used to standardise monoenergetic neutron fields. New and more accurate response function curves have been produced for both long counters. A novel approach using Monte Carlo methods has been developed, validated and used to model the response function of the counters and determine more accurately their effective centres, which have always been difficult to establish experimentally. Calculations and measurements agree well, especially for the De Pangher long counter for which details of the design and constructional material are well known. The sensitivit...

  2. VARIATIONAL MONTE-CARLO APPROACH FOR ARTICULATED OBJECT TRACKING

    Directory of Open Access Journals (Sweden)

    Kartik Dwivedi

    2013-12-01

    Full Text Available In this paper, we describe a novel variational Monte Carlo approach for modeling and tracking body parts of articulated objects. An articulated object (human target is represented as a dynamic Markov network of the different constituent parts. The proposed approach combines local information of individual body parts and other spatial constraints influenced by neighboring parts. The movement of the relative parts of the articulated body is modeled with local information of displacements from the Markov network and the global information from other neighboring parts. We explore the effect of certain model parameters (including the number of parts tracked; number of Monte-Carlo cycles, etc. on system accuracy and show that ourvariational Monte Carlo approach achieves better efficiency and effectiveness compared to other methods on a number of real-time video datasets containing single targets.

  3. Monte Carlo Methods for Tempo Tracking and Rhythm Quantization

    CERN Document Server

    Cemgil, A T; 10.1613/jair.1121

    2011-01-01

    We present a probabilistic generative model for timing deviations in expressive music performance. The structure of the proposed model is equivalent to a switching state space model. The switch variables correspond to discrete note locations as in a musical score. The continuous hidden variables denote the tempo. We formulate two well known music recognition problems, namely tempo tracking and automatic transcription (rhythm quantization) as filtering and maximum a posteriori (MAP) state estimation tasks. Exact computation of posterior features such as the MAP state is intractable in this model class, so we introduce Monte Carlo methods for integration and optimization. We compare Markov Chain Monte Carlo (MCMC) methods (such as Gibbs sampling, simulated annealing and iterative improvement) and sequential Monte Carlo methods (particle filters). Our simulation results suggest better results with sequential methods. The methods can be applied in both online and batch scenarios such as tempo tracking and transcr...

  4. Overview of the MCU Monte Carlo software package

    International Nuclear Information System (INIS)

    Highlights: • MCU is the Monte Carlo code for particle transport in 3D systems with depletion. • Criticality and fixed source problems are solved using pure point-wise approximation. • MCU is parallelized with MPI in three different modes. • MCU has coolant, fuel and xenon feedback for VVER calculations. • MCU is verified for reactors with thermal, intermediate and fast neutron spectrum. - Abstract: MCU (Monte Carlo Universal) is a project on development and practical use of a universal computer code for simulation of particle transport (neutrons, photons, electrons, positrons) in three-dimensional systems by means of the Monte Carlo method. This paper provides the information on the current state of the project. The developed libraries of constants are briefly described, and the potentialities of the MCU-5 package modules and the executable codes compiled from them are characterized. Examples of important problems of reactor physics solved with the code are presented

  5. Estimation of population variance in contributon Monte Carlo

    International Nuclear Information System (INIS)

    Based on the theory of contributons, a new Monte Carlo method known as the contributon Monte Carlo method has recently been developed. The method has found applications in several practical shielding problems. The authors analyze theoretically the variance and efficiency of the new method, by taking moments around the score. In order to compare the contributon game with a game of simple geometrical splitting and also to get the optimal placement of the contributon volume, the moments equations were solved numerically for a one-dimensional, one-group problem using a 10-mfp-thick homogeneous slab. It is found that the optimal placement of the contributon volume is adjacent to the detector; even at its most optimal the contributon Monte Carlo is less efficient than geometrical splitting

  6. Vectorizing and macrotasking Monte Carlo neutral particle algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Heifetz, D.B.

    1987-04-01

    Monte Carlo algorithms for computing neutral particle transport in plasmas have been vectorized and macrotasked. The techniques used are directly applicable to Monte Carlo calculations of neutron and photon transport, and Monte Carlo integration schemes in general. A highly vectorized code was achieved by calculating test flight trajectories in loops over arrays of flight data, isolating the conditional branches to as few a number of loops as possible. A number of solutions are discussed to the problem of gaps appearing in the arrays due to completed flights, which impede vectorization. A simple and effective implementation of macrotasking is achieved by dividing the calculation of the test flight profile among several processors. A tree of random numbers is used to ensure reproducible results. The additional memory required for each task may preclude using a larger number of tasks. In future machines, the limit of macrotasking may be possible, with each test flight, and split test flight, being a separate task.

  7. Monte Carlo tests of the ELIPGRID-PC algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Davidson, J.R.

    1995-04-01

    The standard tool for calculating the probability of detecting pockets of contamination called hot spots has been the ELIPGRID computer code of Singer and Wickman. The ELIPGRID-PC program has recently made this algorithm available for an IBM{reg_sign} PC. However, no known independent validation of the ELIPGRID algorithm exists. This document describes a Monte Carlo simulation-based validation of a modified version of the ELIPGRID-PC code. The modified ELIPGRID-PC code is shown to match Monte Carlo-calculated hot-spot detection probabilities to within {plus_minus}0.5% for 319 out of 320 test cases. The one exception, a very thin elliptical hot spot located within a rectangular sampling grid, differed from the Monte Carlo-calculated probability by about 1%. These results provide confidence in the ability of the modified ELIPGRID-PC code to accurately predict hot-spot detection probabilities within an acceptable range of error.

  8. Strategies for improving the efficiency of quantum Monte Carlo calculations

    CERN Document Server

    Lee, R M; Nemec, N; Rios, P Lopez; Drummond, N D

    2010-01-01

    We describe a number of strategies for optimizing the efficiency of quantum Monte Carlo (QMC) calculations. We investigate the dependence of the efficiency of the variational Monte Carlo method on the sampling algorithm. Within a unified framework, we compare several commonly used variants of diffusion Monte Carlo (DMC). We then investigate the behavior of DMC calculations on parallel computers and the details of parallel implementations, before proposing a technique to optimize the efficiency of the extrapolation of DMC results to zero time step, finding that a relative time step ratio of 1:4 is optimal. Finally, we discuss the removal of serial correlation from data sets by reblocking, setting out criteria for the choice of block length and quantifying the effects of the uncertainty in the estimated correlation length and the presence of divergences in the local energy on estimated error bars on QMC energies.

  9. Construction of Monte Carlo operators in collisional transport theory

    International Nuclear Information System (INIS)

    A Monte Carlo approach for investigating the dynamics of quiescent collisional magnetoplasmas is presented, based on the discretization of the gyrokinetic equation. The theory applies to a strongly rotating multispecies plasma, in a toroidally axisymmetric configuration. Expressions of the Monte Carlo collision operators are obtained for general v-space nonorthogonal coordinates systems, in terms of approximate solutions of the discretized gyrokinetic equation. Basic features of the Monte Carlo operators are that they fullfill all the required conservation laws, i.e., linear momentum and kinetic energy conservation, and in addition that they take into account correctly also off-diagonal diffusion coefficients. The present operators are thus potentially useful for describing the dynamics of a multispecies toroidal magnetoplasma. In particular, strict ambipolarity of particle fluxes is ensured automatically in the limit of small departures of the unperturbed particle trajectories from some initial axisymmetric toroidal magnetic surfaces

  10. Meaningful timescales from Monte Carlo simulations of molecular systems

    CERN Document Server

    Costa, Liborio I

    2016-01-01

    A new Markov Chain Monte Carlo method for simulating the dynamics of molecular systems with atomistic detail is introduced. In contrast to traditional Kinetic Monte Carlo approaches, where the state of the system is associated with minima in the energy landscape, in the proposed method, the state of the system is associated with the set of paths traveled by the atoms and the transition probabilities for an atom to be displaced are proportional to the corresponding velocities. In this way, the number of possible state-to-state transitions is reduced to a discrete set, and a direct link between the Monte Carlo time step and true physical time is naturally established. The resulting rejection-free algorithm is validated against event-driven molecular dynamics: the equilibrium and non-equilibrium dynamics of hard disks converge to the exact results with decreasing displacement size.

  11. Properties of Reactive Oxygen Species by Quantum Monte Carlo

    CERN Document Server

    Zen, Andrea; Guidoni, Leonardo

    2014-01-01

    The electronic properties of the oxygen molecule, in its singlet and triplet states, and of many small oxygen-containing radicals and anions have important roles in different fields of Chemistry, Biology and Atmospheric Science. Nevertheless, the electronic structure of such species is a challenge for ab-initio computational approaches because of the difficulties to correctly describe the statical and dynamical correlation effects in presence of one or more unpaired electrons. Only the highest-level quantum chemical approaches can yield reliable characterizations of their molecular properties, such as binding energies, equilibrium structures, molecular vibrations, charge distribution and polarizabilities. In this work we use the variational Monte Carlo (VMC) and the lattice regularized Monte Carlo (LRDMC) methods to investigate the equilibrium geometries and molecular properties of oxygen and oxygen reactive species. Quantum Monte Carlo methods are used in combination with the Jastrow Antisymmetrized Geminal ...

  12. The Monte Carlo method in quantum field theory

    CERN Document Server

    Morningstar, C

    2007-01-01

    This series of six lectures is an introduction to using the Monte Carlo method to carry out nonperturbative studies in quantum field theories. Path integrals in quantum field theory are reviewed, and their evaluation by the Monte Carlo method with Markov-chain based importance sampling is presented. Properties of Markov chains are discussed in detail and several proofs are presented, culminating in the fundamental limit theorem for irreducible Markov chains. The example of a real scalar field theory is used to illustrate the Metropolis-Hastings method and to demonstrate the effectiveness of an action-preserving (microcanonical) local updating algorithm in reducing autocorrelations. The goal of these lectures is to provide the beginner with the basic skills needed to start carrying out Monte Carlo studies in quantum field theories, as well as to present the underlying theoretical foundations of the method.

  13. FitSKIRT: genetic algorithms to automatically fit dusty galaxies with a Monte Carlo radiative transfer code

    CERN Document Server

    De Geyter, Gert; Fritz, Jacopo; Camps, Peter

    2012-01-01

    We present FitSKIRT, a method to efficiently fit radiative transfer models to UV/optical images of dusty galaxies. These images have the advantage that they have better spatial resolution compared to FIR/submm data. FitSKIRT uses the GAlib genetic algorithm library to optimize the output of the SKIRT Monte Carlo radiative transfer code. Genetic algorithms prove to be a valuable tool in handling the multi- dimensional search space as well as the noise induced by the random nature of the Monte Carlo radiative transfer code. FitSKIRT is tested on artificial images of a simulated edge-on spiral galaxy, where we gradually increase the number of fitted parameters. We find that we can recover all model parameters, even if all 11 model parameters are left unconstrained. Finally, we apply the FitSKIRT code to a V-band image of the edge-on spiral galaxy NGC4013. This galaxy has been modeled previously by other authors using different combinations of radiative transfer codes and optimization methods. Given the different...

  14. MLE [Maximum Likelihood Estimator] reconstruction of a brain phantom using a Monte Carlo transition matrix and a statistical stopping rule

    International Nuclear Information System (INIS)

    In order to study properties of the Maximum Likelihood Estimator (MLE) algorithm for image reconstruction in Positron Emission Tomographyy (PET), the algorithm is applied to data obtained by the ECAT-III tomograph from a brain phantom. The procedure for subtracting accidental coincidences from the data stream generated by this physical phantom is such that he resultant data are not Poisson distributed. This makes the present investigation different from other investigations based on computer-simulated phantoms. It is shown that the MLE algorithm is robust enough to yield comparatively good images, especially when the phantom is in the periphery of the field of view, even though the underlying assumption of the algorithm is violated. Two transition matrices are utilized. The first uses geometric considerations only. The second is derived by a Monte Carlo simulation which takes into account Compton scattering in the detectors, positron range, etc. in the detectors. It is demonstrated that the images obtained from the Monte Carlo matrix are superior in some specific ways. A stopping rule derived earlier and allowing the user to stop the iterative process before the images begin to deteriorate is tested. Since the rule is based on the Poisson assumption, it does not work well with the presently available data, although it is successful wit computer-simulated Poisson data

  15. Fully 3D tomographic reconstruction by Monte Carlo simulation of the system matrix in preclinical PET with iodine 124

    International Nuclear Information System (INIS)

    Immuno-PET imaging can be used to assess the pharmacokinetic in radioimmunotherapy. When using iodine-124, PET quantitative imaging is limited by physics-based degrading factors within the detection system and the object, such as the long positron range in water and the complex spectrum of gamma photons. The objective of this thesis was to develop a fully 3D tomographic reconstruction method (S(MC)2PET) using Monte Carlo simulations for estimating the system matrix, in the context of preclinical imaging with iodine-124. The Monte Carlo simulation platform GATE was used for that respect. Several complexities of system matrices were calculated, with at least a model of the PET system response function. Physics processes in the object was either neglected or taken into account using a precise or a simplified object description. The impact of modelling refinement and statistical variance related to the system matrix elements was evaluated on final reconstructed images. These studies showed that a high level of complexity did not always improve qualitative and quantitative results, owing to the high-variance of the associated system matrices. (author)

  16. Monte Carlo simulation of source-excited in vivo x-ray fluorescence measurements of heavy metals

    Science.gov (United States)

    O'Meara, J. M.; Chettle, D. R.; McNeill, F. E.; Prestwich, W. V.; Svensson, C. E.

    1998-06-01

    This paper reports on the Monte Carlo simulation of in vivo x-ray fluorescence (XRF) measurements. Our model is an improvement on previously reported simulations in that it relies on a theoretical basis for modelling Compton momentum broadening as well as detector efficiency. Furthermore, this model is an accurate simulation of experimentally detected spectra when comparisons are made in absolute counts; preceding models have generally only achieved agreement with spectra normalized to unit area. Our code is sufficiently flexible to be applied to the investigation of numerous source-excited in vivo XRF systems. Thus far the simulation has been applied to the modelling of two different systems. The first application was the investigation of various aspects of a new in vivo XRF system, the measurement of uranium in bone with images/0031-9155/43/6/003/img1.gif" ALIGN="MIDDLE"/> in a backscatter images/0031-9155/43/6/003/img2.gif" ALIGN="MIDDLE"/> geometry. The Monte Carlo simulation was critical in assessing the potential of applying XRF to the measurement of uranium in bone. Currently the Monte Carlo code is being used to evaluate a potential means of simplifying an established in vivo XRF system, the measurement of lead in bone with images/0031-9155/43/6/003/img1.gif" ALIGN="MIDDLE"/> in a images/0031-9155/43/6/003/img4.gif" ALIGN="MIDDLE"/> geometry. The results from these simulations may demonstrate that calibration procedures can be significantly simplified and subject dose may be reduced. As well as providing an excellent tool for optimizing designs of new systems and improving existing techniques, this model can be used in the investigation of the dosimetry of various XRF systems. Our simulation allows a detailed understanding of the numerous processes involved when heavy metal concentrations are measured in vivo with XRF.

  17. A standard event class for Monte Carlo generators

    International Nuclear Information System (INIS)

    StdHepC++ is a CLHEP Monte Carlo event class library which provides a common interface to Monte Carlo event generators. This work is an extensive redesign of the StdHep Fortran interface to use the full power of object oriented design. A generated event maps naturally onto the Directed Acyclic Graph concept and we have used the HepMC classes to implement this. The full implementation allows the user to combine events to simulate beam pileup and access them transparently as though they were a single event

  18. Applications of quantum Monte Carlo methods in condensed systems

    CERN Document Server

    Kolorenc, Jindrich

    2010-01-01

    The quantum Monte Carlo methods represent a powerful and broadly applicable computational tool for finding very accurate solutions of the stationary Schroedinger equation for atoms, molecules, solids and a variety of model systems. The algorithms are intrinsically parallel and are able to take full advantage of the present-day high-performance computing systems. This review article concentrates on the fixed-node/fixed-phase diffusion Monte Carlo method with emphasis on its applications to electronic structure of solids and other extended many-particle systems.

  19. Monte Carlo methods and models in finance and insurance

    CERN Document Server

    Korn, Ralf

    2010-01-01

    Offering a unique balance between applications and calculations, this book incorporates the application background of finance and insurance with the theory and applications of Monte Carlo methods. It presents recent methods and algorithms, including the multilevel Monte Carlo method, the statistical Romberg method, and the Heath-Platen estimator, as well as recent financial and actuarial models, such as the Cheyette and dynamic mortality models. The book enables readers to find the right algorithm for a desired application and illustrates complicated methods and algorithms with simple applicat

  20. Culture, Constructivism, and Media: Designing a Module on Carlos Slim

    OpenAIRE

    Agudo, Roberto Rey

    2012-01-01

    Mexican tycoon Carlos Slim Helú has been a fixture on Forbes’s list of billionaires since 1991, and for the past three years, he has topped the magazine’s list of the world’s richest men. Although he is exceptionally well-known in his native Mexico, the majority of American college students have never heard of Carlos Slim. This article presents a curricular module built around this charismatic and controversial figure. The module requires students to navigate Internet-supported news media in ...

  1. Parton distribution functions in Monte Carlo factorisation scheme

    CERN Document Server

    Jadach, S; Sapeta, S; Siodmok, A; Skrzypek, M

    2016-01-01

    A next step in development of the KrkNLO method of including complete NLO QCD corrections to hard processes in a LO parton-shower Monte Carlo (PSMC) is presented. It consists of generalisation of the method, previously used for the Drell-Yan process, to Higgs-boson production. This extension is accompanied with the complete description of parton distribution functions (PDFs) in a dedicated, Monte Carlo (MC) factorisation scheme, applicable to any process of production of one or more colour-neutral particles in hadron-hadron collisions.

  2. Monte Carlo studies of domain growth in two dimensions

    International Nuclear Information System (INIS)

    Monte Carlo simulations have been carried out to study the effect of temperature on the kinetics of domain growth. The concept of ''spatial entropy'' is introduced. It is shown that ''spatial entropy'' of the domain can be used to give a measure of the roughening of the domain. Most of the roughening is achieved during the initial time (t< or approx. 10 Monte Carlo cycles), the rate of roughening being greater for higher temperatures. For later times the roughening of the domain for different temperatures proceeds at essentially the same rate. (author)

  3. Utilising Monte Carlo Simulation for the Valuation of Mining Concessions

    Directory of Open Access Journals (Sweden)

    Rosli Said

    2005-12-01

    Full Text Available Valuation involves the analyses of various input data to produce an estimated value. Since each input is itself often an estimate, there is an element of uncertainty in the input. This leads to uncertainty in the resultant output value. It is argued that a valuation must also convey information on the uncertainty, so as to be more meaningful and informative to the user. The Monte Carlo simulation technique can generate the information on uncertainty and is therefore potentially useful to valuation. This paper reports on the investigation that has been conducted to apply Monte Carlo simulation technique in mineral valuation, more specifically, in the valuation of a quarry concession.

  4. Monte Carlo simulations of phosphate polyhedron connectivity in glasses

    Energy Technology Data Exchange (ETDEWEB)

    ALAM,TODD M.

    2000-01-01

    Monte Carlo simulations of phosphate tetrahedron connectivity distributions in alkali and alkaline earth phosphate glasses are reported. By utilizing a discrete bond model, the distribution of next-nearest neighbor connectivities between phosphate polyhedron for random, alternating and clustering bonding scenarios was evaluated as a function of the relative bond energy difference. The simulated distributions are compared to experimentally observed connectivities reported for solid-state two-dimensional exchange and double-quantum NMR experiments of phosphate glasses. These Monte Carlo simulations demonstrate that the polyhedron connectivity is best described by a random distribution in lithium phosphate and calcium phosphate glasses.

  5. Further experience in Bayesian analysis using Monte Carlo Integration

    OpenAIRE

    Dijk, Herman; Kloek, Teun

    1980-01-01

    textabstractAn earlier paper [Kloek and Van Dijk (1978)] is extended in three ways. First, Monte Carlo integration is performed in a nine-dimensional parameter space of Klein's model I [Klein (1950)]. Second, Monte Carlo is used as a tool for the elicitation of a uniform prior on a finite region by making use of several types of prior information. Third, special attention is given to procedures for the construction of importance functions which make use of nonlinear optimization methods. *1 T...

  6. Monte Carlo Form-Finding Method for Tensegrity Structures

    Science.gov (United States)

    Li, Yue; Feng, Xi-Qiao; Cao, Yan-Ping

    2010-05-01

    In this paper, we propose a Monte Carlo-based approach to solve tensegrity form-finding problems. It uses a stochastic procedure to find the deterministic equilibrium configuration of a tensegrity structure. The suggested Monte Carlo form-finding (MCFF) method is highly efficient because it does not involve complicated matrix operations and symmetry analysis and it works for arbitrary initial configurations. Both regular and non-regular tensegrity problems of large scale can be solved. Some representative examples are presented to demonstrate the efficiency and accuracy of this versatile method.

  7. Diffusion Monte Carlo: Exponentially inefficent for large systems?

    CERN Document Server

    Nemec, Norbert

    2009-01-01

    The computational cost of a Monte Carlo algorithm can only be meaningfully discussed when taking into account the magnitude of the resulting statistical error. Aiming for a fixed error per particle, we study the scaling behavior of the diffusion Monte Carlo method for large quantum systems. We identify the correlation within the population of walkers as the dominant scaling factor for large systems. While this factor is negligible for small and medium sized systems that are typically studied, it ultimately shows exponential scaling beyond system sizes that can be estimated straightforwardly for each specific system.

  8. Novel Quantum Monte Carlo Approaches for Quantum Liquids

    Science.gov (United States)

    Rubenstein, Brenda M.

    Quantum Monte Carlo methods are a powerful suite of techniques for solving the quantum many-body problem. By using random numbers to stochastically sample quantum properties, QMC methods are capable of studying low-temperature quantum systems well beyond the reach of conventional deterministic techniques. QMC techniques have likewise been indispensible tools for augmenting our current knowledge of superfluidity and superconductivity. In this thesis, I present two new quantum Monte Carlo techniques, the Monte Carlo Power Method and Bose-Fermi Auxiliary-Field Quantum Monte Carlo, and apply previously developed Path Integral Monte Carlo methods to explore two new phases of quantum hard spheres and hydrogen. I lay the foundation for a subsequent description of my research by first reviewing the physics of quantum liquids in Chapter One and the mathematics behind Quantum Monte Carlo algorithms in Chapter Two. I then discuss the Monte Carlo Power Method, a stochastic way of computing the first several extremal eigenvalues of a matrix too memory-intensive to be stored and therefore diagonalized. As an illustration of the technique, I demonstrate how it can be used to determine the second eigenvalues of the transition matrices of several popular Monte Carlo algorithms. This information may be used to quantify how rapidly a Monte Carlo algorithm is converging to the equilibrium probability distribution it is sampling. I next present the Bose-Fermi Auxiliary-Field Quantum Monte Carlo algorithm. This algorithm generalizes the well-known Auxiliary-Field Quantum Monte Carlo algorithm for fermions to bosons and Bose-Fermi mixtures. Despite some shortcomings, the Bose-Fermi Auxiliary-Field Quantum Monte Carlo algorithm represents the first exact technique capable of studying Bose-Fermi mixtures of any size in any dimension. In Chapter Six, I describe a new Constant Stress Path Integral Monte Carlo algorithm for the study of quantum mechanical systems under high pressures. While

  9. A standard Event Class for Monte Carlo Generators

    Institute of Scientific and Technical Information of China (English)

    L.A.Gerren; M.Fischler

    2001-01-01

    StdHepC++[1]is a CLHEP[2] Monte Carlo event class library which provides a common interface to Monte Carlo Event Generators,This work is an extensive redesign of the StdHep Fortran interface to use the full power of object oriented design,A generated event maps naturally onto the Directed Acyclic Graph concept and we have used the HepMC classes to implement this.The full implementation allows the user to combine events to simulate beam pileup and access them transparently as though they were a single event.

  10. Parallelization of Monte Carlo codes MVP/GMVP

    Energy Technology Data Exchange (ETDEWEB)

    Nagaya, Yasunobu; Mori, Takamasa; Nakagawa, Masayuki [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Sasaki, Makoto

    1998-03-01

    General-purpose Monte Carlo codes MVP/GMVP are well-vectorized and thus enable us to perform high-speed Monte Carlo calculations. In order to achieve more speedups, we parallelized the codes on the different types of the parallel processing platforms. The platforms reported are a distributed-memory vector-parallel computer Fujitsu VPP500, a distributed-memory massively parallel computer Intel Paragon and a distributed-memory scalar-parallel computer Hitachi SR2201. As mentioned generally, ideal speedup could be obtained for large-scale problems but parallelization efficiency got worse as the batch size per a processing element (PE) was smaller. (author)

  11. Enhancements for Multi-Player Monte-Carlo Tree Search

    Science.gov (United States)

    Nijssen, J. (Pim) A. M.; Winands, Mark H. M.

    Monte-Carlo Tree Search (MCTS) is becoming increasingly popular for playing multi-player games. In this paper we propose two enhancements for MCTS in multi-player games: (1) Progressive History and (2) Multi-Player Monte-Carlo Tree Search Solver (MP-MCTS-Solver). We analyze the performance of these enhancements in two different multi-player games: Focus and Chinese Checkers. Based on the experimental results we conclude that Progressive History is a considerable improvement in both games and MP-MCTS-Solver, using the standard update rule, is a genuine improvement in Focus.

  12. Monte Carlo simulation of electrons in dense gases

    Science.gov (United States)

    Tattersall, Wade; Boyle, Greg; Cocks, Daniel; Buckman, Stephen; White, Ron

    2014-10-01

    We implement a Monte-Carlo simulation modelling the transport of electrons and positrons in dense gases and liquids, by using a dynamic structure factor that allows us to construct structure-modified effective cross sections. These account for the coherent effects caused by interactions with the relatively dense medium. The dynamic structure factor also allows us to model thermal gases in the same manner, without needing to directly sample the velocities of the neutral particles. We present the results of a series of Monte Carlo simulations that verify and apply this new technique, and make comparisons with macroscopic predictions and Boltzmann equation solutions. Financial support of the Australian Research Council.

  13. Cosmological Markov Chain Monte Carlo simulation with Cmbeasy

    CERN Document Server

    Müller, C M

    2004-01-01

    We introduce a Markov Chain Monte Carlo simulation and data analysis package for the cosmological computation package Cmbeasy. We have taken special care in implementing an adaptive step algorithm for the Markov Chain Monte Carlo in order to improve convergence. Data analysis routines are provided which allow to test models of the Universe against up-to-date measurements of the Cosmic Microwave Background, Supernovae Ia and Large Scale Structure. The observational data is provided with the software for convenient usage. The package is publicly available as part of the Cmbeasy software at www.cmbeasy.org.

  14. Monte Carlo simulation of breast tomosynthesis: visibility of microcalcifications at different acquisition schemes

    Science.gov (United States)

    Petersson, Hannie; Dustler, Magnus; Tingberg, Anders; Timberg, Pontus

    2015-03-01

    Microcalcifications are one feature of interest in mammography and breast tomosynthesis (BT). To achieve optimal conditions for detection of microcalcifications in BT imaging, different acquisition geometries should be evaluated. The purpose of this work was to investigate the influence of acquisition schemes with different angular ranges, projection distributions and dose distributions on the visibility of microcalcifications in reconstructed BT volumes. Microcalcifications were inserted randomly in a high resolution software phantom and a simulation procedure was used to model a MAMMOMAT Inspiration BT system. The simulation procedure was based on analytical ray tracing to produce primary images, Monte Carlo to simulate scatter contributions and flatfield image acquisitions to model system characteristics. Image volumes were reconstructed using the novel method super-resolution reconstruction with statistical artifact reduction (SRSAR). For comparison purposes, the volume of the standard acquisition scheme (50° angular range and uniform projection and dose distribution) was also reconstructed using standard filtered backprojection (FBP). To compare the visibility and depth resolution of the microcalcifications, signal difference to noise ratio (SDNR) and artifact spread function width (ASFW) were calculated. The acquisition schemes with very high central dose yielded significantly lower SDNR than the schemes with more uniform dose distributions. The ASFW was found to decrease (meaning an increase in depth resolution) with wider angular range. In conclusion, none of the evaluated acquisition schemes were found to yield higher SDNR or depth resolution for the simulated microcalcifications than the standard acquisition scheme.

  15. Modeling High Energy (I-131) Pinhole Collimator for Small Animal Gamma Camera by Monte Carlo Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Young Jun; Kim, Kyeong Min; Kim, Jin Su; Park, Ji Ae; Lee, Young Sub; Yoo, A-ram; Kim, Jong Guk [Korea Institute of Radiologic and Medical Sciences, Seoul (Korea, Republic of); Lee, Hak Jae; Lee, Ki Sung [Korea University, Seoul (Korea, Republic of)

    2011-05-15

    In medical nuclear imaging, I-131 takes important role in not only the diagnostic image, but also the quantitative evaluation in nuclear medicine therapy. However, due to the relatively high energy peak of I-131[364 keV (82 %), 326 keV (0.27 %), 503 keV (0.36 %), 637 keV (7.18 %), 643 keV (0.22 %), 723 keV (1.77 %)], it is difficult to construct high resolution, high sensitivity preclinical gamma camera. Especially, 637 keV, 723 keV energy, penetration and scattering occur in relatively high possibility. In this manner, penetration and scattering of high energy gamma ray in collimator degrades image quality fatally. According to the characteristics, it is essential to design collimator which can minimize the degrading factor, and preserve the gamma ray for imaging at the same time. In this study, we designed and simulated the structure of pinhole collimator for a small animal high energy gamma camera by Monte Carlo simulation (GATE 6.0). In this model, the diameter, channel length of pinhole and the thickness of collimator are the main issue for determining the system sensitivity. Thus, in this study, we observed the difference in the number of photons on the scintillator which pass through the collimator that determined by those three factors

  16. Carlos Chagas Discoveries as a Drop Back to Scientific Construction of Chronic Chagas Heart Disease

    Directory of Open Access Journals (Sweden)

    Reinaldo B. Bestetti

    2016-01-01

    Full Text Available Abstract The scientific construction of chronic Chagas heart disease (CCHD started in 1910 when Carlos Chagas highlighted the presence of cardiac arrhythmia during physical examination of patients with chronic Chagas disease, and described a case of heart failure associated with myocardial inflammation and nests of parasites at autopsy. He described sudden cardiac death associated with arrhythmias in 1911, and its association with complete AV block detected by Jacquet's polygraph as Chagas reported in 1912. Chagas showed the presence of myocardial fibrosis underlying the clinical picture of CCHD in 1916, he presented a full characterization of the clinical aspects of CCHD in 1922. In 1928, Chagas detected fibrosis of the conductive system, and pointed out the presence of marked cardiomegaly at the chest X-Ray associated with minimal symptomatology. The use of serological reaction to diagnose CCHD was put into clinical practice in 1936, after Chagas' death, which along with the 12-lead ECG, revealed the epidemiological importance of CCHD in 1945. In 1953, the long period between initial infection and appearance of CCHD was established, whereas the annual incidence of CCHD from patients with the indeterminate form of the disease was established in 1956. The use of heart catheterization in 1965, exercise stress testing in 1973, Holter monitoring in 1975, Electrophysiologic testing in 1973, echocardiography in 1975, endomyocardial biopsy in 1981, and Magnetic Resonance Imaging in 1995, added to the fundamental clinical aspects of CCHD as described by Carlos Chagas.

  17. Radiosteoplasty study in animal bone and radiodosimetric evaluation using Monte Carlo code

    Energy Technology Data Exchange (ETDEWEB)

    Silveira, Marcia Flavia; Campos, Tarcisio Passos Ribeiro [Universidade Federal de Minas Gerais (UFMG), Belo Horizonte, MG (Brazil). Dept. de Engenharia Nuclear]. E-mail: marciaflaviafisio@gmail.com; campos@nuclear.ufmg.br

    2007-07-01

    The radiosteoplasty is a procedure that consists of the injection of a radioactive biomaterial incorporated to the bone cement into the osseous structure affected by cancer. This technique has been developed with the major objective to control the tumor or the regional bone metastasis (in situ) besides pain reduction and structural resistance increasing. In the present study the radiosteoplasty is applied to the bovine and swine bones in vitro using non-radioactive cement. The objective is to know the spatial distribution of the cold compound (non radioactive) in pig and ox bones after implant. A 2 mm needle was introduced into the cortical bone previously perforated. The distribution of this biomaterial was observed trough radiological images obtained just after the compound application. Recent dosimetric studies using Monte Carlo N-Particle method (MCNP-5) concluded that the spatial dose distribution is suitable for the protocol namely radiosteoplasty applied to treat bone tumors on superior and inferior members. The Monte Carlo method simulates the present process and it is particularly interesting tool to solve the complex photon and electron particle transport problems that can not be modeled by codes based on deterministic methods. These related radiodosimetric studies are presented and discussed. (author)

  18. Carlos Chagas Discoveries as a Drop Back to Scientific Construction of Chronic Chagas Heart Disease

    Science.gov (United States)

    Bestetti, Reinaldo B.; Restini, Carolina Baraldi A.; Couto, Lucélio B.

    2016-01-01

    The scientific construction of chronic Chagas heart disease (CCHD) started in 1910 when Carlos Chagas highlighted the presence of cardiac arrhythmia during physical examination of patients with chronic Chagas disease, and described a case of heart failure associated with myocardial inflammation and nests of parasites at autopsy. He described sudden cardiac death associated with arrhythmias in 1911, and its association with complete AV block detected by Jacquet's polygraph as Chagas reported in 1912. Chagas showed the presence of myocardial fibrosis underlying the clinical picture of CCHD in 1916, he presented a full characterization of the clinical aspects of CCHD in 1922. In 1928, Chagas detected fibrosis of the conductive system, and pointed out the presence of marked cardiomegaly at the chest X-Ray associated with minimal symptomatology. The use of serological reaction to diagnose CCHD was put into clinical practice in 1936, after Chagas' death, which along with the 12-lead ECG, revealed the epidemiological importance of CCHD in 1945. In 1953, the long period between initial infection and appearance of CCHD was established, whereas the annual incidence of CCHD from patients with the indeterminate form of the disease was established in 1956. The use of heart catheterization in 1965, exercise stress testing in 1973, Holter monitoring in 1975, Electrophysiologic testing in 1973, echocardiography in 1975, endomyocardial biopsy in 1981, and Magnetic Resonance Imaging in 1995, added to the fundamental clinical aspects of CCHD as described by Carlos Chagas. PMID:27223644

  19. Carlos Chagas Discoveries as a Drop Back to Scientific Construction of Chronic Chagas Heart Disease.

    Science.gov (United States)

    Bestetti, Reinaldo B; Restini, Carolina Baraldi A; Couto, Lucélio B

    2016-07-01

    The scientific construction of chronic Chagas heart disease (CCHD) started in 1910 when Carlos Chagas highlighted the presence of cardiac arrhythmia during physical examination of patients with chronic Chagas disease, and described a case of heart failure associated with myocardial inflammation and nests of parasites at autopsy. He described sudden cardiac death associated with arrhythmias in 1911, and its association with complete AV block detected by Jacquet's polygraph as Chagas reported in 1912. Chagas showed the presence of myocardial fibrosis underlying the clinical picture of CCHD in 1916, he presented a full characterization of the clinical aspects of CCHD in 1922. In 1928, Chagas detected fibrosis of the conductive system, and pointed out the presence of marked cardiomegaly at the chest X-Ray associated with minimal symptomatology. The use of serological reaction to diagnose CCHD was put into clinical practice in 1936, after Chagas' death, which along with the 12-lead ECG, revealed the epidemiological importance of CCHD in 1945. In 1953, the long period between initial infection and appearance of CCHD was established, whereas the annual incidence of CCHD from patients with the indeterminate form of the disease was established in 1956. The use of heart catheterization in 1965, exercise stress testing in 1973, Holter monitoring in 1975, Electrophysiologic testing in 1973, echocardiography in 1975, endomyocardial biopsy in 1981, and Magnetic Resonance Imaging in 1995, added to the fundamental clinical aspects of CCHD as described by Carlos Chagas. PMID:27223644

  20. Sensitivity of UVER enhancement to broken liquid water clouds: A Monte Carlo approach

    Science.gov (United States)

    Núñez, M.; Marín, M. J.; Serrano, D.; Utrillas, M. P.; Fienberg, K.; Martínez-Lozano, J. A.

    2016-01-01

    The study uses a Monte Carlo radiative transfer model to examine the sensitivity of the UV erythemal radiation (UVER) enhancement to broken liquid water clouds of the cumulus and stratocumulus type. The model uses monochromatic radiation at 310 nm corresponding approximately to the peak of the product between irradiance and the erythemal curve. All scattering, absorption, extinction coefficients, and spectral albedos are tuned to this wavelength. In order of importance, fractional cloud cover, the area of individual cloud patches, and cloud thickness exert a strong influence on the enhancement, with smaller contributions from cloud optical depth, cloud base height, and solar zenith angle. In order to produce realistic enhancements for our study area located in the Valencia region of Spain (39°30'N, 0°25'W), measurements were obtained from a Landsat image of the region in combination with a spectral Fourier transform model. The Monte Carlo model, as applied to the Fourier transform cloud distribution, produced satisfactory results compared to 1 year of measured UVER enhancement for the study region provided that fractional cloud cover was equal to or greater than 3/10. At smaller cloud fractions, the neglect of cloud patches less than 50 m × 50 m in area by the model created significant discrepancies.