Monte Carlo radiation transport in external beam radiotherapy
Çeçen, Yiğit
2013-01-01
The use of Monte Carlo in radiation transport is an effective way to predict absorbed dose distributions. Monte Carlo modeling has contributed to a better understanding of photon and electron transport by radiotherapy physicists. The aim of this review is to introduce Monte Carlo as a powerful radiation transport tool. In this review, photon and electron transport algorithms for Monte Carlo techniques are investigated and a clinical linear accelerator model is studied for external beam radiot...
A Monte Carlo code for ion beam therapy
Anaïs Schaeffer
2012-01-01
Initially developed for applications in detector and accelerator physics, the modern Fluka Monte Carlo code is now used in many different areas of nuclear science. Over the last 25 years, the code has evolved to include new features, such as ion beam simulations. Given the growing use of these beams in cancer treatment, Fluka simulations are being used to design treatment plans in several hadron-therapy centres in Europe. Fluka calculates the dose distribution for a patient treated at CNAO with proton beams. The colour-bar displays the normalized dose values. Fluka is a Monte Carlo code that very accurately simulates electromagnetic and nuclear interactions in matter. In the 1990s, in collaboration with NASA, the code was developed to predict potential radiation hazards received by space crews during possible future trips to Mars. Over the years, it has become the standard tool to investigate beam-machine interactions, radiation damage and radioprotection issues in the CERN accelerator com...
Monte Carlo simulations of nanoscale focused neon ion beam sputtering.
Timilsina, Rajendra; Rack, Philip D
2013-12-13
A Monte Carlo simulation is developed to model the physical sputtering of aluminum and tungsten emulating nanoscale focused helium and neon ion beam etching from the gas field ion microscope. Neon beams with different beam energies (0.5-30 keV) and a constant beam diameter (Gaussian with full-width-at-half-maximum of 1 nm) were simulated to elucidate the nanostructure evolution during the physical sputtering of nanoscale high aspect ratio features. The aspect ratio and sputter yield vary with the ion species and beam energy for a constant beam diameter and are related to the distribution of the nuclear energy loss. Neon ions have a larger sputter yield than the helium ions due to their larger mass and consequently larger nuclear energy loss relative to helium. Quantitative information such as the sputtering yields, the energy-dependent aspect ratios and resolution-limiting effects are discussed.
A Investigation of Radiotherapy Electron Beams Using Monte Carlo Techniques
Ding, George X.
1995-01-01
Radiotherapy electron beams are more complicated than photon beams due to variations in the beam production, the scattering of low-energy electrons, and the presence contaminant photons. The detailed knowledge of a radiotherapy beam is essential to an accurate calculation of dose distribution for a treatment planning system. This investigation aims to enhance our understanding of radiotherapy beams by focusing on electron beams used in radiotherapy. It starts with a description of the Monte Carlo simulation code, BEAM, and a detailed simulation of an accelerator head to obtain realistic radiotherapy beams. The simulation covers electron beams from various accelerators, including the NRC research accelerator, the NPL (UK), accelerator, A Varian Clinac 2100C, a Philips SL75-20, a Siemens KD2, an AECL Therac 20, and a Scanditronix MM50. The beam energies range from 4 to 50 MeV. The EGS4 user code, BEAM, is extensively benchmarked against experiment by comparing calculated dose distributions with measured dose distributions in water. The simulated beams are analyzed to obtain the characteristics of various electron beams from a variety of accelerators. The simulated beams are also used as inputs to calculate the following parameters: the mean electron energy, the most probable energy, the energy-range relationships, the depth-scaling factor to convert depths in plastic to water-equivalent depths, the water-to-air stopping-power ratios, and the electron fluence correction factors used to convert dose measured in plastics to dose in water. These parameters are essential for electron beam dosimetry. The results from this study can be applied in cancer clinics to improve the accuracy of the absolute dosimetry. The simulation also provides information about the backscatter into the beam monitor chamber, and predicts the influence on the beam output factors. This investigation presents comprehensive data on the clinical electron beams, and answers many questions which could
Monte Carlo simulation of electron beam air plasma characteristics
Institute of Scientific and Technical Information of China (English)
Deng Yong-Feng; Han Xian-Wei; Tan Chang
2009-01-01
A high-energy electron beam generator is used to generate a plasma in atmosphere. Based on a Monte Carlo toolkit named GEANT4,a model including complete physics processes is established to simulate the passage of the electron beam in air. Based on the model,the characteristics of the electron beam air plasma are calculated. The energy distribution of beam electrons (BEs) indicates that high-energy electrons almost reside in the centre region of the beam,but low-energy electrons always live in the fringe area. The energy deposition is calculated in two cases,i.e.,with and without secondary electrons (SEs). Analysis indicates that the energy deposition of Ses accounts for a large part of the total energy deposition. The results of the energy spectrum show that the electrons in the inlet layer of the low-pressure chamber (LPC) are monoenergetic,but the energy spectrum of the electrons in the outlet layer is not pure. The SEs are largely generated at the outlet of the LPC. Moreover,both the energy distribution of Bes and the magnitude of the density of SEs are closely related to the pressure of LPC. Thus,a conclusion is drawn that a low magnitude of LPC pressure is helpful for reducing the energy loss in the LPC and also useful for greatly increasing the secondary electron density in dense air.
Monte Carlo physical dosimetry for small photon beams
Energy Technology Data Exchange (ETDEWEB)
Perucha, M.; Rincon, M.; Leal, A.; Carrasco, E. [Sevilla Univ. (Spain). Dept. Fisiologia Medica y Biofisica; Sanchez-Doblado, F. [Sevilla Univ. (Spain). Dept. Fisiologia Medica y Biofisica]|[Hospital Univ. Virgen Macarena, Sevilla (Spain). Servicio de Oncologia Radioterapica; Nunez, L. [Clinica Puerta de Hierro, Madrid (Spain). Servicio de Radiofisica; Arrans, R.; Sanchez-Calzado, J.A.; Errazquin, L. [Hospital Univ. Virgen Macarena, Sevilla (Spain). Servicio de Oncologia Radioterapica
2001-07-01
Small field dosimetry is complicated due to the lack of electronic equilibrium and to the high steep dose gradients. This works compares PDD curves, profiles and output factors measured with conventional detectors (film, diode, TLD and ionisation chamber) and calculated with Monte Carlo. The 6 MV nominal energy from a Philips SL-18 linac has been simulated by using the OMEGA code. MC calculation reveals itself as a convenient method to validate OF and profiles in special conditions, such as small fields. (orig.)
Energy Technology Data Exchange (ETDEWEB)
Zychor, I. [Soltan Inst. for Nuclear Studies, Otwock-Swierk (Poland)
1994-12-31
The application of a Monte Carlo method to study a transport in matter of electron and photon beams is presented, especially for electrons with energies up to 18 MeV. The SHOWME Monte Carlo code, a modified version of GEANT3 code, was used on the CONVEX C3210 computer at Swierk. It was assumed that an electron beam is mono directional and monoenergetic. Arbitrary user-defined, complex geometries made of any element or material can be used in calculation. All principal phenomena occurring when electron beam penetrates the matter are taken into account. The use of calculation for a therapeutic electron beam collimation is presented. (author). 20 refs, 29 figs.
Novel imaging and quality assurance techniques for ion beam therapy a Monte Carlo study
Rinaldi, I; Jäkel, O; Mairani, A; Parodi, K
2010-01-01
Ion beams exhibit a finite and well defined range in matter together with an “inverted” depth-dose profile, the so-called Bragg peak. These favourable physical properties may enable superior tumour-dose conformality for high precision radiation therapy. On the other hand, they introduce the issue of sensitivity to range uncertainties in ion beam therapy. Although these uncertainties are typically taken into account when planning the treatment, correct delivery of the intended ion beam range has to be assured to prevent undesired underdosage of the tumour or overdosage of critical structures outside the target volume. Therefore, it is necessary to define dedicated Quality Assurance procedures to enable in-vivo range verification before or during therapeutic irradiation. For these purposes, Monte Carlo transport codes are very useful tools to support the development of novel imaging modalities for ion beam therapy. In the present work, we present calculations performed with the FLUKA Monte Carlo code and pr...
Patient-dependent beam-modifier physics in Monte Carlo photon dose calculations.
Schach von Wittenau, A E; Bergstrom, P M; Cox, L J
2000-05-01
Model pencil-beam on slab calculations are used as well as a series of detailed calculations of photon and electron output from commercial accelerators to quantify level(s) of physics required for the Monte Carlo transport of photons and electrons in treatment-dependent beam modifiers, such as jaws, wedges, blocks, and multileaf collimators, in photon teletherapy dose calculations. The physics approximations investigated comprise (1) not tracking particles below a given kinetic energy, (2) continuing to track particles, but performing simplified collision physics, particularly in handling secondary particle production, and (3) not tracking particles in specific spatial regions. Figures-of-merit needed to estimate the effects of these approximations are developed, and these estimates are compared with full-physics Monte Carlo calculations of the contribution of the collimating jaws to the on-axis depth-dose curve in a water phantom. These figures of merit are next used to evaluate various approximations used in coupled photon/electron physics in beam modifiers. Approximations for tracking electrons in air are then evaluated. It is found that knowledge of the materials used for beam modifiers, of the energies of the photon beams used, as well as of the length scales typically found in photon teletherapy plans, allows a number of simplifying approximations to be made in the Monte Carlo transport of secondary particles from the accelerator head and beam modifiers to the isocenter plane.
Monte Carlo Simulation of a Linear Accelerator and Electron Beam Parameters Used in Radiotherapy
Directory of Open Access Journals (Sweden)
Mohammad Taghi Bahreyni Toossi
2009-06-01
Full Text Available Introduction: In recent decades, several Monte Carlo codes have been introduced for research and medical applications. These methods provide both accurate and detailed calculation of particle transport from linear accelerators. The main drawback of Monte Carlo techniques is the extremely long computing time that is required in order to obtain a dose distribution with good statistical accuracy. Material and Methods: In this study, the MCNP-4C Monte Carlo code was used to simulate the electron beams generated by a Neptun 10 PC linear accelerator. The depth dose curves and related parameters to depth dose and beam profiles were calculated for 6, 8 and 10 MeV electron beams with different field sizes and these data were compared with the corresponding measured values. The actual dosimetry was performed by employing a Welhofer-Scanditronix dose scanning system, semiconductor detectors and ionization chambers. Results: The result showed good agreement (better than 2% between calculated and measured depth doses and lateral dose profiles for all energies in different field sizes. Also good agreements were achieved between calculated and measured related electron beam parameters such as E0, Rq, Rp and R50. Conclusion: The simulated model of the linac developed in this study is capable of computing electron beam data in a water phantom for different field sizes and the resulting data can be used to predict the dose distributions in other complex geometries.
Effects of physics change in Monte Carlo code on electron pencil beam dose distributions
Energy Technology Data Exchange (ETDEWEB)
Toutaoui, Abdelkader, E-mail: toutaoui.aek@gmail.com [Departement de Physique Medicale, Centre de Recherche Nucleaire d' Alger, 2 Bd Frantz Fanon BP399 Alger RP, Algiers (Algeria); Khelassi-Toutaoui, Nadia, E-mail: nadiakhelassi@yahoo.fr [Departement de Physique Medicale, Centre de Recherche Nucleaire d' Alger, 2 Bd Frantz Fanon BP399 Alger RP, Algiers (Algeria); Brahimi, Zakia, E-mail: zsbrahimi@yahoo.fr [Departement de Physique Medicale, Centre de Recherche Nucleaire d' Alger, 2 Bd Frantz Fanon BP399 Alger RP, Algiers (Algeria); Chami, Ahmed Chafik, E-mail: chafik_chami@yahoo.fr [Laboratoire de Sciences Nucleaires, Faculte de Physique, Universite des Sciences et de la Technologie Houari Boumedienne, BP 32 El Alia, Bab Ezzouar, Algiers (Algeria)
2012-01-15
Pencil beam algorithms used in computerized electron beam dose planning are usually described using the small angle multiple scattering theory. Alternatively, the pencil beams can be generated by Monte Carlo simulation of electron transport. In a previous work, the 4th version of the Electron Gamma Shower (EGS) Monte Carlo code was used to obtain dose distributions from monoenergetic electron pencil beam, with incident energy between 1 MeV and 50 MeV, interacting at the surface of a large cylindrical homogeneous water phantom. In 2000, a new version of this Monte Carlo code has been made available by the National Research Council of Canada (NRC), which includes various improvements in its electron-transport algorithms. In the present work, we were interested to see if the new physics in this version produces pencil beam dose distributions very different from those calculated with oldest one. The purpose of this study is to quantify as well as to understand these differences. We have compared a series of pencil beam dose distributions scored in cylindrical geometry, for electron energies between 1 MeV and 50 MeV calculated with two versions of the Electron Gamma Shower Monte Carlo Code. Data calculated and compared include isodose distributions, radial dose distributions and fractions of energy deposition. Our results for radial dose distributions show agreement within 10% between doses calculated by the two codes for voxels closer to the pencil beam central axis, while the differences are up to 30% for longer distances. For fractions of energy deposition, the results of the EGS4 are in good agreement (within 2%) with those calculated by EGSnrc at shallow depths for all energies, whereas a slightly worse agreement (15%) is observed at deeper distances. These differences may be mainly attributed to the different multiple scattering for electron transport adopted in these two codes and the inclusion of spin effect, which produces an increase of the effective range of
Commissioning of a Varian Clinac iX 6 MV photon beam using Monte Carlo simulation
Dirgayussa, I. Gde Eka; Yani, Sitti; Rhani, M. Fahdillah; Haryanto, Freddy
2015-09-01
Monte Carlo modelling of a linear accelerator is the first and most important step in Monte Carlo dose calculations in radiotherapy. Monte Carlo is considered today to be the most accurate and detailed calculation method in different fields of medical physics. In this research, we developed a photon beam model for Varian Clinac iX 6 MV equipped with MilleniumMLC120 for dose calculation purposes using BEAMnrc/DOSXYZnrc Monte Carlo system based on the underlying EGSnrc particle transport code. Monte Carlo simulation for this commissioning head LINAC divided in two stages are design head Linac model using BEAMnrc, characterize this model using BEAMDP and analyze the difference between simulation and measurement data using DOSXYZnrc. In the first step, to reduce simulation time, a virtual treatment head LINAC was built in two parts (patient-dependent component and patient-independent component). The incident electron energy varied 6.1 MeV, 6.2 MeV and 6.3 MeV, 6.4 MeV, and 6.6 MeV and the FWHM (full width at half maximum) of source is 1 mm. Phase-space file from the virtual model characterized using BEAMDP. The results of MC calculations using DOSXYZnrc in water phantom are percent depth doses (PDDs) and beam profiles at depths 10 cm were compared with measurements. This process has been completed if the dose difference of measured and calculated relative depth-dose data along the central-axis and dose profile at depths 10 cm is ≤ 5%. The effect of beam width on percentage depth doses and beam profiles was studied. Results of the virtual model were in close agreement with measurements in incident energy electron 6.4 MeV. Our results showed that photon beam width could be tuned using large field beam profile at the depth of maximum dose. The Monte Carlo model developed in this study accurately represents the Varian Clinac iX with millennium MLC 120 leaf and can be used for reliable patient dose calculations. In this commissioning process, the good criteria of dose
Commissioning of a Varian Clinac iX 6 MV photon beam using Monte Carlo simulation
Energy Technology Data Exchange (ETDEWEB)
Dirgayussa, I Gde Eka, E-mail: ekadirgayussa@gmail.com; Yani, Sitti; Haryanto, Freddy, E-mail: freddy@fi.itb.ac.id [Institut Teknologi Bandung, Jl. Ganesha 10, 40132 (Indonesia); Rhani, M. Fahdillah [Tang Tock Seng Hospital (Singapore)
2015-09-30
Monte Carlo modelling of a linear accelerator is the first and most important step in Monte Carlo dose calculations in radiotherapy. Monte Carlo is considered today to be the most accurate and detailed calculation method in different fields of medical physics. In this research, we developed a photon beam model for Varian Clinac iX 6 MV equipped with MilleniumMLC120 for dose calculation purposes using BEAMnrc/DOSXYZnrc Monte Carlo system based on the underlying EGSnrc particle transport code. Monte Carlo simulation for this commissioning head LINAC divided in two stages are design head Linac model using BEAMnrc, characterize this model using BEAMDP and analyze the difference between simulation and measurement data using DOSXYZnrc. In the first step, to reduce simulation time, a virtual treatment head LINAC was built in two parts (patient-dependent component and patient-independent component). The incident electron energy varied 6.1 MeV, 6.2 MeV and 6.3 MeV, 6.4 MeV, and 6.6 MeV and the FWHM (full width at half maximum) of source is 1 mm. Phase-space file from the virtual model characterized using BEAMDP. The results of MC calculations using DOSXYZnrc in water phantom are percent depth doses (PDDs) and beam profiles at depths 10 cm were compared with measurements. This process has been completed if the dose difference of measured and calculated relative depth-dose data along the central-axis and dose profile at depths 10 cm is ≤ 5%. The effect of beam width on percentage depth doses and beam profiles was studied. Results of the virtual model were in close agreement with measurements in incident energy electron 6.4 MeV. Our results showed that photon beam width could be tuned using large field beam profile at the depth of maximum dose. The Monte Carlo model developed in this study accurately represents the Varian Clinac iX with millennium MLC 120 leaf and can be used for reliable patient dose calculations. In this commissioning process, the good
Energy Technology Data Exchange (ETDEWEB)
Jabbari, Keyvan; Sarfehnia, Arman; Podgorsak, Ervin B; Seuntjens, Jan P [Medical Physics Unit, McGill University, Montreal General Hospital, 1650 avenue Cedar, Montreal, Quebec H3G 1A4 (Canada)
2007-02-21
The basic characteristics of orthogonal bremsstrahlung beams are studied and the feasibility of improved contrast imaging with such a beam is evaluated. In the context of this work, orthogonal bremsstrahlung beams represent the component of the bremsstrahlung distribution perpendicular to the electron beam impinging on an accelerator target. The BEAMnrc Monte Carlo code was used to study target characteristics, energy spectra and relative fluences of orthogonal beams to optimize target design. The reliability of the simulations was verified by comparing our results with benchmark experiments. Using the results of the Monte Carlo optimization, the targets with various materials and a collimator were designed and built. The primary pencil electron beam from the research port of a Varian Clinac-18 accelerator striking on Al, Pb and C targets was used to create orthogonal beams. For these beams, diagnostic image contrast was tested by placing simple Lucite objects in the path of the beams and comparing image contrast obtained in the orthogonal direction to the one obtained in the forward direction. The simulations for various target materials and various primary electron energies showed that a width of 80% of the continuous-slowing-down approximation range (R{sub CSDA}) is sufficient to remove electron contamination in the orthogonal direction. The photon fluence of the orthogonal beam for high Z targets is larger compared to low Z targets, i.e. by a factor of 20 for W compared to Be. For a 6 MeV electron beam, the mean energy for low Z targets is calculated to be 320 keV for Al and 150 keV for Be, and for a high Z target like Pb to be 980 keV. For irradiation times of 1.2 s in an electron mode of the linac, the contrast of diagnostic images created with orthogonal beams from the Al target is superior to that in the forward direction. The image contrast and the beam profile of the bremsstrahlung beams were also studied. Both the Monte Carlo study and experiment showed
Jabbari, Keyvan; Sarfehnia, Arman; Podgorsak, Ervin B.; Seuntjens, Jan P.
2007-02-01
The basic characteristics of orthogonal bremsstrahlung beams are studied and the feasibility of improved contrast imaging with such a beam is evaluated. In the context of this work, orthogonal bremsstrahlung beams represent the component of the bremsstrahlung distribution perpendicular to the electron beam impinging on an accelerator target. The BEAMnrc Monte Carlo code was used to study target characteristics, energy spectra and relative fluences of orthogonal beams to optimize target design. The reliability of the simulations was verified by comparing our results with benchmark experiments. Using the results of the Monte Carlo optimization, the targets with various materials and a collimator were designed and built. The primary pencil electron beam from the research port of a Varian Clinac-18 accelerator striking on Al, Pb and C targets was used to create orthogonal beams. For these beams, diagnostic image contrast was tested by placing simple Lucite objects in the path of the beams and comparing image contrast obtained in the orthogonal direction to the one obtained in the forward direction. The simulations for various target materials and various primary electron energies showed that a width of 80% of the continuous-slowing-down approximation range (RCSDA) is sufficient to remove electron contamination in the orthogonal direction. The photon fluence of the orthogonal beam for high Z targets is larger compared to low Z targets, i.e. by a factor of 20 for W compared to Be. For a 6 MeV electron beam, the mean energy for low Z targets is calculated to be 320 keV for Al and 150 keV for Be, and for a high Z target like Pb to be 980 keV. For irradiation times of 1.2 s in an electron mode of the linac, the contrast of diagnostic images created with orthogonal beams from the Al target is superior to that in the forward direction. The image contrast and the beam profile of the bremsstrahlung beams were also studied. Both the Monte Carlo study and experiment showed an
Monte Carlo modeling of ion beam induced secondary electrons
Energy Technology Data Exchange (ETDEWEB)
Huh, U., E-mail: uhuh@vols.utk.edu [Biochemistry & Cellular & Molecular Biology, University of Tennessee, Knoxville, TN 37996-0840 (United States); Cho, W. [Electrical and Computer Engineering, University of Tennessee, Knoxville, TN 37996-2100 (United States); Joy, D.C. [Biochemistry & Cellular & Molecular Biology, University of Tennessee, Knoxville, TN 37996-0840 (United States); Center for Nanophase Materials Science, Oak Ridge National Laboratory, Oak Ridge, TN 37831 (United States)
2016-09-15
Ion induced secondary electrons (iSE) can produce high-resolution images ranging from a few eV to 100 keV over a wide range of materials. The interpretation of such images requires knowledge of the secondary electron yields (iSE δ) for each of the elements and materials present and as a function of the incident beam energy. Experimental data for helium ions are currently limited to 40 elements and six compounds while other ions are not well represented. To overcome this limitation, we propose a simple procedure based on the comprehensive work of Berger et al. Here we show that between the energy range of 10–100 keV the Berger et al. data for elements and compounds can be accurately represented by a single universal curve. The agreement between the limited experimental data that is available and the predictive model is good, and has been found to provide reliable yield data for a wide range of elements and compounds. - Highlights: • The Universal ASTAR Yield Curve was derived from data recently published by NIST. • IONiSE incorporated with the Curve will predict iSE yield for elements and compounds. • This approach can also handle other ion beams by changing basic scattering profile.
Monte Carlo study of secondary electron production from gold nanoparticle in proton beam irradiation
Directory of Open Access Journals (Sweden)
Jeff Gao
2014-03-01
Full Text Available Purpose: In this study, we examined some characteristics of secondary electrons produced by gold nanoparticle (NP during proton beam irradiation.Method: By using the Geant4 Monte Carlo simulation toolkit, we simulated the NP at the range from radius (r of 17.5 nm, 25 nm, 35 nm to r = 50 nm. The proton beam energies used were 20MeV, 50MeV, and 100MeV. Findings on secondary electron production and their average kinetic energy are presented in this paper. Results: Firstly, for NP with a finite size, the secondary electron production increase with decreasing incident proton beam energy and secondary buildup existed outside NP. Secondly, the average kinetic energy of secondary electrons produced by a gold NP increased with incident proton beam energy. Thirdly, the larger the NP size, the more the secondary electron production.Conclusion: Collectively, our results suggest that apart from biological uptake efficiency, we should take the secondary electron production effect into account when considering the potential use of NPs in proton beam irradiation.-----------------------------------------------Cite this article as: Gao J, Zheng Y. Monte Carlo study of secondary electron production from gold nanoparticle in proton beam irradiation. Int J Cancer Ther Oncol 2014; 2(2:02025.DOI: http://dx.doi.org/10.14319/ijcto.0202.5
Monte Carlo simulation of spectrum changes in a photon beam due to a brass compensator
Energy Technology Data Exchange (ETDEWEB)
Custidiano, E.R., E-mail: ernesto7661@gmail.com [Department of Physics, FaCENA, UNNE, Av., Libertad 5470, C.P.3400, Corrientes (Argentina); Valenzuela, M.R., E-mail: meraqval@gmail.com [Department of Physics, FaCENA, UNNE, Av., Libertad 5470, C.P.3400, Corrientes (Argentina); Dumont, J.L., E-mail: Joseluis.Dumont@elekta.com [Elekta CMS Software, St.Louis, MO (United States); McDonnell, J., E-mail: josemc@express.com.ar [Cumbres Institute, Riobamba 1745, C.P.2000, Rosario, Santa Fe (Argentina); Rene, L, E-mail: luismrene@gmail.com [Radiotherapy Center, Crespo 953, C.P.2000, Rosario, Santa Fe (Argentina); Rodriguez Aguirre, J.M., E-mail: juakcho@gmail.com [Department of Physics, FaCENA, UNNE, Av., Libertad 5470, C.P.3400, Corrientes (Argentina)
2011-06-15
Monte Carlo simulations were used to study the changes in the incident spectrum when a poly-energetic photon beam passes through a static brass compensator. The simulated photon beam spectrum was evaluated by comparing it against the incident spectra. We also discriminated the changes in the transmitted spectrum produced by each of the microscopic processes. (i.e. Rayleigh scattering, photoelectric effect, Compton scattering, and pair production). The results show that the relevant process in the energy range considered is the Compton Effect, as expected for composite materials of intermediate atomic number and energy range considered.
Energy Technology Data Exchange (ETDEWEB)
Manchado de Sola, F.; Vilches Pacheco, M.; Lallena Rojo, A. M.; Prezado, Y.
2013-07-01
Still in testing phase, radiation therapy with mini-beams is presented as a promising form of treatment. The irradiation with beams constituted by a group of parallel strips of radiation and shade (peaks and valleys), each an of the which has a width of the order of microns. We studied using Monte Carlo simulation, the effect of the brain caused by the heartbeat pulsed on the reason of dose peak-valley in cranial radiotherapy with mini-beams, depending on the width of peak and the rate of irradiation. (Author)
Iba, Yukito
2000-01-01
``Extended Ensemble Monte Carlo''is a generic term that indicates a set of algorithms which are now popular in a variety of fields in physics and statistical information processing. Exchange Monte Carlo (Metropolis-Coupled Chain, Parallel Tempering), Simulated Tempering (Expanded Ensemble Monte Carlo), and Multicanonical Monte Carlo (Adaptive Umbrella Sampling) are typical members of this family. Here we give a cross-disciplinary survey of these algorithms with special emphasis on the great f...
Chetty, Indrin J.; Moran, Jean M.; Nurushev, Teamor S.; McShan, Daniel L.; Fraass, Benedick A.; Wilderman, Scott J.; Bielajew, Alex F.
2002-06-01
A comprehensive set of measurements and calculations has been conducted to investigate the accuracy of the Dose Planning Method (DPM) Monte Carlo code for electron beam dose calculations in heterogeneous media. Measurements were made using 10 MeV and 50 MeV minimally scattered, uncollimated electron beams from a racetrack microtron. Source distributions for the Monte Carlo calculations were reconstructed from in-air ion chamber scans and then benchmarked against measurements in a homogeneous water phantom. The in-air spatial distributions were found to have FWHM of 4.7 cm and 1.3 cm, at 100 cm from the source, for the 10 MeV and 50 MeV beams respectively. Energy spectra for the electron beams were determined by simulating the components of the microtron treatment head using the code MCNP4B. Profile measurements were made using an ion chamber in a water phantom with slabs of lung or bone-equivalent materials submerged at various depths. DPM calculations are, on average, within 2% agreement with measurement for all geometries except for the 50 MeV incident on a 6 cm lung-equivalent slab. Measurements using approximately monoenergetic, 50 MeV, 'pencil-beam'-type electrons in heterogeneous media provide conditions for maximum electronic disequilibrium and hence present a stringent test of the code's electron transport physics; the agreement noted between calculation and measurement illustrates that the DPM code is capable of accurate dose calculation even under such conditions.
Commissioning of a medical accelerator photon beam Monte Carlo simulation using wide-field profiles
Pena, J.; Franco, L.; Gómez, F.; Iglesias, A.; Lobato, R.; Mosquera, J.; Pazos, A.; Pardo, J.; Pombar, M.; Rodríguez, A.; Sendón, J.
2004-11-01
A method for commissioning an EGSnrc Monte Carlo simulation of medical linac photon beams through wide-field lateral profiles at moderate depth in a water phantom is presented. Although depth-dose profiles are commonly used for nominal energy determination, our study shows that they are quite insensitive to energy changes below 0.3 MeV (0.6 MeV) for a 6 MV (15 MV) photon beam. Also, the depth-dose profile dependence on beam radius adds an additional uncertainty in their use for tuning nominal energy. Simulated 40 cm × 40 cm lateral profiles at 5 cm depth in a water phantom show greater sensitivity to both nominal energy and radius. Beam parameters could be determined by comparing only these curves with measured data.
Commissioning of a medical accelerator photon beam Monte Carlo simulation using wide-field profiles
Energy Technology Data Exchange (ETDEWEB)
Pena, J [Departamento de Fisica de PartIculas, Facultade de Fisica, 15782 Santiago de Compostela (Spain); Franco, L [Departamento de Fisica de PartIculas, Facultade de Fisica, 15782 Santiago de Compostela (Spain); Gomez, F [Departamento de Fisica de PartIculas, Facultade de Fisica, 15782 Santiago de Compostela (Spain); Iglesias, A [Departamento de Fisica de PartIculas, Facultade de Fisica, 15782 Santiago de Compostela (Spain); Lobato, R [Hospital ClInico Universitario de Santiago, Santiago de Compostela (Spain); Mosquera, J [Hospital ClInico Universitario de Santiago, Santiago de Compostela (Spain); Pazos, A [Departamento de Fisica de PartIculas, Facultade de Fisica, 15782 Santiago de Compostela (Spain); Pardo, J [Departamento de Fisica de PartIculas, Facultade de Fisica, 15782 Santiago de Compostela (Spain); Pombar, M [Hospital ClInico Universitario de Santiago, Santiago de Compostela (Spain); RodrIguez, A [Departamento de Fisica de PartIculas, Facultade de Fisica, 15782 Santiago de Compostela (Spain); Sendon, J [Hospital ClInico Universitario de Santiago, Santiago de Compostela (Spain)
2004-11-07
A method for commissioning an EGSnrc Monte Carlo simulation of medical linac photon beams through wide-field lateral profiles at moderate depth in a water phantom is presented. Although depth-dose profiles are commonly used for nominal energy determination, our study shows that they are quite insensitive to energy changes below 0.3 MeV (0.6 MeV) for a 6 MV (15 MV) photon beam. Also, the depth-dose profile dependence on beam radius adds an additional uncertainty in their use for tuning nominal energy. Simulated 40 cm x 40 cm lateral profiles at 5 cm depth in a water phantom show greater sensitivity to both nominal energy and radius. Beam parameters could be determined by comparing only these curves with measured data.
A Monte Carlo-based treatment-planning tool for ion beam therapy
Böhlen, T T; Dosanjh, M; Ferrari, A; Haberer, T; Parodi, K; Patera, V; Mairan, A
2013-01-01
Ion beam therapy, as an emerging radiation therapy modality, requires continuous efforts to develop and improve tools for patient treatment planning (TP) and research applications. Dose and fluence computation algorithms using the Monte Carlo (MC) technique have served for decades as reference tools for accurate dose computations for radiotherapy. In this work, a novel MC-based treatment-planning (MCTP) tool for ion beam therapy using the pencil beam scanning technique is presented. It allows single-field and simultaneous multiple-fields optimization for realistic patient treatment conditions and for dosimetric quality assurance for irradiation conditions at state-of-the-art ion beam therapy facilities. It employs iterative procedures that allow for the optimization of absorbed dose and relative biological effectiveness (RBE)-weighted dose using radiobiological input tables generated by external RBE models. Using a re-implementation of the local effect model (LEM), theMCTP tool is able to perform TP studies u...
Monte Carlo Commissioning of Low Energy Electron Radiotherapy Beams using NXEGS Software
Directory of Open Access Journals (Sweden)
2004-06-01
Full Text Available This work is a report on the commissioning of low energy electron beams of a medical linear accelerator for Monte Carlo dose calculation using NXEGS software (NXEGS version 1.0.10.0, NX Medical Software, LLC. A unique feature of NXEGS is automated commissioning, a process whereby a combination of analytic and Monte Carlo methods generates beam models from dosimetric data collected in a water phantom. This study uses NXEGS to commission 6, 9, and 12 MeV electron beams of a Varian Clinac 2100C using three applicators with standard inserts. Central axis depth-dose, primary axis and diagonal beam profiles, and output factors are the measurements necessary for commissioning of the code. We present a comparison of measured dose distributions with the distributions generated by NXEGS, using confidence limits on seven measures of error. We find that confidence limits are typically less than 3% or 3 mm, but increase with increasing source to surface distance (SSD and depth at or beyond R50. We also investigate the dependence of NXEGS' performance on the size and composition of data used to commission the program, finding a weak dependence on number of dose profiles in the data set, but finding also that commissioning data need be measured at only two SSDs.
Figueroa, R G; Valente, M
2015-09-21
The main purpose of this work is to determine the feasibility and physical characteristics of a new teletherapy device of radiation therapy based on the application of a convergent x-ray beam of energies like those used in radiotherapy providing highly concentrated dose delivery to the target. We have denominated it Convergent Beam Radio Therapy (CBRT). Analytical methods are developed first in order to determine the dosimetry characteristic of an ideal convergent photon beam in a hypothetical water phantom. Then, using the PENELOPE Monte Carlo code, a similar convergent beam that is applied to the water phantom is compared with that of the analytical method. The CBRT device (Converay(®)) is designed to adapt to the head of LINACs. The converging beam photon effect is achieved thanks to the perpendicular impact of LINAC electrons on a large thin spherical cap target where Bremsstrahlung is generated (high-energy x-rays). This way, the electrons impact upon various points of the cap (CBRT condition), aimed at the focal point. With the X radiation (Bremsstrahlung) directed forward, a system of movable collimators emits many beams from the output that make a virtually definitive convergent beam. Other Monte Carlo simulations are performed using realistic conditions. The simulations are performed for a thin target in the shape of a large, thin, spherical cap, with an r radius of around 10-30 cm and a curvature radius of approximately 70 to 100 cm, and a cubed water phantom centered in the focal point of the cap. All the interaction mechanisms of the Bremsstrahlung radiation with the phantom are taken into consideration for different energies and cap thicknesses. Also, the magnitudes of the electric and/or magnetic fields, which are necessary to divert clinical-use electron beams (0.1 to 20 MeV), are determined using electromagnetism equations with relativistic corrections. This way the above-mentioned beam is manipulated and guided for its perpendicular impact
Characterizing a Proton Beam Scanning System for Monte Carlo Dose Calculation in Patients
Grassberger, C; Lomax, Tony; Paganetti, H
2015-01-01
The presented work has two goals. First, to demonstrate the feasibility of accurately characterizing a proton radiation field at treatment head exit for Monte Carlo dose calculation of active scanning patient treatments. Second, to show that this characterization can be done based on measured depth dose curves and spot size alone, without consideration of the exact treatment head delivery system. This is demonstrated through calibration of a Monte Carlo code to the specific beam lines of two institutions, Massachusetts General Hospital (MGH) and Paul Scherrer Institute (PSI). Comparison of simulations modeling the full treatment head at MGH to ones employing a parameterized phase space of protons at treatment head exit reveals the adequacy of the method for patient simulations. The secondary particle production in the treatment head is typically below 0.2% of primary fluence, except for low–energy electrons (protons), whose contribution to skin dose is negligible. However, there is significant difference between the two methods in the low-dose penumbra, making full treatment head simulations necessary to study out-of field effects such as secondary cancer induction. To calibrate the Monte Carlo code to measurements in a water phantom, we use an analytical Bragg peak model to extract the range-dependent energy spread at the two institutions, as this quantity is usually not available through measurements. Comparison of the measured with the simulated depth dose curves demonstrates agreement within 0.5mm over the entire energy range. Subsequently, we simulate three patient treatments with varying anatomical complexity (liver, head and neck and lung) to give an example how this approach can be employed to investigate site-specific discrepancies between treatment planning system and Monte Carlo simulations. PMID:25549079
Study of an extrapolation chamber in a standard diagnostic radiology beam by Monte Carlo simulation
Energy Technology Data Exchange (ETDEWEB)
Vedovato, Uly Pita; Silva, Rayre Janaina Vieira; Neves, Lucio Pereira; Santos, William S.; Perini, Ana Paula, E-mail: anapaula.perini@ufu.br [Universidade Federal de Uberlandia (INFIS/UFU), MG (Brazil). Instituto de Fisica; Caldas, Linda V.E. [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil); Belinato, Walmir [Instituto Federal de Educacao, Ciencia e Tecnologia da Bahia (IFBA), Vitoria da Conquista, BA (Brazil)
2016-07-01
In this work, we studied the influence of the components of an extrapolation ionization chamber in its response. This study was undertaken using the MCNP-5 Monte Carlo code, and the standard diagnostic radiology quality for direct beams (RQR5). Using tally F6 and 2.1 x 10{sup 9} simulated histories, the results showed that the chamber design and material not alter significantly the energy deposited in its sensitive volume. The collecting electrode and support board were the components with more influence on the chamber response. (author)
Lutsyshyn, Y.; Halley, J. W.
2011-01-01
We present the results of diffusion Monte Carlo calculations of the elastic transmission of a low-energy beam of helium atoms through a suspended slab of superfluid helium. These calculations represent a significant improvement on variational Monte Carlo methods which were previously used to study this problem. The results are consistent with the existence of a condensate-mediated transmission mechanism, which would result in very fast transmission of pulses through a slab.
The FLUKA code for application of Monte Carlo methods to promote high precision ion beam therapy
Parodi, K; Cerutti, F; Ferrari, A; Mairani, A; Paganetti, H; Sommerer, F
2010-01-01
Monte Carlo (MC) methods are increasingly being utilized to support several aspects of commissioning and clinical operation of ion beam therapy facilities. In this contribution two emerging areas of MC applications are outlined. The value of MC modeling to promote accurate treatment planning is addressed via examples of application of the FLUKA code to proton and carbon ion therapy at the Heidelberg Ion Beam Therapy Center in Heidelberg, Germany, and at the Proton Therapy Center of Massachusetts General Hospital (MGH) Boston, USA. These include generation of basic data for input into the treatment planning system (TPS) and validation of the TPS analytical pencil-beam dose computations. Moreover, we review the implementation of PET/CT (Positron-Emission-Tomography / Computed- Tomography) imaging for in-vivo verification of proton therapy at MGH. Here, MC is used to calculate irradiation-induced positron-emitter production in tissue for comparison with the +-activity measurement in order to infer indirect infor...
Validation of Monte Carlo calculated surface doses for megavoltage photon beams.
Abdel-Rahman, Wamied; Seuntjens, Jan P; Verhaegen, Frank; Deblois, François; Podgorsak, Ervin B
2005-01-01
Recent work has shown that there is significant uncertainty in measuring build-up doses in mega-voltage photon beams especially at high energies. In this present investigation we used a phantom-embedded extrapolation chamber (PEEC) made of Solid Water to validate Monte Carlo (MC)-calculated doses in the dose build-up region for 6 and 18 MV x-ray beams. The study showed that the percentage depth ionizations (PDIs) obtained from measurements are higher than the percentage depth doses (PDDs) obtained with Monte Carlo techniques. To validate the MC-calculated PDDs, the design of the PEEC was incorporated into the simulations. While the MC-calculated and measured PDIs in the dose build-up region agree with one another for the 6 MV beam, a non-negligible difference is observed for the 18 MV x-ray beam. A number of experiments and theoretical studies of various possible effects that could be the source of this discrepancy were performed. The contribution of contaminating neutrons and protons to the build-up dose region in the 18 MV x-ray beam is negligible. Moreover, the MC calculations using the XCOM photon cross-section database and the NIST bremsstrahlung differential cross section do not explain the discrepancy between the MC calculations and measurement in the dose build-up region for the 18 MV. A simple incorporation of triplet production events into the MC dose calculation increases the calculated doses in the build-up region but does not fully account for the discrepancy between measurement and calculations for the 18 MV x-ray beam.
Monte Carlo investigation into feasibility and dosimetry of flat Flattening Filter Free beams
Zavgorodni, Sergei
2013-01-01
Flattening filter free (FFF) beams due to their non-uniformity, are sub-optimal for larger field sizes. The purpose of this study was to investigate the incident electron beam distributions that would produce flat FFF beams without the use of flattening filter. Monte Carlo (MC) simulations with BEAMnrc and DOSXYZnrc codes have been performed to evaluate the feasibility of this approach. The dose distributions in water for open 6MV beams were simulated using Varian 21EX linac head model, which will be called flattening filter (FF) model. Flattening filter has then been removed from FF model, and MC simulations were performed using (1) 6 MeV electrons incident on the target, (2) 6 MeV electron beam with electron angular distributions optimized to provide as flat dose profiles as possible. Configuration (1) represents FFF beam while configuration (2) allowed producing a flat FFF (F4) beam. Optimizations have also been performed to produce flattest profiles for a set of dose rates (DRs) in the range from 1.25 to ...
Modeling the Biophysical Effects in a Carbon Beam Delivery Line using Monte Carlo Simulation
Cho, Ilsung; Cho, Sungho; Kim, Eun Ho; Song, Yongkeun; Shin, Jae-ik; Jung, Won-Gyun
2016-01-01
Relative biological effectiveness (RBE) plays an important role in designing a uniform dose response for ion beam therapy. In this study the biological effectiveness of a carbon ion beam delivery system was investigated using Monte Carlo simulation. A carbon ion beam delivery line was designed for the Korea Heavy Ion Medical Accelerator (KHIMA) project. The GEANT4 simulation tool kit was used to simulate carbon beam transporting into media. An incident energy carbon ion beam in the range between 220 MeV/u and 290 MeV/u was chosen to generate secondary particles. The microdosimetric-kinetic (MK) model is applied to describe the RBE of 10% survival in human salivary gland (HSG) cells. The RBE weighted dose was estimated as a function of the penetrating depth of the water phantom along the incident beam direction. A biologically photon-equivalent Spread Out Bragg Peak (SOBP) was designed using the RBE weighted absorbed dose. Finally, the RBE of mixed beams was predicted as a function of the water phantom depth.
Modeling the biophysical effects in a carbon beam delivery line by using Monte Carlo simulations
Cho, Ilsung; Yoo, SeungHoon; Cho, Sungho; Kim, Eun Ho; Song, Yongkeun; Shin, Jae-ik; Jung, Won-Gyun
2016-09-01
The Relative biological effectiveness (RBE) plays an important role in designing a uniform dose response for ion-beam therapy. In this study, the biological effectiveness of a carbon-ion beam delivery system was investigated using Monte Carlo simulations. A carbon-ion beam delivery line was designed for the Korea Heavy Ion Medical Accelerator (KHIMA) project. The GEANT4 simulation tool kit was used to simulate carbon-ion beam transport into media. An incident energy carbon-ion beam with energy in the range between 220 MeV/u and 290 MeV/u was chosen to generate secondary particles. The microdosimetric-kinetic (MK) model was applied to describe the RBE of 10% survival in human salivary-gland (HSG) cells. The RBE weighted dose was estimated as a function of the penetration depth in the water phantom along the incident beam's direction. A biologically photon-equivalent Spread Out Bragg Peak (SOBP) was designed using the RBE-weighted absorbed dose. Finally, the RBE of mixed beams was predicted as a function of the depth in the water phantom.
Energy Technology Data Exchange (ETDEWEB)
Chetty, Indrin J. [Department of Radiation Oncology, University of Michigan, Ann Arbor, MI (United States)]. E-mail: indrin@med.umich.edu; Moran, Jean M.; Nurushev, Teamor S.; McShan, Daniel L.; Fraass, Benedick A. [Department of Radiation Oncology, University of Michigan, Ann Arbor, MI (United States); Wilderman, Scott J.; Bielajew, Alex F. [Department of Nuclear Engineering, University of Michigan, Ann Arbor, MI (United States)
2002-06-07
A comprehensive set of measurements and calculations has been conducted to investigate the accuracy of the Dose Planning Method (DPM) Monte Carlo code for electron beam dose calculations in heterogeneous media. Measurements were made using 10 MeV and 50 MeV minimally scattered, uncollimated electron beams from a racetrack microtron. Source distributions for the Monte Carlo calculations were reconstructed from in-air ion chamber scans and then benchmarked against measurements in a homogeneous water phantom. The in-air spatial distributions were found to have FWHM of 4.7 cm and 1.3 cm, at 100 cm from the source, for the 10 MeV and 50 MeV beams respectively. Energy spectra for the electron beams were determined by simulating the components of the microtron treatment head using the code MCNP4B. Profile measurements were made using an ion chamber in a water phantom with slabs of lung or bone-equivalent materials submerged at various depths. DPM calculations are, on average, within 2% agreement with measurement for all geometries except for the 50 MeV incident on a 6 cm lung-equivalent slab. Measurements using approximately monoenergetic, 50 MeV, 'pencil-beam'-type electrons in heterogeneous media provide conditions for maximum electronic disequilibrium and hence present a stringent test of the code's electron transport physics; the agreement noted between calculation and measurement illustrates that the DPM code is capable of accurate dose calculation even under such conditions. (author)
Neutron contamination of Varian Clinac iX 10 MV photon beam using Monte Carlo simulation
Yani, S.; Tursinah, R.; Rhani, M. F.; Soh, R. C. X.; Haryanto, F.; Arif, I.
2016-03-01
High energy medical accelerators are commonly used in radiotherapy to increase the effectiveness of treatments. As we know neutrons can be emitted from a medical accelerator if there is an incident of X-ray that hits any of its materials. This issue becomes a point of view of many researchers. The neutron contamination has caused many problems such as image resolution and radiation protection for patients and radio oncologists. This study concerns the simulation of neutron contamination emitted from Varian Clinac iX 10 MV using Monte Carlo code system. As neutron production process is very complex, Monte Carlo simulation with MCNPX code system was carried out to study this contamination. The design of this medical accelerator was modelled based on the actual materials and geometry. The maximum energy of photons and neutron in the scoring plane was 10.5 and 2.239 MeV, respectively. The number and energy of the particles produced depend on the depth and distance from beam axis. From these results, it is pointed out that the neutron produced by linac 10 MV photon beam in a typical treatment is not negligible.
Directory of Open Access Journals (Sweden)
Seif F
2015-03-01
Full Text Available Background: Megavoltage beams used in radiotherapy are contaminated with secondary electrons. Different parts of linac head and air above patient act as a source of this contamination. This contamination can increase damage to skin and subcutaneous tissue during radiotherapy. Monte Carlo simulation is an accurate method for dose calculation in medical dosimetry and has an important role in optimization of linac head materials. The aim of this study was to calculate electron contamination of Varian linac. Materials and Method: The 6MV photon beam of Varian (2100 C/D linac was simulated by Monte Carlo code, MCNPX, based on its company’s instructions. The validation was done by comparing the calculated depth dose and profiles of simulation with dosimetry measurements in a water phantom (error less than 2%. The Percentage Depth Dose (PDDs, profiles and contamination electron energy spectrum were calculated for different therapeutic field sizes (5×5 to 40×40 cm2 for both linacs. Results: The dose of electron contamination was observed to rise with increase in field size. The contribution of the secondary contamination electrons on the surface dose was 6% for 5×5 cm2 to 27% for 40×40 cm2 , respectively. Conclusion: Based on the results, the effect of electron contamination on patient surface dose cannot be ignored, so the knowledge of the electron contamination is important in clinical dosimetry. It must be calculated for each machine and considered in Treatment Planning Systems.
Energy Technology Data Exchange (ETDEWEB)
Brown, F.B.; Sutton, T.M.
1996-02-01
This report is composed of the lecture notes from the first half of a 32-hour graduate-level course on Monte Carlo methods offered at KAPL. These notes, prepared by two of the principle developers of KAPL`s RACER Monte Carlo code, cover the fundamental theory, concepts, and practices for Monte Carlo analysis. In particular, a thorough grounding in the basic fundamentals of Monte Carlo methods is presented, including random number generation, random sampling, the Monte Carlo approach to solving transport problems, computational geometry, collision physics, tallies, and eigenvalue calculations. Furthermore, modern computational algorithms for vector and parallel approaches to Monte Carlo calculations are covered in detail, including fundamental parallel and vector concepts, the event-based algorithm, master/slave schemes, parallel scaling laws, and portability issues.
Bardenet, R.
2012-01-01
ISBN:978-2-7598-1032-1; International audience; Bayesian inference often requires integrating some function with respect to a posterior distribution. Monte Carlo methods are sampling algorithms that allow to compute these integrals numerically when they are not analytically tractable. We review here the basic principles and the most common Monte Carlo algorithms, among which rejection sampling, importance sampling and Monte Carlo Markov chain (MCMC) methods. We give intuition on the theoretic...
A Monte Carlo approach for simulating the propagation of partially coherent x-ray beams
DEFF Research Database (Denmark)
Prodi, A.; Bergbäck Knudsen, Erik; Willendrup, Peter Kjær
2011-01-01
by sampling Huygens-Fresnel waves with Monte Carlo methods and is used to propagate each source realization to the detector plane. The sampling is implemented with a modified Monte Carlo ray tracing scheme where the optical path of each generated ray is stored. Such information is then used in the summation...
Simulation of Cone Beam CT System Based on Monte Carlo Method
Wang, Yu; Cao, Ruifen; Hu, Liqin; Li, Bingbing
2014-01-01
Adaptive Radiation Therapy (ART) was developed based on Image-guided Radiation Therapy (IGRT) and it is the trend of photon radiation therapy. To get a better use of Cone Beam CT (CBCT) images for ART, the CBCT system model was established based on Monte Carlo program and validated against the measurement. The BEAMnrc program was adopted to the KV x-ray tube. Both IOURCE-13 and ISOURCE-24 were chosen to simulate the path of beam particles. The measured Percentage Depth Dose (PDD) and lateral dose profiles under 1cm water were compared with the dose calculated by DOSXYZnrc program. The calculated PDD was better than 1% within the depth of 10cm. More than 85% points of calculated lateral dose profiles was within 2%. The correct CBCT system model helps to improve CBCT image quality for dose verification in ART and assess the CBCT image concomitant dose risk.
Dunn, William L
2012-01-01
Exploring Monte Carlo Methods is a basic text that describes the numerical methods that have come to be known as "Monte Carlo." The book treats the subject generically through the first eight chapters and, thus, should be of use to anyone who wants to learn to use Monte Carlo. The next two chapters focus on applications in nuclear engineering, which are illustrative of uses in other fields. Five appendices are included, which provide useful information on probability distributions, general-purpose Monte Carlo codes for radiation transport, and other matters. The famous "Buffon's needle proble
A geometrical model for the Monte Carlo simulation of the TrueBeam linac
Rodriguez, Miguel; Fogliata, Antonella; Cozzi, Luca; Sauerwein, Wolfgang; Brualla, Lorenzo
2015-01-01
Monte Carlo (MC) simulation of linacs depends on the accurate geometrical description of the head. The geometry of the Varian TrueBeam linac is not available to researchers. Instead, the company distributes phase-space files (PSFs) of the flattening-filter-free (FFF) beams tallied upstream the jaws. Yet, MC simulations based on third party tallied PSFs are subject to limitations. We present an experimentally-based geometry developed for the simulation of the FFF beams of the TrueBeam linac. The upper part of the TrueBeam linac was modeled modifying the Clinac 2100 geometry. The most important modification is the replacement of the standard flattening filters by {\\it ad hoc} thin filters which were modeled by comparing dose measurements and simulations. The experimental dose profiles for the 6~MV and 10~MV FFF beams were obtained from the Varian Golden Data Set and from in-house measurements for radiation fields ranging from $3\\times3$ to $40\\times40$ cm$^2$. The same comparisons were done for dose profiles ob...
Monte Carlo Simulation of Electron Beams for Radiotherapy - EGS4, MCNP4b and GEANT3 Intercomparison
Trindade, A; Alves, C M; Chaves, A; Lopes, C; Oliveira, C; Peralta, L
2000-01-01
In medical radiation physics, an increasing number of Monte Carlo codes are being used, which requires intercomparison between them to evaluated the accuracy of the simulated results against benchmark experiments. The Monte Carlo code EGS4, commonly used to simulate electron beams from medical linear accelerators, was compared with GEANT3 and MCNP4b. Intercomparison of electron energy spectra, angular and spatial distribution were carried out for the Siemens KD2 linear accelerator, at beam energies of 10 and 15 MeV for a field size of 10x10 cm2. Indirect validation was performed against electron depth doses curves and beam profiles measured in a MP3-PTW water phantom using a Markus planar chamber. Monte Carlo isodose lines were reconstructed and compared to those from commercial treatment planning systems (TPS's) and with experimental data.
Monte Carlo Simulation of Electron Beams for Radiotherapy - EGS4, MCNP4b and GEANT3 Intercomparison
Trindade, A.; Rodrigues, P.; Alves, C.; Chaves, A.; Lopes, M. C.; Oliveira, C.; Peralta, L.
In medical radiation physics, an increasing number of Monte Carlo codes are being used, which requires intercomparison between them to evaluated the accuracy of the simulated results against benchmark experiments. The Monte Carlo code EGS4, commonly used to simulate electron beams from medical linear accelerators, was compared with GEANT3 and MCNP4b. Intercomparison of electron energy spectra, angular and spatial distribution were carried out for the Siemens KD2 linear accelerator, at beam energies of 10 and 15 MeV for a field size of 10x10 cm2. Indirect validation was performed against electron depth doses curves and beam profiles measured in a MP3-PTW water phantom using a Markus planar chamber. Monte Carlo isodose lines were reconstructed and compared to those from commercial treatment planning systems (TPS's) and with experimental data.
Energy Technology Data Exchange (ETDEWEB)
Manchado, F.; Vilches, M.; Guiraldo, D.; Lallena, A. M.
2011-07-01
In this paper we have studied, using Monte Carlo simulation, the properties of such beams, degradation with depth traversed, the influence of target motion during irradiation, how to reduce the absorbed dose between bands and how to reduce simulation times.
Proton microbeam radiotherapy with scanned pencil-beams--Monte Carlo simulations.
Kłodowska, M; Olko, P; Waligórski, M P R
2015-09-01
Irradiation, delivered by a synchrotron facility, using a set of highly collimated, narrow and parallel photon beams spaced by 1 mm or less, has been termed Microbeam Radiation Therapy (MRT). The tolerance of healthy tissue after MRT was found to be better than after standard broad X-ray beams, together with a more pronounced response of malignant tissue. The microbeam spacing and transverse peak-to-valley dose ratio (PVDR) are considered to be relevant biological MRT parameters. We investigated the MRT concept for proton microbeams, where we expected different depth-dose profiles and PVDR dependences, resulting in skin sparing and homogeneous dose distributions at larger beam depths, due to differences between interactions of proton and photon beams in tissue. Using the FLUKA Monte Carlo code we simulated PVDR distributions for differently spaced 0.1 mm (sigma) pencil-beams of entrance energies 60, 80, 100 and 120 MeV irradiating a cylindrical water phantom with and without a bone layer, representing human head. We calculated PVDR distributions and evaluated uniformity of target irradiation at distal beam ranges of 60-120 MeV microbeams. We also calculated PVDR distributions for a 60 MeV spread-out Bragg peak microbeam configuration. Application of optimised proton MRT in terms of spot size, pencil-beam distribution, entrance beam energy, multiport irradiation, combined with relevant radiobiological investigations, could pave the way for hypofractionation scenarios where tissue sparing at the entrance, better malignant tissue response and better dose conformity of target volume irradiation could be achieved, compared with present proton beam radiotherapy configurations.
Doucet, R.; Olivares, M.; DeBlois, F.; Podgorsak, E. B.; Kawrakow, I.; Seuntjens, J.
2003-08-01
Calculations of dose distributions in heterogeneous phantoms in clinical electron beams, carried out using the fast voxel Monte Carlo (MC) system XVMC and the conventional MC code EGSnrc, were compared with measurements. Irradiations were performed using the 9 MeV and 15 MeV beams from a Varian Clinac-18 accelerator with a 10 × 10 cm2 applicator and an SSD of 100 cm. Depth doses were measured with thermoluminescent dosimetry techniques (TLD 700) in phantoms consisting of slabs of Solid WaterTM (SW) and bone and slabs of SW and lung tissue-equivalent materials. Lateral profiles in water were measured using an electron diode at different depths behind one and two immersed aluminium rods. The accelerator was modelled using the EGS4/BEAM system and optimized phase-space files were used as input to the EGSnrc and the XVMC calculations. Also, for the XVMC, an experiment-based beam model was used. All measurements were corrected by the EGSnrc-calculated stopping power ratios. Overall, there is excellent agreement between the corrected experimental and the two MC dose distributions. Small remaining discrepancies may be due to the non-equivalence between physical and simulated tissue-equivalent materials and to detector fluence perturbation effect correction factors that were calculated for the 9 MeV beam at selected depths in the heterogeneous phantoms.
Energy Technology Data Exchange (ETDEWEB)
Doucet, R [Medical Physics Unit, McGill University, Montreal General Hospital, 1650 Ave Cedar, Montreal H3G 1A4 (Canada); Olivares, M [Medical Physics Unit, McGill University, Montreal General Hospital, 1650 Ave Cedar, Montreal H3G 1A4 (Canada); DeBlois, F [Medical Physics Unit, McGill University, Montreal General Hospital, 1650 Ave Cedar, Montreal H3G 1A4 (Canada); Podgorsak, E B [Medical Physics Unit, McGill University, Montreal General Hospital, 1650 Ave Cedar, Montreal H3G 1A4 (Canada); Kawrakow, I [National Research Council Canada, Ionizing Radiation Standards Group, Ottawa K1A 0R6, Canada (Canada); Seuntjens, J [Medical Physics Unit, McGill University, Montreal General Hospital, 1650 Ave Cedar, Montreal H3G 1A4 (Canada)
2003-08-07
Calculations of dose distributions in heterogeneous phantoms in clinical electron beams, carried out using the fast voxel Monte Carlo (MC) system XVMC and the conventional MC code EGSnrc, were compared with measurements. Irradiations were performed using the 9 MeV and 15 MeV beams from a Varian Clinac-18 accelerator with a 10 x 10 cm{sup 2} applicator and an SSD of 100 cm. Depth doses were measured with thermoluminescent dosimetry techniques (TLD 700) in phantoms consisting of slabs of Solid Water{sup TM} (SW) and bone and slabs of SW and lung tissue-equivalent materials. Lateral profiles in water were measured using an electron diode at different depths behind one and two immersed aluminium rods. The accelerator was modelled using the EGS4/BEAM system and optimized phase-space files were used as input to the EGSnrc and the XVMC calculations. Also, for the XVMC, an experiment-based beam model was used. All measurements were corrected by the EGSnrc-calculated stopping power ratios. Overall, there is excellent agreement between the corrected experimental and the two MC dose distributions. Small remaining discrepancies may be due to the non-equivalence between physical and simulated tissue-equivalent materials and to detector fluence perturbation effect correction factors that were calculated for the 9 MeV beam at selected depths in the heterogeneous phantoms.
A comparison of neon versus helium ion beam induced deposition via Monte Carlo simulations.
Timilsina, Rajendra; Smith, Daryl A; Rack, Philip D
2013-03-22
The ion beam induced nanoscale synthesis of PtCx (where x ∼ 5) using the trimethyl (methylcyclopentadienyl)platinum(IV) (MeCpPt(IV)Me3) precursor is investigated by performing Monte Carlo simulations of helium and neon ions. The helium beam leads to more lateral growth relative to the neon beam because of its larger interaction volume. The lateral growth of the nanopillars is dominated by molecules deposited via secondary electrons in both the simulations. Notably, the helium pillars are dominated by SE-I electrons whereas the neon pillars are dominated by SE-II electrons. Using a low precursor residence time of 70 μs, resulting in an equilibrium coverage of ∼4%, the neon simulation has a lower deposition efficiency (3.5%) compared to that of the helium simulation (6.5%). At larger residence time (10 ms) and consequently larger equilibrium coverage (85%) the deposition efficiencies of helium and neon increased to 49% and 21%, respectively; which is dominated by increased lateral growth rates leading to broader pillars. The nanoscale growth is further studied by varying the ion beam diameter at 10 ms precursor residence time. The study shows that total SE yield decreases with increasing beam diameters for both the ion types. However, helium has the larger SE yield as compared to that of neon in both the low and high precursor residence time, and thus pillars are wider in all the simulations studied.
Validation of Monte-Carlo simulations with measurements at the ICON beam-line at SINQ
Energy Technology Data Exchange (ETDEWEB)
Giller, L. [LRS, Physics Department, Ecole Polytechnique Federal de Lausanne, CH-1015 Lausanne (Switzerland); Filges, U. [LDM, NUM Department, Paul Scherrer Institut, CH-5232 Villigen PSI (Switzerland)], E-mail: uwe.filges@psi.ch; Kuehne, G. [ASQ, NUM Department, Paul Scherrer Institut, CH-5232 Villigen PSI (Switzerland); Wohlmuther, M. [ABE, GFA Department, Paul Scherrer Institut, CH-5232 Villigen PSI (Switzerland); Zanini, L. [ASQ, NUM Department, Paul Scherrer Institut, CH-5232 Villigen PSI (Switzerland)
2008-02-11
ICON is the new cold neutron imaging facility at the neutron spallation source SINQ. The ICON facility is placed at beam-line S52 with direct view to the cold liquid D{sub 2} moderator. The beam-line includes a 4.4 m long collimation section followed by a 11 m long flight path to the imaging system. The essential part of the collimation section is composed of six revolving drums and a variable aperture wheel. Depending on the investigated object, different apertures are used. Measurements have shown that each setup has a different spatial neutron flux distribution and specific beam profiles. Measured beam profiles have been used to validate results of simulations coupling the Monte-Carlo program MCNPX with the neutron ray-tracing program McStas. In a first step, MCNPX was used to calculate neutron spectra closed to the SINQ target, at the entrance of the collimation section. These results served as an input for McStas where the beam-line itself was simulated. In the present paper, experimental and theoretical results will be compared and discussed.
Validation of Monte-Carlo simulations with measurements at the ICON beam-line at SINQ
Giller, L.; Filges, U.; Kühne, G.; Wohlmuther, M.; Zanini, L.
2008-02-01
ICON is the new cold neutron imaging facility at the neutron spallation source SINQ. The ICON facility is placed at beam-line S52 with direct view to the cold liquid D 2 moderator. The beam-line includes a 4.4 m long collimation section followed by a 11 m long flight path to the imaging system. The essential part of the collimation section is composed of six revolving drums and a variable aperture wheel. Depending on the investigated object, different apertures are used. Measurements have shown that each setup has a different spatial neutron flux distribution and specific beam profiles. Measured beam profiles have been used to validate results of simulations coupling the Monte-Carlo program MCNPX with the neutron ray-tracing program McStas. In a first step, MCNPX was used to calculate neutron spectra closed to the SINQ target, at the entrance of the collimation section. These results served as an input for McStas where the beam-line itself was simulated. In the present paper, experimental and theoretical results will be compared and discussed.
Monte Carlo model of the Studsvik BNCT clinical beam: description and validation.
Giusti, Valerio; Munck af Rosenschöld, Per M; Sköld, Kurt; Montagnini, Bruno; Capala, Jacek
2003-12-01
The neutron beam at the Studsvik facility for boron neutron capture therapy (BNCT) and the validation of the related computational model developed for the MCNP-4B Monte Carlo code are presented. Several measurements performed at the epithermal neutron port used for clinical trials have been made in order to validate the Monte Carlo computational model. The good general agreement between the MCNP calculations and the experimental results has provided an adequate check of the calculation procedure. In particular, at the nominal reactor power of 1 MW, the calculated in-air epithermal neutron flux in the energy interval between 0.4 eV-10 keV is 3.24 x 10(9) n cm(-2) s(-1) (+/- 1.2% 1 std. dev.) while the measured value is 3.30 x 10(9) n cm(-20 s(-1) (+/- 5.0% 1 std. dev.). Furthermore, the calculated in-phantom thermal neutron flux, equal to 6.43 x 10(9) n cm(-2) s(-1) (+/- 1.0% 1 std. dev.), and the corresponding measured value of 6.33 X 10(9) n cm(-2) s(-1) (+/- 5.3% 1 std. dev.) agree within their respective uncertainties. The only statistically significant disagreement is a discrepancy of 39% between the MCNP calculations of the in-air photon kerma and the corresponding experimental value. Despite this, a quite acceptable overall in-phantom beam performance was obtained, with a maximum value of the therapeutic ratio (the ratio between the local tumor dose and the maximum healthy tissue dose) equal to 6.7. The described MCNP model of the Studsvik facility has been deemed adequate to evaluate further improvements in the beam design as well as to plan experimental work.
SU-E-T-578: MCEBRT, A Monte Carlo Code for External Beam Treatment Plan Verifications
Energy Technology Data Exchange (ETDEWEB)
Chibani, O; Ma, C [Fox Chase Cancer Center, Philadelphia, PA (United States); Eldib, A [Fox Chase Cancer Center, Philadelphia, PA (United States); Al-Azhar University, Cairo (Egypt)
2014-06-01
Purpose: Present a new Monte Carlo code (MCEBRT) for patient-specific dose calculations in external beam radiotherapy. The code MLC model is benchmarked and real patient plans are re-calculated using MCEBRT and compared with commercial TPS. Methods: MCEBRT is based on the GEPTS system (Med. Phys. 29 (2002) 835–846). Phase space data generated for Varian linac photon beams (6 – 15 MV) are used as source term. MCEBRT uses a realistic MLC model (tongue and groove, rounded ends). Patient CT and DICOM RT files are used to generate a 3D patient phantom and simulate the treatment configuration (gantry, collimator and couch angles; jaw positions; MLC sequences; MUs). MCEBRT dose distributions and DVHs are compared with those from TPS in absolute way (Gy). Results: Calculations based on the developed MLC model closely matches transmission measurements (pin-point ionization chamber at selected positions and film for lateral dose profile). See Fig.1. Dose calculations for two clinical cases (whole brain irradiation with opposed beams and lung case with eight fields) are carried out and outcomes are compared with the Eclipse AAA algorithm. Good agreement is observed for the brain case (Figs 2-3) except at the surface where MCEBRT dose can be higher by 20%. This is due to better modeling of electron contamination by MCEBRT. For the lung case an overall good agreement (91% gamma index passing rate with 3%/3mm DTA criterion) is observed (Fig.4) but dose in lung can be over-estimated by up to 10% by AAA (Fig.5). CTV and PTV DVHs from TPS and MCEBRT are nevertheless close (Fig.6). Conclusion: A new Monte Carlo code is developed for plan verification. Contrary to phantombased QA measurements, MCEBRT simulate the exact patient geometry and tissue composition. MCEBRT can be used as extra verification layer for plans where surface dose and tissue heterogeneity are an issue.
Monte Carlo based verification of a beam model used in a treatment planning system
Wieslander, E.; Knöös, T.
2008-02-01
Modern treatment planning systems (TPSs) usually separate the dose modelling into a beam modelling phase, describing the beam exiting the accelerator, followed by a subsequent dose calculation in the patient. The aim of this work is to use the Monte Carlo code system EGSnrc to study the modelling of head scatter as well as the transmission through multi-leaf collimator (MLC) and diaphragms in the beam model used in a commercial TPS (MasterPlan, Nucletron B.V.). An Elekta Precise linear accelerator equipped with an MLC has been modelled in BEAMnrc, based on available information from the vendor regarding the material and geometry of the treatment head. The collimation in the MLC direction consists of leafs which are complemented with a backup diaphragm. The characteristics of the electron beam, i.e., energy and spot size, impinging on the target have been tuned to match measured data. Phase spaces from simulations of the treatment head are used to extract the scatter from, e.g., the flattening filter and the collimating structures. Similar data for the source models used in the TPS are extracted from the treatment planning system, thus a comprehensive analysis is possible. Simulations in a water phantom, with DOSXYZnrc, are also used to study the modelling of the MLC and the diaphragms by the TPS. The results from this study will be helpful to understand the limitations of the model in the TPS and provide knowledge for further improvements of the TPS source modelling.
ALICE EMCal Reconstructable Energy Non-Linearity From Test Beam Monte Carlo
Carter, Thomas Michael
2017-01-01
Calorimeters play many important roles in modern high energy physics detectors, such as event selection, triggering, and precision energy measurements. EMCal, in the case of the ALICE experiment provides triggering on high energy jets, improves jet quenching study measurement bias and jet energy resolution, and improves electron and photon measurements [3]. With the EMCal detector in the ALICE experiment taking on so many important roles, it is important to fully understand, characterize and model its interactions with particles. In 2010 SPS and PS electron test beam measurements were performed on an EMCal mini-module [2]. Alongside this, the test beam setup and geometry was recreated in Geant4 by Nico [1]. Figure 1 shows the reconstructable energy linearity for the SPS test beam data and that obtained from the test beam monte carlo, indicating the amount of energy deposit as hits in the EMCal module. It can be seen that for energies above ∼ 100 GeV there is a signiﬁcant drop in the reconstructableenergym...
Gomà, Carles; Andreo, Pedro; Sempau, Josep
2016-03-01
This work calculates beam quality correction factors (k Q ) in monoenergetic proton beams using detailed Monte Carlo simulation of ionization chambers. It uses the Monte Carlo code penh and the electronic stopping powers resulting from the adoption of two different sets of mean excitation energy values for water and graphite: (i) the currently ICRU 37 and ICRU 49 recommended {{I}\\text{w}}=75~\\text{eV} and {{I}\\text{g}}=78~\\text{eV} and (ii) the recently proposed {{I}\\text{w}}=78~\\text{eV} and {{I}\\text{g}}=81.1~\\text{eV} . Twelve different ionization chambers were studied. The k Q factors calculated using the two different sets of I-values were found to agree with each other within 1.6% or better. k Q factors calculated using current ICRU I-values were found to agree within 2.3% or better with the k Q factors tabulated in IAEA TRS-398, and within 1% or better with experimental values published in the literature. k Q factors calculated using the new I-values were also found to agree within 1.1% or better with the experimental values. This work concludes that perturbation correction factors in proton beams—currently assumed to be equal to unity—are in fact significantly different from unity for some of the ionization chambers studied.
Monte Carlo simulation of MLC-shaped TrueBeam electron fields benchmarked against measurement
Lloyd, Samantha AM; Zavgorodni, Sergei
2014-01-01
Modulated electron radiotherapy (MERT) and combined, modulated photon/electron radiotherapy (MPERT) have received increased research attention, having shown capacity for reduced low dose exposure to healthy tissue and comparable, if not improved, target coverage for a number of treatment sites. Accurate dose calculation tools are necessary for clinical treatment planning, and Monte Carlo (MC) is the gold standard for electron field simulation. With many clinics replacing older accelerators, MC source models of the new machines are needed for continued development, however, Varian has kept internal schematics of the TrueBeam confidential and electron phase-space sources have not been made available. TrueBeam electron fields are not substantially different from those generated by the Clinac 21EX, so we have modified the internal schematics of the Clinac 21EX to simulate TrueBeam electrons. BEAMnrc/DOSXYZnrc were used to simulate 5x5 and 20x20 cm$^2$ electron fields with MLC-shaped apertures. Secondary collimati...
Monte Carlo simulation of electron beams from an accelerator head using PENELOPE
Sempau, J.; Sánchez-Reyes, A.; Salvat, F.; Oulad ben Tahar, H.; Jiang, S. B.; Fernández-Varea, J. M.
2001-04-01
The Monte Carlo code PENELOPE has been used to simulate electron beams from a Siemens Mevatron KDS linac with nominal energies of 6, 12 and 18 MeV. Owing to its accuracy, which stems from that of the underlying physical interaction models, PENELOPE is suitable for simulating problems of interest to the medical physics community. It includes a geometry package that allows the definition of complex quadric geometries, such as those of irradiation instruments, in a straightforward manner. Dose distributions in water simulated with PENELOPE agree well with experimental measurements using a silicon detector and a monitoring ionization chamber. Insertion of a lead slab in the incident beam at the surface of the water phantom produces sharp variations in the dose distributions, which are correctly reproduced by the simulation code. Results from PENELOPE are also compared with those of equivalent simulations with the EGS4-based user codes BEAM and DOSXYZ. Angular and energy distributions of electrons and photons in the phase-space plane (at the downstream end of the applicator) obtained from both simulation codes are similar, although significant differences do appear in some cases. These differences, however, are shown to have a negligible effect on the calculated dose distributions. Various practical aspects of the simulations, such as the calculation of statistical uncertainties and the effect of the `latent' variance in the phase-space file, are discussed in detail.
Energy Technology Data Exchange (ETDEWEB)
Cramer, S.N.
1984-01-01
The MORSE code is a large general-use multigroup Monte Carlo code system. Although no claims can be made regarding its superiority in either theoretical details or Monte Carlo techniques, MORSE has been, since its inception at ORNL in the late 1960s, the most widely used Monte Carlo radiation transport code. The principal reason for this popularity is that MORSE is relatively easy to use, independent of any installation or distribution center, and it can be easily customized to fit almost any specific need. Features of the MORSE code are described.
Clement, S D; Choi, J R; Zamenhof, R G; Yanch, J C; Harling, O K
1990-01-01
Monte Carlo methods of coupled neutron/photon transport are being used in the design of filtered beams for Neutron Capture Therapy (NCT). This method of beam analysis provides segregation of each individual dose component, and thereby facilitates beam optimization. The Monte Carlo method is discussed in some detail in relation to NCT epithermal beam design. Ideal neutron beams (i.e., plane-wave monoenergetic neutron beams with no primary gamma-ray contamination) have been modeled both for comparison and to establish target conditions for a practical NCT epithermal beam design. Detailed models of the 5 MWt Massachusetts Institute of Technology Research Reactor (MITR-II) together with a polyethylene head phantom have been used to characterize approximately 100 beam filter and moderator configurations. Using the Monte Carlo methodology of beam design and benchmarking/calibrating our computations with measurements, has resulted in an epithermal beam design which is useful for therapy of deep-seated brain tumors. This beam is predicted to be capable of delivering a dose of 2000 RBE-cGy (cJ/kg) to a therapeutic advantage depth of 5.7 cm in polyethylene assuming 30 micrograms/g 10B in tumor with a ten-to-one tumor-to-blood ratio, and a beam diameter of 18.4 cm. The advantage ratio (AR) is predicted to be 2.2 with a total irradiation time of approximately 80 minutes. Further optimization work on the MITR-II epithermal beams is expected to improve the available beams.
Abdel-Rahman, Wamied; Seuntjens, Jan P; Verhaegen, Frank; Podgorsak, Ervin B
2006-09-01
Polarity effects in ionization chambers are caused by a radiation induced current, also known as Compton current, which arises as a charge imbalance due to charge deposition in electrodes of ionization chambers. We used a phantom-embedded extrapolation chamber (PEEC) for measurements of Compton current in megavoltage photon and electron beams. Electron contamination of photon beams and photon contamination of electron beams have a negligible effect on the measured Compton current. To allow for a theoretical understanding of the Compton current produced in the PEEC effect we carried out Monte Carlo calculations with a modified user code, the COMPTON/ EGSnrc. The Monte Carlo calculated COMPTON currents agree well with measured data for both photon and electron beams; the calculated polarity correction factors, on the other hand, do not agree with measurement results. The conclusions reached for the PEEC can be extended to parallel-plate ionization chambers in general.
Quantum Monte Carlo simulation
Wang, Yazhen
2011-01-01
Contemporary scientific studies often rely on the understanding of complex quantum systems via computer simulation. This paper initiates the statistical study of quantum simulation and proposes a Monte Carlo method for estimating analytically intractable quantities. We derive the bias and variance for the proposed Monte Carlo quantum simulation estimator and establish the asymptotic theory for the estimator. The theory is used to design a computational scheme for minimizing the mean square er...
Monte Carlo transition probabilities
Lucy, L. B.
2001-01-01
Transition probabilities governing the interaction of energy packets and matter are derived that allow Monte Carlo NLTE transfer codes to be constructed without simplifying the treatment of line formation. These probabilities are such that the Monte Carlo calculation asymptotically recovers the local emissivity of a gas in statistical equilibrium. Numerical experiments with one-point statistical equilibrium problems for Fe II and Hydrogen confirm this asymptotic behaviour. In addition, the re...
Energy Technology Data Exchange (ETDEWEB)
Gomes B, W. O., E-mail: wilsonottobatista@gmail.com [Instituto Federal da Bahia, Rua Emidio dos Santos s/n, Barbalho 40301-015, Salvador de Bahia (Brazil)
2016-10-15
This study aimed to develop a geometry of irradiation applicable to the software PCXMC and the consequent calculation of effective dose in applications of the Computed Tomography Cone Beam (CBCT). We evaluated two different CBCT equipment s for dental applications: Care stream Cs 9000 3-dimensional tomograph; i-CAT and GENDEX GXCB-500. Initially characterize each protocol measuring the surface kerma input and the product kerma air-area, P{sub KA}, with solid state detectors RADCAL and PTW transmission chamber. Then we introduce the technical parameters of each preset protocols and geometric conditions in the PCXMC software to obtain the values of effective dose. The calculated effective dose is within the range of 9.0 to 15.7 μSv for 3-dimensional computer 9000 Cs; within the range 44.5 to 89 μSv for GXCB-500 equipment and in the range of 62-111 μSv for equipment Classical i-CAT. These values were compared with results obtained dosimetry using TLD implanted in anthropomorphic phantom and are considered consistent. Os effective dose results are very sensitive to the geometry of radiation (beam position in mathematical phantom). This factor translates to a factor of fragility software usage. But it is very useful to get quick answers to regarding process optimization tool conclusions protocols. We conclude that use software PCXMC Monte Carlo simulation is useful assessment protocols for CBCT tests in dental applications. (Author)
Denoising of electron beam Monte Carlo dose distributions using digital filtering techniques
Deasy, Joseph O.
2000-07-01
The Monte Carlo (MC) method has long been viewed as the ultimate dose distribution computational technique. The inherent stochastic dose fluctuations (i.e. noise), however, have several important disadvantages: noise will affect estimates of all the relevant dosimetric and radiobiological indices, and noise will degrade the resulting dose contour visualizations. We suggest the use of a post-processing denoising step to reduce statistical fluctuations and also improve dose contour visualization. We report the results of applying four different two-dimensional digital smoothing filters to two-dimensional dose images. The Integrated Tiger Series MC code was used to generate 10 MeV electron beam dose distributions at various depths in two different phantoms. The observed qualitative effects of filtering include: (a) the suppression of voxel-to-voxel (high-frequency) noise and (b) the resulting contour plots are visually more comprehensible. Drawbacks include, in some cases, slight blurring of penumbra near the surface and slight blurring of other very sharp real dosimetric features. Of the four digital filters considered here, one, a filter based on a local least-squares principle, appears to suppress noise with negligible degradation of real dosimetric features. We conclude that denoising of electron beam MC dose distributions is feasible and will yield improved dosimetric reliability and improved visualization of dose distributions.
Denoising of electron beam Monte Carlo dose distributions using digital filtering techniques
Energy Technology Data Exchange (ETDEWEB)
Deasy, Joseph O. [Mallinckrodt Institute of Radiology, Washington University School of Medicine, 510 So. Kingshighway Blvd, St Louis, MO 63110 (United States). E-mail: deasy at radonc.wustl.edu
2000-07-01
The Monte Carlo (MC) method has long been viewed as the ultimate dose distribution computational technique. The inherent stochastic dose fluctuations (i.e. noise), however, have several important disadvantages: noise will affect estimates of all the relevant dosimetric and radiobiological indices, and noise will degrade the resulting dose contour visualizations. We suggest the use of a post-processing denoising step to reduce statistical fluctuations and also improve dose contour visualization. We report the results of applying four different two-dimensional digital smoothing filters to two-dimensional dose images. The Integrated Tiger Series MC code was used to generate 10 MeV electron beam dose distributions at various depths in two different phantoms. The observed qualitative effects of filtering include: (a) the suppression of voxel-to-voxel (high-frequency) noise and (b) the resulting contour plots are visually more comprehensible. Drawbacks include, in some cases, slight blurring of penumbra near the surface and slight blurring of other very sharp real dosimetric features. Of the four digital filters considered here, one, a filter based on a local least-squares principle, appears to suppress noise with negligible degradation of real dosimetric features. We conclude that denoising of electron beam MC dose distributions is feasible and will yield improved dosimetric reliability and improved visualization of dose distributions. (author)
Monte Carlo simulation of MOSFET detectors for high-energy photon beams using the PENELOPE code
Energy Technology Data Exchange (ETDEWEB)
Panettieri, Vanessa [Institut de Tecniques Energetiques, Universitat Politecnica de Catalunya, Diagonal 647, 08028 Barcelona (Spain); Duch, Maria Amor [Institut de Tecniques Energetiques, Universitat Politecnica de Catalunya, Diagonal 647, 08028 Barcelona (Spain); Jornet, Nuria [Servei de RadiofIsica i Radioproteccio, Hospital de la Santa Creu i San Pau Sant Antoni Maria Claret 167, 08025 Barcelona (Spain); Ginjaume, Merce [Institut de Tecniques Energetiques, Universitat Politecnica de Catalunya, Diagonal 647, 08028 Barcelona (Spain); Carrasco, Pablo [Servei de RadiofIsica i Radioproteccio, Hospital de la Santa Creu i San Pau Sant Antoni Maria Claret 167, 08025 Barcelona (Spain); Badal, Andreu [Institut de Tecniques Energetiques, Universitat Politecnica de Catalunya, Diagonal 647, 08028 Barcelona (Spain); Ortega, Xavier [Institut de Tecniques Energetiques, Universitat Politecnica de Catalunya, Diagonal 647, 08028 Barcelona (Spain); Ribas, Montserrat [Servei de RadiofIsica i Radioproteccio, Hospital de la Santa Creu i San Pau Sant Antoni Maria Claret 167, 08025 Barcelona (Spain)
2007-01-07
The aim of this work was the Monte Carlo (MC) simulation of the response of commercially available dosimeters based on metal oxide semiconductor field effect transistors (MOSFETs) for radiotherapeutic photon beams using the PENELOPE code. The studied Thomson and Nielsen TN-502-RD MOSFETs have a very small sensitive area of 0.04 mm{sup 2} and a thickness of 0.5 {mu}m which is placed on a flat kapton base and covered by a rounded layer of black epoxy resin. The influence of different metallic and Plastic water(TM) build-up caps, together with the orientation of the detector have been investigated for the specific application of MOSFET detectors for entrance in vivo dosimetry. Additionally, the energy dependence of MOSFET detectors for different high-energy photon beams (with energy >1.25 MeV) has been calculated. Calculations were carried out for simulated 6 MV and 18 MV x-ray beams generated by a Varian Clinac 1800 linear accelerator, a Co-60 photon beam from a Theratron 780 unit, and monoenergetic photon beams ranging from 2 MeV to 10 MeV. The results of the validation of the simulated photon beams show that the average difference between MC results and reference data is negligible, within 0.3%. MC simulated results of the effect of the build-up caps on the MOSFET response are in good agreement with experimental measurements, within the uncertainties. In particular, for the 18 MV photon beam the response of the detectors under a tungsten cap is 48% higher than for a 2 cm Plastic water(TM) cap and approximately 26% higher when a brass cap is used. This effect is demonstrated to be caused by positron production in the build-up caps of higher atomic number. This work also shows that the MOSFET detectors produce a higher signal when their rounded side is facing the beam (up to 6%) and that there is a significant variation (up to 50%) in the response of the MOSFET for photon energies in the studied energy range. All the results have shown that the PENELOPE code system
Energy Technology Data Exchange (ETDEWEB)
Albright, N; Bergstrom, P M; Daly, T P; Descalle, M; Garrett, D; House, R K; Knapp, D K; May, S; Patterson, R W; Siantar, C L; Verhey, L; Walling, R S; Welczorek, D
1999-07-01
PEREGRINE is a 3D Monte Carlo dose calculation system designed to serve as a dose calculation engine for clinical radiation therapy treatment planning systems. Taking advantage of recent advances in low-cost computer hardware, modern multiprocessor architectures and optimized Monte Carlo transport algorithms, PEREGRINE performs mm-resolution Monte Carlo calculations in times that are reasonable for clinical use. PEREGRINE has been developed to simulate radiation therapy for several source types, including photons, electrons, neutrons and protons, for both teletherapy and brachytherapy. However the work described in this paper is limited to linear accelerator-based megavoltage photon therapy. Here we assess the accuracy, reliability, and added value of 3D Monte Carlo transport for photon therapy treatment planning. Comparisons with clinical measurements in homogeneous and heterogeneous phantoms demonstrate PEREGRINE's accuracy. Studies with variable tissue composition demonstrate the importance of material assignment on the overall dose distribution. Detailed analysis of Monte Carlo results provides new information for radiation research by expanding the set of observables.
Breast tomosynthesis with monochromatic beams: a feasibility study using Monte Carlo simulations
Malliori, A.; Bliznakova, K.; Sechopoulos, I.; Kamarianakis, Z.; Fei, B.; Pallikarakis, N.
2014-08-01
The aim of this study is to investigate the impact on image quality of using monochromatic beams for lower dose breast tomosynthesis (BT). For this purpose, modeling and simulation of BT and mammography imaging processes have been performed using two x-ray beams: one at 28 kVp and a monochromatic one at 19 keV at different entrance surface air kerma ranging between 0.16 and 5.5 mGy. Two 4 cm thick computational breast models, in a compressed state, were used: one simple homogeneous and one heterogeneous based on CT breast images, with compositions of 50% glandular-50% adipose and 40% glandular-60% adipose tissues by weight, respectively. Modeled lesions, representing masses and calcifications, were inserted within these breast phantoms. X-ray transport in the breast models was simulated with previously developed and validated Monte Carlo application. Results showed that, for the same incident photon fluence, the use of the monochromatic beam in BT resulted in higher image quality compared to the one using polychromatic acquisition, especially in terms of contrast. For the homogenous phantom, the improvement ranged between 15% and 22% for calcifications and masses, respectively, while for the heterogeneous one this improvement was in the order of 33% for the masses and 17% for the calcifications. For different exposures, comparable image quality in terms of signal-difference-to-noise ratio and higher contrast for all features was obtained when using a monochromatic 19 keV beam at a lower mean glandular dose, compared to the polychromatic one. Monochromatic images also provide better detail and, in combination with BT, can lead to substantial improvement in visualization of features, and particularly better edge detection of low-contrast masses.
Monte Carlo dose calculation improvements for low energy electron beams using eMC.
Fix, Michael K; Frei, Daniel; Volken, Werner; Neuenschwander, Hans; Born, Ernst J; Manser, Peter
2010-08-21
The electron Monte Carlo (eMC) dose calculation algorithm in Eclipse (Varian Medical Systems) is based on the macro MC method and is able to predict dose distributions for high energy electron beams with high accuracy. However, there are limitations for low energy electron beams. This work aims to improve the accuracy of the dose calculation using eMC for 4 and 6 MeV electron beams of Varian linear accelerators. Improvements implemented into the eMC include (1) improved determination of the initial electron energy spectrum by increased resolution of mono-energetic depth dose curves used during beam configuration; (2) inclusion of all the scrapers of the applicator in the beam model; (3) reduction of the maximum size of the sphere to be selected within the macro MC transport when the energy of the incident electron is below certain thresholds. The impact of these changes in eMC is investigated by comparing calculated dose distributions for 4 and 6 MeV electron beams at source to surface distance (SSD) of 100 and 110 cm with applicators ranging from 6 x 6 to 25 x 25 cm(2) of a Varian Clinac 2300C/D with the corresponding measurements. Dose differences between calculated and measured absolute depth dose curves are reduced from 6% to less than 1.5% for both energies and all applicators considered at SSD of 100 cm. Using the original eMC implementation, absolute dose profiles at depths of 1 cm, d(max) and R50 in water lead to dose differences of up to 8% for applicators larger than 15 x 15 cm(2) at SSD 100 cm. Those differences are now reduced to less than 2% for all dose profiles investigated when the improved version of eMC is used. At SSD of 110 cm the dose difference for the original eMC version is even more pronounced and can be larger than 10%. Those differences are reduced to within 2% or 2 mm with the improved version of eMC. In this work several enhancements were made in the eMC algorithm leading to significant improvements in the accuracy of the dose
Monte Carlo techniques in radiation therapy
Verhaegen, Frank
2013-01-01
Modern cancer treatment relies on Monte Carlo simulations to help radiotherapists and clinical physicists better understand and compute radiation dose from imaging devices as well as exploit four-dimensional imaging data. With Monte Carlo-based treatment planning tools now available from commercial vendors, a complete transition to Monte Carlo-based dose calculation methods in radiotherapy could likely take place in the next decade. Monte Carlo Techniques in Radiation Therapy explores the use of Monte Carlo methods for modeling various features of internal and external radiation sources, including light ion beams. The book-the first of its kind-addresses applications of the Monte Carlo particle transport simulation technique in radiation therapy, mainly focusing on external beam radiotherapy and brachytherapy. It presents the mathematical and technical aspects of the methods in particle transport simulations. The book also discusses the modeling of medical linacs and other irradiation devices; issues specific...
Ultrafast cone-beam CT scatter correction with GPU-based Monte Carlo simulation
Directory of Open Access Journals (Sweden)
Yuan Xu
2014-03-01
Full Text Available Purpose: Scatter artifacts severely degrade image quality of cone-beam CT (CBCT. We present an ultrafast scatter correction framework by using GPU-based Monte Carlo (MC simulation and prior patient CT image, aiming at automatically finish the whole process including both scatter correction and reconstruction within 30 seconds.Methods: The method consists of six steps: 1 FDK reconstruction using raw projection data; 2 Rigid Registration of planning CT to the FDK results; 3 MC scatter calculation at sparse view angles using the planning CT; 4 Interpolation of the calculated scatter signals to other angles; 5 Removal of scatter from the raw projections; 6 FDK reconstruction using the scatter-corrected projections. In addition to using GPU to accelerate MC photon simulations, we also use a small number of photons and a down-sampled CT image in simulation to further reduce computation time. A novel denoising algorithm is used to eliminate MC noise from the simulated scatter images caused by low photon numbers. The method is validated on one simulated head-and-neck case with 364 projection angles.Results: We have examined variation of the scatter signal among projection angles using Fourier analysis. It is found that scatter images at 31 angles are sufficient to restore those at all angles with < 0.1% error. For the simulated patient case with a resolution of 512 × 512 × 100, we simulated 5 × 106 photons per angle. The total computation time is 20.52 seconds on a Nvidia GTX Titan GPU, and the time at each step is 2.53, 0.64, 14.78, 0.13, 0.19, and 2.25 seconds, respectively. The scatter-induced shading/cupping artifacts are substantially reduced, and the average HU error of a region-of-interest is reduced from 75.9 to 19.0 HU.Conclusion: A practical ultrafast MC-based CBCT scatter correction scheme is developed. It accomplished the whole procedure of scatter correction and reconstruction within 30 seconds.----------------------------Cite this
Koger, B; Kirkby, C
2016-12-02
As a recent area of development in radiation therapy, gold nanoparticle (GNP) enhanced radiation therapy has shown potential to increase tumour dose while maintaining acceptable levels of healthy tissue toxicity. In this study, the effect of varying photon beam energy in GNP enhanced arc radiation therapy (GEART) is quantified through the introduction of a dose scoring metric, and GEART is compared to a conventional radiotherapy treatment. The PENELOPE Monte Carlo code was used to model several simple phantoms consisting of a spherical tumour containing GNPs (concentration: 15 mg Au g(-1) tumour, 0.8 mg Au g(-1) normal tissue) in a cylinder of tissue. Several monoenergetic photon beams, with energies ranging from 20 keV to 6 MeV, as well as 100, 200, and 300 kVp spectral beams, were used to irradiate the tumour in a 360° arc treatment. A dose metric was then used to compare tumour and tissue doses from GEART treatments to a similar treatment from a 6 MV spectrum. This was also performed on a simulated brain tumour using patient computed tomography data. GEART treatments showed potential over the 6 MV treatment for many of the simulated geometries, delivering up to 88% higher mean dose to the tumour for a constant tissue dose, with the effect greatest near a source energy of 50 keV. This effect is also seen with the inclusion of bone in a brain treatment, with a 14% increase in mean tumour dose over 6 MV, while still maintaining acceptable levels of dose to the bone and brain.
Energy Technology Data Exchange (ETDEWEB)
Jin, L; Wang, L; Li, J; Luo, W; Feigenberg, S J; Ma, C-M [Department of Radiation Oncology, Fox Chase Cancer Center, Philadelphia, PA 19111 (United States)
2007-07-21
This work investigated the selection of beam margins in lung-cancer stereotactic body radiotherapy (SBRT) with 6 MV photon beams. Monte Carlo dose calculations were used to systematically and quantitatively study the dosimetric effects of beam margins for different lung densities (0.1, 0.15, 0.25, 0.35 and 0.5 g cm{sup -3}), planning target volumes (PTVs) (14.4, 22.1 and 55.3 cm{sup 3}) and numbers of beam angles (three, six and seven) in lung-cancer SBRT in order to search for optimal beam margins for various clinical situations. First, a large number of treatment plans were generated in a commercial treatment planning system, and then recalculated using Monte Carlo simulations. All the plans were normalized to ensure that 95% of the PTV at least receives the prescription dose and compared quantitatively. Based on these plans, the relationships between the beam margin and quantities such as the lung toxicity (quantified by V{sub 20}, the percentage volume of the two lungs receiving at least 20 Gy) and the maximum target (PTV) dose were established for different PTVs and lung densities. The impact of the number of beam angles on the relationship between V{sub 20} and the beam margin was assessed. Quantitative information about optimal beam margins for lung-cancer SBRT was obtained for clinical applications.
Nievaart, V.A.; Legrady, D.; Moss, R.L.; Kloosterman, J.L.; Van der Hagen, T.H.; Van Dam, H.
2007-01-01
This paper deals with the application of the adjoint transport theory in order to optimize Monte Carlo based radiotherapy treatment planning. The technique is applied to Boron Neutron Capture Therapy where most often mixed beams of neutrons and gammas are involved. In normal forward Monte Carlo simu
Energy Technology Data Exchange (ETDEWEB)
Treutwein, M.; Bogner, L. [Universitaetsklinikum Regensburg (Germany). Klinik und Poliklinik fuer Strahlentherapie
2007-08-15
Background and Purpose: For several years three-dimensional treatment-planning systems have used pencil beam algorithms in the calculation of electron fields. Nowadays, exact Monte Carlo methods are commercially available, showing good correspondence to experimental results. Clinical examples are investigated to find differences in the dose distribution of treatment plans, which are calculated with both pencil beam and Monte Carlo algorithm. Material and Methods: Two different clinical applications are regarded: (1) an irradiation of the chest wall, and (2) an electron field to the vertebral column. The dose distributions are calculated by Oncentra trademark MasterPlan on the one hand, using the Monte Carlo code VMC++, and by Helax trademark TMS on the other hand (both Nucletron B.V., Veenendaal, The Netherlands). Profiles and depth dose curves are evaluated by the Verisoft trademark program of PTW (Freiburg, Germany). Results: In the case of chest wall irradiation, the depth dose curves for the three investigated energies, 9, 15 and 21 MeV, agree rather well, also in lung tissue. The mean value for the lung differs only by 4% related to the dose maximum. In the case of vertebral column irradiation, however, the dose difference is more pronounced and, in the prevertebral region, is 56% lower for the VMC++ plan than in the pencil beam calculation. Conclusion: For irradiations of the chest wall, dose distribution calculations by means of pencil beam algorithm may be applied. Calculating electron dose distributions in cases of larger bone inhomogeneities, the more exact Monte Carlo algorithm should be preferred. (orig.)
Alvarenga, A V; Silva, C E R; Costa-Félix, R P B
2016-07-01
The uncertainty of ultrasonic beam parameters from non-destructive testing immersion probes was evaluated using the Guide to the expression of uncertainty in measurement (GUM) uncertainty framework and Monte Carlo Method simulation. The calculated parameters such as focal distance, focal length, focal widths and beam divergence were determined according to EN 12668-2. The typical system configuration used during the mapping acquisition comprises a personal computer connected to an oscilloscope, a signal generator, axes movement controllers, and a water bath. The positioning system allows moving the transducer (or hydrophone) in the water bath. To integrate all system components, a program was developed to allow controlling all the axes, acquire waterborne signals, and calculate essential parameters to assess and calibrate US transducers. All parameters were calculated directly from the raster scans of axial and transversal beam profiles, except beam divergence. Hence, the positioning system resolution and the step size are principal source of uncertainty. Monte Carlo Method simulations were performed by another program that generates pseudo-random samples for the distributions of the involved quantities. In all cases, there were found statistical differences between Monte Carlo and GUM methods.
Energy Technology Data Exchange (ETDEWEB)
Mazrou, Hakim, E-mail: mazrou_h@crna.d [Centre de Recherche Nucleaire d' Alger (CRNA), 02 Boulevard Frantz, Fanon, B.P. 399, Alger-RP 16000 (Algeria); Sidahmed, Tassadit [Centre de Recherche Nucleaire d' Alger (CRNA), 02 Boulevard Frantz, Fanon, B.P. 399, Alger-RP 16000 (Algeria); Allab, Malika [Faculte de Physique, Universite des Sciences et de la Technologie de Houari-Boumediene (USTHB), 16111, Alger (Algeria)
2010-10-15
An irradiation system has been acquired by the Nuclear Research Center of Algiers (CRNA) to provide neutron references for metrology and dosimetry purposes. It consists of an {sup 241}Am-Be radionuclide source of 185 GBq (5 Ci) activity inside a cylindrical steel-enveloped polyethylene container with radially positioned beam channel. Because of its composition, filled with hydrogenous material, which is not recommended by ISO standards, we expect large changes in the physical quantities of primary importance of the source compared to a free-field situation. Thus, the main goal of the present work is to fully characterize neutron field of such special delivered set-up. This was conducted by both extensive Monte-Carlo calculations and experimental measurements obtained by using BF{sub 3} and {sup 3}He based neutron area dosimeters. Effects of each component present in the bunker facility of the Algerian Secondary Standard Dosimetry Laboratory (SSDL) on the energy neutron spectrum have been investigated by simulating four irradiation configurations and comparison to the ISO spectrum has been performed. The ambient dose equivalent rate was determined based upon a correct estimate of the mean fluence to ambient dose equivalent conversion factors at different irradiations positions by means of a 3-D transport code MCNP5. Finally, according to practical requirements established for calibration purposes an optimal irradiation position has been suggested to the SSDL staff to perform, in appropriate manner, their routine calibrations.
Beam neutron energy optimization for boron neutron capture therapy using Monte Carlo method
Directory of Open Access Journals (Sweden)
Ali Pazirandeh
2006-06-01
Full Text Available In last two decades the optimal neutron energy for the treatment of deep seated tumors in boron neutron capture therapy in view of neutron physics and chemical compounds of boron carrier has been under thorough study. Although neutron absorption cross section of boron is high (3836b, the treatment of deep seated tumors such as gliobelastoma multiform (GBM requires beam of neutrons of higher energy that can penetrate deeply into the brain and thermalize in the proximity of the tumor. Dosage from recoil proton associated with fast neutrons however poses some constraints on maximum neutron energy that can be used in the treatment. For this reason neutrons in the epithermal energy range of 10eV-10keV are generally to be the most appropriate. The simulation carried out by Monte Carlo methods using MCBNCT and MCNP4C codes along with the cross section library in 290 groups extracted from ENDF/B6 main library. The optimal neutron energy for deep seated tumors depends on the size and depth of tumor. Our estimated optimized energy for the tumor of 5cm wide and 1-2cm thick stands at 5cm depth is in the range of 3-5keV
Tisseur, David; Andrieux, Alexan; Costin, Marius; Vabre, Alexandre
2014-06-01
CEA-LIST develops CIVA software for non-destructive testing simulation. Radiography Monte Carlo simulation for the scattered beam can be quite long (several hours) even on a multi-thread CPU implementation. In order to reduce this computation time, we have modified and adapted for CIVA a GPU open source code named MCGPU. This paper presents our work and the results of cross comparison between CIVA and the modified MCGPU code in a NDT context.
Monte Carlo Treatment Planning for Advanced Radiotherapy
DEFF Research Database (Denmark)
Cronholm, Rickard
and validation of a Monte Carlo model of a medical linear accelerator (i), converting a CT scan of a patient to a Monte Carlo compliant phantom (ii) and translating the treatment plan parameters (including beam energy, angles of incidence, collimator settings etc) to a Monte Carlo input file (iii). A protocol...... previous algorithms since it uses delineations of structures in order to include and/or exclude certain media in various anatomical regions. This method has the potential to reduce anatomically irrelevant media assignment. In house MATLAB scripts translating the treatment plan parameters to Monte Carlo...
Hrivnacova, I; Berejnov, V V; Brun, R; Carminati, F; Fassò, A; Futo, E; Gheata, A; Caballero, I G; Morsch, Andreas
2003-01-01
The concept of Virtual Monte Carlo (VMC) has been developed by the ALICE Software Project to allow different Monte Carlo simulation programs to run without changing the user code, such as the geometry definition, the detector response simulation or input and output formats. Recently, the VMC classes have been integrated into the ROOT framework, and the other relevant packages have been separated from the AliRoot framework and can be used individually by any other HEP project. The general concept of the VMC and its set of base classes provided in ROOT will be presented. Existing implementations for Geant3, Geant4 and FLUKA and simple examples of usage will be described.
Monte Carlo Simulations of Beam Losses in the Test Beam Line of CTF3
Nebot Del Busto, E; Branger, E; Holzer, E B; Doebert, S; Lillestol, R L; Welsch, C P
2013-01-01
The Test Beam Line (TBL) of the CLIC Test Facility 3 (CTF3) aims to validate the drive beam deceleration concept of CLIC, in which the RF power requested to boost particles to multi-TeV energies is obtained via deceleration of a high current and low energy drive beam (DB). Despite a TBL beam energy (150-80 MeV) significantly lower than the minimum nominal energy of the CLIC DB (250 MeV), the pulse time structure of the TBL provides the opportunity to measure beam losses with CLIC-like DB timing conditions. In this contribution, a simulation study on the detection of beam losses along the TBL for the commissioning of the recently installed beam loss monitoring system is presented. The most likely loss locations during stable beam conditions are studied by considering the beam envelope defined by the FODO lattice as well as the emittance growth due to the deceleration process. Moreover, the optimization of potential detector locations is discussed. Several factors are considered, namely: the distance to the bea...
Agyingi, Ephraim O; Mobit, Paul N; Sandison, George A
2006-01-01
A Monte Carlo study of the energy response of an aluminium oxide (Al(2)O(3)) detector in kilovoltage and megavoltage photon beams relative to (60)Co gamma rays has been performed using EGSnrc Monte Carlo simulations. The sensitive volume of the Al(2)O(3) detector was simulated as a disc of diameter 2.85 mm and thickness 1 mm. The phantom material was water and the irradiation depth chosen was 2.0 cm in kilovoltage photon beams and 5.0 cm in megavoltage photon beams. The results show that the energy response of the Al(2)O(3) detector is constant within 3% for photon beam energies in the energy range of (60)Co gamma rays to 25 MV X rays. However, the Al(2)O(3) detector shows an enhanced energy response for kilovoltage photon beams, which in the case of 50 kV X rays is 3.2 times higher than that for (60)Co gamma rays. There is essentially no difference in the energy responses of LiF and Al(2)O(3) detectors irradiated in megavoltage photon beams when these Al(2)O(3) results are compared with literature data for LiF thermoluminescence detectors. However, the Al(2)O(3) detector has a much higher enhanced response compared with LiF detectors in kilovoltage X-ray beams, more than twice as much for the case of 50 kV X rays.
Monte Carlo validation of the TrueBeam 10XFFF phase–space files for applications in lung SABR
Energy Technology Data Exchange (ETDEWEB)
Teke, Tony, E-mail: tteke2@bccancer.bc.ca [Medical Physics, BC Cancer Agency—Centre for the Southern Interior, Kelowna, British Columbia V1Y 5L3 (Canada); Duzenli, Cheryl; Bergman, Alanah; Viel, Francis; Atwal, Parmveer; Gete, Ermias [Medical Physics, BC Cancer Agency—Vancouver Centre, Vancouver, British Columbia V5Z 4E6 (Canada)
2015-12-15
Purpose: To establish the clinical acceptability of universal Monte Carlo phase–space data for the 10XFFF (flattening filter free) photon beam on the Varian TrueBeam Linac, including previously unreported data for small fields, output factors, and inhomogeneous media. The study was particularly aimed at confirming the suitability for use in simulations of lung stereotactic ablative radiotherapy treatment plans. Methods: Monte Carlo calculated percent depth doses (PDDs), transverse profiles, and output factors for the TrueBeam 10 MV FFF beam using generic phase–space data that have been released by the Varian MC research team were compared with in-house measurements and published data from multiple institutions (ten Linacs from eight different institutions). BEAMnrc was used to create field size specific phase–spaces located underneath the jaws. Doses were calculated with DOSXYZnrc in a water phantom for fields ranging from 1 × 1 to 40 × 40 cm{sup 2}. Particular attention was paid to small fields (down to 1 × 1 cm{sup 2}) and dose per pulse effects on dosimeter response for high dose rate 10XFFF beams. Ion chamber measurements were corrected for changes in ion collection efficiency (P{sub ion}) with increasing dose per pulse. MC and ECLIPSE ANISOTROPIC ANALYTICAL ALGORITHM (AAA) calculated PDDs were compared to Gafchromic film measurement in inhomogeneous media (water, bone, lung). Results: Measured data from all machines agreed with Monte Carlo simulations within 1.0% and 1.5% for PDDs and in-field transverse profiles, respectively, for field sizes >1 × 1 cm{sup 2} in a homogeneous water phantom. Agreements in the 80%–20% penumbra widths were better than 2 mm for all the fields that were compared. For all the field sizes considered, the agreement between their measured and calculated output factors was within 1.1%. Monte Carlo results for dose to water at water/bone, bone/lung, and lung/water interfaces as well as within lung agree with film
Lillhök, J E; Grindborg, J-E; Lindborg, L; Gudowska, I; Carlsson, G Alm; Söderberg, J; Kopeć, M; Medin, J
2007-08-21
Nanodosimetric single-event distributions or their mean values may contribute to a better understanding of how radiation induced biological damages are produced. They may also provide means for radiation quality characterization in therapy beams. Experimental nanodosimetry is however technically challenging and Monte Carlo simulations are valuable as a complementary tool for such investigations. The dose-mean lineal energy was determined in a therapeutic p(65)+Be neutron beam and in a (60)Co gamma beam using low-pressure gas detectors and the variance-covariance method. The neutron beam was simulated using the condensed history Monte Carlo codes MCNPX and SHIELD-HIT. The dose-mean lineal energy was calculated using the simulated dose and fluence spectra together with published data from track-structure simulations. A comparison between simulated and measured results revealed some systematic differences and different dependencies on the simulated object size. The results show that both experimental and theoretical approaches are needed for an accurate dosimetry in the nanometer region. In line with previously reported results, the dose-mean lineal energy determined at 10 nm was shown to be related to clinical RBE values in the neutron beam and in a simulated 175 MeV proton beam as well.
Experimental and Monte Carlo evaluation of an ionization chamber in a 60Co beam
Perini, A. P.; Neves, L. P.; Santos, W. S.; Caldas, L. V. E.
2016-07-01
Recently a special parallel-plate ionization chamber was developed and characterized at the Instituto de Pesquisas Energeticas e Nucleares. The operational tests presented results within the recommended limits. In order to determine the influence of some components of the ionization chamber on its response, Monte Carlo simulations were carried out. The experimental and simulation results pointed out that the dosimeter evaluated in the present work has favorable properties to be applied to 60Co dosimetry at calibration laboratories.
Experimental and Monte Carlo evaluation of an ionization chamber in a {sup 60}Co beam
Energy Technology Data Exchange (ETDEWEB)
Perini, Ana P.; Neves, Lucio Pereira, E-mail: anapaula.perini@ufu.br [Universidade Federal de Uberlandia (INFIS/UFU), MG (Brazil). Instituto de Fisica; Santos, William S.; Caldas, Linda V.E. [Instituto de Pesquisas Energeticas e Nucleres (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)
2015-07-01
Recently a special parallel-plate ionization chamber was developed and characterized at the Instituto de Pesquisas Energeticas e Nucleares. The operational tests presented results within the recommended limits. In order to determine the influence of some components of the ionization chamber on its response, Monte Carlo simulations were carried out. The experimental and simulation results pointed out that the dosimeter evaluated in the present work has favorable properties to be applied to {sup 60}Co dosimetry at calibration laboratories. (author)
SU-D-19A-04: Parameter Characterization of Electron Beam Monte Carlo Phase Space of TrueBeam Linacs
Energy Technology Data Exchange (ETDEWEB)
Rodrigues, A; Yin, F; Wu, Q [Duke University Medical Center, Durham, NC (United States); Medical Physics Graduate Program, Duke University Medical Center, Durham, NC (United States); Sawkey, D [Varian Medical Systems, Palo Alto, CA (United States)
2014-06-01
Purpose: For TrueBeam Monte Carlo simulations, Varian does not distribute linac head geometry and material compositions, instead providing a phase space file (PSF) for the users. The PSF has a finite number of particle histories and can have very large file size, yet still contains inherent statistical noises. The purpose of this study is to characterize the electron beam PSF with parameters. Methods: The PSF is a snapshot of all particles' information at a given plane above jaws including type, energy, position, and directions. This study utilized a preliminary TrueBeam PSF, of which validation against measurement is presented in another study. To characterize the PSF, distributions of energy, position, and direction of all particles are analyzed as piece-wise parameterized functions of radius and polar angle. Subsequently, a pseudo PSF was generated based on this characterization. Validation was assessed by directly comparing the true and pseudo PSFs, and by using both PSFs in the down-stream MC simulations (BEAMnrc/DOSXYZnrc) and comparing dose distributions for 3 applicators at 15 MeV. Statistical uncertainty of 4% was limited by the number of histories in the original PSF. Percent depth dose (PDD) and orthogonal (PRF) profiles at various depths were evaluated. Results: Preliminary results showed that this PSF parameterization was accurate, with no visible differences between original and pseudo PSFs except at the edge (6 cm off axis), which did not impact dose distributions in phantom. PDD differences were within 1 mm for R{sub 7} {sub 0}, R{sub 5} {sub 0}, R{sub 3} {sub 0}, and R{sub 1} {sub 0}, and PRF field size and penumbras were within 2 mm. Conclusion: A PSF can be successfully characterized by distributions for energy, position, and direction as parameterized functions of radius and polar angles; this facilitates generating sufficient particles at any statistical precision. Analyses for all other electron energies are under way and results will be
Monte Carlo Simulation of Damage Depth in Focused Ion Beam Milling Si3N4 Thin Film
Institute of Scientific and Technical Information of China (English)
TAN Yong-wen; XIE Xue-bing; Jack Zhou; XU Tian-wei; YANG Wei-guo; YANG Hai
2007-01-01
The damage properties of Focused Ion Beam(FIB) milling Si3N4 thin film are investigated by the detailed analyzing images of nanoholes and simulation of Monte Carlo. The damage depth in the Si3N4 thin film for two different ion species(Gallium and Arsenic) under various parameters(ion energy, angle of incidence) are investigated by Monte Carlo method. The simulations show the damage depth increases with the increasing ion energy, the damage depth is dependent on the angle of incident ion, the curves of the damage depth for Ga ion and As ion at 30 keV nearly superpose, while the damage depth for Ga with 90 keV ion is more than that for As ion with the same energy.
Indian Academy of Sciences (India)
V C Petwal; J N Rao; Jishnu Dwivedi; V K Senecha; K V Subbaiah
2010-03-01
A prototype pulsed electron beam irradiation facility for radiation processing of food and medical products is being commissioned at our centre in Indore, India. Analysis of surface dose and uniformity for a pulsed beam facility is of crucial importance because it is influenced by various operating parameters such as beam current, pulse repetition rate (PRR), scanning current profile and frequency, scanning width and product conveying speed. A large number of experiments are required to determine the harmonized setting of these operating parameters for achieving uniform dose. Since there is no readily available tool to set these parameters, use of Monte Carlo methods and computational tools can prove to be the most viable and time saving technique to support the assessment of the dose distribution. In the present study, Monte Carlo code, MCNP, is used to simulate the transport of 10 MeV electron beam through various mediums coming into the beam path and generate an equivalent dose profile in a polystyrene phantom for stationary state. These results have been verified with experimentally measured dose profile, showing that results are in good agreement within 4%. The Monte Carlo simulation further has been used to optimize the overlapping between the successive pulses of a scan to achieve ± 5% dose uniformity along the scanning direction. A mathematical model, which uses the stationary state data, is developed to include the effect of conveyor speed. The algorithm of the model is discussed and the results are compared with the experimentally measured values, which show that the agreement is better than 15%. Finally, harmonized setting for operating parameters of the accelerator are derived to deliver uniform surface dose in the range of 1–13 kGy/pass.
Chetty, Indrin J; Moran, Jean M; McShan, Daniel L; Fraass, Benedick A; Wilderman, Scott J; Bielajew, Alex F
2002-06-01
A comprehensive set of measurements and calculations has been conducted to investigate the accuracy of the Dose Planning Method (DPM) Monte Carlo code for dose calculations from 10 and 50 MeV scanned electron beams produced from a racetrack microtron. Central axis depth dose measurements and a series of profile scans at various depths were acquired in a water phantom using a Scanditronix type RK ion chamber. Source spatial distributions for the Monte Carlo calculations were reconstructed from in-air ion chamber measurements carried out across the two-dimensional beam profile at 100 cm downstream from the source. The in-air spatial distributions were found to have full width at half maximum of 4.7 and 1.3 cm, at 100 cm from the source, for the 10 and 50 MeV beams, respectively. Energy spectra for the 10 and 50 MeV beams were determined by simulating the components of the microtron treatment head using the code MCNP4B. DPM calculations are on average within +/- 2% agreement with measurement for all depth dose and profile comparisons conducted in this study. The accuracy of the DPM code illustrated in this work suggests that DPM may be used as a valuable tool for electron beam dose calculations.
Directory of Open Access Journals (Sweden)
Cecilia Maya
2004-12-01
Full Text Available El método Monte Carlo se aplica a varios casos de valoración de opciones financieras. El método genera una buena aproximación al comparar su precisión con la de otros métodos numéricos. La estimación que produce la versión Cruda de Monte Carlo puede ser aún más exacta si se recurre a metodologías de reducción de la varianza entre las cuales se sugieren la variable antitética y de la variable de control. Sin embargo, dichas metodologías requieren un esfuerzo computacional mayor por lo cual las mismas deben ser evaluadas en términos no sólo de su precisión sino también de su eficiencia.
Monte Carlo and nonlinearities
Dauchet, Jérémi; Blanco, Stéphane; Caliot, Cyril; Charon, Julien; Coustet, Christophe; Hafi, Mouna El; Eymet, Vincent; Farges, Olivier; Forest, Vincent; Fournier, Richard; Galtier, Mathieu; Gautrais, Jacques; Khuong, Anaïs; Pelissier, Lionel; Piaud, Benjamin; Roger, Maxime; Terrée, Guillaume; Weitz, Sebastian
2016-01-01
The Monte Carlo method is widely used to numerically predict systems behaviour. However, its powerful incremental design assumes a strong premise which has severely limited application so far: the estimation process must combine linearly over dimensions. Here we show that this premise can be alleviated by projecting nonlinearities on a polynomial basis and increasing the configuration-space dimension. Considering phytoplankton growth in light-limited environments, radiative transfer in planetary atmospheres, electromagnetic scattering by particles and concentrated-solar-power-plant productions, we prove the real world usability of this advance on four test-cases that were so far regarded as impracticable by Monte Carlo approaches. We also illustrate an outstanding feature of our method when applied to sharp problems with interacting particles: handling rare events is now straightforward. Overall, our extension preserves the features that made the method popular: addressing nonlinearities does not compromise o...
Energy Technology Data Exchange (ETDEWEB)
Wollaber, Allan Benton [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2016-06-16
This is a powerpoint presentation which serves as lecture material for the Parallel Computing summer school. It goes over the fundamentals of the Monte Carlo calculation method. The material is presented according to the following outline: Introduction (background, a simple example: estimating π), Why does this even work? (The Law of Large Numbers, The Central Limit Theorem), How to sample (inverse transform sampling, rejection), and An example from particle transport.
Nakayama, Hiroshi; Furuichi, Akihisa; Kita, Takashi; Nishino, Taneo
1997-04-01
Structural phase transition of epitaxial growing layer is quite important to understand the atomic scale mechanism of molecular beam epitaxy (MBE). GaAs and related alloy semiconductors are typical systems which show variety of such structural transitions during MBE. Structural evolution of surface reconstruction phases and an order-disorder transition in III-V alloy semiconductors are typical cases where such phase transitions appear during epitaxial processes. In this work, a stochastic theory and the Monte-Carlo simulation have been presented to describe the structural evolution of epitaxial growth in binary system. This method, known here as the 'Monte-Carlo master equation (MCME) method', couples a master equation for epitaxial growth kinetics with an Ising Hamiltonian of growing surface. The Monte-Carlo (MC) simulation of binary growing surface with atom-correlation effects has successfully revealed the evolution of atomic structure and the formation of short-range ordering (SRO) during epitaxy. This demonstrates the usefulness of the MCME method in describing the atomic-structural dynamics as compared with a conventional theory of epitaxy based on a diffusion equation and standard nucleation theory.
Monte-Carlo scatter correction for cone-beam computed tomography with limited scan field-of-view
Bertram, Matthias; Sattel, Timo; Hohmann, Steffen; Wiegert, Jens
2008-03-01
In flat detector cone-beam computed tomography (CBCT), scattered radiation is a major source of image degradation, making accurate a posteriori scatter correction inevitable. A potential solution to this problem is provided by computerized scatter correction based on Monte-Carlo simulations. Using this technique, the detected distributions of X-ray scatter are estimated for various viewing directions using Monte-Carlo simulations of an intermediate reconstruction. However, as a major drawback, for standard CBCT geometries and with standard size flat detectors such as mounted on interventional C-arms, the scan field of view is too small to accommodate the human body without lateral truncations, and thus this technique cannot be readily applied. In this work, we present a novel method for constructing a model of the object in a laterally and possibly also axially extended field of view, which enables meaningful application of Monte-Carlo based scatter correction even in case of heavy truncations. Evaluation is based on simulations of a clinical CT data set of a human abdomen, which strongly exceeds the field of view of the simulated C-arm based CBCT imaging geometry. By using the proposed methodology, almost complete removal of scatter-caused inhomogeneities is demonstrated in reconstructed images.
Prediction of beam hardening artefacts in computed tomography using Monte Carlo simulations
DEFF Research Database (Denmark)
Thomsen, M.; Bergbäck Knudsen, Erik; Willendrup, Peter Kjær
2015-01-01
We show how radiological images of both single and multi material samples can be simulated using the Monte Carlo simulation tool McXtrace and how these images can be used to make a three dimensional reconstruction. Good numerical agreement between the X-ray attenuation coefficient in experimental......, illustrated with an example. Linearisation requires knowledge about the X-ray transmission at varying sample thickness, but in some cases homogeneous calibration phantoms are hard to manufacture, which affects the accuracy of the calibration. Using simulated data overcomes the manufacturing problems...
Energy Technology Data Exchange (ETDEWEB)
Bourhaleb, F; Givehchi, N; Iliescu, S; Rosa, A La; Pecka, A; Peroni, C [Dipartimento di Fisica Sperimentale, Universita' di Torino, Via P. Giuria 1, Torino 10125 (Italy); Attili, A; Cirio, R; Marchetto, F; Donetti, M; Garella, M A; Giordanengo, S; Pardo, J [INFN, Sezione di Torino, Via P. Giuria 1, Torino 10125 (Italy); Cirrone, P [INFN, Laboratori Nazionali del Sud, Via S.Sofia 62, Catania 95125 (Italy)], E-mail: bourhaleb@to.infn.it
2008-02-01
Proton and carbon ion beams have a very sharp Bragg peak. For proton beams of energies smaller than 100 MeV, fitting with a gaussian the region of the maximum of the Bragg peak, the sigma along the beam direction is smaller than 1 mm, while for carbon ion beams, the sigma derived with the same technique is smaller than 1 mm for energies up to 360 MeV. In order to use low energy proton and carbon ion beams in hadrontherapy and to achieve an acceptable homogeneity of the spread out Bragg peak (SOBP) either the peak positions along the beam have to be quite close to each other or the longitudinal peak shape needs to be broaden at least few millimeters by means of a properly designed ripple filter. With a synchrotron accelerator in conjunction with active scanning techniques the use of a ripple filter is necessary to reduce the numbers of energy switches necessary to obtain a smooth SOBP, leading also to shorter overall irradiation times. We studied the impact of the design of the ripple filter on the dose uniformity in the SOBP region by means of Monte Carlo simulations, implemented using the package Geant4. We simulated the beam delivery line supporting both proton and carbon ion beams using different energies of the beams. We compared the effect of different kind of ripple filters and their advantages.
McMillan, Kyle; McNitt-Gray, Michael; Ruan, Dan
2013-01-01
Purpose: The purpose of this study is to adapt an equivalent source model originally developed for conventional CT Monte Carlo dose quantification to the radiation oncology context and validate its application for evaluating concomitant dose incurred by a kilovoltage (kV) cone-beam CT (CBCT) system integrated into a linear accelerator. Methods: In order to properly characterize beams from the integrated kV CBCT system, the authors have adapted a previously developed equivalent source model consisting of an equivalent spectrum module that takes into account intrinsic filtration and an equivalent filter module characterizing the added bowtie filtration. An equivalent spectrum was generated for an 80, 100, and 125 kVp beam with beam energy characterized by half-value layer measurements. An equivalent filter description was generated from bowtie profile measurements for both the full- and half-bowtie. Equivalent source models for each combination of equivalent spectrum and filter were incorporated into the Monte Carlo software package MCNPX. Monte Carlo simulations were then validated against in-phantom measurements for both the radiographic and CBCT mode of operation of the kV CBCT system. Radiographic and CBCT imaging dose was measured for a variety of protocols at various locations within a body (32 cm in diameter) and head (16 cm in diameter) CTDI phantom. The in-phantom radiographic and CBCT dose was simulated at all measurement locations and converted to absolute dose using normalization factors calculated from air scan measurements and corresponding simulations. The simulated results were compared with the physical measurements and their discrepancies were assessed quantitatively. Results: Strong agreement was observed between in-phantom simulations and measurements. For the radiographic protocols, simulations uniformly underestimated measurements by 0.54%–5.14% (mean difference = −3.07%, SD = 1.60%). For the CBCT protocols, simulations uniformly
Institute of Scientific and Technical Information of China (English)
Min YUN; Yang LIU; Lian-zhong DENG; Qi ZHOU; Jian-ping YIN
2008-01-01
A new kind of continuous-wave (CW) cold mo-lecular beam, methyl cyanide (CH3 CN) beam, is generated by a bent electrostatic quadrupole guiding. The Stark shift of rotational energy levels of CH,3CN molecule and its popula-tion distribution are calculated, and the dynamic processes of electrostatic guiding and energy filtering of CH3CN molecules from a gas source with room temperature (300 K) are simu-lated by Monte Carlo Method. The study showed that the lon-gitudinal and transversal temperatures of output cold CH,3CN beam could be about -2 K and-420 mK, and the corre-sponding guiding efficiency was about 10'-5 as the guiding voltage was 3 kV. Furthermore, the temperature of the guided molecules and its guiding efficiency can be controlled by ad-justing the guiding voltages applied on electrodes.
Park, Dong-wook; Lee, Jai-ki
2016-08-01
For high energy photon beams, solid phantom to water dose conversion factors were calculated by using a Monte Carlo method, and the result were compared with measurements and published data. Based on the absorbed dose to water dosimetry protocol, the conversion factor was theoretically divided into stopping powers ratios, perturbation factors and ratios of absorbed dose to water and that to solid phantom. Data for a Farmer-type chamber and a solid phantom based on polystyrene which is one of the most common material were applied to calculate the conversion factors for 6 MV and 15 MV photon beams. All measurements were conducted after 10 Gy pre-irradiation and thermal equilibrium had been established with solid slabs in a treatment room. The calculated and the measured conversion factors were in good agreement and could be used to confirm the feasibility of the solid phantom as a substitute for water for high energy photon beam.
Evaluation of a 50-MV photon therapy beam from a racetrack microtron using MCNP4B Monte Carlo code
Energy Technology Data Exchange (ETDEWEB)
Gudowska, I.; Svensson, R. [Karolinska Inst. (Sweden). Dept. of Medical Radiation Physics]|[Huddinge Univ. Hospital, Stockholm (Sweden). Dept. of Medical Physics; Sorcini, B. [Karolinska Inst. (Sweden). Dept. of Medical Radiation Physics]|[Stockholm Univ. (Sweden)
2001-07-01
High energy photon therapy beam from the 50 MV racetrack microtron has been evaluated using the Monte Carlo code MCNP4B. The spatial and energy distribution of photons, radial and depth dose distributions in the phantom are calculated for the stationary and scanned photon beams from different targets. The calculated dose distributions are compared to the experimental data using a silicon diode detector. Measured and calculated depth-dose distributions are in fairly good agreement, within 2-3% for the positions in the range 2-30 cm in the phantom, whereas the larger discrepancies up to 10% are observed in the dose build-up region. For the stationary beams the differences in the calculated and measured radial dose distributions are about 2-10%. (orig.)
LMC: Logarithmantic Monte Carlo
Mantz, Adam B.
2017-06-01
LMC is a Markov Chain Monte Carlo engine in Python that implements adaptive Metropolis-Hastings and slice sampling, as well as the affine-invariant method of Goodman & Weare, in a flexible framework. It can be used for simple problems, but the main use case is problems where expensive likelihood evaluations are provided by less flexible third-party software, which benefit from parallelization across many nodes at the sampling level. The parallel/adaptive methods use communication through MPI, or alternatively by writing/reading files, and mostly follow the approaches pioneered by CosmoMC (ascl:1106.025).
Duan, Zhe; Barber, Desmond P; Qin, Qing
2015-01-01
With the recently emerging global interest in building a next generation of circular electron-positron colliders to study the properties of the Higgs boson, and other important topics in particle physics at ultra-high beam energies, it is also important to pursue the possibility of implementing polarized beams at this energy scale. It is therefore necessary to set up simulation tools to evaluate the beam polarization at these ultra-high beam energies. In this paper, a Monte-Carlo simulation of the equilibrium beam polarization based on the Polymorphic Tracking Code(PTC) is described. The simulations are for a model storage ring with parameters similar to those of proposed circular colliders in this energy range, and they are compared with the suggestion that there are different regimes for the spin dynamics underlying the polarization of a beam in the presence of synchrotron radiation at ultra-high beam energies. In particular, it has been suggested that the so-called "correlated" crossing of spin resonances ...
Energy Technology Data Exchange (ETDEWEB)
Zucca Aparicio, D.; Perez Moreno, J. M.; Fernandez Leton, P.; Garcia Ruiz-Zorrilla, J.; Minambres Moro, A.
2011-07-01
Today it is common to find commercial planning systems that incorporate dose calculation algorithms for photon beams based on Monte Carlo. This paper summarizes the processes involved in the evaluation of a dose calculation algorithm using Monte Carlo photon beams of 6 MV and 15 MV from a Siemens linear accelerator equipped with a collimating system 160 sheets.
Energy Technology Data Exchange (ETDEWEB)
Garcia-Pareja, S.; Galan, P.; Manzano, F.; Brualla, L.; Lallena, A. M. [Servicio de Radiofisica Hospitalaria, Hospital Regional Universitario ' ' Carlos Haya' ' , Avda. Carlos Haya s/n, E-29010 Malaga (Spain); Unidad de Radiofisica Hospitalaria, Hospital Xanit Internacional, Avda. de los Argonautas s/n, E-29630 Benalmadena (Malaga) (Spain); NCTeam, Strahlenklinik, Universitaetsklinikum Essen, Hufelandstr. 55, D-45122 Essen (Germany); Departamento de Fisica Atomica, Molecular y Nuclear, Universidad de Granada, E-18071 Granada (Spain)
2010-07-15
Purpose: In this work, the authors describe an approach which has been developed to drive the application of different variance-reduction techniques to the Monte Carlo simulation of photon and electron transport in clinical accelerators. Methods: The new approach considers the following techniques: Russian roulette, splitting, a modified version of the directional bremsstrahlung splitting, and the azimuthal particle redistribution. Their application is controlled by an ant colony algorithm based on an importance map. Results: The procedure has been applied to radiosurgery beams. Specifically, the authors have calculated depth-dose profiles, off-axis ratios, and output factors, quantities usually considered in the commissioning of these beams. The agreement between Monte Carlo results and the corresponding measurements is within {approx}3%/0.3 mm for the central axis percentage depth dose and the dose profiles. The importance map generated in the calculation can be used to discuss simulation details in the different parts of the geometry in a simple way. The simulation CPU times are comparable to those needed within other approaches common in this field. Conclusions: The new approach is competitive with those previously used in this kind of problems (PSF generation or source models) and has some practical advantages that make it to be a good tool to simulate the radiation transport in problems where the quantities of interest are difficult to obtain because of low statistics.
Monte Carlo Simulation of 6MV Elekta Synergy Platform Linac photon beam using Gate/Geant4
Tayalati, Yahya; Zerfaoui, Mustafa; Moussaa, Abdellilah
2013-01-01
The present work is devoted to develop a computational model using the Gate Monte Carlo software for the simulation of a 6MV photon beam given by Elekta Synergy Platform medical linear accelerator treatment head. The model includes the major components of the multileaf accelerator head and a homogeneous water phantom. Calculations were performed for a photon beam with several treatment fields size ranging from 5*5 cm2 to 30*30 cm2 at 100 cm distance from source. The simulation is successfully validated by comparison with experimental distributions measured at the Regional Hassan II Oncology Center. Good agreement between simulations and measurements was observed, with dose differences of about 1.6% and 1.8% for depth doses and lateral dose profiles, respectively. The gamma index comparisons were also performed where more than 98% of the points for all simulations passed the standard quality assurance criteria of 3mm/3%.
Monte Carlo simulation for calculation of fragments produced by 400 MeV/u carbon ion beam in water
Ou, Hai-Feng; Zhang, Bin; Zhao, Shu-Jun
2017-04-01
Monte Carlo simulation was an important approach to obtain accurate characteristics of radiotherapy. In this work, a 400 MeV/u carbon ion beam incident on water phantom was simulated with Gate/Geant4 tools. The authors obtained the dose distributions of H, He, Li, Be, B, C and their isotopes in water phantom, and drew a conclusion that the dose of 11C was the main reason of causing the embossment of total dose curve around 252 mm depth. The authors also studied detailedly the dose contribution distributions, yield distributions and average energy distributions of all kinds of fragments. The information of four distributions was very meaningful for understanding the effect of fragments in carbon ion beam radiotherapy. The method of this simulation was easy to extend. For example, for obtaining a special result, we may change the particle energy, particle type, target material, target geometry, physics process, detector, etc.
Morone, M Cristina; Calabretta, Luciano; Cuttone, Giacomo; Fiorini, Francesca
2008-11-07
Protons and carbon ion beams for hadron therapy can be delivered by cyclotrons with a fixed energy. In order to treat patients, an energy degrader along the beam line will be used to match the particle range with the target depth. Fragmentation reactions of carbon ions inside the degrader material could introduce a small amount of unwanted contaminants to the beam, giving additional dose to the patient out of the target volume. A simulation study using the FLUKA Monte Carlo code has been carried out by considering three different materials as the degrader. Two situations have been studied: a realistic one, lowering the carbon beam energy from 300 MeV/n to 220 MeV/n, corresponding to a range of 10 cm in water, and the worst possible case, lowering the carbon energy to 50 MeV/n, corresponding to the millimeter range. The main component of the contaminant is represented by alpha particles and protons, with a typical momentum after the degrader greater than that of the primary beam, and can be eliminated by the action of a momentum analyzing system and slits, and by a second thin absorber. The residual component of fragments reaching the patient is negligible with respect to the fragment quantity generated by the primary beam inside the patient before arriving at the end of the target volume.
Energy Technology Data Exchange (ETDEWEB)
Yoriyaz, Helio; Siqueira, Paulo T.D.; Zevallos-Chavez, Juan Y. [Instituto de Pesquisas Energeticas e Nucleares (IPEN), Sao Paulo, SP (Brazil). Centro de Engenharia Nuclear]. E-mail: hyoriyaz@ipen.br; Furnari, Laura; Poli, Maria Esmeralda R. [Sao Paulo Univ., SP (Brazil). Faculdade de Medicina. Hospital das Clinicas
2005-07-01
Radial dose distributions have been obtained for several electron beam field sizes through the Monte Carlo simulation. Measurements were performed by an ionization chamber in a 50x50x50 cm{sup 3} water phantom which is routinely used for calibration. Calculated and measured values were compared to adjust the input energy spectra used for the Monte Carlo simulation. The methodology presented here is part of the 'tuning procedure' for the construction of electron beam sources typically used for radiotherapy. (author)
Energy Technology Data Exchange (ETDEWEB)
Palmans, H. [Ghent Univ. (Belgium). Dept. of Biomedical Physics; Verhaegen, F.
1995-12-01
In the last decade, several clinical proton beam therapy facilities have been developed. To satisfy the demand for uniformity in clinical (routine) proton beam dosimetry two dosimetry protocols (ECHED and AAPM) have been published. Both protocols neglect the influence of ion chamber dependent parameters on dose determination in proton beams because of the scatter properties of these beams, although the problem has not been studied thoroughly yet. A comparison between water calorimetry and ionisation chamber dosimetry showed a discrepancy of 2.6% between the former method and ionometry following the ECHED protocol. Possibly, a small part of this difference can be attributed to chamber dependent correction factors. Indications for this possibility are found in ionometry measurements. To allow the simulation of complex geometries with different media necessary for the study of those corrections, an existing proton Monte Carlo code (PTRAN, Berger) has been modified. The original code, that applies Mollire`s multiple scattering theory and Vavilov`s energy straggling theory, calculates depth dose profiles, energy distributions and radial distributions for pencil beams in water. Comparisons with measurements and calculations reported in the literature are done to test the program`s accuracy. Preliminary results of the influence of chamber design and chamber materials on dose to water determination are presented.
Energy Technology Data Exchange (ETDEWEB)
Marcus, Ryan C. [Los Alamos National Laboratory
2012-07-25
MCMini is a proof of concept that demonstrates the possibility for Monte Carlo neutron transport using OpenCL with a focus on performance. This implementation, written in C, shows that tracing particles and calculating reactions on a 3D mesh can be done in a highly scalable fashion. These results demonstrate a potential path forward for MCNP or other Monte Carlo codes.
Monte Carlo Calculations of Dose to Medium and Dose to Water for Carbon Ion Beams in Various Media
DEFF Research Database (Denmark)
Herrmann, Rochus; Petersen, Jørgen B.B.; Jäkel, Oliver
. The dose to medium (Dm ) may however differ from Dw , due to the different particle spectrum and stopping power found herein. Monte Carlo particle transport codes are capable of directly calculating dose to medium (Dm ), and was for instance recently investigated by Paganetti 2009 for various proton...... treatment plans. Here, we quantisize the effect of dose to water vs. dose to medium for a series of typical target materials found in medical physics. 2 Material and Methods The Monte Carlo code FLUKA [Battistioni et al. 2007] is used to simulate the particle fluence spectrum in a series of target...... the PSTAR, ASTAR stopping power routines available at NIST1 and MSTAR2 provided by H. Paul et al. 3 Results For a pristine carbon ion beam we encountered a maximum deviation between Dw and Dm up to 8% for bone. In addition we investigate spread out Bragg peak configurations which dilutes the effect...
Diffenderfer, Eric S; Dolney, Derek; Schaettler, Maximilian; Sanzari, Jenine K; McDonough, James; Cengel, Keith A
2014-03-01
The space radiation environment imposes increased dangers of exposure to ionizing radiation, particularly during a solar particle event (SPE). These events consist primarily of low energy protons that produce a highly inhomogeneous dose distribution. Due to this inherent dose heterogeneity, experiments designed to investigate the radiobiological effects of SPE radiation present difficulties in evaluating and interpreting dose to sensitive organs. To address this challenge, we used the Geant4 Monte Carlo simulation framework to develop dosimetry software that uses computed tomography (CT) images and provides radiation transport simulations incorporating all relevant physical interaction processes. We found that this simulation accurately predicts measured data in phantoms and can be applied to model dose in radiobiological experiments with animal models exposed to charged particle (electron and proton) beams. This study clearly demonstrates the value of Monte Carlo radiation transport methods for two critically interrelated uses: (i) determining the overall dose distribution and dose levels to specific organ systems for animal experiments with SPE-like radiation, and (ii) interpreting the effect of random and systematic variations in experimental variables (e.g. animal movement during long exposures) on the dose distributions and consequent biological effects from SPE-like radiation exposure. The software developed and validated in this study represents a critically important new tool that allows integration of computational and biological modeling for evaluating the biological outcomes of exposures to inhomogeneous SPE-like radiation dose distributions, and has potential applications for other environmental and therapeutic exposure simulations.
Energy Technology Data Exchange (ETDEWEB)
Schach von Wittenau, A.E.; Cox, L.J.; Bergstrom, P.M. Jr.; Hornstein, S.M. [Lawrence Livermore National Lab., CA (United States); Mohan, R.; Libby, B.; Wu, Q. [Medical Coll. of Virginia, Richmond, VA (United States); Lovelock, D.M.J. [Memorial Sloan-Kettering Cancer Center, New York, NY (United States)
1997-03-01
The goal of the PEREGRINE Monte Carlo Dose Calculation Project is to deliver a Monte Carlo package that is both accurate and sufficiently fast for routine clinical use. One of the operational requirements for photon-treatment plans is a fast, accurate method of describing the photon phase-space distribution at the surface of the patient. The open-field case is computationally the most tractable; we know, a priori, for a given machine and energy, the locations and compositions of the relevant accelerator components (i.e., target, primary collimator, flattening filter, and monitor chamber). Therefore, we can precalculate and store the expected photon distributions. For any open-field treatment plan, we then evaluate these existing photon phase-space distributions at the patient`s surface, and pass the obtained photons to the dose calculation routines within PEREGRINE. We neglect any effect of the intervening air column, including attenuation of the photons and production of contaminant electrons. In principle, for treatment plans requiring jaws, blocks, and wedges, we could precalculate and store photon phase-space distributions for various combinations of field sizes and wedges. This has the disadvantage that we would have to anticipate those combinations and that subsequently PEREGRINE would not be able to treat other plans. Therefore, PEREGRINE tracks photons through the patient-dependent beam modifiers. The geometric and physics methods used to do this are described here. 4 refs., 8 figs.
Monte Carlo methods for electromagnetics
Sadiku, Matthew NO
2009-01-01
Until now, novices had to painstakingly dig through the literature to discover how to use Monte Carlo techniques for solving electromagnetic problems. Written by one of the foremost researchers in the field, Monte Carlo Methods for Electromagnetics provides a solid understanding of these methods and their applications in electromagnetic computation. Including much of his own work, the author brings together essential information from several different publications.Using a simple, clear writing style, the author begins with a historical background and review of electromagnetic theory. After addressing probability and statistics, he introduces the finite difference method as well as the fixed and floating random walk Monte Carlo methods. The text then applies the Exodus method to Laplace's and Poisson's equations and presents Monte Carlo techniques for handing Neumann problems. It also deals with whole field computation using the Markov chain, applies Monte Carlo methods to time-varying diffusion problems, and ...
Directory of Open Access Journals (Sweden)
Zahra Anjomani
2011-03-01
Full Text Available Introduction: Nowadays new radiochromic films have an essential role in radiotherapy dosimetry. Properties such as high sensitivity, good reproducibility, high spatial resolution, easy readout and portability have made them attractive for dosimetry, especially in high-dose-gradient regions. Material and Methods: In this study, electron-beam dose distributions in homogenous and heterogeneous phantoms were calculated using the MCNPX Monte Carlo code and compared with experimental measurements obtained by GAFCHROMIC® EBT film and p-type silicon diode dosimetry. Irradiation was carried out using an Elekta linear accelerator at two different electron energies (8 and 15 MeV, with a 10×10 cm2 applicator and at 100 cm source-to-surface distance. Results: The results show good agreement (within 2% between radiochromic film measurements and MCNP results. Conclusions: The results show that the new radiochromic films can be used in electron dosimetry and that they are also reliable in presence of heterogeneous media.
Metropolis Methods for Quantum Monte Carlo Simulations
Ceperley, D. M.
2003-01-01
Since its first description fifty years ago, the Metropolis Monte Carlo method has been used in a variety of different ways for the simulation of continuum quantum many-body systems. This paper will consider some of the generalizations of the Metropolis algorithm employed in quantum Monte Carlo: Variational Monte Carlo, dynamical methods for projector monte carlo ({\\it i.e.} diffusion Monte Carlo with rejection), multilevel sampling in path integral Monte Carlo, the sampling of permutations, ...
Mizutani, Shohei; Takada, Yoshihisa; Kohno, Ryosuke; Hotta, Kenji; Tansho, Ryohei; Akimoto, Tetsuo
2016-03-01
Full Monte Carlo (FMC) calculation of dose distribution has been recognized to have superior accuracy, compared with the pencil beam algorithm (PBA). However, since the FMC methods require long calculation time, it is difficult to apply them to routine treatment planning at present. In order to improve the situation, a simplified Monte Carlo (SMC) method has been introduced to the dose kernel calculation applicable to dose optimization procedure for the proton pencil beam scanning. We have evaluated accuracy of the SMC calculation by comparing a result of the dose kernel calculation using the SMC method with that using the FMC method in an inhomogeneous phantom. The dose distribution obtained by the SMC method was in good agreement with that obtained by the FMC method. To assess the usefulness of SMC calculation in clinical situations, we have compared results of the dose calculation using the SMC with those using the PBA method for three clinical cases of tumor treatment. The dose distributions calculated with the PBA dose kernels appear to be homogeneous in the planning target volumes (PTVs). In practice, the dose distributions calculated with the SMC dose kernels with the spot weights optimized with the PBA method show largely inhomogeneous dose distributions in the PTVs, while those with the spot weights optimized with the SMC method have moderately homogeneous distributions in the PTVs. Calculation using the SMC method is faster than that using the GEANT4 by three orders of magnitude. In addition, the graphic processing unit (GPU) boosts the calculation speed by 13 times for the treatment planning using the SMC method. Thence, the SMC method will be applicable to routine clinical treatment planning for reproduction of the complex dose distribution more accurately than the PBA method in a reasonably short time by use of the GPU-based calculation engine. PACS number(s): 87.55.Gh.
Graves, Yan Jiang; Jia, Xun; Jiang, Steve B
2013-03-21
The γ-index test has been commonly adopted to quantify the degree of agreement between a reference dose distribution and an evaluation dose distribution. Monte Carlo (MC) simulation has been widely used for the radiotherapy dose calculation for both clinical and research purposes. The goal of this work is to investigate both theoretically and experimentally the impact of the MC statistical fluctuation on the γ-index test when the fluctuation exists in the reference, the evaluation, or both dose distributions. To the first order approximation, we theoretically demonstrated in a simplified model that the statistical fluctuation tends to overestimate γ-index values when existing in the reference dose distribution and underestimate γ-index values when existing in the evaluation dose distribution given the original γ-index is relatively large for the statistical fluctuation. Our numerical experiments using realistic clinical photon radiation therapy cases have shown that (1) when performing a γ-index test between an MC reference dose and a non-MC evaluation dose, the average γ-index is overestimated and the gamma passing rate decreases with the increase of the statistical noise level in the reference dose; (2) when performing a γ-index test between a non-MC reference dose and an MC evaluation dose, the average γ-index is underestimated when they are within the clinically relevant range and the gamma passing rate increases with the increase of the statistical noise level in the evaluation dose; (3) when performing a γ-index test between an MC reference dose and an MC evaluation dose, the gamma passing rate is overestimated due to the statistical noise in the evaluation dose and underestimated due to the statistical noise in the reference dose. We conclude that the γ-index test should be used with caution when comparing dose distributions computed with MC simulation.
Tessonnier, T.; Mairani, A.; Brons, S.; Sala, P.; Cerutti, F.; Ferrari, A.; Haberer, T.; Debus, J.; Parodi, K.
2017-08-01
In the field of particle therapy helium ion beams could offer an alternative for radiotherapy treatments, owing to their interesting physical and biological properties intermediate between protons and carbon ions. We present in this work the comparisons and validations of the Monte Carlo FLUKA code against in-depth dosimetric measurements acquired at the Heidelberg Ion Beam Therapy Center (HIT). Depth dose distributions in water with and without ripple filter, lateral profiles at different depths in water and a spread-out Bragg peak were investigated. After experimentally-driven tuning of the less known initial beam characteristics in vacuum (beam lateral size and momentum spread) and simulation parameters (water ionization potential), comparisons of depth dose distributions were performed between simulations and measurements, which showed overall good agreement with range differences below 0.1 mm and dose-weighted average dose-differences below 2.3% throughout the entire energy range. Comparisons of lateral dose profiles showed differences in full-width-half-maximum lower than 0.7 mm. Measurements of the spread-out Bragg peak indicated differences with simulations below 1% in the high dose regions and 3% in all other regions, with a range difference less than 0.5 mm. Despite the promising results, some discrepancies between simulations and measurements were observed, particularly at high energies. These differences were attributed to an underestimation of dose contributions from secondary particles at large angles, as seen in a triple Gaussian parametrization of the lateral profiles along the depth. However, the results allowed us to validate FLUKA simulations against measurements, confirming its suitability for 4He ion beam modeling in preparation of clinical establishment at HIT. Future activities building on this work will include treatment plan comparisons using validated biological models between proton and helium ions, either within a Monte Carlo
Fix, Michael K; Cygler, Joanna; Frei, Daniel; Volken, Werner; Neuenschwander, Hans; Born, Ernst J; Manser, Peter
2013-05-07
The electron Monte Carlo (eMC) dose calculation algorithm available in the Eclipse treatment planning system (Varian Medical Systems) is based on the macro MC method and uses a beam model applicable to Varian linear accelerators. This leads to limitations in accuracy if eMC is applied to non-Varian machines. In this work eMC is generalized to also allow accurate dose calculations for electron beams from Elekta and Siemens accelerators. First, changes made in the previous study to use eMC for low electron beam energies of Varian accelerators are applied. Then, a generalized beam model is developed using a main electron source and a main photon source representing electrons and photons from the scattering foil, respectively, an edge source of electrons, a transmission source of photons and a line source of electrons and photons representing the particles from the scrapers or inserts and head scatter radiation. Regarding the macro MC dose calculation algorithm, the transport code of the secondary particles is improved. The macro MC dose calculations are validated with corresponding dose calculations using EGSnrc in homogeneous and inhomogeneous phantoms. The validation of the generalized eMC is carried out by comparing calculated and measured dose distributions in water for Varian, Elekta and Siemens machines for a variety of beam energies, applicator sizes and SSDs. The comparisons are performed in units of cGy per MU. Overall, a general agreement between calculated and measured dose distributions for all machine types and all combinations of parameters investigated is found to be within 2% or 2 mm. The results of the dose comparisons suggest that the generalized eMC is now suitable to calculate dose distributions for Varian, Elekta and Siemens linear accelerators with sufficient accuracy in the range of the investigated combinations of beam energies, applicator sizes and SSDs.
Fix, Michael K.; Cygler, Joanna; Frei, Daniel; Volken, Werner; Neuenschwander, Hans; Born, Ernst J.; Manser, Peter
2013-05-01
The electron Monte Carlo (eMC) dose calculation algorithm available in the Eclipse treatment planning system (Varian Medical Systems) is based on the macro MC method and uses a beam model applicable to Varian linear accelerators. This leads to limitations in accuracy if eMC is applied to non-Varian machines. In this work eMC is generalized to also allow accurate dose calculations for electron beams from Elekta and Siemens accelerators. First, changes made in the previous study to use eMC for low electron beam energies of Varian accelerators are applied. Then, a generalized beam model is developed using a main electron source and a main photon source representing electrons and photons from the scattering foil, respectively, an edge source of electrons, a transmission source of photons and a line source of electrons and photons representing the particles from the scrapers or inserts and head scatter radiation. Regarding the macro MC dose calculation algorithm, the transport code of the secondary particles is improved. The macro MC dose calculations are validated with corresponding dose calculations using EGSnrc in homogeneous and inhomogeneous phantoms. The validation of the generalized eMC is carried out by comparing calculated and measured dose distributions in water for Varian, Elekta and Siemens machines for a variety of beam energies, applicator sizes and SSDs. The comparisons are performed in units of cGy per MU. Overall, a general agreement between calculated and measured dose distributions for all machine types and all combinations of parameters investigated is found to be within 2% or 2 mm. The results of the dose comparisons suggest that the generalized eMC is now suitable to calculate dose distributions for Varian, Elekta and Siemens linear accelerators with sufficient accuracy in the range of the investigated combinations of beam energies, applicator sizes and SSDs.
A Monte Carlo pencil beam scanning model for proton treatment plan simulation using GATE/GEANT4
Energy Technology Data Exchange (ETDEWEB)
Grevillot, L; Freud, N; Sarrut, D [Universite de Lyon, CREATIS, CNRS UMR5220, Inserm U1044, INSA-Lyon, Universite Lyon 1, Centre Leon Berard, Lyon (France); Bertrand, D; Dessy, F, E-mail: loic.grevillot@creatis.insa-lyon.fr [IBA, B-1348, Louvain-la Neuve (Belgium)
2011-08-21
This work proposes a generic method for modeling scanned ion beam delivery systems, without simulation of the treatment nozzle and based exclusively on beam data library (BDL) measurements required for treatment planning systems (TPS). To this aim, new tools dedicated to treatment plan simulation were implemented in the Gate Monte Carlo platform. The method was applied to a dedicated nozzle from IBA for proton pencil beam scanning delivery. Optical and energy parameters of the system were modeled using a set of proton depth-dose profiles and spot sizes measured at 27 therapeutic energies. For further validation of the beam model, specific 2D and 3D plans were produced and then measured with appropriate dosimetric tools. Dose contributions from secondary particles produced by nuclear interactions were also investigated using field size factor experiments. Pristine Bragg peaks were reproduced with 0.7 mm range and 0.2 mm spot size accuracy. A 32 cm range spread-out Bragg peak with 10 cm modulation was reproduced with 0.8 mm range accuracy and a maximum point-to-point dose difference of less than 2%. A 2D test pattern consisting of a combination of homogeneous and high-gradient dose regions passed a 2%/2 mm gamma index comparison for 97% of the points. In conclusion, the generic modeling method proposed for scanned ion beam delivery systems was applicable to an IBA proton therapy system. The key advantage of the method is that it only requires BDL measurements of the system. The validation tests performed so far demonstrated that the beam model achieves clinical performance, paving the way for further studies toward TPS benchmarking. The method involves new sources that are available in the new Gate release V6.1 and could be further applied to other particle therapy systems delivering protons or other types of ions like carbon.
Erazo, F.; Brualla, L.; Lallena, A. M.
2014-11-01
In this work we calculate the beam quality correction factor {{k}\\text{Q,{{\\text{Q}}0}}} for various plane-parallel ionization chambers. A set of Monte Carlo calculations using the code penelope/penEasy have been carried out to calculate the overall correction factor fc,Q for eight electron beams corresponding to a Varian Clinac 2100 C/D, with nominal energies ranging between 6 MeV and 22 MeV, for a 60Co beam, that has been used as the reference quality Q0 and also for eight monoenergetic electron beams reproducing the quality index R50 of the Clinac beams. Two field sizes, 10 × 10 cm2 and 20 × 20 cm2 have been considered. The {{k}\\text{Q,{{\\text{Q}}0}}} factors have been calculated as the ratio between fc,Q and {{f}\\text{c,{{\\text{Q}}0}}} . Values for the Exradin A10, A11, A11TW, P11, P11TW, T11 and T11TW ionization chambers, manufactured by Standard Imaging, as well as for the NACP-02 have been obtained. The results found with the Clinac beams for the two field sizes analyzed show differences below 0.6%, even in the case of the higher energy electron beams. The {{k}\\text{Q,{{\\text{Q}}0}}} values obtained with the Clinac beams are 1% larger than those found with the monoenergetic beams for the higher energies, above 12 MeV. This difference can be ascribed to secondary photons produced in the linac head and the air path towards the phantom. Contrary to what was quoted in a previous work (Sempau et al 2004 Phys. Med. Biol. 49 4427-44), the beam quality correction factors obtained with the complete Clinac geometries and with the monoenergetic beams differ significantly for energies above 12 MeV. Material differences existing between chambers that have the same geometry produce non-negligible modifications in the value of these correction factors.
Lectures on Monte Carlo methods
Madras, Neal
2001-01-01
Monte Carlo methods form an experimental branch of mathematics that employs simulations driven by random number generators. These methods are often used when others fail, since they are much less sensitive to the "curse of dimensionality", which plagues deterministic methods in problems with a large number of variables. Monte Carlo methods are used in many fields: mathematics, statistics, physics, chemistry, finance, computer science, and biology, for instance. This book is an introduction to Monte Carlo methods for anyone who would like to use these methods to study various kinds of mathemati
Energy Technology Data Exchange (ETDEWEB)
Lymperopoulou, G. [Nuclear and Particle Physics Section, Physics Department, University of Athens, Panepistimioupolis, Ilisia, 157 71 Athens (Greece)], E-mail: glymper@phys.uoa.gr; Petrokokkinos, L.; Papagiannis, P. [Nuclear and Particle Physics Section, Physics Department, University of Athens, Panepistimioupolis, Ilisia, 157 71 Athens (Greece); Steiner, M.; Spevacek, V.; Semnicka, J.; Dvorak, P. [Czech Technical University, Faculty of Nuclear Sciences and Physical Engineering, Department of Dosimetry and Application of Ionizing Radiation, Brehova 7 115 19, Prague 1 (Czech Republic); Seimenis, I. [Nuclear and Particle Physics Section, Physics Department, University of Athens, Panepistimioupolis, Ilisia, 157 71 Athens (Greece)
2007-09-21
The Leksell Gamma Knife is a stereotactic radio-surgery unit for the treatment of small volumes (on the order of 25 mm{sup 3}) that employs a hemispherical configuration of 201 {sup 60}Co sources and appropriate configurations of collimation to form beams of 4, 8, 14 and 18 mm nominal diameter at the Unit Center Point (UCP). Although Monte Carlo (MC) simulation is well suited for narrow-beam dosimetry, experimental dosimetry is required at least for acceptance testing and quality assurance purposes. Besides other drawbacks of conventional point dosimeters, the main problems associated with narrow-beam dosimetry in stereotactic applications are accurate positioning and volume averaging. In this work, MCNPX and EGSnrc MC simulation dosimetry results for a Gamma Knife unit are benchmarked through their comparison to treatment planning software calculations based on radio-chromic film measurements. Then, MC dosimetry results are utilized to optimize the only three-dimensional experimental dosimetry method available; the polymer gel-Magnetic Resonance Imaging (MRI) method. MC results are used to select the spatial resolution in the imaging session of the irradiated gels and validate a mathematical tool for the localization of the UCP in the three-dimensional experimental dosimetry data acquired. Experimental results are compared with corresponding MC calculations and shown capable to provide accurate dosimetry, free of volume averaging and positioning uncertainties.
Rivera de Mena, Antonio; Crespillo Almenara, Miguel; Olivares Roza, Jimena; García, G.; Argullo Lopez, Fernando
2010-01-01
We present a MonteCarlo approach to the non-radiative exciton-decay model recently proposed to describe ion-beam damage in LiNbO3 produced in the electronic excitation regime. It takes into account the statistical (random) spatial distribution of ion impacts on the crystal surface. The MonteCarlo approach is necessary to simulate the evolution of the damage morphology with irradiation fluence from the single track regime to the overlapping track regime. A detailed comparison between the morph...
Comparative Dosimetric Estimates of a 25 keV Electron Micro-beam with three Monte Carlo Codes
Energy Technology Data Exchange (ETDEWEB)
Mainardi, Enrico; Donahue, Richard J.; Blakely, Eleanor A.
2002-09-11
The calculations presented compare the different performances of the three Monte Carlo codes PENELOPE-1999, MCNP-4C and PITS, for the evaluation of Dose profiles from a 25 keV electron micro-beam traversing individual cells. The overall model of a cell is a water cylinder equivalent for the three codes but with a different internal scoring geometry: hollow cylinders for PENELOPE and MCNP, whereas spheres are used for the PITS code. A cylindrical cell geometry with scoring volumes with the shape of hollow cylinders was initially selected for PENELOPE and MCNP because of its superior simulation of the actual shape and dimensions of a cell and for its improved computer-time efficiency if compared to spherical internal volumes. Some of the transfer points and energy transfer that constitute a radiation track may actually fall in the space between spheres, that would be outside the spherical scoring volume. This internal geometry, along with the PENELOPE algorithm, drastically reduced the computer time when using this code if comparing with event-by-event Monte Carlo codes like PITS. This preliminary work has been important to address dosimetric estimates at low electron energies. It demonstrates that codes like PENELOPE can be used for Dose evaluation, even with such small geometries and energies involved, which are far below the normal use for which the code was created. Further work (initiated in Summer 2002) is still needed however, to create a user-code for PENELOPE that allows uniform comparison of exact cell geometries, integral volumes and also microdosimetric scoring quantities, a field where track-structure codes like PITS, written for this purpose, are believed to be superior.
Energy Technology Data Exchange (ETDEWEB)
Pavon, Ester Carrasco [Dpto Fisiologia Medica y Biofisica, Facultad de Medicina, Universidad de Sevilla, Avda Sanchez Pizjuan, 4, E-41009, Sevilla (Spain); Sanchez-Doblado, Francisco [Dpto Fisiologia Medica y Biofisica, Facultad de Medicina, Universidad de Sevilla, Avda Sanchez Pizjuan, 4, E-41009, Sevilla (Spain); Leal, Antonio [Dpto Fisiologia Medica y Biofisica, Facultad de Medicina, Universidad de Sevilla, Avda Sanchez Pizjuan, 4, E-41009, Sevilla (Spain); Capote, Roberto [Dpto Fisiologia Medica y Biofisica, Facultad de Medicina, Universidad de Sevilla, Avda Sanchez Pizjuan, 4, E-41009, Sevilla (Spain); Lagares, Juan Ignacio [Dpto Fisiologia Medica y Biofisica, Facultad de Medicina, Universidad de Sevilla, Avda Sanchez Pizjuan, 4, E-41009, Sevilla (Spain); Perucha, Maria [Dpto Fisiologia Medica y Biofisica, Facultad de Medicina, Universidad de Sevilla, Avda Sanchez Pizjuan, 4, E-41009, Sevilla (Spain); Arrans, Rafael [Dpto Fisiologia Medica y Biofisica, Facultad de Medicina, Universidad de Sevilla, Avda Sanchez Pizjuan, 4, E-41009, Sevilla (Spain)
2003-09-07
Total skin electron therapy (TSET) is a complex technique which requires non-standard measurements and dosimetric procedures. This paper investigates an essential first step towards TSET Monte Carlo (MC) verification. The non-standard 6 MeV 40 x 40 cm{sup 2} electron beam at a source to surface distance (SSD) of 100 cm as well as its horizontal projection behind a polymethylmethacrylate (PMMA) screen to SSD = 380 cm were evaluated. The EGS4 OMEGA-BEAM code package running on a Linux home made 47 PCs cluster was used for the MC simulations. Percentage depth-dose curves and profiles were calculated and measured experimentally for the 40 x 40 cm{sup 2} field at both SSD = 100 cm and patient surface SSD = 380 cm. The output factor (OF) between the reference 40 x 40 cm{sup 2} open field and its horizontal projection as TSET beam at SSD = 380 cm was also measured for comparison with MC results. The accuracy of the simulated beam was validated by the good agreement to within 2% between measured relative dose distributions, including the beam characteristic parameters (R{sub 50}, R{sub 80}, R{sub 100}, R{sub p}, E{sub 0}) and the MC calculated results. The energy spectrum, fluence and angular distribution at different stages of the beam (at SSD = 100 cm, at SSD = 364.2 cm, behind the PMMA beam spoiler screen and at treatment surface SSD = 380 cm) were derived from MC simulations. Results showed a final decrease in mean energy of almost 56% from the exit window to the treatment surface. A broader angular distribution (FWHM of the angular distribution increased from 13deg at SSD 100 cm to more than 30deg at the treatment surface) was fully attributable to the PMMA beam spoiler screen. OF calculations and measurements agreed to less than 1%. The effect of changing the electron energy cut-off from 0.7 MeV to 0.521 MeV and air density fluctuations in the bunker which could affect the MC results were shown to have a negligible impact on the beam fluence distributions. Results
Monte Carlo integration on GPU
Kanzaki, J.
2010-01-01
We use a graphics processing unit (GPU) for fast computations of Monte Carlo integrations. Two widely used Monte Carlo integration programs, VEGAS and BASES, are parallelized on GPU. By using $W^{+}$ plus multi-gluon production processes at LHC, we test integrated cross sections and execution time for programs in FORTRAN and C on CPU and those on GPU. Integrated results agree with each other within statistical errors. Execution time of programs on GPU run about 50 times faster than those in C...
Energy Technology Data Exchange (ETDEWEB)
Belosi, Maria F.; Fogliata, Antonella, E-mail: antonella.fogliata-cozzi@eoc.ch, E-mail: afc@iosi.ch; Cozzi, Luca; Clivio, Alessandro; Nicolini, Giorgia; Vanetti, Eugenio [Oncology Institute of Southern Switzerland, Medical Physics Unit, Bellinzona CH-6500 (Switzerland); Rodriguez, Miguel [Institut de Tècniques Energètiques, Universitat Politècnica de Catalunya, Barcelona E-08028 (Spain); Sempau, Josep [Institut de Tècniques Energètiques, Universitat Politècnica de Catalunya, Barcelona E-08028, Spain and Spanish Networking Research Center CIBER-BBN, Barcelona E-08028 (Spain); Krauss, Harald [Kaiser-Franz-Josef-Spital, Institut für Radioonkologie, Vienna A-1100 (Austria); Khamphan, Catherine [Institut Sainte Catherine, Medical Physics Unit, Avignon F-84000 (France); Fenoglietto, Pascal [Départment de Cancérologie Radiothérapie, CRLC Val d’Aurelle-Paul Lamarque, Montpellier F-34090 (France); Puxeu, Josep [Medical Physics Department, Institut Català d’Oncologia, Barcelona E-08028 (Spain); Fedele, David [Radio-Oncology Department, Casa di Cura San Rossore, Pisa I-56100 (Italy); Mancosu, Pietro [Radiation Oncology Department, Humanitas Clinical and Research Center, Rozzano-Milan I-20089 (Italy); Brualla, Lorenzo [NCTeam, Strahlenklinik, Universitätsklinikum Essen, Essen D-45122 (Germany)
2014-05-15
Purpose: Phase-space files for Monte Carlo simulation of the Varian TrueBeam beams have been made available by Varian. The aim of this study is to evaluate the accuracy of the distributed phase-space files for flattening filter free (FFF) beams, against experimental measurements from ten TrueBeam Linacs. Methods: The phase-space files have been used as input in PRIMO, a recently released Monte Carlo program based on thePENELOPE code. Simulations of 6 and 10 MV FFF were computed in a virtual water phantom for field sizes 3 × 3, 6 × 6, and 10 × 10 cm{sup 2} using 1 × 1 × 1 mm{sup 3} voxels and for 20 × 20 and 40 × 40 cm{sup 2} with 2 × 2 × 2 mm{sup 3} voxels. The particles contained in the initial phase-space files were transported downstream to a plane just above the phantom surface, where a subsequent phase-space file was tallied. Particles were transported downstream this second phase-space file to the water phantom. Experimental data consisted of depth doses and profiles at five different depths acquired at SSD = 100 cm (seven datasets) and SSD = 90 cm (three datasets). Simulations and experimental data were compared in terms of dose difference. Gamma analysis was also performed using 1%, 1 mm and 2%, 2 mm criteria of dose-difference and distance-to-agreement, respectively. Additionally, the parameters characterizing the dose profiles of unflattened beams were evaluated for both measurements and simulations. Results: Analysis of depth dose curves showed that dose differences increased with increasing field size and depth; this effect might be partly motivated due to an underestimation of the primary beam energy used to compute the phase-space files. Average dose differences reached 1% for the largest field size. Lateral profiles presented dose differences well within 1% for fields up to 20 × 20 cm{sup 2}, while the discrepancy increased toward 2% in the 40 × 40 cm{sup 2} cases. Gamma analysis resulted in an agreement of 100% when a 2%, 2 mm criterion
Çatli, Serap
2015-09-08
High atomic number and density of dental implants leads to major problems at providing an accurate dose distribution in radiotherapy and contouring tumors and organs caused by the artifact in head and neck tumors. The limits and deficiencies of the algorithms using in the treatment planning systems can lead to large errors in dose calculation, and this may adversely affect the patient's treatment. In the present study, four commercial dental implants were used: pure titanium, titanium alloy (Ti-6Al-4V), amalgam, and crown. The effects of dental implants on dose distribution are determined with two methods: pencil beam convolution (PBC) algorithm and Monte Carlo code for 6 MV photon beam. The central axis depth doses were calculated on the phantom for a source-skin distance (SSD) of 100 cm and a 10 × 10 cm2 field using both of algorithms. The results of Monte Carlo method and Eclipse TPS were compared to each other and to those previously reported. In the present study, dose increases in tissue at a distance of 2 mm in front of the dental implants were seen due to the backscatter of electrons for dental implants at 6 MV using the Monte Carlo method. The Eclipse treatment planning system (TPS) couldn't precisely account for the backscatter radiation caused by the dental prostheses. TPS underestimated the back scatter dose and overestimated the dose after the dental implants. The large errors found for TPS in this study are due to the limits and deficiencies of the algorithms. The accuracy of the PBC algorithm of Eclipse TPS was evaluated in comparison to Monte Carlo calculations in consideration of the recommendations of the American Association of Physicists in Medicine Radiation Therapy Committee Task Group 65. From the comparisons of the TPS and Monte Carlo calculations, it is verified that the Monte Carlo simulation is a good approach to derive the dose distribution in heterogeneous media.
Çatli, Serap
2015-09-01
High atomic number and density of dental implants leads to major problems at providing an accurate dose distribution in radiotherapy and contouring tumors and organs caused by the artifact in head and neck tumors. The limits and deficiencies of the algorithms using in the treatment planning systems can lead to large errors in dose calculation, and this may adversely affect the patient's treatment. In the present study, four commercial dental implants were used: pure titanium, titanium alloy (Ti-6Al-4V), amalgam, and crown. The effects of dental implants on dose distribution are determined with two methods: pencil beam convolution (PBC) algorithm and Monte Carlo code for 6 MV photon beam. The central axis depth doses were calculated on the phantom for a source-skin distance (SSD) of 100 cm and a 10×10 cm2 field using both of algorithms. The results of Monte Carlo method and Eclipse TPS were compared to each other and to those previously reported. In the present study, dose increases in tissue at a distance of 2 mm in front of the dental implants were seen due to the backscatter of electrons for dental implants at 6 MV using the Monte Carlo method. The Eclipse treatment planning system (TPS) couldn't precisely account for the backscatter radiation caused by the dental prostheses. TPS underestimated the back scatter dose and overestimated the dose after the dental implants. The large errors found for TPS in this study are due to the limits and deficiencies of the algorithms. The accuracy of the PBC algorithm of Eclipse TPS was evaluated in comparison to Monte Carlo calculations in consideration of the recommendations of the American Association of Physicists in Medicine Radiation Therapy Committee Task Group 65. From the comparisons of the TPS and Monte Carlo calculations, it is verified that the Monte Carlo simulation is a good approach to derive the dose distribution in heterogeneous media. PACS numbers: 87.55.K.
Yamaguchi, Mitsutaka; Nagao, Yuto; Satoh, Takahiro; Sugai, Hiroyuki; Sakai, Makoto; Arakawa, Kazuo; Kawachi, Naoki
2017-01-01
The purpose of this study is to determine whether the main component of the low-energy (63-68 keV) particles emitted perpendicularly to the 12C beam from the 12C-irradiated region in a water phantom is secondary electron bremsstrahlung (SEB). Monte Carlo simulations of a 12C-beam (290 MeV/u) irradiated on a water phantom were performed. A detector was placed beside the water phantom with a lead collimator between the phantom and the detector. To move the Bragg-peak position, a binary filter was placed in an upper stream of the phantom. The energy distributions of the particles incident on the detector and those deposited in the detector were analyzed. The simulation was also performed with suppressed delta-ray and/or bremsstrahlung generation to identify the SEB components. It was found that the particles incident on the detector were predominantly photons and neutrons. The yields of the photons and energy deposition decreased with the suppression of SEB generation. It is concluded that one of the predominant components of the yields in the regions shallower than the Bragg-peak position is due to SEB generation, and these components become significantly smaller in regions deeper than the Bragg-peak position.
Institute of Scientific and Technical Information of China (English)
ZHAO Hong-bin; KONG Xiao-xiao; LI Quan-feng; LIN Xiao-qi; BAO Shang-lian
2009-01-01
Objective:In this study,we try to establish an initial electron beam model by combining Monte Carlo simulation method with particle dynamic calculation (TRSV) for the single 6 MV X-ray accelerating waveguide of BJ- 6 medical linac. Methods and Materials:1. We adapted the treatment head configuration of BJ- 6 medical linac made by Beijing Medical Equipment Institute (BMEI) as the radiation system for this study. 2. Use particle dynamics calculation code called TRSV to drive out the initial electron beam parameters of the energy spectrum, the spatial intensity distribution, and the beam incidence angle. 3. Analyze the 6 MV X-ray beam characteristics of PDDc, OARc in a water phantom by using Monte Carlo simulation (BEAMnrc,DOSXYZnrc) for a preset of the initial electron beam parameters which have been determined by TRSV, do the comparisons of the measured results of PDDm, OARm in a real water phantom, and then use the deviations of calculated and measured results to slightly modify the initial electron beam model back and forth until the deviations meet the error less than 2%. Results:The deviations between the Monte Carlo simulation results of percentage depth doses at PDDc and off-axis ratios OARc and the measured results of PDDm and OARm in a water phantom were within 2%. Conclusion:When doing the Monte Carlo simulation to determine the parameters of an initial electron beam for a particular medical linac like BJ- 6, modifying some parameters based on the particle dynamics calculation code would give some more reasonable and more acceptable results.
Schiapparelli, P; Zefiro, D; Taccini, G
2009-05-01
The aim of this work was to evaluate the performance of the voxel-based Monte Carlo algorithm implemented in the commercial treatment planning system ONCENTRA MASTERPLAN for a 9 MeV electron beam produced by a linear accelerator Varian Clinac 2100 C/D. In order to realize an experimental verification of the computed data, three different groups of tests were planned. The first set was performed in a water phantom to investigate standard fields, custom inserts, and extended treatment distances. The second one concerned standard field, irregular entrance surface, and oblique incidence in a homogeneous PMMA phantom. The last group involved the introduction of inhomogeneities in a PMMA phantom to simulate high and low density materials such as bone and lung. Measurements in water were performed by means of cylindrical and plane-parallel ionization chambers, whereas measurements in PMMA were carried out by the use of radiochromic films. Point dose values were compared in terms of percentage difference, whereas the gamma index tool was used to perform the comparison between computed and measured dose profiles, considering different tolerances according to the test complexity. In the case of transverse scans, the agreement was searched in the plane formed by the intersection of beam axis and the profile (2D analysis), while for percentage depth dose curves, only the beam axis was explored (1D analysis). An excellent agreement was found for point dose evaluation in water (discrepancies smaller than 2%). Also the comparison between planned and measured dose profiles in homogeneous water and PMMA phantoms showed good results (agreement within 2%-2 mm). Profile evaluation in phantoms with internal inhomogeneities showed a good agreement in the case of "lung" insert, while in tests concerning a small "bone" inhomogeneity, a discrepancy was particularly evidenced in dose values on the beam axis. This is due to the inaccurate geometrical description of the phantom that is linked
Tessonnier, T; Böhlen, T T; Ceruti, F; Ferrari, A; Sala, P; Brons, S; Haberer, T; Debus, J; Parodi, K; Mairani, A
2017-07-31
The introduction of 'new' ion species in particle therapy needs to be supported by a thorough assessment of their dosimetric properties and by treatment planning comparisons with clinically used proton and carbon ion beams. In addition to the latter two ions, helium and oxygen ion beams are foreseen at the Heidelberg Ion Beam Therapy Center (HIT) as potential assets for improving clinical outcomes in the near future. We present in this study a dosimetric validation of a FLUKA-based Monte Carlo treatment planning tool (MCTP) for protons, helium, carbon and oxygen ions for spread-out Bragg peaks in water. The comparisons between the ions show the dosimetric advantages of helium and heavier ion beams in terms of their distal and lateral fall-offs with respect to protons, reducing the lateral size of the region receiving 50% of the planned dose up to 12 mm. However, carbon and oxygen ions showed significant doses beyond the target due to the higher fragmentation tail compared to lighter ions (p and He), up to 25%. The Monte Carlo predictions were found to be in excellent geometrical agreement with the measurements, with deviations below 1 mm for all parameters investigated such as target and lateral size as well as distal fall-offs. Measured and simulated absolute dose values agreed within about 2.5% on the overall dose distributions. The MCTP tool, which supports the usage of multiple state-of-the-art relative biological effectiveness models, will provide a solid engine for treatment planning comparisons at HIT.
Tessonnier, T.; Böhlen, T. T.; Ceruti, F.; Ferrari, A.; Sala, P.; Brons, S.; Haberer, T.; Debus, J.; Parodi, K.; Mairani, A.
2017-08-01
The introduction of ‘new’ ion species in particle therapy needs to be supported by a thorough assessment of their dosimetric properties and by treatment planning comparisons with clinically used proton and carbon ion beams. In addition to the latter two ions, helium and oxygen ion beams are foreseen at the Heidelberg Ion Beam Therapy Center (HIT) as potential assets for improving clinical outcomes in the near future. We present in this study a dosimetric validation of a FLUKA-based Monte Carlo treatment planning tool (MCTP) for protons, helium, carbon and oxygen ions for spread-out Bragg peaks in water. The comparisons between the ions show the dosimetric advantages of helium and heavier ion beams in terms of their distal and lateral fall-offs with respect to protons, reducing the lateral size of the region receiving 50% of the planned dose up to 12 mm. However, carbon and oxygen ions showed significant doses beyond the target due to the higher fragmentation tail compared to lighter ions (p and He), up to 25%. The Monte Carlo predictions were found to be in excellent geometrical agreement with the measurements, with deviations below 1 mm for all parameters investigated such as target and lateral size as well as distal fall-offs. Measured and simulated absolute dose values agreed within about 2.5% on the overall dose distributions. The MCTP tool, which supports the usage of multiple state-of-the-art relative biological effectiveness models, will provide a solid engine for treatment planning comparisons at HIT.
Lloyd, Samantha A. M.; Gagne, Isabelle M.; Bazalova-Carter, Magdalena; Zavgorodni, Sergei
2016-12-01
To accurately simulate therapeutic electron beams using Monte Carlo methods, backscatter from jaws into the monitor chamber must be accounted for via the backscatter factor, S b. Measured and simulated values of S b for the TrueBeam are investigated. Two approaches for measuring S b are presented. Both require service mode operation with the dose and pulse forming networking servos turned off in order to assess changes in dose rate with field size. The first approach samples an instantaneous dose rate, while the second approach times the delivery of a fixed number of monitor units to assess dose rate. Dose rates were measured for 6, 12 and 20 MeV electrons for jaw- or MLC-shaped apertures between 1× 1 and 40× 40 cm2. The measurement techniques resulted in values of S b that agreed within 0.21% for square and asymmetric fields collimated by the jaws. Measured values of S b were used to calculate the forward dose component in a virtual monitor chamber using BEAMnrc. Based on this forward component, simulated values of S b were calculated and compared to measurement and Varian’s VirtuaLinac simulations. BEAMnrc results for jaw-shaped fields agreed with measurements and with VirtuaLinac simulations within 0.2%. For MLC-shaped fields, the respective measurement techniques differed by as much as 0.41% and BEAMnrc results differed with measurement by as much as 0.4%, however, all measured and simulated values agreed within experimental uncertainty. Measurement sensitivity was not sufficient to capture the small backscatter effect due to the MLC, and Monte Carlo predicted backscatter from the MLC to be no more than 0.3%. Backscatter from the jaws changed the electron dose rate by up to 2.6%. This reinforces the importance of including a backscatter factor in simulations of electron fields shaped with secondary collimating jaws, but presents the option of ignoring it when jaws are retracted and collimation is done with the MLC.
Energy Technology Data Exchange (ETDEWEB)
Linares R, H. M.; Laguardia, R. A. [Instituto Superior de Tecnologias y Ciencias Aplicadas, Av. Salvador Allende Esq. Luaces, Quinta de los Molinos, Plaza de la Revolucion, 10600 La Habana (Cuba); Lara M, E., E-mail: elier@inor.sld.cu [Instituto Nacional de Oncologia y Radioterapia, Av. 29 y E. Vedado, 10400 La Habana (Cuba)
2014-08-15
For the simulation of the accelerator head the parameters determination that characterize the electrons primary beam that affect in the target is a step that involves a fundamental role in the precision of the Monte Carlo calculations. Applying the proposed methodology by Pena et al. [2007], in this work was carried out the qualification of the photon beams (6 MV and 15 MV) of an accelerator Elekta Precise, using the Monte Carlo code EGSnrc. The influence exerted by the characteristics of the electrons primary beam on the distribution of absorbed dose for the two energy of this equipment was studied. Using different mid energy combinations and FWHM of the electrons primary beam was calculated the dose deposited in a segmented water mannequin with its surface to 100 cm of the source. Starting from the deposited dose in the mannequin the dose curves in depth and dose profiles to different depths were built. These curves were compared with measured values in a similar experimental arrangement to the carried out simulation, applying acceptability criteria based on confidence intervals [Venselaar et al. 2001]. The dose profiles for small fields were like it was expected, to be strongly influenced by the radial distribution (FWHM). The energy/FWHM combinations that better reproduce the experimental curves of each photon beam were determined. One time determined the best combination (5.75 MeV/2 mm and 11.25 MeV/2 mm, respectively) was used for the generation of the phase spaces and the field factors calculation. A good correspondence was obtained between the simulations and the measurements for a wide range of field sizes, as well as for different types of detectors, being all the results inside of the tolerance margins. (author)
SU-E-T-505: BrainLab Plan Comparisons: Brain Scan Pencil Beam versus IPlan Monte Carlo.
Kowski, M; Edwards, J; Bauer, L; DuBose, R; Powell, H
2012-06-01
Monte Carlo (MC) dose modeling techniques are available in the newest version of Brain Lab's IPlan treatment planning system (TPS). Prior to the upgrade, at our facility, BrainLab's BrainScan was the treatment planning system available; pencil beam (PB) modeling is employed by BrainScan. As published in the literature, MC calculations, as compared to the PB algorithm, can generate differences in coverage as much as 20%. With the introduction of the new treatment planning system, treatment parameter comparisons were made with quantitative assessments. Differences due to changes in the dose calculation that could impact patient treatments and outcomes were investigated. Beam data was collected for the new BrainLab TPS IPLAN under the conditions as outlined in the manufacturer's Version 1.3 data collection, commissioning and acceptance guidelines. Utilizing BrainLab's treatment planning systems, treatment plan comparisons were made. First, PB modeling treatment plans were assessed for each treatment plan with pencil beam modeling in the BrainScan and IPlan TPS. Treatment plans with MC modeling were then compared to PB models. Differences in the dose distribution, DVH values, and monitor units were evaluated between the older version software (BrainScan) and the newer treatment planning system (IPlan). As predicted by the literature, the differences in the MC modeling versus PB modeling were significant depending upon the anatomy (tumor site). Modeling comparison for the treatment plans will be presented for SRS (Stereotactic Radiosurgery) and Stereotactic Body Radiation Therapy (SBRT). Clinical implementation of a new treatment planning system must be approached with caution and with adherence to AAPM recommendations and guidelines. Whenever a new TPS calculation model is introduced, thorough comparison between former and new models should be obtained. An additional recommended test would be to perform an independent, end-to-end check of the overall system utilizing
Zavgorodni, Sergei; Alhakeem, Eyad; Townson, Reid
2014-02-21
Linac backscattered radiation (BSR) into the monitor chamber affects the chamber's signal and has to be accounted for in radiotherapy dose calculations. In Monte Carlo (MC) calculations, the BSR can be modelled explicitly and accounted for in absolute dose. However, explicit modelling of the BSR becomes impossible if treatment head geometry is not available. In this study, monitor backscatter factors (MBSFs), defined as the ratio of the charge collected in the monitor chamber for a reference field to that of a given field, have been evaluated experimentally and incorporated into MC modelling of linacs with either known or unknown treatment head geometry. A telescopic technique similar to that by Kubo (1989 Med. Phys. 16 295-98) was used. However, instead of lead slits, a 1.8 mm diameter collimator and a small (2 mm diameter) detector positioned at extended source to detector distance were used. This setup provided a field of view to the source of less than 3.1 mm and allowed for MBSF measurements of open fields from 1 × 1 to 40 × 40 cm(2). For the fields with both X and Y dimensions exceeding 15 cm, a diode detector was used. A pinpoint ionization chamber was used for smaller fields. MBSFs were also explicitly modelled in MC calculations using BEAMnrc and DOSXYZnrc codes for 6 and 18 MV beams of a Varian 21EX linac. A method for deriving the D(ch)(forward) values that are used in MC absolute dose calculations was demonstrated. These values were derived from measured MBSFs for two 21EX and four TrueBeam energies. MBSFs were measured for 6 and 18 MV beams from Varian 21EX, and for 6 MV, 10 MV-FFF, 10 MV, and 15 MV beams from Varian TrueBeam linacs. For the open field sizes modelled in this study for the 21EX, the measured MBSFs agreed with MC calculated values within combined statistical (0.4%) and experimental (0.2%) uncertainties. Variation of MBSFs across field sizes was about a factor of two smaller for the TrueBeam compared to 21EX Varian linacs. Measured
Multilevel sequential Monte Carlo samplers
Beskos, Alexandros
2016-08-29
In this article we consider the approximation of expectations w.r.t. probability distributions associated to the solution of partial differential equations (PDEs); this scenario appears routinely in Bayesian inverse problems. In practice, one often has to solve the associated PDE numerically, using, for instance finite element methods which depend on the step-size level . hL. In addition, the expectation cannot be computed analytically and one often resorts to Monte Carlo methods. In the context of this problem, it is known that the introduction of the multilevel Monte Carlo (MLMC) method can reduce the amount of computational effort to estimate expectations, for a given level of error. This is achieved via a telescoping identity associated to a Monte Carlo approximation of a sequence of probability distributions with discretization levels . âˆž>h0>h1â‹¯>hL. In many practical problems of interest, one cannot achieve an i.i.d. sampling of the associated sequence and a sequential Monte Carlo (SMC) version of the MLMC method is introduced to deal with this problem. It is shown that under appropriate assumptions, the attractive property of a reduction of the amount of computational effort to estimate expectations, for a given level of error, can be maintained within the SMC context. That is, relative to exact sampling and Monte Carlo for the distribution at the finest level . hL. The approach is numerically illustrated on a Bayesian inverse problem. Â© 2016 Elsevier B.V.
Righi, Sergio; Karaj, Evis; Felici, Giuseppe; Di Martino, Fabio
2013-01-07
The Novac7 and Liac are linear accelerators (linacs) dedicated to intraoperative radiation therapy (IORT), which produce high energy, very high dose-per-pulse electron beams. The characteristics of the accelerators heads of the Novac7 and Liac are different compared to conventional electron accelerators. The aim of this work was to investigate the specific characteristics of the Novac7 and Liac electron beams using the Monte Carlo method. The Monte Carlo code BEAMnrc has been employed to model the head and simulate the electron beams. The Monte Carlo simulation was preliminarily validated by comparing the simulated dose distributions with those measured by means of EBT radiochromic film. Then, the energy spectra, mean energy profiles, fluence profiles, photon contamination, and angular distributions were obtained from the Monte Carlo simulation. The Spencer-Attix water-to-air mass restricted collision stopping power ratios (sw,air) were also calculated. Moreover, the modifications of the percentage depth dose in water (backscatter effect) due to the presence of an attenuator plate composed of a sandwich of a 2 mm aluminum foil and a 4 mm lead foil, commonly used for breast treatments, were evaluated. The calculated sw,air values are in agreement with those tabulated in the IAEA TRS-398 dosimetric code of practice within 0.2% and 0.4% at zref (reference depth in water) for the Novac7 and Liac, respectively. These differences are negligible for practical dosimetry. The attenuator plate is sufficient to completely absorb the electron beam for each energy of the Novac7 and Liac; moreover, the shape of the dose distribution in water strongly changes with the introduction of the attenuator plate. This variation depends on the energy of the beam, and it can give rise to an increase in the maximum dose in the range of 3%-9%.
Chakarova, Roumiana; Krantz, Marcus
2014-05-08
The aim is to study beam characteristics at large distances when focusing on the electron component. In particular, to investigate the utility of spoilers with various thicknesses as an electron source, as well as the effect of different spoiler-to-surface distances (STSD) on the beam characteristics and, consequently, on the dose in the superficial region. A MC model of a 15 MV Varian accelerator, validated earlier by experimental data at isocenter and extended distances used in large-field total body irradiation, is applied to evaluate beam characteristics at distances larger than 400 cm. Calculations are carried out using BEAMnrc/DOSXYZnrc code packages and phase space data are analyzed by the beam data processor BEAMdp. The electron component of the beam is analyzed at isocenter and extended distances, with and without spoilers as beam modifiers, assuming vacuum or air surrounding the accelerator head. Spoiler thickness of 1.6 cm is found to be optimal compared to thicknesses of 0.8 cm and 2.4 cm. The STSD variations should be taken into account when treating patients, in particular when the treatment protocols are based on a fixed distance to the patient central sagittal plane, and also, in order to maintain high dose in the superficial region.
Kim, Sung Jin; Kim, Sung Kyu
2015-01-01
Treatment planning system calculations in inhomogeneous regions may present significant inaccuracies due to loss of electronic equilibrium. In this study, three different dose calculation algorithms, pencil beam, collapsed cone, and Monte-Carlo, provided by our planning system were compared to assess their impact on the three-dimensional planning of lung and breast cases. A total of five breast and five lung cases were calculated using the PB, CC, and MC algorithms. Planning treatment volume and organs at risk delineation was performed according to our institutions protocols on the Oncentra MasterPlan image registration module, on 0.3 to 0.5 cm computed tomography slices taken under normal respiration conditions. Four intensity-modulated radiation therapy plans were calculated according to each algorithm for each patient. The plans were conducted on the Oncentra MasterPlan and CMS Monaco treatment planning systems, for 6 MV. The plans were compared in terms of the dose distribution in target, OAR volumes, and...
Zavgorodni, Sergei; Townson, Reid
2013-01-01
Objectives: Linac backscattered radiation (BSR) into the monitor chamber affects the chamber signal and has to be accounted for in radiotherapy dose calculations. In Monte Carlo (MC) calculations BSR can be modeled explicitly and incorporated into absolute dose. However, explicit modeling of BSR becomes impossible if treatment head geometry is not available. In this study, monitor backscatter factors (MBSFs), defined as the ratio of the charge collected in the monitor chamber for a reference field to that of a given field, have been evaluated experimentally and incorporated into MC modeling. Materials and methods: A telescopic technique similar to that by Kubo (1989) was used. However, instead of lead slits, a 1.8 mm diameter collimator and a PTW pinpoint ionization chamber positioned at extended SDD were used. These provided a field of view to the source of less than 3.1 mm. MBSFs were also explicitly modeled in MC calculations using BEAMnrc and DOSXYZnrc codes for 6MV and 18MV beams of a Varian 21EX linac, ...
Comparative Dosimetric Estimates of a 25 keV Electron Micro-beam with three Monte Carlo Codes
Mainardi, E; Donahue, R J
2002-01-01
The calculations presented compare the different performances of the three Monte Carlo codes PENELOPE-1999, MCNP-4C and PITS, for the evaluation of Dose profiles from a 25 keV electron micro-beam traversing individual cells. The overall model of a cell is a water cylinder equivalent for the three codes but with a different internal scoring geometry: hollow cylinders for PENELOPE and MCNP, whereas spheres are used for the PITS code. A cylindrical cell geometry with scoring volumes with the shape of hollow cylinders was initially selected for PENELOPE and MCNP because of its superior simulation of the actual shape and dimensions of a cell and for its improved computer-time efficiency if compared to spherical internal volumes. Some of the transfer points and energy transfer that constitute a radiation track may actually fall in the space between spheres, that would be outside the spherical scoring volume. This internal geometry, along with the PENELOPE algorithm, drastically reduced the computer time when using ...
Kohno, R; Hotta, K; Nishioka, S; Matsubara, K; Tansho, R; Suzuki, T
2011-11-21
We implemented the simplified Monte Carlo (SMC) method on graphics processing unit (GPU) architecture under the computer-unified device architecture platform developed by NVIDIA. The GPU-based SMC was clinically applied for four patients with head and neck, lung, or prostate cancer. The results were compared to those obtained by a traditional CPU-based SMC with respect to the computation time and discrepancy. In the CPU- and GPU-based SMC calculations, the estimated mean statistical errors of the calculated doses in the planning target volume region were within 0.5% rms. The dose distributions calculated by the GPU- and CPU-based SMCs were similar, within statistical errors. The GPU-based SMC showed 12.30-16.00 times faster performance than the CPU-based SMC. The computation time per beam arrangement using the GPU-based SMC for the clinical cases ranged 9-67 s. The results demonstrate the successful application of the GPU-based SMC to a clinical proton treatment planning.
Czarnecki, Damian; Poppe, Björn; Zink, Klemens
2017-06-01
The impact of removing the flattening filter in clinical electron accelerators on the relationship between dosimetric quantities such as beam quality specifiers and the mean photon and electron energies of the photon radiation field was investigated by Monte Carlo simulations. The purpose of this work was to determine the uncertainties when using the well-known beam quality specifiers or energy-based beam specifiers as predictors of dosimetric photon field properties when removing the flattening filter. Monte Carlo simulations applying eight different linear accelerator head models with and without flattening filter were performed in order to generate realistic radiation sources and calculate field properties such as restricted mass collision stopping power ratios (L¯/ρ)airwater, mean photon and secondary electron energies. To study the impact of removing the flattening filter on the beam quality correction factors kQ , this factor for detailed ionization chamber models was calculated by Monte Carlo simulations. Stopping power ratios (L¯/ρ)airwater and kQ values for different ionization chambers as a function of TPR1020 and %dd(10)x were calculated. Moreover, mean photon energies in air and at the point of measurement in water as well as mean secondary electron energies at the point of measurement were calculated. The results revealed that removing the flattening filter led to a change within 0.3% in the relationship between %dd(10)x and (L¯/ρ)airwater, whereby the relationship between TPR1020 and (L¯/ρ)airwater changed up to 0.8% for high energy photon beams. However, TPR1020 was a good predictor of (L¯/ρ)airwater for both types of linear accelerator with energies mean photon energy below the linear accelerators head as well as at the point of measurement may not be suitable as a predictor of (L¯/ρ)airwater and kQ to merge the dosimetry of both linear accelerator types. It was possible to derive (L¯/ρ)airwater using the mean secondary electron energy
Equilibrium Statistics: Monte Carlo Methods
Kröger, Martin
Monte Carlo methods use random numbers, or ‘random’ sequences, to sample from a known shape of a distribution, or to extract distribution by other means. and, in the context of this book, to (i) generate representative equilibrated samples prior being subjected to external fields, or (ii) evaluate high-dimensional integrals. Recipes for both topics, and some more general methods, are summarized in this chapter. It is important to realize, that Monte Carlo should be as artificial as possible to be efficient and elegant. Advanced Monte Carlo ‘moves’, required to optimize the speed of algorithms for a particular problem at hand, are outside the scope of this brief introduction. One particular modern example is the wavelet-accelerated MC sampling of polymer chains [406].
Mahady, Kyle; Tan, Shida; Greenzweig, Yuval; Livengood, Richard; Raveh, Amir; Rack, Philip
2017-01-01
We present an updated version of our Monte-Carlo based code for the simulation of ion beam sputtering. This code simulates the interaction of energetic ions with a target, and tracks the cumulative damage, enabling it to simulate the dynamic evolution of nanostructures as material is removed. The updated code described in this paper is significantly faster, permitting the inclusion of new features, namely routines to handle interstitial atoms, and to reduce the surface energy as the structure would otherwise develop energetically unfavorable surface porosity. We validate our code against the popular Monte-Carlo code SRIM-TRIM, and study the development of nanostructures from Ne+ ion beam milling in a copper target.
Monte Carlo Hamiltonian: Linear Potentials
Institute of Scientific and Technical Information of China (English)
LUO Xiang-Qian; LIU Jin-Jiang; HUANG Chun-Qing; JIANG Jun-Qin; Helmut KROGER
2002-01-01
We further study the validity of the Monte Carlo Hamiltonian method. The advantage of the method,in comparison with the standard Monte Carlo Lagrangian approach, is its capability to study the excited states. Weconsider two quantum mechanical models: a symmetric one V(x) = |x|/2; and an asymmetric one V(x) = ∞, forx ＜ 0 and V(x) = x, for x ≥ 0. The results for the spectrum, wave functions and thermodynamical observables are inagreement with the analytical or Runge-Kutta calculations.
Proton Upset Monte Carlo Simulation
O'Neill, Patrick M.; Kouba, Coy K.; Foster, Charles C.
2009-01-01
The Proton Upset Monte Carlo Simulation (PROPSET) program calculates the frequency of on-orbit upsets in computer chips (for given orbits such as Low Earth Orbit, Lunar Orbit, and the like) from proton bombardment based on the results of heavy ion testing alone. The software simulates the bombardment of modern microelectronic components (computer chips) with high-energy (.200 MeV) protons. The nuclear interaction of the proton with the silicon of the chip is modeled and nuclear fragments from this interaction are tracked using Monte Carlo techniques to produce statistically accurate predictions.
Schiavi, A.; Senzacqua, M.; Pioli, S.; Mairani, A.; Magro, G.; Molinelli, S.; Ciocca, M.; Battistoni, G.; Patera, V.
2017-09-01
Ion beam therapy is a rapidly growing technique for tumor radiation therapy. Ions allow for a high dose deposition in the tumor region, while sparing the surrounding healthy tissue. For this reason, the highest possible accuracy in the calculation of dose and its spatial distribution is required in treatment planning. On one hand, commonly used treatment planning software solutions adopt a simplified beam–body interaction model by remapping pre-calculated dose distributions into a 3D water-equivalent representation of the patient morphology. On the other hand, Monte Carlo (MC) simulations, which explicitly take into account all the details in the interaction of particles with human tissues, are considered to be the most reliable tool to address the complexity of mixed field irradiation in a heterogeneous environment. However, full MC calculations are not routinely used in clinical practice because they typically demand substantial computational resources. Therefore MC simulations are usually only used to check treatment plans for a restricted number of difficult cases. The advent of general-purpose programming GPU cards prompted the development of trimmed-down MC-based dose engines which can significantly reduce the time needed to recalculate a treatment plan with respect to standard MC codes in CPU hardware. In this work, we report on the development of fred, a new MC simulation platform for treatment planning in ion beam therapy. The code can transport particles through a 3D voxel grid using a class II MC algorithm. Both primary and secondary particles are tracked and their energy deposition is scored along the trajectory. Effective models for particle–medium interaction have been implemented, balancing accuracy in dose deposition with computational cost. Currently, the most refined module is the transport of proton beams in water: single pencil beam dose–depth distributions obtained with fred agree with those produced by standard MC codes within 1–2% of
Elmekawy, Ahmed Farouk
The distal edge of therapeutic proton radiation beams was investigated by different methods. Proton beams produced at the Hampton University Proton Therapy Institute (HUPTI) were used to irradiate a Polymethylmethacrylate (PMMA) phantom for three different ranges (13.5, 17.0 and 21.0 cm) to investigate the distal slope dependence of the Bragg peak. The activation of 11 C was studied by scanning the phantom less than 10 minutes post-irradiation with a Philips Big Bore Gemini(c) PET/CT. The DICOM images were imported into the Varian Eclipse(c) Treatment Planning System (TPS) for analysis and then analyzed by ImageJ(c) . The distal slope ranged from ?0.1671 +/- 0.0036 to -0.1986 +/- 0.0052 (pixel intensity/slice number) for ranges 13.5 to 21.0 cm respectively. A realistic description of the setup was modeled using the GATE 7.0 Monte Carlo simulation tool and compared to the experiment data. The results show the distal slope ranged from -0.1158+/-0.0133 to -0.0787+/-0.002 (Gy/mm). Additionally, low activity, 11C were simulated to study the 11C reconstructed half-life dependence versus the initial activity for six ranges chosen around the previous activation study. The results of the expected/nominal half-life vs. activity ranged from -5 x 10-4 +/- 2.8104 x 10-4 to 1.6 x 10-3 +/- 9.44 x 10-4 (%diff./Bq). The comparison between two experiments with proton beams on a PMMA phantom and multi-layer ion chamber, and two GATE simulations of a proton beam incident on a water phantom and 11C PET study show that: (i) the distal fall-off variation of the steepness of the slopes are found to be similar thus validating the sensitivity of the PET technique to the range degradation and (ii) the average of the super-ratios difference between all studies observed is primarily due to the difference in the dose deposited in the media.
Beam neutron energy optimization for boron neutron capture therapy using Monte Carlo method
Ali Pazirandeh; Elham Shekarian
2006-01-01
In last two decades the optimal neutron energy for the treatment of deep seated tumors in boron neutron capture therapy in view of neutron physics and chemical compounds of boron carrier has been under thorough study. Although neutron absorption cross section of boron is high (3836b), the treatment of deep seated tumors such as gliobelastoma multiform (GBM) requires beam of neutrons of higher energy that can penetrate deeply into the brain and thermalize in the proximity of the tumor. Dosage...
Monte Carlo Particle Lists: MCPL
Kittelmann, Thomas; Knudsen, Erik B; Willendrup, Peter; Cai, Xiao Xiao; Kanaki, Kalliopi
2016-01-01
A binary format with lists of particle state information, for interchanging particles between various Monte Carlo simulation applications, is presented. Portable C code for file manipulation is made available to the scientific community, along with converters and plugins for several popular simulation packages.
A Monte Carlo code to optimize the production of Radioactive Ion Beams by the ISOL technique
Santana-Leitner, M
2005-01-01
Currently the nuclear chart includes around 3000 nuclides, distributed as ${\\beta}^+$, ${\\beta}^-$ and $\\alpha$-emitters, stable and spontaneously fissioning isotopes. A similar amount of unknown nuclei belongs to the so-called \\textit{terra incognita}, the uncertain region contained also within the proton, neutron and (fast) fission driplines and thereby stable against nucleon emission. The exploration of this zone is to be assisted by the use of radioactive ion beams (RIB) and could provide a new understanding of several nuclear properties. Moreover, besides pointing at crucial questions such as the validity of the shell model, the dilute matter and the halo structure, challenging experiments outside nuclear physics are also attended, e.g., explanations of the nucleosythesis processes that may justify why the matter in the universe has evolved to present proportions of elements, and which represents a major challenge to nuclear physics. These, together with other fascinating research lines in particle physi...
Łukomska, Sandra; Kukołowicz, Paweł; Zawadzka, Anna; Gruda, Mariusz; Giżyńska, Marta; Jankowska, Anna; Piziorska, Maria
2016-09-01
The aim of the study was to verify the accuracy of calculations of dose distributions for electron beams performed using the electron Monte Carlo (eMC) v.10.0.28 algorithm implemented in the Eclipse treatment planning system (Varian Medical Systems). Implementation of the objective of the study was carried out in two stages. In the first stage the influence of several parameters defined by the user on the calculation accuracy was assessed. After selecting a set of parameters for which the best results were obtained a series of tests were carried. The tests were carried out in accordance with the recommendations of the Polish Society of Medical Physics (PSMP). The calculation and measurement of dose rate under reference conditions for semi quadratic and shaped fields were compared by individual cut-outs. We compared the calculated and measured percent depth doses, profiles and output factors for beams with an energy of 6, 9, 12, 15 and 18 MeV, for semi quadratic fields and for three different SSDs 100, 110, and 120 cm. All tests were carried out for beams generated in the Varian 2300CD Clinac linear accelerator. The results obtained during the first stage of the study demonstrated that the highest compliance between the calculations and measurements were obtained for the mean statistical uncertainty equal to 1, and the parameter responsible for smoothing the statistical noise defined as medium. Comparisons were made showing similar compliance calculations and measurements for the calculation grid of 0.1 cm and 0.25 cm and therefore the remaining part of the study was carried out for these two grids. In stage 2 it was demonstrated that the use of calculation grid of 0.1 cm allows for greater compliance of calculations and measurements. For energy 12, 15 and 18 MeV discrepancies between calculations and measurements, in most cases, did not exceed the PSMP action levels. The biggest differences between measurements and calculations were obtained for 6 MeV energy, for
Energy Technology Data Exchange (ETDEWEB)
Wang, Z [Reading Hospital, West Reading, PA (United States); Gao, M [ProCure Treatment Centers, Warrenville, IL (United States)
2014-06-01
Purpose: Monte Carlo simulation plays an important role for proton Pencil Beam Scanning (PBS) technique. However, MC simulation demands high computing power and is limited to few large proton centers that can afford a computer cluster. We study the feasibility of utilizing cloud computing in the MC simulation of PBS beams. Methods: A GATE/GEANT4 based MC simulation software was installed on a commercial cloud computing virtual machine (Linux 64-bits, Amazon EC2). Single spot Integral Depth Dose (IDD) curves and in-air transverse profiles were used to tune the source parameters to simulate an IBA machine. With the use of StarCluster software developed at MIT, a Linux cluster with 2–100 nodes can be conveniently launched in the cloud. A proton PBS plan was then exported to the cloud where the MC simulation was run. Results: The simulated PBS plan has a field size of 10×10cm{sup 2}, 20cm range, 10cm modulation, and contains over 10,000 beam spots. EC2 instance type m1.medium was selected considering the CPU/memory requirement and 40 instances were used to form a Linux cluster. To minimize cost, master node was created with on-demand instance and worker nodes were created with spot-instance. The hourly cost for the 40-node cluster was $0.63 and the projected cost for a 100-node cluster was $1.41. Ten million events were simulated to plot PDD and profile, with each job containing 500k events. The simulation completed within 1 hour and an overall statistical uncertainty of < 2% was achieved. Good agreement between MC simulation and measurement was observed. Conclusion: Cloud computing is a cost-effective and easy to maintain platform to run proton PBS MC simulation. When proton MC packages such as GATE and TOPAS are combined with cloud computing, it will greatly facilitate the pursuing of PBS MC studies, especially for newly established proton centers or individual researchers.
Applications of Monte Carlo Methods in Calculus.
Gordon, Sheldon P.; Gordon, Florence S.
1990-01-01
Discusses the application of probabilistic ideas, especially Monte Carlo simulation, to calculus. Describes some applications using the Monte Carlo method: Riemann sums; maximizing and minimizing a function; mean value theorems; and testing conjectures. (YP)
Tissue classifications in Monte Carlo simulations of patient dose for photon beam tumor treatments
Energy Technology Data Exchange (ETDEWEB)
Lin, Mu-Han [Department of Biomedical Engineering and Environmental Sciences, National Tsing Hua University, 101 Sec. 2, Kung Fu Road, Hsinchu 30013, Taiwan (China); Chao, Tsi-Chian [Department of Medical Imaging and Radiological Sciences, Chang Gung University, 259 Wen-Hwa 1st Road, Kwei-Shan, Tao-Yuan 333, Taiwan (China); Lee, Chung-Chi [Department of Medical Imaging and Radiological Sciences, Chang Gung University, 259 Wen-Hwa 1st Road, Kwei-Shan, Tao-Yuan 333, Taiwan (China); Department of Radiation Oncology, Chang Gung Memorial Hospital, 5 Fu-Hsin Street, Kwei-Shan, Tao-Yuan 333, Taiwan (China); Tung-Chieh Chang, Joseph [Department of Radiation Oncology, Chang Gung Memorial Hospital, 5 Fu-Hsin Street, Kwei-Shan, Tao-Yuan 333, Taiwan (China); Tung, Chuan-Jong, E-mail: cjtung@mail.cgu.edu.t [Department of Medical Imaging and Radiological Sciences, Chang Gung University, 259 Wen-Hwa 1st Road, Kwei-Shan, Tao-Yuan 333, Taiwan (China)
2010-07-21
The purpose of this work was to study the calculated dose uncertainties induced by the material classification that determined the interaction cross-sections and the water-to-material stopping-power ratios. Calculations were made for a head- and neck-cancer patient treated with five intensity-modulated radiotherapy fields using 6 MV photon beams. The patient's CT images were reconstructed into two voxelized patient phantoms based on different CT-to-material classification schemes. Comparisons of the depth-dose curve of the anterior-to-posterior field and the dose-volume-histogram of the treatment plan were used to evaluate the dose uncertainties from such schemes. The results indicated that any misassignment of tissue materials could lead to a substantial dose difference, which would affect the treatment outcome. To assure an appropriate material assignment, it is desirable to have different conversion tables for various parts of the body. The assignment of stopping-power ratio should be based on the chemical composition and the density of the material.
Directory of Open Access Journals (Sweden)
Nitin Ramesh Kakade
2015-01-01
Full Text Available Background: Gold nanoparticle (GNP-aided radiation therapy (RT is useful to make the tumor more sensitive to radiation damage because of the enhancement in the dose inside the tumor region. Polymer gel dosimeter (PGD can be a good choice for the physical measurement of dose enhancement produced by GNP inside the gel. Materials and Methods: The present study uses EGSnrc Monte Carlo code to estimate dose enhancement factor (DEF due to the introduction of GNPs inside the PGD at different concentrations (7 and 18 mg Au/g of gel when irradiated by therapeutic X-rays of energy 100 kVp, 150 kVp, 6 MV, and 15 MV. The simulation was also carried out to quantify the dose enhancement in PAGAT gel and tumor for 100 kVp X-rays. Results: For 100 kVp X-rays, average DEF of 1.86 and 2.91 is observed in the PAGAT gel dosimeter with 7 and 18 mg Au/g of gel, respectively. Average DEF of 1.69 and 2.61 is recorded for 150 kVp X-rays with 7 and 18 mg Au/g of gel, respectively. No clinically meaningful DEF was observed for 6 and 15 MV photon beams. Furthermore, the dose enhancement within the PAGAT gel dosimeter and tumor closely matches with each other. Conclusion: The polymer gel dosimetry can be a suitable method of dose estimation and verification for clinical implementation of GNP-aided RT. GNP-aided RT has the potential of delivering high localized tumoricidal dose with significant sparing of normal structures when the treatment is delivered with low energy X-rays.
Kakade, Nitin Ramesh; Sharma, Sunil Dutt
2015-01-01
Gold nanoparticle (GNP)-aided radiation therapy (RT) is useful to make the tumor more sensitive to radiation damage because of the enhancement in the dose inside the tumor region. Polymer gel dosimeter (PGD) can be a good choice for the physical measurement of dose enhancement produced by GNP inside the gel. The present study uses EGSnrc Monte Carlo code to estimate dose enhancement factor (DEF) due to the introduction of GNPs inside the PGD at different concentrations (7 and 18 mg Au/g of gel) when irradiated by therapeutic X-rays of energy 100 kVp, 150 kVp, 6 MV, and 15 MV. The simulation was also carried out to quantify the dose enhancement in PAGAT gel and tumor for 100 kVp X-rays. For 100 kVp X-rays, average DEF of 1.86 and 2.91 is observed in the PAGAT gel dosimeter with 7 and 18 mg Au/g of gel, respectively. Average DEF of 1.69 and 2.61 is recorded for 150 kVp X-rays with 7 and 18 mg Au/g of gel, respectively. No clinically meaningful DEF was observed for 6 and 15 MV photon beams. Furthermore, the dose enhancement within the PAGAT gel dosimeter and tumor closely matches with each other. The polymer gel dosimetry can be a suitable method of dose estimation and verification for clinical implementation of GNP-aided RT. GNP-aided RT has the potential of delivering high localized tumoricidal dose with significant sparing of normal structures when the treatment is delivered with low energy X-rays.
Energy Technology Data Exchange (ETDEWEB)
Kang, Sei-Kwon; Yoon, Jai-Woong; Hwang, Taejin; Park, Soah; Cheong, Kwang-Ho; Jin Han, Tae; Kim, Haeyoung; Lee, Me-Yeon; Ju Kim, Kyoung, E-mail: kjkim@hallym.or.kr; Bae, Hoonsik
2015-10-01
A metallic contact eye shield has sometimes been used for eyelid treatment, but dose distribution has never been reported for a patient case. This study aimed to show the shield-incorporated CT-based dose distribution using the Pinnacle system and Monte Carlo (MC) calculation for 3 patient cases. For the artifact-free CT scan, an acrylic shield machined as the same size as that of the tungsten shield was used. For the MC calculation, BEAMnrc and DOSXYZnrc were used for the 6-MeV electron beam of the Varian 21EX, in which information for the tungsten, stainless steel, and aluminum material for the eye shield was used. The same plan was generated on the Pinnacle system and both were compared. The use of the acrylic shield produced clear CT images, enabling delineation of the regions of interest, and yielded CT-based dose calculation for the metallic shield. Both the MC and the Pinnacle systems showed a similar dose distribution downstream of the eye shield, reflecting the blocking effect of the metallic eye shield. The major difference between the MC and the Pinnacle results was the target eyelid dose upstream of the shield such that the Pinnacle system underestimated the dose by 19 to 28% and 11 to 18% for the maximum and the mean doses, respectively. The pattern of dose difference between the MC and the Pinnacle systems was similar to that in the previous phantom study. In conclusion, the metallic eye shield was successfully incorporated into the CT-based planning, and the accurate dose calculation requires MC simulation.
Monte Carlo simulation for the sputtering yield of Si3N4 thin film milled by focused ion beams
Institute of Scientific and Technical Information of China (English)
TAN Yong-wen; SONG Yu-min; ZHOU Peng; WANG Cheng-yu; YANG Hai
2008-01-01
The sputtering yield of the Si3N4 thin film is calculated by Monte Carlo method with different parameters. The dependences of the sputtering yield on the incident ion energy, the incident angle and the number of Gallium (Ga) and Arsenic (As) ions are predicted. The abnormal sputtering yield for As at 90 keV occurs when the incident angle reaches the range between 82°and 84°.
Energy Technology Data Exchange (ETDEWEB)
Bandura, L., E-mail: bandura@msu.ed [Argonne National Laboratory, Argonne, IL 60439 (United States); Erdelyi, B. [Argonne National Laboratory, Argonne, IL 60439 (United States); Northern Illinois University, DeKalb, IL 60115 (United States); Nolen, J. [Argonne National Laboratory, Argonne, IL 60439 (United States)
2010-12-01
An integrated beam optics-nuclear processes framework is essential for accurate simulation of fragment separator beam dynamics. The code COSY INFINITY provides powerful differential algebraic methods for modeling and beam dynamics simulations in absence of beam-material interactions. However, these interactions are key for accurately simulating the dynamics of heavy ion fragmentation and fission. We have developed an extended version of the code that includes these interactions, and a set of new tools that allow efficient and accurate particle transport: by transfer map in vacuum and by Monte Carlo methods in materials. The new framework is presented, along with several examples from a preliminary layout of a fragment separator for a facility for rare isotope beams.
(U) Introduction to Monte Carlo Methods
Energy Technology Data Exchange (ETDEWEB)
Hungerford, Aimee L. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2017-03-20
Monte Carlo methods are very valuable for representing solutions to particle transport problems. Here we describe a “cook book” approach to handling the terms in a transport equation using Monte Carlo methods. Focus is on the mechanics of a numerical Monte Carlo code, rather than the mathematical foundations of the method.
Wang, Yi; Antonuk, Larry E; El-Mohri, Youcef; Zhao, Qihua; Sawant, Amit; Du, Hong
2008-01-01
Megavoltage cone-beam computed tomography (MV CBCT) is a highly promising technique for providing volumetric patient position information in the radiation treatment room. Such information has the potential to greatly assist in registering the patient to the planned treatment position, helping to ensure accurate delivery of the high energy therapy beam to the tumor volume while sparing the surrounding normal tissues. Presently, CBCT systems using conventional MV active matrix flat-panel imagers (AMFPIs), which are commonly used in portal imaging, require a relatively large amount of dose to create images that are clinically useful. This is due to the fact that the phosphor screen detector employed in conventional MV AMFPIs utilizes only approximately 2% of the incident radiation (for a 6 MV x-ray spectrum). Fortunately, thick segmented scintillating detectors can overcome this limitation, and the first prototype imager has demonstrated highly promising performance for projection imaging at low doses. It is therefore of definite interest to examine the potential performance of such thick, segmented scintillating detectors for MV CBCT. In this study, Monte Carlo simulations of radiation energy deposition were used to examine reconstructed images of cylindrical CT contrast phantoms, embedded with tissue-equivalent objects. The phantoms were scanned at 6 MV using segmented detectors having various design parameters (i.e., detector thickness as well as scintillator and septal wall materials). Due to constraints imposed by the nature of this study, the size of the phantoms was limited to approximately 6 cm. For such phantoms, the simulation results suggest that a 40 mm thick, segmented CsI detector with low density septal walls can delineate electron density differences of approximately 2.3% and 1.3% at doses of 1.54 and 3.08 cGy, respectively. In addition, it was found that segmented detectors with greater thickness, higher density scintillator material, or lower density
Romano, F; Cirrone, G A P; Cuttone, G; Rosa, F Di; Mazzaglia, S E; Petrovic, I; Fira, A Ristic; Varisano, A
2014-06-21
Fluence, depth absorbed dose and linear energy transfer (LET) distributions of proton and carbon ion beams have been investigated using the Monte Carlo code Geant4 (GEometry ANd Tracking). An open source application was developed with the aim to simulate two typical transport beam lines, one used for ocular therapy and cell irradiations with protons and the other for cell irradiations with carbon ions. This tool allows evaluation of the primary and total dose averaged LET and predict their spatial distribution in voxelized or sliced geometries. In order to reproduce the LET distributions in a realistic way, and also the secondary particles' contributions due to nuclear interactions were considered in the computations. Pristine and spread-out Bragg peaks were taken into account both for proton and carbon ion beams, with the maximum energy of 62 MeV/n. Depth dose distributions were compared with experimental data, showing good agreement. Primary and total LET distributions were analysed in order to study the influence of contributions of secondary particles in regions at different depths. A non-negligible influence of high-LET components was found in the entrance channel for proton beams, determining the total dose averaged LET by the factor 3 higher than the primary one. A completely different situation was obtained for carbon ions. In this case, secondary particles mainly contributed in the tail that is after the peak. The results showed how the weight of light and heavy secondary ions can considerably influence the computation of LET depth distributions. This has an important role in the interpretation of results coming from radiobiological experiments and, therefore, in hadron treatment planning procedures.
Romano, F.; Cirrone, G. A. P.; Cuttone, G.; Di Rosa, F.; Mazzaglia, S. E.; Petrovic, I.; Ristic Fira, A.; Varisano, A.
2014-06-01
Fluence, depth absorbed dose and linear energy transfer (LET) distributions of proton and carbon ion beams have been investigated using the Monte Carlo code Geant4 (GEometry ANd Tracking). An open source application was developed with the aim to simulate two typical transport beam lines, one used for ocular therapy and cell irradiations with protons and the other for cell irradiations with carbon ions. This tool allows evaluation of the primary and total dose averaged LET and predict their spatial distribution in voxelized or sliced geometries. In order to reproduce the LET distributions in a realistic way, and also the secondary particles’ contributions due to nuclear interactions were considered in the computations. Pristine and spread-out Bragg peaks were taken into account both for proton and carbon ion beams, with the maximum energy of 62 MeV/n. Depth dose distributions were compared with experimental data, showing good agreement. Primary and total LET distributions were analysed in order to study the influence of contributions of secondary particles in regions at different depths. A non-negligible influence of high-LET components was found in the entrance channel for proton beams, determining the total dose averaged LET by the factor 3 higher than the primary one. A completely different situation was obtained for carbon ions. In this case, secondary particles mainly contributed in the tail that is after the peak. The results showed how the weight of light and heavy secondary ions can considerably influence the computation of LET depth distributions. This has an important role in the interpretation of results coming from radiobiological experiments and, therefore, in hadron treatment planning procedures.
Cheng, Jason Y; Ning, Holly; Arora, Barbara C; Zhuge, Ying; Miller, Robert W
2016-05-08
The dose measurements of the small field sizes, such as conical collimators used in stereotactic radiosurgery (SRS), are a significant challenge due to many factors including source occlusion, detector size limitation, and lack of lateral electronic equilibrium. One useful tool in dealing with the small field effect is Monte Carlo (MC) simulation. In this study, we report a comparison of Monte Carlo simulations and measurements of output factors for the Varian SRS system with conical collimators for energies of 6 MV flattening filter-free (6 MV) and 10 MV flattening filter-free (10 MV) on the TrueBeam accelerator. Monte Carlo simulations of Varian's SRS system for 6 MV and 10 MV photon energies with cones sizes of 17.5 mm, 15.0 mm, 12.5 mm, 10.0 mm, 7.5 mm, 5.0 mm, and 4.0 mm were performed using EGSnrc (release V4 2.4.0) codes. Varian's version-2 phase-space files for 6 MV and 10 MV of TrueBeam accelerator were utilized in the Monte Carlo simulations. Two small diode detectors Edge (Sun Nuclear) and Small Field Detector (SFD) (IBA Dosimetry) were applied to measure the output factors. Significant errors may result if detector correction factors are not applied to small field dosimetric measurements. Although it lacked the machine-specific kfclin,fmsrQclin,Qmsr correction factors for diode detectors in this study, correction factors were applied utilizing published studies conducted under similar conditions. For cone diameters greater than or equal to 12.5 mm, the differences between output factors for the Edge detector, SFD detector, and MC simulations are within 3.0% for both energies. For cone diameters below 12.5 mm, output factors differences exhibit greater variations.
Energy Technology Data Exchange (ETDEWEB)
Bazalova-Carter, Magdalena; Liu, Michael; Palma, Bianey; Koong, Albert C.; Maxim, Peter G., E-mail: Peter.Maxim@Stanford.edu, E-mail: BWLoo@Stanford.edu; Loo, Billy W., E-mail: Peter.Maxim@Stanford.edu, E-mail: BWLoo@Stanford.edu [Department of Radiation Oncology, Stanford University, Stanford, California 94305-5847 (United States); Dunning, Michael; McCormick, Doug; Hemsing, Erik; Nelson, Janice; Jobe, Keith; Colby, Eric; Tantawi, Sami; Dolgashev, Valery [SLAC National Accelerator Laboratory, Menlo Park, California 94025 (United States)
2015-04-15
Purpose: To measure radiation dose in a water-equivalent medium from very high-energy electron (VHEE) beams and make comparisons to Monte Carlo (MC) simulation results. Methods: Dose in a polystyrene phantom delivered by an experimental VHEE beam line was measured with Gafchromic films for three 50 MeV and two 70 MeV Gaussian beams of 4.0–6.9 mm FWHM and compared to corresponding MC-simulated dose distributions. MC dose in the polystyrene phantom was calculated with the EGSnrc/BEAMnrc and DOSXYZnrc codes based on the experimental setup. Additionally, the effect of 2% beam energy measurement uncertainty and possible non-zero beam angular spread on MC dose distributions was evaluated. Results: MC simulated percentage depth dose (PDD) curves agreed with measurements within 4% for all beam sizes at both 50 and 70 MeV VHEE beams. Central axis PDD at 8 cm depth ranged from 14% to 19% for the 5.4–6.9 mm 50 MeV beams and it ranged from 14% to 18% for the 4.0–4.5 mm 70 MeV beams. MC simulated relative beam profiles of regularly shaped Gaussian beams evaluated at depths of 0.64 to 7.46 cm agreed with measurements to within 5%. A 2% beam energy uncertainty and 0.286° beam angular spread corresponded to a maximum 3.0% and 3.8% difference in depth dose curves of the 50 and 70 MeV electron beams, respectively. Absolute dose differences between MC simulations and film measurements of regularly shaped Gaussian beams were between 10% and 42%. Conclusions: The authors demonstrate that relative dose distributions for VHEE beams of 50–70 MeV can be measured with Gafchromic films and modeled with Monte Carlo simulations to an accuracy of 5%. The reported absolute dose differences likely caused by imperfect beam steering and subsequent charge loss revealed the importance of accurate VHEE beam control and diagnostics.
Sangeetha, S.; Sureka, C. S.
2017-06-01
The present study is focused to compare the characteristics of Varian Clinac 600 C/D flattened and unflattened 6 MV photon beams for small field dosimetry using EGSnrc Monte Carlo Simulation since the small field dosimetry is considered to be the most crucial and provoking task in the field of radiation dosimetry. A 6 MV photon beam of a Varian Clinac 600 C/D medical linear accelerator operates with Flattening Filter (FF) and Flattening-Filter-Free (FFF) mode for small field dosimetry were performed using EGSnrc Monte Carlo user codes (BEAMnrc and DOSXYZnrc) in order to calculate the beam characteristics using Educated-trial and error method. These includes: Percentage depth dose, lateral beam profile, dose rate delivery, photon energy spectra, photon beam uniformity, out-of-field dose, surface dose, penumbral dose and output factor for small field dosimetry (0.5×0.5 cm2 to 4×4 cm2) and are compared with magna-field sizes (5×5 cm2 to 40×40 cm2) at various depths. The results obtained showed that the optimized beam energy and Full-width-half maximum value for small field dosimetry and magna-field dosimetry was found to be 5.7 MeV and 0.13 cm for both FF and FFF beams. The depth of dose maxima for small field size deviates minimally for both FF and FFF beams similar to magna-fields. The depths greater than dmax depicts a steeper dose fall off in the exponential region for FFF beams comparing FF beams where its deviations gets increased with the increase in field size. The shape of the lateral beam profiles of FF and FFF beams varies remains similar for the small field sizes less than 4×4 cm2 whereas it varies in the case of magna-fields. Dose rate delivery for FFF beams shows an eminent increase with a two-fold factor for both small field dosimetry and magna-field sizes. The surface dose measurements of FFF beams for small field size were found to be higher whereas it gets lower for magna-fields than FF beam. The amount of out-of-field dose reduction gets
Density matrix quantum Monte Carlo
Blunt, N S; Spencer, J S; Foulkes, W M C
2013-01-01
This paper describes a quantum Monte Carlo method capable of sampling the full density matrix of a many-particle system, thus granting access to arbitrary reduced density matrices and allowing expectation values of complicated non-local operators to be evaluated easily. The direct sampling of the density matrix also raises the possibility of calculating previously inaccessible entanglement measures. The algorithm closely resembles the recently introduced full configuration interaction quantum Monte Carlo method, but works all the way from infinite to zero temperature. We explain the theory underlying the method, describe the algorithm, and introduce an importance-sampling procedure to improve the stochastic efficiency. To demonstrate the potential of our approach, the energy and staggered magnetization of the isotropic antiferromagnetic Heisenberg model on small lattices and the concurrence of one-dimensional spin rings are compared to exact or well-established results. Finally, the nature of the sign problem...
Efficient kinetic Monte Carlo simulation
Schulze, Tim P.
2008-02-01
This paper concerns kinetic Monte Carlo (KMC) algorithms that have a single-event execution time independent of the system size. Two methods are presented—one that combines the use of inverted-list data structures with rejection Monte Carlo and a second that combines inverted lists with the Marsaglia-Norman-Cannon algorithm. The resulting algorithms apply to models with rates that are determined by the local environment but are otherwise arbitrary, time-dependent and spatially heterogeneous. While especially useful for crystal growth simulation, the algorithms are presented from the point of view that KMC is the numerical task of simulating a single realization of a Markov process, allowing application to a broad range of areas where heterogeneous random walks are the dominate simulation cost.
Bergman, Alanah M; Gete, Ermias; Duzenli, Cheryl; Teke, Tony
2014-05-08
A Monte Carlo (MC) validation of the vendor-supplied Varian TrueBeam 6 MV flattened (6X) phase-space file and the first implementation of the Siebers-Keall MC MLC model as applied to the HD120 MLC (for 6X flat and 6X flattening filter-free (6X FFF) beams) are described. The MC model is validated in the context of VMAT patient-specific quality assurance. The Monte Carlo commissioning process involves: 1) validating the calculated open-field percentage depth doses (PDDs), profiles, and output factors (OF), 2) adapting the Siebers-Keall MLC model to match the new HD120-MLC geometry and material composition, 3) determining the absolute dose conversion factor for the MC calculation, and 4) validating this entire linac/MLC in the context of dose calculation verification for clinical VMAT plans. MC PDDs for the 6X beams agree with the measured data to within 2.0% for field sizes ranging from 2 × 2 to 40 × 40 cm2. Measured and MC profiles show agreement in the 50% field width and the 80%-20% penumbra region to within 1.3 mm for all square field sizes. MC OFs for the 2 to 40 cm2 square fields agree with measurement to within 1.6%. Verification of VMAT SABR lung, liver, and vertebra plans demonstrate that measured and MC ion chamber doses agree within 0.6% for the 6X beam and within 2.0% for the 6X FFF beam. A 3D gamma factor analysis demonstrates that for the 6X beam, > 99% of voxels meet the pass criteria (3%/3 mm). For the 6X FFF beam, > 94% of voxels meet this criteria. The TrueBeam accelerator delivering 6X and 6X FFF beams with the HD120 MLC can be modeled in Monte Carlo to provide an independent 3D dose calculation for clinical VMAT plans. This quality assurance tool has been used clinically to verify over 140 6X and 16 6X FFF TrueBeam treatment plans.
Adaptive Multilevel Monte Carlo Simulation
Hoel, H
2011-08-23
This work generalizes a multilevel forward Euler Monte Carlo method introduced in Michael B. Giles. (Michael Giles. Oper. Res. 56(3):607–617, 2008.) for the approximation of expected values depending on the solution to an Itô stochastic differential equation. The work (Michael Giles. Oper. Res. 56(3):607– 617, 2008.) proposed and analyzed a forward Euler multilevelMonte Carlo method based on a hierarchy of uniform time discretizations and control variates to reduce the computational effort required by a standard, single level, Forward Euler Monte Carlo method. This work introduces an adaptive hierarchy of non uniform time discretizations, generated by an adaptive algorithmintroduced in (AnnaDzougoutov et al. Raùl Tempone. Adaptive Monte Carlo algorithms for stopped diffusion. In Multiscale methods in science and engineering, volume 44 of Lect. Notes Comput. Sci. Eng., pages 59–88. Springer, Berlin, 2005; Kyoung-Sook Moon et al. Stoch. Anal. Appl. 23(3):511–558, 2005; Kyoung-Sook Moon et al. An adaptive algorithm for ordinary, stochastic and partial differential equations. In Recent advances in adaptive computation, volume 383 of Contemp. Math., pages 325–343. Amer. Math. Soc., Providence, RI, 2005.). This form of the adaptive algorithm generates stochastic, path dependent, time steps and is based on a posteriori error expansions first developed in (Anders Szepessy et al. Comm. Pure Appl. Math. 54(10):1169– 1214, 2001). Our numerical results for a stopped diffusion problem, exhibit savings in the computational cost to achieve an accuracy of ϑ(TOL),from(TOL−3), from using a single level version of the adaptive algorithm to ϑ(((TOL−1)log(TOL))2).
TH-A-18C-09: Ultra-Fast Monte Carlo Simulation for Cone Beam CT Imaging of Brain Trauma
Energy Technology Data Exchange (ETDEWEB)
Sisniega, A; Zbijewski, W; Stayman, J [Department of Biomedical Engineering, Johns Hopkins University (United States); Yorkston, J [Carestream Health (United States); Aygun, N [Department of Radiology, Johns Hopkins University (United States); Koliatsos, V [Department of Neurology, Johns Hopkins University (United States); Siewerdsen, J [Department of Biomedical Engineering, Johns Hopkins University (United States); Department of Radiology, Johns Hopkins University (United States)
2014-06-15
Purpose: Application of cone-beam CT (CBCT) to low-contrast soft tissue imaging, such as in detection of traumatic brain injury, is challenged by high levels of scatter. A fast, accurate scatter correction method based on Monte Carlo (MC) estimation is developed for application in high-quality CBCT imaging of acute brain injury. Methods: The correction involves MC scatter estimation executed on an NVIDIA GTX 780 GPU (MC-GPU), with baseline simulation speed of ~1e7 photons/sec. MC-GPU is accelerated by a novel, GPU-optimized implementation of variance reduction (VR) techniques (forced detection and photon splitting). The number of simulated tracks and projections is reduced for additional speed-up. Residual noise is removed and the missing scatter projections are estimated via kernel smoothing (KS) in projection plane and across gantry angles. The method is assessed using CBCT images of a head phantom presenting a realistic simulation of fresh intracranial hemorrhage (100 kVp, 180 mAs, 720 projections, source-detector distance 700 mm, source-axis distance 480 mm). Results: For a fixed run-time of ~1 sec/projection, GPU-optimized VR reduces the noise in MC-GPU scatter estimates by a factor of 4. For scatter correction, MC-GPU with VR is executed with 4-fold angular downsampling and 1e5 photons/projection, yielding 3.5 minute run-time per scan, and de-noised with optimized KS. Corrected CBCT images demonstrate uniformity improvement of 18 HU and contrast improvement of 26 HU compared to no correction, and a 52% increase in contrast-tonoise ratio in simulated hemorrhage compared to “oracle” constant fraction correction. Conclusion: Acceleration of MC-GPU achieved through GPU-optimized variance reduction and kernel smoothing yields an efficient (<5 min/scan) and accurate scatter correction that does not rely on additional hardware or simplifying assumptions about the scatter distribution. The method is undergoing implementation in a novel CBCT dedicated to brain
WE-EF-207-05: Monte Carlo Dosimetry for a Dedicated Cone-Beam CT Head Scanner
Energy Technology Data Exchange (ETDEWEB)
Sisniega, A; Zbijewski, W; Xu, J; Dang, H; Stayman, J W; Aygun, N; Koliatsos, V E; Siewerdsen, J H [Johns Hopkins University, Balitmore, MD (United States); Wang, X; Foos, D H [Carestream Health, Rochester, NY (United States)
2015-06-15
Purpose: Cone-Beam CT (CBCT) is an attractive platform for point-of-care imaging of traumatic brain injury and intracranial hemorrhage. This work implements and evaluates a fast Monte-Carlo (MC) dose estimation engine for development of a dedicated head CBCT scanner, optimization of acquisition protocols, geometry, bowtie filter designs, and patient-specific dosimetry. Methods: Dose scoring with a GPU-based MC CBCT simulator was validated on an imaging bench using a modified 16 cm CTDI phantom with 7 ion chamber shafts along the central ray for 80–100 kVp (+2 mm Al, +0.2 mm Cu). Dose distributions were computed in a segmented CBCT reconstruction of an anthropomorphic head phantom with 4×10{sup 5} tracked photons per scan (5 min runtime). Circular orbits with angular span ranging from short scan (180° + fan angle) to full rotation (360°) were considered for fixed total mAs per scan. Two aluminum filters were investigated: aggressive bowtie, and moderate bowtie (matched to 16 cm and 32 cm water cylinder, respectively). Results: MC dose estimates showed strong agreement with measurements (RMSE<0.001 mGy/mAs). A moderate (aggressive) bowtie reduced the dose, per total mAs, by 20% (30%) at the center of the head, by 40% (50%) at the eye lens, and by 70% (80%) at the posterior skin entrance. For the no bowtie configuration, a short scan reduced the eye lens dose by 62% (from 0.08 mGy/mAs to 0.03 mGy/mAs) compared to full scan, although the dose to spinal bone marrow increased by 40%. For both bowties, the short scan resulted in a similar 40% increase in bone marrow dose, but the reduction in the eye lens was more pronounced: 70% (90%) for the moderate (aggressive) bowtie. Conclusions: Dose maps obtained with validated MC simulation demonstrated dose reduction in sensitive structures (eye lens and bone marrow) through combination of short-scan trajectories and bowtie filters. Xiaohui Wang and David Foos are employees of Carestream Health.
Xu, Yuan; Bai, Ti; Yan, Hao; Ouyang, Luo; Pompos, Arnold; Wang, Jing; Zhou, Linghong; Jiang, Steve B; Jia, Xun
2015-05-07
Cone-beam CT (CBCT) has become the standard image guidance tool for patient setup in image-guided radiation therapy. However, due to its large illumination field, scattered photons severely degrade its image quality. While kernel-based scatter correction methods have been used routinely in the clinic, it is still desirable to develop Monte Carlo (MC) simulation-based methods due to their accuracy. However, the high computational burden of the MC method has prevented routine clinical application. This paper reports our recent development of a practical method of MC-based scatter estimation and removal for CBCT. In contrast with conventional MC approaches that estimate scatter signals using a scatter-contaminated CBCT image, our method used a planning CT image for MC simulation, which has the advantages of accurate image intensity and absence of image truncation. In our method, the planning CT was first rigidly registered with the CBCT. Scatter signals were then estimated via MC simulation. After scatter signals were removed from the raw CBCT projections, a corrected CBCT image was reconstructed. The entire workflow was implemented on a GPU platform for high computational efficiency. Strategies such as projection denoising, CT image downsampling, and interpolation along the angular direction were employed to further enhance the calculation speed. We studied the impact of key parameters in the workflow on the resulting accuracy and efficiency, based on which the optimal parameter values were determined. Our method was evaluated in numerical simulation, phantom, and real patient cases. In the simulation cases, our method reduced mean HU errors from 44 to 3 HU and from 78 to 9 HU in the full-fan and the half-fan cases, respectively. In both the phantom and the patient cases, image artifacts caused by scatter, such as ring artifacts around the bowtie area, were reduced. With all the techniques employed, we achieved computation time of less than 30 s including the
Krongkietlearts, K.; Tangboonduangjit, P.; Paisangittisakul, N.
2016-03-01
In order to improve the life's quality for a cancer patient, the radiation techniques are constantly evolving. Especially, the two modern techniques which are intensity modulated radiation therapy (IMRT) and volumetric modulated arc therapy (VMAT) are quite promising. They comprise of many small beam sizes (beamlets) with various intensities to achieve the intended radiation dose to the tumor and minimal dose to the nearby normal tissue. The study investigates whether the microDiamond detector (PTW manufacturer), a synthetic single crystal diamond detector, is suitable for small field output factor measurement. The results were compared with those measured by the stereotactic field detector (SFD) and the Monte Carlo simulation (EGSnrc/BEAMnrc/DOSXYZ). The calibration of Monte Carlo simulation was done using the percentage depth dose and dose profile measured by the photon field detector (PFD) of the 10×10 cm2 field size with 100 cm SSD. Comparison of the values obtained from the calculations and measurements are consistent, no more than 1% difference. The output factors obtained from the microDiamond detector have been compared with those of SFD and Monte Carlo simulation, the results demonstrate the percentage difference of less than 2%.
Monte Carlo simulation and measurements of clinical photon beams using LiF:Mg,Cu,P+PTFE
Energy Technology Data Exchange (ETDEWEB)
Azorin-Vega, C. [Centro de Investigacion en Ciencia Aplicada y Tecnologia Avanzada del Instituto Politecnico Nacional, Legaria 694, Col. Irrigacion, 11500 Mexico, D. F. (Mexico)], E-mail: claudiaazorin@yahoo.com.mx; Rivera-Montalvo, T. [Centro de Investigacion en Ciencia Aplicada y Tecnologia Avanzada del Instituto Politecnico Nacional, Legaria 694, Col. Irrigacion, 11500 Mexico, D. F. (Mexico)], E-mail: trivera@ipn.mx; Azorin-Nieto, J. [Universidad Autonoma Metropolitana-Iztapalapa, San Rafael Atlixco 186, Col. Vicentina, 09340 Mexico, D. F. (Mexico)], E-mail: azorin@xanum.uam.mx; Villasenor-Navarro, L.; Lujan-Castilla, P. [Hospital General de Mexico, Dr. Balmis 148, Col. Doctores, 06726 Mexico, D. F. (Mexico); Vega-Carrillo, H. [Unidad Academica de Estudios Nucleares de la UAZ, Apdo. Postal 336, 98000 Zacatecas, Zac. (Mexico)], E-mail: fermineutron@yahoo.com
2010-04-15
Thermoluminescent response of LiF:Mg,Cu,P+PTFE under clinical photon irradiation was obtained. Thermoluminescent dosimeters (TLDs) were irradiated for determining entrance surface dose (ESD) in a solid water phantom when using standard clinical adult treatment protocols. A Monte Carlo simulation of photon interaction with matter was performed and absorbed dose determined. ESD calculated by MCNPX code was greater than those determined by direct measurements in phantom. The results obtained open the possibility for using this material as a TLDs in medical accelerators.
Energy Technology Data Exchange (ETDEWEB)
Gomes B, W. O., E-mail: wilsonottobatista@gmail.com [Instituto Federal da Bahia, Rua Emidio dos Santos s/n, Bardalho, 40301-015 Salvador, Bahia (Brazil)
2015-10-15
Full text: In this study irradiation geometry applicable to PCXMC and the consequent calculation of effective dose in applications of cone beam computed tomography (CBCT) was developed. Two different CBCT equipment s for dental applications were evaluated: Care Stream Cs-9000 3-Dimensional and Gendex GXCB-500 tomographs. Each protocol initially was characterized by measuring the surface kerma input and the product air kerma-area, P{sub KA}. Then, technical parameters of each of the predetermined protocols and geometric conditions in the PCXMC software were introduced to obtain the values of effective dose. The calculated effective dose is within the range of 9.0 to 15.7 μSv for Cs 9000 3-D and in the range 44.5 to 89 mSv for GXCB-500 equipment. These values were compared with dosimetric results obtained using thermoluminescent dosimeters implanted in anthropomorphic mannequin and were considered consistent. The effective dose results are very sensitive to the radiation geometry (beam position); this represents a factor of fragility software usage, but on the other hand, turns out to be a very useful tool for quick conclusions regarding the optimization process of protocols. We can conclude that the use of Monte Carlo simulation software PCXMC is useful in the evaluation of test protocols of CBCT in dental applications. (Author)
Energy Technology Data Exchange (ETDEWEB)
Ramirez Ros, J. C.; Jerez Sainz, M. I.; Jodar Lopez, C. A.; Lobato Munoz, M.; Ruiz Lopez, M. A.; Carrasco Rodriguez, J. L.; Pamos Urena, M.
2013-07-01
We evaluated the Monte Carlo Monaco Planner v2.0.3 by planners of the SEFM Protocol [1] to the modeling of the photon beam of 6 MV of a linear accelerator Elekta Synergy with collimator MLC Beam Modulator. We compare the Monte Carlo calculation with profiles on water measurement DFS = 100 cm, absorbed dose and dose levels for rectangular, asymmetric fields and different DFS. We compare the results with those obtained with the algorithm Collapsed Cone of Pinnacle Scheduler v8.0m. (Author)
DEFF Research Database (Denmark)
Slot Thing, Rune; Bernchou, Uffe; Mainegra-Hing, Ernesto;
2013-01-01
Abstract Purpose. Cone beam computed tomography (CBCT) image quality is limited by scattered photons. Monte Carlo (MC) simulations provide the ability of predicting the patient-specific scatter contamination in clinical CBCT imaging. Lengthy simulations prevent MC-based scatter correction from...... and pelvis scan were simulated within 2% statistical uncertainty in two hours per scan. Within the same time, the ray tracing algorithm provided the primary signal for each of the projections. Thus, all the data needed for MC-based scatter correction in clinical CBCT imaging was obtained within two hours per...
Institute of Scientific and Technical Information of China (English)
Wang Yang; Cheng Tianle; Xia Yuanming; Jiang Dazhi
2001-01-01
In this paper, the fracture process of a unidirectional CF/SiC single edge-notched beam (SENB) under three-point bending (TPB) is studied by means of macro/micro-statistical Monte Carlo simulation. The simulated p-△ curves are in agreement with the experimental results before the peaks of curves, and the simulated microevolution patterns are in agreement with the patterns of the crack surfaces, which have verified this method. It is preliminarily demonstrated that the second turning point in the compliance changing rate curve corresponds to the fracture initiation for experiments on SENB under TPB of unidirectional CF/SiC composites.
Monte Carlo approach to turbulence
Energy Technology Data Exchange (ETDEWEB)
Dueben, P.; Homeier, D.; Muenster, G. [Muenster Univ. (Germany). Inst. fuer Theoretische Physik; Jansen, K. [DESY, Zeuthen (Germany). John von Neumann-Inst. fuer Computing NIC; Mesterhazy, D. [Humboldt Univ., Berlin (Germany). Inst. fuer Physik
2009-11-15
The behavior of the one-dimensional random-force-driven Burgers equation is investigated in the path integral formalism on a discrete space-time lattice. We show that by means of Monte Carlo methods one may evaluate observables, such as structure functions, as ensemble averages over different field realizations. The regularization of shock solutions to the zero-viscosity limit (Hopf-equation) eventually leads to constraints on lattice parameters required for the stability of the simulations. Insight into the formation of localized structures (shocks) and their dynamics is obtained. (orig.)
Approaching Chemical Accuracy with Quantum Monte Carlo
Petruzielo, Frank R.; Toulouse, Julien; Umrigar, C. J.
2012-01-01
International audience; A quantum Monte Carlo study of the atomization energies for the G2 set of molecules is presented. Basis size dependence of diffusion Monte Carlo atomization energies is studied with a single determinant Slater-Jastrow trial wavefunction formed from Hartree-Fock orbitals. With the largest basis set, the mean absolute deviation from experimental atomization energies for the G2 set is 3.0 kcal/mol. Optimizing the orbitals within variational Monte Carlo improves the agreem...
Mean field simulation for Monte Carlo integration
Del Moral, Pierre
2013-01-01
In the last three decades, there has been a dramatic increase in the use of interacting particle methods as a powerful tool in real-world applications of Monte Carlo simulation in computational physics, population biology, computer sciences, and statistical machine learning. Ideally suited to parallel and distributed computation, these advanced particle algorithms include nonlinear interacting jump diffusions; quantum, diffusion, and resampled Monte Carlo methods; Feynman-Kac particle models; genetic and evolutionary algorithms; sequential Monte Carlo methods; adaptive and interacting Marko
Energy Technology Data Exchange (ETDEWEB)
Ishmael Parsai, E. [Department of Radiation Oncology, University of Toledo, 3000 Arlington Avenue, Toledo, OH 43614 (United States)]. E-mail: Ishmael.parsai@utoledo.edu; Pearson, David [Department of Radiation Oncology, University of Toledo, 3000 Arlington Avenue, Toledo, OH 43614 (United States); Department of Physics and Astronomy, University of Toledo, 3000 Arlington Avenue, Toledo, OH 43614 (United States); Kvale, Thomas [Department of Physics and Astronomy, University of Toledo, 3000 Arlington Avenue, Toledo, OH 43614 (United States)
2007-08-15
An Elekta SL-25 medical linear accelerator (Elekta Oncology Systems, Crawley, UK) has been modelled using Monte Carlo simulations with the photon flattening filter removed. It is hypothesized that intensity modulated radiation therapy (IMRT) treatments may be carried out after the removal of this component despite it's criticality to standard treatments. Measurements using a scanning water phantom were also performed after the flattening filter had been removed. Both simulated and measured beam profiles showed that dose on the central axis increased, with the Monte Carlo simulations showing an increase by a factor of 2.35 for 6 MV and 4.18 for 10 MV beams. A further consequence of removing the flattening filter was the softening of the photon energy spectrum leading to a steeper reduction in dose at depths greater than the depth of maximum dose. A comparison of the points at the field edge showed that dose was reduced at these points by as much as 5.8% for larger fields. In conclusion, the greater photon fluence is expected to result in shorter treatment times, while the reduction in dose outside of the treatment field is strongly suggestive of more accurate dose delivery to the target.
1-D EQUILIBRIUM DISCRETE DIFFUSION MONTE CARLO
Energy Technology Data Exchange (ETDEWEB)
T. EVANS; ET AL
2000-08-01
We present a new hybrid Monte Carlo method for 1-D equilibrium diffusion problems in which the radiation field coexists with matter in local thermodynamic equilibrium. This method, the Equilibrium Discrete Diffusion Monte Carlo (EqDDMC) method, combines Monte Carlo particles with spatially discrete diffusion solutions. We verify the EqDDMC method with computational results from three slab problems. The EqDDMC method represents an incremental step toward applying this hybrid methodology to non-equilibrium diffusion, where it could be simultaneously coupled to Monte Carlo transport.
Energy Technology Data Exchange (ETDEWEB)
Kim, S [Advocate Lutheran General Hospital, Park Ridge, IL (United States)
2015-06-15
Purpose: To quantify the dosimetric variations of misaligned beams for a linear accelerator by using Monte Carlo (MC) simulations. Method and Materials: Misaligned beams of a Varian 21EX Clinac were simulated to estimate the dosimetric effects. All the linac head components for a 6 MV photon beam were implemented in BEAMnrc/EGSnrc system. For incident electron beam parameters, 6 MeV with 0.1 cm full-width-half-max Gaussian beam was used. A phase space file was obtained below the jaw per each misalignment condition of the incident electron beam: (1) The incident electron beams were tilted by 0.5, 1.0 and 1.5 degrees on the x-axis from the central axis. (2) The center of the incident electron beam was off-axially moved toward +x-axis by 0.1, 0.2, and 0.3 cm away from the central axis. Lateral profiles for each misaligned beam condition were acquired at dmax = 1.5 cm and 10 cm depth in a rectangular water phantom. Beam flatness and symmetry were calculated by using the lateral profile data. Results: The lateral profiles were found to be skewed opposite to the angle of the incident beam for the tilted beams. For the displaced beams, similar skewed lateral profiles were obtained with small shifts of penumbra on the +x-axis. The variations of beam flatness were 3.89–11.18% and 4.12–42.57% for the tilted beam and the translated beam, respectively. The beam symmetry was separately found to be 2.95 −9.93% and 2.55–38.06% separately. It was found that the percent increase of the flatness and the symmetry values are approximated 2 to 3% per 0.5 degree tilt or per 1 mm displacement. Conclusion: This study quantified the dosimetric effects of misaligned beams using MC simulations. The results would be useful to understand the magnitude of the dosimetric deviations for the misaligned beams.
Error in Monte Carlo, quasi-error in Quasi-Monte Carlo
Kleiss, R. H. P.; Lazopoulos, A.
2006-01-01
While the Quasi-Monte Carlo method of numerical integration achieves smaller integration error than standard Monte Carlo, its use in particle physics phenomenology has been hindered by the abscence of a reliable way to estimate that error. The standard Monte Carlo error estimator relies on the assumption that the points are generated independently of each other and, therefore, fails to account for the error improvement advertised by the Quasi-Monte Carlo method. We advocate the construction o...
Lazos, Dimitrios; Pokhrel, Damodar; Su, Zhong; Lu, Jun; Williamson, Jeffrey F.
2008-03-01
Fast and accurate modeling of cone-beam CT (CBCT) x-ray projection data can improve CBCT image quality either by linearizing projection data for each patient prior to image reconstruction (thereby mitigating detector blur/lag, spectral hardening, and scatter artifacts) or indirectly by supporting rigorous comparative simulation studies of competing image reconstruction and processing algorithms. In this study, we compare Monte Carlo-computed x-ray projections with projections experimentally acquired from our Varian Trilogy CBCT imaging system for phantoms of known design. Our recently developed Monte Carlo photon-transport code, PTRAN, was used to compute primary and scatter projections for cylindrical phantom of known diameter (NA model 76-410) with and without bow-tie filter and antiscatter grid for both full- and half-fan geometries. These simulations were based upon measured 120 kVp spectra, beam profiles, and flat-panel detector (4030CB) point-spread function. Compound Poisson- process noise was simulated based upon measured beam output. Computed projections were compared to flat- and dark-field corrected 4030CB images where scatter profiles were estimated by subtracting narrow axial-from full axial width 4030CB profiles. In agreement with the literature, the difference between simulated and measured projection data is of the order of 6-8%. The measurement of the scatter profiles is affected by the long tails of the detector PSF. Higher accuracy can be achieved mainly by improving the beam modeling and correcting the non linearities induced by the detector PSF.
Energy Technology Data Exchange (ETDEWEB)
Kurosu, K [Department of Radiation Oncology, Osaka University Graduate School of Medicine, Osaka (Japan); Department of Medical Physics ' Engineering, Osaka University Graduate School of Medicine, Osaka (Japan); Takashina, M; Koizumi, M [Department of Medical Physics ' Engineering, Osaka University Graduate School of Medicine, Osaka (Japan); Das, I; Moskvin, V [Department of Radiation Oncology, Indiana University School of Medicine, Indianapolis, IN (United States)
2014-06-01
Purpose: Monte Carlo codes are becoming important tools for proton beam dosimetry. However, the relationships between the customizing parameters and percentage depth dose (PDD) of GATE and PHITS codes have not been reported which are studied for PDD and proton range compared to the FLUKA code and the experimental data. Methods: The beam delivery system of the Indiana University Health Proton Therapy Center was modeled for the uniform scanning beam in FLUKA and transferred identically into GATE and PHITS. This computational model was built from the blue print and validated with the commissioning data. Three parameters evaluated are the maximum step size, cut off energy and physical and transport model. The dependence of the PDDs on the customizing parameters was compared with the published results of previous studies. Results: The optimal parameters for the simulation of the whole beam delivery system were defined by referring to the calculation results obtained with each parameter. Although the PDDs from FLUKA and the experimental data show a good agreement, those of GATE and PHITS obtained with our optimal parameters show a minor discrepancy. The measured proton range R90 was 269.37 mm, compared to the calculated range of 269.63 mm, 268.96 mm, and 270.85 mm with FLUKA, GATE and PHITS, respectively. Conclusion: We evaluated the dependence of the results for PDDs obtained with GATE and PHITS Monte Carlo generalpurpose codes on the customizing parameters by using the whole computational model of the treatment nozzle. The optimal parameters for the simulation were then defined by referring to the calculation results. The physical model, particle transport mechanics and the different geometrybased descriptions need accurate customization in three simulation codes to agree with experimental data for artifact-free Monte Carlo simulation. This study was supported by Grants-in Aid for Cancer Research (H22-3rd Term Cancer Control-General-043) from the Ministry of Health
Directory of Open Access Journals (Sweden)
Nasrollah Jabbari
2009-03-01
Full Text Available Introduction: In clinical electron beams, most of bremsstrahlung radiation is produced by various linac head structures. This bremsstrahlung radiation dose is influenced by the geometry and construction of every component of the linac treatment head structures. Thus, it can be expected that the amount of the contaminated photon dose due to bremsstrahlung radiation varies among different linacs, even for the same electron beam energy. The aims of this study were to simulate the NEPTUN 10PC linac electron beams and to calculate the photon contamination dose due to bremsstrahlung radiation in these beams using a Monte Carlo method. Materials and methods: A NEPTUN 10PC linac was simulated in its electron mode using the BEAMnrc code. This linac can provide three electron beam energies of 6, 8 and 10 MeV. Detailed information required for the simulation, including the geometry and materials of various components of the linac treatment head, was provided by the vender. For all simulations, the cut-off energies for electron and photon transport were set at ECUT=0.521 MeV and PCUT=0.010 MeV, respectively. The KS statistical test was used for validation of the simulated models. Then, relevant bremsstrahlung radiation doses for the three electron beam energies of the linac were calculated for the reference field using the Monte Carlo method. Results: The KS test showed a good agreement between the calculated values (resulting from the simulations and the measured ones. The results showed that the amount of contaminated photon dose due to bremsstrahlung radiation from various components of the simulated linac at the surface of the phantom was between 0.2%-0.5% of the maximum dose for the three electron beam energies. Conclusion: Considering the good agreement between the measured and simulated data, it can be concluded that the simulation method as well as the calculated bremsstrahlung doses have been made at a good level of accuracy and precision
Energy Technology Data Exchange (ETDEWEB)
Parsons, C; Parsons, D [Dept of Physics and Atmospheric Science, Dalhousie University, Halifax, Nova Scotia (Canada); Robar, J; Kelly, R [Dept of Physics and Atmospheric Science, Dalhousie University, Halifax, Nova Scotia (Canada); Dept of Radiation Oncology, Dalhousie University, Halifax, Nova Scotia (Canada); Nova Scotia Cancer Centre, Halifax, NS (Canada)
2014-06-15
Purpose: The introduction of the TrueBeam linac platform provides access to an in-air target assembly making it possible to apply novel treatments using multiple target designs. One such novel treatment uses multiple low-Z targets to enhance surface dose replacing the use of synthetic tissue equivalent material (bolus). This treatment technique will decrease the common dosimetric and set up errors prevalent in using physical treatment accessories like bolus. The groundwork for a novel treatment beam used to enhance surface dose to within 80-100% of the dose at dmax by utilizing low-Z (Carbon) targets of various percent CSDA range thickness operated at 2.5–4 MeV used in conjunction with a clinical 6 MV beam is presented herein. Methods: A standard Monte Carlo model of a Varian Clinac accelerator was developed to manufacturers specifications. Simulations were performed using Be, C, AL, and C, as potential low-Z targets, placed in the secondary target position. The results determined C to be the target material of choice. Simulations of 15, 30 and 60% CSDA range C beams were propagated through slab phantoms. The resulting PDDs were weighted and combined with a standard 6 MV treatment beam. Versions of the experimental targets were installed into a 2100C Clinac and the models were validated. Results: Carbon was shown to be the low-Z material of choice for this project. Using combinations of 15, 30, 60% CSDA beams operated at 2.5 and 4 MeV in combination with a standard 6 MV treatment beam the surface dose was shown to be enhanced to within 80–100% the dose at dmax. Conclusion: The modeled low-Z beams were successfully validated using machined versions of the targets. Water phantom measurements and slab phantom simulations show excellent correlation. Patient simulations are now underway to compare the use of bolus with the proposed novel beams. NSERC.
Langevin Monte Carlo filtering for target tracking
Iglesias Garcia, Fernando; Bocquel, Melanie; Driessen, Hans
2015-01-01
This paper introduces the Langevin Monte Carlo Filter (LMCF), a particle filter with a Markov chain Monte Carlo algorithm which draws proposals by simulating Hamiltonian dynamics. This approach is well suited to non-linear filtering problems in high dimensional state spaces where the bootstrap filte
An introduction to Monte Carlo methods
Walter, J. -C.; Barkema, G. T.
2015-01-01
Monte Carlo simulations are methods for simulating statistical systems. The aim is to generate a representative ensemble of configurations to access thermodynamical quantities without the need to solve the system analytically or to perform an exact enumeration. The main principles of Monte Carlo sim
An introduction to Monte Carlo methods
Walter, J. -C.; Barkema, G. T.
2015-01-01
Monte Carlo simulations are methods for simulating statistical systems. The aim is to generate a representative ensemble of configurations to access thermodynamical quantities without the need to solve the system analytically or to perform an exact enumeration. The main principles of Monte Carlo sim
Challenges of Monte Carlo Transport
Energy Technology Data Exchange (ETDEWEB)
Long, Alex Roberts [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2016-06-10
These are slides from a presentation for Parallel Summer School at Los Alamos National Laboratory. Solving discretized partial differential equations (PDEs) of interest can require a large number of computations. We can identify concurrency to allow parallel solution of discrete PDEs. Simulated particles histories can be used to solve the Boltzmann transport equation. Particle histories are independent in neutral particle transport, making them amenable to parallel computation. Physical parameters and method type determine the data dependencies of particle histories. Data requirements shape parallel algorithms for Monte Carlo. Then, Parallel Computational Physics and Parallel Monte Carlo are discussed and, finally, the results are given. The mesh passing method greatly simplifies the IMC implementation and allows simple load-balancing. Using MPI windows and passive, one-sided RMA further simplifies the implementation by removing target synchronization. The author is very interested in implementations of PGAS that may allow further optimization for one-sided, read-only memory access (e.g. Open SHMEM). The MPICH_RMA_OVER_DMAPP option and library is required to make one-sided messaging scale on Trinitite - Moonlight scales poorly. Interconnect specific libraries or functions are likely necessary to ensure performance. BRANSON has been used to directly compare the current standard method to a proposed method on idealized problems. The mesh passing algorithm performs well on problems that are designed to show the scalability of the particle passing method. BRANSON can now run load-imbalanced, dynamic problems. Potential avenues of improvement in the mesh passing algorithm will be implemented and explored. A suite of test problems that stress DD methods will elucidate a possible path forward for production codes.
Palmans, H; Al-Sulaiti, L; Andreo, P; Shipley, D; Lühr, A; Bassler, N; Martinkovič, J; Dobrovodský, J; Rossomme, S; Thomas, R A S; Kacperek, A
2013-05-21
The conversion of absorbed dose-to-graphite in a graphite phantom to absorbed dose-to-water in a water phantom is performed by water to graphite stopping power ratios. If, however, the charged particle fluence is not equal at equivalent depths in graphite and water, a fluence correction factor, kfl, is required as well. This is particularly relevant to the derivation of absorbed dose-to-water, the quantity of interest in radiotherapy, from a measurement of absorbed dose-to-graphite obtained with a graphite calorimeter. In this work, fluence correction factors for the conversion from dose-to-graphite in a graphite phantom to dose-to-water in a water phantom for 60 MeV mono-energetic protons were calculated using an analytical model and five different Monte Carlo codes (Geant4, FLUKA, MCNPX, SHIELD-HIT and McPTRAN.MEDIA). In general the fluence correction factors are found to be close to unity and the analytical and Monte Carlo codes give consistent values when considering the differences in secondary particle transport. When considering only protons the fluence correction factors are unity at the surface and increase with depth by 0.5% to 1.5% depending on the code. When the fluence of all charged particles is considered, the fluence correction factor is about 0.5% lower than unity at shallow depths predominantly due to the contributions from alpha particles and increases to values above unity near the Bragg peak. Fluence correction factors directly derived from the fluence distributions differential in energy at equivalent depths in water and graphite can be described by kfl = 0.9964 + 0.0024·zw-eq with a relative standard uncertainty of 0.2%. Fluence correction factors derived from a ratio of calculated doses at equivalent depths in water and graphite can be described by kfl = 0.9947 + 0.0024·zw-eq with a relative standard uncertainty of 0.3%. These results are of direct relevance to graphite calorimetry in low-energy protons but given that the fluence
Modeling neutron guides using Monte Carlo simulations
Wang, D Q; Crow, M L; Wang, X L; Lee, W T; Hubbard, C R
2002-01-01
Four neutron guide geometries, straight, converging, diverging and curved, were characterized using Monte Carlo ray-tracing simulations. The main areas of interest are the transmission of the guides at various neutron energies and the intrinsic time-of-flight (TOF) peak broadening. Use of a delta-function time pulse from a uniform Lambert neutron source allows one to quantitatively simulate the effect of guides' geometry on the TOF peak broadening. With a converging guide, the intensity and the beam divergence increases while the TOF peak width decreases compared with that of a straight guide. By contrast, use of a diverging guide decreases the intensity and the beam divergence, and broadens the width (in TOF) of the transmitted neutron pulse.
The MC21 Monte Carlo Transport Code
Energy Technology Data Exchange (ETDEWEB)
Sutton TM, Donovan TJ, Trumbull TH, Dobreff PS, Caro E, Griesheimer DP, Tyburski LJ, Carpenter DC, Joo H
2007-01-09
MC21 is a new Monte Carlo neutron and photon transport code currently under joint development at the Knolls Atomic Power Laboratory and the Bettis Atomic Power Laboratory. MC21 is the Monte Carlo transport kernel of the broader Common Monte Carlo Design Tool (CMCDT), which is also currently under development. The vision for CMCDT is to provide an automated, computer-aided modeling and post-processing environment integrated with a Monte Carlo solver that is optimized for reactor analysis. CMCDT represents a strategy to push the Monte Carlo method beyond its traditional role as a benchmarking tool or ''tool of last resort'' and into a dominant design role. This paper describes various aspects of the code, including the neutron physics and nuclear data treatments, the geometry representation, and the tally and depletion capabilities.
Sampson, Andrew Joseph
This dissertation describes the application of two principled variance reduction strategies to increase the efficiency for two applications within medical physics. The first, called correlated Monte Carlo (CMC) applies to patient-specific, permanent-seed brachytherapy (PSB) dose calculations. The second, called adjoint-biased forward Monte Carlo (ABFMC), is used to compute cone-beam computed tomography (CBCT) scatter projections. CMC was applied for two PSB cases: a clinical post-implant prostate, and a breast with a simulated lumpectomy cavity. CMC computes the dose difference, DeltaD, between the highly correlated dose computing homogeneous and heterogeneous geometries. The particle transport in the heterogeneous geometry assumed a purely homogeneous environment, and altered particle weights accounted for bias. Average gains of 37 to 60 are reported from using CMC, relative to un-correlated Monte Carlo (UMC) calculations, for the prostate and breast CTV's, respectively. To further increase the efficiency up to 1500 fold above UMC, an approximation called interpolated correlated Monte Carlo (ICMC) was applied. ICMC computes DeltaD using CMC on a low-resolution (LR) spatial grid followed by interpolation to a high-resolution (HR) voxel grid followed. The interpolated, HR DeltaD is then summed with a HR, pre-computed, homogeneous dose map. ICMC computes an approximate, but accurate, HR heterogeneous dose distribution from LR MC calculations achieving an average 2% standard deviation within the prostate and breast CTV's in 1.1 sec and 0.39 sec, respectively. Accuracy for 80% of the voxels using ICMC is within 3% for anatomically realistic geometries. Second, for CBCT scatter projections, ABFMC was implemented via weight windowing using a solution to the adjoint Boltzmann transport equation computed either via the discrete ordinates method (DOM), or a MC implemented forward-adjoint importance generator (FAIG). ABFMC, implemented via DOM or FAIG, was tested for a
Lin, Yi-Chun; Huang, Tseng-Te; Liu, Yuan-Hao; Chen, Wei-Lin; Chen, Yen-Fu; Wu, Shu-Wei; Nievaart, Sander; Jiang, Shiang-Huei
2015-06-01
The paired ionization chambers (ICs) technique is commonly employed to determine neutron and photon doses in radiology or radiotherapy neutron beams, where neutron dose shows very strong dependence on the accuracy of accompanying high energy photon dose. During the dose derivation, it is an important issue to evaluate the photon and electron response functions of two commercially available ionization chambers, denoted as TE(TE) and Mg(Ar), used in our reactor based epithermal neutron beam. Nowadays, most perturbation corrections for accurate dose determination and many treatment planning systems are based on the Monte Carlo technique. We used general purposed Monte Carlo codes, MCNP5, EGSnrc, FLUKA or GEANT4 for benchmark verifications among them and carefully measured values for a precise estimation of chamber current from absorbed dose rate of cavity gas. Also, energy dependent response functions of two chambers were calculated in a parallel beam with mono-energies from 20 keV to 20 MeV photons and electrons by using the optimal simple spherical and detailed IC models. The measurements were performed in the well-defined (a) four primary M-80, M-100, M120 and M150 X-ray calibration fields, (b) primary 60Co calibration beam, (c) 6 MV and 10 MV photon, (d) 6 MeV and 18 MeV electron LINACs in hospital and (e) BNCT clinical trials neutron beam. For the TE(TE) chamber, all codes were almost identical over the whole photon energy range. In the Mg(Ar) chamber, MCNP5 showed lower response than other codes for photon energy region below 0.1 MeV and presented similar response above 0.2 MeV (agreed within 5% in the simple spherical model). With the increase of electron energy, the response difference between MCNP5 and other codes became larger in both chambers. Compared with the measured currents, MCNP5 had the difference from the measurement data within 5% for the 60Co, 6 MV, 10 MV, 6 MeV and 18 MeV LINACs beams. But for the Mg(Ar) chamber, the derivations reached 7
Betz, G
2002-01-01
To extend the time scale in molecular dynamics (MD) calculations of sputtering and ion assisted deposition we have coupled our MD calculations to a kinetic Monte Carlo (KMC) calculation. In this way we have studied surface erosion of Cu(1 0 0) under 200-600 eV Cu ion bombardment and growth of Cu on Cu(1 0 0) for deposition at thermal energies up to energies of 100 eV per atom. Target temperatures were varied from 100 to 400 K. The coupling of the MD calculation to a KMC calculation allows us to extend our calculations from a few ps, a time scale typical for MD, to times of up to seconds until the next Cu particle will impinge/be deposited on the crystal surface of about 100 nm sup 2 in size. The latter value of 1 s is quite realistic for a typical experimental sputter erosion or deposition experiment. In such a calculation thermal diffusion processes at the surface and annealing of the surface after energetic ion bombardment can be taken into account. To achieve homo-epitaxial growth of a film the results cle...
Energy Technology Data Exchange (ETDEWEB)
Kurosu, Keita [Department of Medical Physics and Engineering, Osaka University Graduate School of Medicine, Suita, Osaka 565-0871 (Japan); Department of Radiation Oncology, Osaka University Graduate School of Medicine, Suita, Osaka 565-0871 (Japan); Takashina, Masaaki; Koizumi, Masahiko [Department of Medical Physics and Engineering, Osaka University Graduate School of Medicine, Suita, Osaka 565-0871 (Japan); Das, Indra J. [Department of Radiation Oncology, Indiana University School of Medicine, Indianapolis, IN 46202 (United States); Moskvin, Vadim P., E-mail: vadim.p.moskvin@gmail.com [Department of Radiation Oncology, Indiana University School of Medicine, Indianapolis, IN 46202 (United States)
2014-10-01
Although three general-purpose Monte Carlo (MC) simulation tools: Geant4, FLUKA and PHITS have been used extensively, differences in calculation results have been reported. The major causes are the implementation of the physical model, preset value of the ionization potential or definition of the maximum step size. In order to achieve artifact free MC simulation, an optimized parameters list for each simulation system is required. Several authors have already proposed the optimized lists, but those studies were performed with a simple system such as only a water phantom. Since particle beams have a transport, interaction and electromagnetic processes during beam delivery, establishment of an optimized parameters-list for whole beam delivery system is therefore of major importance. The purpose of this study was to determine the optimized parameters list for GATE and PHITS using proton treatment nozzle computational model. The simulation was performed with the broad scanning proton beam. The influences of the customizing parameters on the percentage depth dose (PDD) profile and the proton range were investigated by comparison with the result of FLUKA, and then the optimal parameters were determined. The PDD profile and the proton range obtained from our optimized parameters list showed different characteristics from the results obtained with simple system. This led to the conclusion that the physical model, particle transport mechanics and different geometry-based descriptions need accurate customization in planning computational experiments for artifact-free MC simulation.
Spike Inference from Calcium Imaging using Sequential Monte Carlo Methods
NeuroData; Paninski, L
2015-01-01
Vogelstein JT, Paninski L. Spike Inference from Calcium Imaging using Sequential Monte Carlo Methods. Statistical and Applied Mathematical Sciences Institute (SAMSI) Program on Sequential Monte Carlo Methods, 2008
Monte Carlo approaches to light nuclei
Energy Technology Data Exchange (ETDEWEB)
Carlson, J.
1990-01-01
Significant progress has been made recently in the application of Monte Carlo methods to the study of light nuclei. We review new Green's function Monte Carlo results for the alpha particle, Variational Monte Carlo studies of {sup 16}O, and methods for low-energy scattering and transitions. Through these calculations, a coherent picture of the structure and electromagnetic properties of light nuclei has arisen. In particular, we examine the effect of the three-nucleon interaction and the importance of exchange currents in a variety of experimentally measured properties, including form factors and capture cross sections. 29 refs., 7 figs.
Monte carlo simulation for soot dynamics
Zhou, Kun
2012-01-01
A new Monte Carlo method termed Comb-like frame Monte Carlo is developed to simulate the soot dynamics. Detailed stochastic error analysis is provided. Comb-like frame Monte Carlo is coupled with the gas phase solver Chemkin II to simulate soot formation in a 1-D premixed burner stabilized flame. The simulated soot number density, volume fraction, and particle size distribution all agree well with the measurement available in literature. The origin of the bimodal distribution of particle size distribution is revealed with quantitative proof.
Lattice gauge theories and Monte Carlo simulations
Rebbi, Claudio
1983-01-01
This volume is the most up-to-date review on Lattice Gauge Theories and Monte Carlo Simulations. It consists of two parts. Part one is an introductory lecture on the lattice gauge theories in general, Monte Carlo techniques and on the results to date. Part two consists of important original papers in this field. These selected reprints involve the following: Lattice Gauge Theories, General Formalism and Expansion Techniques, Monte Carlo Simulations. Phase Structures, Observables in Pure Gauge Theories, Systems with Bosonic Matter Fields, Simulation of Systems with Fermions.
Quantum Monte Carlo for minimum energy structures
Wagner, Lucas K
2010-01-01
We present an efficient method to find minimum energy structures using energy estimates from accurate quantum Monte Carlo calculations. This method involves a stochastic process formed from the stochastic energy estimates from Monte Carlo that can be averaged to find precise structural minima while using inexpensive calculations with moderate statistical uncertainty. We demonstrate the applicability of the algorithm by minimizing the energy of the H2O-OH- complex and showing that the structural minima from quantum Monte Carlo calculations affect the qualitative behavior of the potential energy surface substantially.
Fast quantum Monte Carlo on a GPU
Lutsyshyn, Y
2013-01-01
We present a scheme for the parallelization of quantum Monte Carlo on graphical processing units, focusing on bosonic systems and variational Monte Carlo. We use asynchronous execution schemes with shared memory persistence, and obtain an excellent acceleration. Comparing with single core execution, GPU-accelerated code runs over x100 faster. The CUDA code is provided along with the package that is necessary to execute variational Monte Carlo for a system representing liquid helium-4. The program was benchmarked on several models of Nvidia GPU, including Fermi GTX560 and M2090, and the latest Kepler architecture K20 GPU. Kepler-specific optimization is discussed.
Energy Technology Data Exchange (ETDEWEB)
Santana Leitner, Mario; Fasso, Alberto; Fisher, Alan S.; Nuhn, Heinz D.; /SLAC; Dooling, Jeffrey C.; Berg, William; Yang, Bin X.; /Argonne
2010-09-14
In 2009 the Linac Coherent Light Source (LCLS) at the SLAC National Accelerator Center started free electron laser (FEL) operation. In order to continue to produce the bright and short-pulsed x-ray laser demanded by FEL scientists, this pioneer hard x-ray FEL requires a perfectly tailored magnetic field at the undulators, so that the photons generated at the electron wiggling path interact at the right phase with the electron beam. In such a precise system, small (>0.01%) radiation-induced alterations of the magnetic field in the permanent magnets could affect FEL performance. This paper describes the simulation studies of radiation fields in permanent magnets and the expected signal in the detectors. The transport of particles from the radiation sources (i.e. diagnostic insert) to the undulator magnets and to the beam loss monitors (BLM) was simulated with the intra nuclear cascade codes FLUKA and MARS15. In order to accurately reproduce the optics of LCLS, lattice capabilities and magnetic fields were enabled in FLUKA and betatron oscillations were validated against reference data. All electron events entering the BLMs were printed in data files. The paper also introduces the Radioactive Ion Beam Optimizer (RIBO) Monte Carlo 3-D code, which was used to read from the event files, to compute Cerenkov production and then to simulate the optical coupling of the BLM detectors, accounting for the transmission of light through the quartz.
Monte Carlo simulations for focusing elliptical guides
Energy Technology Data Exchange (ETDEWEB)
Valicu, Roxana [FRM2 Garching, Muenchen (Germany); Boeni, Peter [E20, TU Muenchen (Germany)
2009-07-01
The aim of the Monte Carlo simulations using McStas Programme was to improve the focusing of the neutron beam existing at PGAA (FRM II) by prolongation of the existing elliptic guide (coated now with supermirrors with m=3) with a new part. First we have tried with an initial length of the additional guide of 7,5cm and coatings for the neutron guide of supermirrors with m=4,5 and 6. The gain (calculated by dividing the intensity in the focal point after adding the guide by the intensity at the focal point with the initial guide) obtained for this coatings indicated that a coating with m=5 would be appropriate for a first trial. The next step was to vary the length of the additional guide for this m value and therefore choosing the appropriate length for the maximal gain. With the m value and the length of the guide fixed we have introduced an aperture 1 cm before the focal point and we have varied the radius of this aperture in order to obtain a focused beam. We have observed a dramatic decrease in the size of the beam in the focal point after introducing this aperture. The simulation results, the gains obtained and the evolution of the beam size will be presented.
Monte Carlo systems used for treatment planning and dose verification
Energy Technology Data Exchange (ETDEWEB)
Brualla, Lorenzo [Universitaetsklinikum Essen, NCTeam, Strahlenklinik, Essen (Germany); Rodriguez, Miguel [Centro Medico Paitilla, Balboa (Panama); Lallena, Antonio M. [Universidad de Granada, Departamento de Fisica Atomica, Molecular y Nuclear, Granada (Spain)
2017-04-15
General-purpose radiation transport Monte Carlo codes have been used for estimation of the absorbed dose distribution in external photon and electron beam radiotherapy patients since several decades. Results obtained with these codes are usually more accurate than those provided by treatment planning systems based on non-stochastic methods. Traditionally, absorbed dose computations based on general-purpose Monte Carlo codes have been used only for research, owing to the difficulties associated with setting up a simulation and the long computation time required. To take advantage of radiation transport Monte Carlo codes applied to routine clinical practice, researchers and private companies have developed treatment planning and dose verification systems that are partly or fully based on fast Monte Carlo algorithms. This review presents a comprehensive list of the currently existing Monte Carlo systems that can be used to calculate or verify an external photon and electron beam radiotherapy treatment plan. Particular attention is given to those systems that are distributed, either freely or commercially, and that do not require programming tasks from the end user. These systems are compared in terms of features and the simulation time required to compute a set of benchmark calculations. (orig.) [German] Seit mehreren Jahrzehnten werden allgemein anwendbare Monte-Carlo-Codes zur Simulation des Strahlungstransports benutzt, um die Verteilung der absorbierten Dosis in der perkutanen Strahlentherapie mit Photonen und Elektronen zu evaluieren. Die damit erzielten Ergebnisse sind meist akkurater als solche, die mit nichtstochastischen Methoden herkoemmlicher Bestrahlungsplanungssysteme erzielt werden koennen. Wegen des damit verbundenen Arbeitsaufwands und der langen Dauer der Berechnungen wurden Monte-Carlo-Simulationen von Dosisverteilungen in der konventionellen Strahlentherapie in der Vergangenheit im Wesentlichen in der Forschung eingesetzt. Im Bemuehen, Monte-Carlo
Bauer, J.; Unholtz, D.; Kurz, C.; Parodi, K.
2013-08-01
We report on the experimental campaign carried out at the Heidelberg Ion-Beam Therapy Center (HIT) to optimize the Monte Carlo (MC) modelling of proton-induced positron-emitter production. The presented experimental strategy constitutes a pragmatic inverse approach to overcome the known uncertainties in the modelling of positron-emitter production due to the lack of reliable cross-section data for the relevant therapeutic energy range. This work is motivated by the clinical implementation of offline PET/CT-based treatment verification at our facility. Here, the irradiation induced tissue activation in the patient is monitored shortly after the treatment delivery by means of a commercial PET/CT scanner and compared to a MC simulated activity expectation, derived under the assumption of a correct treatment delivery. At HIT, the MC particle transport and interaction code FLUKA is used for the simulation of the expected positron-emitter yield. For this particular application, the code is coupled to externally provided cross-section data of several proton-induced reactions. Studying experimentally the positron-emitting radionuclide yield in homogeneous phantoms provides access to the fundamental production channels. Therefore, five different materials have been irradiated by monoenergetic proton pencil beams at various energies and the induced β+ activity subsequently acquired with a commercial full-ring PET/CT scanner. With the analysis of dynamically reconstructed PET images, we are able to determine separately the spatial distribution of different radionuclide concentrations at the starting time of the PET scan. The laterally integrated radionuclide yields in depth are used to tune the input cross-section data such that the impact of both the physical production and the imaging process on the various positron-emitter yields is reproduced. The resulting cross-section data sets allow to model the absolute level of measured β+ activity induced in the investigated
Tian, Zhen; Li, Yongbao; Hassan-Rezaeian, Nima; Jiang, Steve B; Jia, Xun
2017-03-01
We have previously developed a GPU-based Monte Carlo (MC) dose engine on the OpenCL platform, named goMC, with a built-in analytical linear accelerator (linac) beam model. In this paper, we report our recent improvement on goMC to move it toward clinical use. First, we have adapted a previously developed automatic beam commissioning approach to our beam model. The commissioning was conducted through an optimization process, minimizing the discrepancies between calculated dose and measurement. We successfully commissioned six beam models built for Varian TrueBeam linac photon beams, including four beams of different energies (6 MV, 10 MV, 15 MV, and 18 MV) and two flattening-filter-free (FFF) beams of 6 MV and 10 MV. Second, to facilitate the use of goMC for treatment plan dose calculations, we have developed an efficient source particle sampling strategy. It uses the pre-generated fluence maps (FMs) to bias the sampling of the control point for source particles already sampled from our beam model. It could effectively reduce the number of source particles required to reach a statistical uncertainty level in the calculated dose, as compared to the conventional FM weighting method. For a head-and-neck patient treated with volumetric modulated arc therapy (VMAT), a reduction factor of ~2.8 was achieved, accelerating dose calculation from 150.9 s to 51.5 s. The overall accuracy of goMC was investigated on a VMAT prostate patient case treated with 10 MV FFF beam. 3D gamma index test was conducted to evaluate the discrepancy between our calculated dose and the dose calculated in Varian Eclipse treatment planning system. The passing rate was 99.82% for 2%/2 mm criterion and 95.71% for 1%/1 mm criterion. Our studies have demonstrated the effectiveness and feasibility of our auto-commissioning approach and new source sampling strategy for fast and accurate MC dose calculations for treatment plans. © 2017 The Authors. Journal of Applied Clinical Medical Physics published by
11th International Conference on Monte Carlo and Quasi-Monte Carlo Methods in Scientific Computing
Nuyens, Dirk
2016-01-01
This book presents the refereed proceedings of the Eleventh International Conference on Monte Carlo and Quasi-Monte Carlo Methods in Scientific Computing that was held at the University of Leuven (Belgium) in April 2014. These biennial conferences are major events for Monte Carlo and quasi-Monte Carlo researchers. The proceedings include articles based on invited lectures as well as carefully selected contributed papers on all theoretical aspects and applications of Monte Carlo and quasi-Monte Carlo methods. Offering information on the latest developments in these very active areas, this book is an excellent reference resource for theoreticians and practitioners interested in solving high-dimensional computational problems, arising, in particular, in finance, statistics and computer graphics.
Simulation and the Monte Carlo method
Rubinstein, Reuven Y
2016-01-01
Simulation and the Monte Carlo Method, Third Edition reflects the latest developments in the field and presents a fully updated and comprehensive account of the major topics that have emerged in Monte Carlo simulation since the publication of the classic First Edition over more than a quarter of a century ago. While maintaining its accessible and intuitive approach, this revised edition features a wealth of up-to-date information that facilitates a deeper understanding of problem solving across a wide array of subject areas, such as engineering, statistics, computer science, mathematics, and the physical and life sciences. The book begins with a modernized introduction that addresses the basic concepts of probability, Markov processes, and convex optimization. Subsequent chapters discuss the dramatic changes that have occurred in the field of the Monte Carlo method, with coverage of many modern topics including: Markov Chain Monte Carlo, variance reduction techniques such as the transform likelihood ratio...
Monte Carlo simulations for plasma physics
Energy Technology Data Exchange (ETDEWEB)
Okamoto, M.; Murakami, S.; Nakajima, N.; Wang, W.X. [National Inst. for Fusion Science, Toki, Gifu (Japan)
2000-07-01
Plasma behaviours are very complicated and the analyses are generally difficult. However, when the collisional processes play an important role in the plasma behaviour, the Monte Carlo method is often employed as a useful tool. For examples, in neutral particle injection heating (NBI heating), electron or ion cyclotron heating, and alpha heating, Coulomb collisions slow down high energetic particles and pitch angle scatter them. These processes are often studied by the Monte Carlo technique and good agreements can be obtained with the experimental results. Recently, Monte Carlo Method has been developed to study fast particle transports associated with heating and generating the radial electric field. Further it is applied to investigating the neoclassical transport in the plasma with steep gradients of density and temperatures which is beyong the conventional neoclassical theory. In this report, we briefly summarize the researches done by the present authors utilizing the Monte Carlo method. (author)
Quantum Monte Carlo Calculations of Light Nuclei
Pieper, Steven C
2007-01-01
During the last 15 years, there has been much progress in defining the nuclear Hamiltonian and applying quantum Monte Carlo methods to the calculation of light nuclei. I describe both aspects of this work and some recent results.
Improved Monte Carlo Renormalization Group Method
Gupta, R.; Wilson, K. G.; Umrigar, C.
1985-01-01
An extensive program to analyze critical systems using an Improved Monte Carlo Renormalization Group Method (IMCRG) being undertaken at LANL and Cornell is described. Here we first briefly review the method and then list some of the topics being investigated.
Monte Carlo methods for particle transport
Haghighat, Alireza
2015-01-01
The Monte Carlo method has become the de facto standard in radiation transport. Although powerful, if not understood and used appropriately, the method can give misleading results. Monte Carlo Methods for Particle Transport teaches appropriate use of the Monte Carlo method, explaining the method's fundamental concepts as well as its limitations. Concise yet comprehensive, this well-organized text: * Introduces the particle importance equation and its use for variance reduction * Describes general and particle-transport-specific variance reduction techniques * Presents particle transport eigenvalue issues and methodologies to address these issues * Explores advanced formulations based on the author's research activities * Discusses parallel processing concepts and factors affecting parallel performance Featuring illustrative examples, mathematical derivations, computer algorithms, and homework problems, Monte Carlo Methods for Particle Transport provides nuclear engineers and scientists with a practical guide ...
Smart detectors for Monte Carlo radiative transfer
Baes, Maarten
2008-01-01
Many optimization techniques have been invented to reduce the noise that is inherent in Monte Carlo radiative transfer simulations. As the typical detectors used in Monte Carlo simulations do not take into account all the information contained in the impacting photon packages, there is still room to optimize this detection process and the corresponding estimate of the surface brightness distributions. We want to investigate how all the information contained in the distribution of impacting photon packages can be optimally used to decrease the noise in the surface brightness distributions and hence to increase the efficiency of Monte Carlo radiative transfer simulations. We demonstrate that the estimate of the surface brightness distribution in a Monte Carlo radiative transfer simulation is similar to the estimate of the density distribution in an SPH simulation. Based on this similarity, a recipe is constructed for smart detectors that take full advantage of the exact location of the impact of the photon pack...
Quantum Monte Carlo approaches for correlated systems
Becca, Federico
2017-01-01
Over the past several decades, computational approaches to studying strongly-interacting systems have become increasingly varied and sophisticated. This book provides a comprehensive introduction to state-of-the-art quantum Monte Carlo techniques relevant for applications in correlated systems. Providing a clear overview of variational wave functions, and featuring a detailed presentation of stochastic samplings including Markov chains and Langevin dynamics, which are developed into a discussion of Monte Carlo methods. The variational technique is described, from foundations to a detailed description of its algorithms. Further topics discussed include optimisation techniques, real-time dynamics and projection methods, including Green's function, reptation and auxiliary-field Monte Carlo, from basic definitions to advanced algorithms for efficient codes, and the book concludes with recent developments on the continuum space. Quantum Monte Carlo Approaches for Correlated Systems provides an extensive reference ...
Yani, Sitti; Dirgayussa, I. Gde E.; Rhani, Moh. Fadhillah; Haryanto, Freddy; Arif, Idam
2015-09-01
Recently, Monte Carlo (MC) calculation method has reported as the most accurate method of predicting dose distributions in radiotherapy. The MC code system (especially DOSXYZnrc) has been used to investigate the different voxel (volume elements) sizes effect on the accuracy of dose distributions. To investigate this effect on dosimetry parameters, calculations were made with three different voxel sizes. The effects were investigated with dose distribution calculations for seven voxel sizes: 1 × 1 × 0.1 cm3, 1 × 1 × 0.5 cm3, and 1 × 1 × 0.8 cm3. The 1 × 109 histories were simulated in order to get statistical uncertainties of 2%. This simulation takes about 9-10 hours to complete. Measurements are made with field sizes 10 × 10 cm2 for the 6 MV photon beams with Gaussian intensity distribution FWHM 0.1 cm and SSD 100.1 cm. MC simulated and measured dose distributions in a water phantom. The output of this simulation i.e. the percent depth dose and dose profile in dmax from the three sets of calculations are presented and comparisons are made with the experiment data from TTSH (Tan Tock Seng Hospital, Singapore) in 0-5 cm depth. Dose that scored in voxels is a volume averaged estimate of the dose at the center of a voxel. The results in this study show that the difference between Monte Carlo simulation and experiment data depend on the voxel size both for percent depth dose (PDD) and profile dose. PDD scan on Z axis (depth) of water phantom, the big difference obtain in the voxel size 1 × 1 × 0.8 cm3 about 17%. In this study, the profile dose focused on high gradient dose area. Profile dose scan on Y axis and the big difference get in the voxel size 1 × 1 × 0.1 cm3 about 12%. This study demonstrated that the arrange voxel in Monte Carlo simulation becomes important.
Energy Technology Data Exchange (ETDEWEB)
Yani, Sitti, E-mail: sitti.yani@s.itb.ac.id [Nuclear Physics and Biophysics Division, Physics Department, Institut Teknologi Bandung (Indonesia); Akademi Kebidanan Pelita Ibu, Kendari (Indonesia); Dirgayussa, I Gde E.; Haryanto, Freddy; Arif, Idam [Nuclear Physics and Biophysics Division, Physics Department, Institut Teknologi Bandung (Indonesia); Rhani, Moh. Fadhillah [Tan Tock Seng Hospital (Singapore)
2015-09-30
Recently, Monte Carlo (MC) calculation method has reported as the most accurate method of predicting dose distributions in radiotherapy. The MC code system (especially DOSXYZnrc) has been used to investigate the different voxel (volume elements) sizes effect on the accuracy of dose distributions. To investigate this effect on dosimetry parameters, calculations were made with three different voxel sizes. The effects were investigated with dose distribution calculations for seven voxel sizes: 1 × 1 × 0.1 cm{sup 3}, 1 × 1 × 0.5 cm{sup 3}, and 1 × 1 × 0.8 cm{sup 3}. The 1 × 10{sup 9} histories were simulated in order to get statistical uncertainties of 2%. This simulation takes about 9-10 hours to complete. Measurements are made with field sizes 10 × 10 cm2 for the 6 MV photon beams with Gaussian intensity distribution FWHM 0.1 cm and SSD 100.1 cm. MC simulated and measured dose distributions in a water phantom. The output of this simulation i.e. the percent depth dose and dose profile in d{sub max} from the three sets of calculations are presented and comparisons are made with the experiment data from TTSH (Tan Tock Seng Hospital, Singapore) in 0-5 cm depth. Dose that scored in voxels is a volume averaged estimate of the dose at the center of a voxel. The results in this study show that the difference between Monte Carlo simulation and experiment data depend on the voxel size both for percent depth dose (PDD) and profile dose. PDD scan on Z axis (depth) of water phantom, the big difference obtain in the voxel size 1 × 1 × 0.8 cm{sup 3} about 17%. In this study, the profile dose focused on high gradient dose area. Profile dose scan on Y axis and the big difference get in the voxel size 1 × 1 × 0.1 cm{sup 3} about 12%. This study demonstrated that the arrange voxel in Monte Carlo simulation becomes important.
Energy Technology Data Exchange (ETDEWEB)
Wang, Y; Mazur, T; Green, O; Hu, Y; Wooten, H; Yang, D; Zhao, T; Mutic, S; Li, H [Washington University School of Medicine, St. Louis, MO (United States)
2015-06-15
Purpose: To build a fast, accurate and easily-deployable research platform for Monte-Carlo dose calculations. We port the dose calculation engine PENELOPE to C++, and accelerate calculations using GPU acceleration. Simulations of a Co-60 beam model provided by ViewRay demonstrate the capabilities of the platform. Methods: We built software that incorporates a beam model interface, CT-phantom model, GPU-accelerated PENELOPE engine, and GUI front-end. We rewrote the PENELOPE kernel in C++ (from Fortran) and accelerated the code on a GPU. We seamlessly integrated a Co-60 beam model (obtained from ViewRay) into our platform. Simulations of various field sizes and SSDs using a homogeneous water phantom generated PDDs, dose profiles, and output factors that were compared to experiment data. Results: With GPU acceleration using a dated graphics card (Nvidia Tesla C2050), a highly accurate simulation – including 100*100*100 grid, 3×3×3 mm3 voxels, <1% uncertainty, and 4.2×4.2 cm2 field size – runs 24 times faster (20 minutes versus 8 hours) than when parallelizing on 8 threads across a new CPU (Intel i7-4770). Simulated PDDs, profiles and output ratios for the commercial system agree well with experiment data measured using radiographic film or ionization chamber. Based on our analysis, this beam model is precise enough for general applications. Conclusions: Using a beam model for a Co-60 system provided by ViewRay, we evaluate a dose calculation platform that we developed. Comparison to measurements demonstrates the promise of our software for use as a research platform for dose calculations, with applications including quality assurance and treatment plan verification.
Edimo, P; Clermont, C; Kwato, M G; Vynckier, S
2009-09-01
In the present work, Monte Carlo (MC) models of electron beams (energies 4, 12 and 18MeV) from an Elekta SL25 medical linear accelerator were simulated using EGSnrc/BEAMnrc user code. The calculated dose distributions were benchmarked by comparison with measurements made in a water phantom for a wide range of open field sizes and insert combinations, at a single source-to-surface distance (SSD) of 100cm. These BEAMnrc models were used to evaluate the accuracy of a commercial MC dose calculation engine for electron beam treatment planning (Oncentra MasterPlan Treament Planning System (OMTPS) version 1.4, Nucletron) for two energies, 4 and 12MeV. Output factors were furthermore measured in the water phantom and compared to BEAMnrc and OMTPS. The overall agreement between predicted and measured output factors was comparable for both BEAMnrc and OMTPS, except for a few asymmetric and/or small insert cutouts, where larger deviations between measurements and the values predicted from BEAMnrc as well as OMTPS computations were recorded. However, in the heterogeneous phantom, differences between BEAMnrc and measurements ranged from 0.5 to 2.0% between two ribs and 0.6-1.0% below the ribs, whereas the range difference between OMTPS and measurements was the same (0.5-4.0%) in both areas. With respect to output factors, the overall agreement between BEAMnrc and measurements was usually within 1.0% whereas differences up to nearly 3.0% were observed for OMTPS. This paper focuses on a comparison for clinical cases, including the effects of electron beam attenuations in a heterogeneous phantom. It, therefore, complements previously reported data (only based on measurements) in one other paper on commissioning of the VMC++ dose calculation engine. These results demonstrate that the VMC++ algorithm is more robust in predicting dose distribution than Pencil beam based algorithms for the electron beams investigated.
Bartalini, P.; Kryukov, A.; Selyuzhenkov, Ilya V.; Sherstnev, A.; Vologdin, A.
2004-01-01
We present the Monte-Carlo events Data Base (MCDB) project and its development plans. MCDB facilitates communication between authors of Monte-Carlo generators and experimental users. It also provides a convenient book-keeping and an easy access to generator level samples. The first release of MCDB is now operational for the CMS collaboration. In this paper we review the main ideas behind MCDB and discuss future plans to develop this Data Base further within the CERN LCG framework.
Monte Carlo Algorithms for Linear Problems
DIMOV, Ivan
2000-01-01
MSC Subject Classification: 65C05, 65U05. Monte Carlo methods are a powerful tool in many fields of mathematics, physics and engineering. It is known, that these methods give statistical estimates for the functional of the solution by performing random sampling of a certain chance variable whose mathematical expectation is the desired functional. Monte Carlo methods are methods for solving problems using random variables. In the book [16] edited by Yu. A. Shreider one can find the followin...
The Feynman Path Goes Monte Carlo
Sauer, Tilman
2001-01-01
Path integral Monte Carlo (PIMC) simulations have become an important tool for the investigation of the statistical mechanics of quantum systems. I discuss some of the history of applying the Monte Carlo method to non-relativistic quantum systems in path-integral representation. The principle feasibility of the method was well established by the early eighties, a number of algorithmic improvements have been introduced in the last two decades.
Monte Carlo Hamiltonian:Inverse Potential
Institute of Scientific and Technical Information of China (English)
LUO Xiang-Qian; CHENG Xiao-Ni; Helmut KR(O)GER
2004-01-01
The Monte Carlo Hamiltonian method developed recently allows to investigate the ground state and low-lying excited states of a quantum system,using Monte Carlo(MC)algorithm with importance sampling.However,conventional MC algorithm has some difficulties when applied to inverse potentials.We propose to use effective potential and extrapolation method to solve the problem.We present examples from the hydrogen system.
Self-consistent kinetic lattice Monte Carlo
Energy Technology Data Exchange (ETDEWEB)
Horsfield, A.; Dunham, S.; Fujitani, Hideaki
1999-07-01
The authors present a brief description of a formalism for modeling point defect diffusion in crystalline systems using a Monte Carlo technique. The main approximations required to construct a practical scheme are briefly discussed, with special emphasis on the proper treatment of charged dopants and defects. This is followed by tight binding calculations of the diffusion barrier heights for charged vacancies. Finally, an application of the kinetic lattice Monte Carlo method to vacancy diffusion is presented.
Error in Monte Carlo, quasi-error in Quasi-Monte Carlo
Kleiss, R H
2006-01-01
While the Quasi-Monte Carlo method of numerical integration achieves smaller integration error than standard Monte Carlo, its use in particle physics phenomenology has been hindered by the abscence of a reliable way to estimate that error. The standard Monte Carlo error estimator relies on the assumption that the points are generated independently of each other and, therefore, fails to account for the error improvement advertised by the Quasi-Monte Carlo method. We advocate the construction of an estimator of stochastic nature, based on the ensemble of pointsets with a particular discrepancy value. We investigate the consequences of this choice and give some first empirical results on the suggested estimators.
Implementation of Monte Carlo Simulations for the Gamma Knife System
Energy Technology Data Exchange (ETDEWEB)
Xiong, W [Memorial Sloan-Kettering Cancer Center/Mercy Medical Center, 1000 N Village Ave., Rockville Centre, NY 11570 (United States); Huang, D [Memorial Sloan-Kettering Cancer Center/Mercy Medical Center, 1000 N Village Ave., Rockville Centre, NY 11570 (United States); Lee, L [Memorial Sloan-Kettering Cancer Center/Mercy Medical Center, 1000 N Village Ave., Rockville Centre, NY 11570 (United States); Feng, J [Memorial Sloan-Kettering Cancer Center/Mercy Medical Center, 1000 N Village Ave., Rockville Centre, NY 11570 (United States); Morris, K [Memorial Sloan-Kettering Cancer Center/Mercy Medical Center, 1000 N Village Ave., Rockville Centre, NY 11570 (United States); Calugaru, E [Memorial Sloan-Kettering Cancer Center/Mercy Medical Center, 1000 N Village Ave., Rockville Centre, NY 11570 (United States); Burman, C [Memorial Sloan-Kettering Cancer Center/Mercy Medical Center, 1000 N Village Ave., Rockville Centre, NY 11570 (United States); Li, J [Fox Chase Cancer Center, 333 Cottman Ave., Philadelphia, PA 17111 (United States); Ma, C-M [Fox Chase Cancer Center, 333 Cottman Ave., Philadelphia, PA 17111 (United States)
2007-06-15
Currently the Gamma Knife system is accompanied with a treatment planning system, Leksell GammaPlan (LGP) which is a standard, computer-based treatment planning system for Gamma Knife radiosurgery. In LGP, the dose calculation algorithm does not consider the scatter dose contributions and the inhomogeneity effect due to the skull and air cavities. To improve the dose calculation accuracy, Monte Carlo simulations have been implemented for the Gamma Knife planning system. In this work, the 201 Cobalt-60 sources in the Gamma Knife unit are considered to have the same activity. Each Cobalt-60 source is contained in a cylindric stainless steel capsule. The particle phase space information is stored in four beam data files, which are collected in the inner sides of the 4 treatment helmets, after the Cobalt beam passes through the stationary and helmet collimators. Patient geometries are rebuilt from patient CT data. Twenty two Patients are included in the Monte Carlo simulation for this study. The dose is calculated using Monte Carlo in both homogenous and inhomogeneous geometries with identical beam parameters. To investigate the attenuation effect of the skull bone the dose in a 16cm diameter spherical QA phantom is measured with and without a 1.5mm Lead-covering and also simulated using Monte Carlo. The dose ratios with and without the 1.5mm Lead-covering are 89.8% based on measurements and 89.2% according to Monte Carlo for a 18mm-collimator Helmet. For patient geometries, the Monte Carlo results show that although the relative isodose lines remain almost the same with and without inhomogeneity corrections, the difference in the absolute dose is clinically significant. The average inhomogeneity correction is (3.9 {+-} 0.90) % for the 22 patients investigated. These results suggest that the inhomogeneity effect should be considered in the dose calculation for Gamma Knife treatment planning.
Parodi, K; Kraemer, M; Sommerer, F; Naumann, J; Mairani, A; Brons, S
2010-01-01
Scanned ion beam delivery promises superior flexibility and accuracy for highly conformal tumour therapy in comparison to the usage of passive beam shaping systems. The attainable precision demands correct overlapping of the pencil-like beams which build up the entire dose distribution in the treatment field. In particular, improper dose application due to deviations of the lateral beam profiles from the nominal planning conditions must be prevented via appropriate beam monitoring in the beamline, prior to the entrance in the patient. To assess the necessary tolerance thresholds of the beam monitoring system at the Heidelberg Ion Beam Therapy Center, Germany, this study has investigated several worst-case scenarios for a sensitive treatment plan, namely scanned proton and carbon ion delivery to a small target volume at a shallow depth. Deviations from the nominal lateral beam profiles were simulated, which may occur because of misaligned elements or changes of the beam optic in the beamline. Data have been an...
Orfanelli, Styliani; Gazis, E
The Compact Linear Collider (CLIC) study is a feasibility study aiming at the development of an electron/positron linear collider with a centre of mass energy in the multi-TeV energy range. Each Linac will have a length of 21 km, which means that very high accelerating gradients (>100 MV/m) are required. To achieve the high accelerating gradients, a novel two-beam acceleration scheme, in which RF power is transferred from a high-current, low-energy drive beam to the low-current, high energy main accelerating beam is designed. A Beam Loss Monitoring (BLM) system will be designed for CLIC to meet the requirements of the accelerator complex. Its main role as part of the machine protection scheme will be to detect potentially dangerous beam instabilities and prevent subsequent injection into the main beam or drive beam decelerators. The first part of this work describes the GEANT4 Monte Carlo simulations performed to estimate the damage potential of high energy electron beams impacting a copper target. The second...
Magro, G.; Molinelli, S.; Mairani, A.; Mirandola, A.; Panizza, D.; Russo, S.; Ferrari, A.; Valvo, F.; Fossati, P.; Ciocca, M.
2015-09-01
This study was performed to evaluate the accuracy of a commercial treatment planning system (TPS), in optimising proton pencil beam dose distributions for small targets of different sizes (5-30 mm side) located at increasing depths in water. The TPS analytical algorithm was benchmarked against experimental data and the FLUKA Monte Carlo (MC) code, previously validated for the selected beam-line. We tested the Siemens syngo® TPS plan optimisation module for water cubes fixing the configurable parameters at clinical standards, with homogeneous target coverage to a 2 Gy (RBE) dose prescription as unique goal. Plans were delivered and the dose at each volume centre was measured in water with a calibrated PTW Advanced Markus® chamber. An EBT3® film was also positioned at the phantom entrance window for the acquisition of 2D dose maps. Discrepancies between TPS calculated and MC simulated values were mainly due to the different lateral spread modeling and resulted in being related to the field-to-spot size ratio. The accuracy of the TPS was proved to be clinically acceptable in all cases but very small and shallow volumes. In this contest, the use of MC to validate TPS results proved to be a reliable procedure for pre-treatment plan verification.
Magro, G; Mairani, A; Mirandola, A; Panizza, D; Russo, S; Ferrari, A; Valvo, F; Fossati, P; Ciocca, M
2015-01-01
This study was performed to evaluate the accuracy of a commercial treatment planning system (TPS), in optimising proton pencil beam dose distributions for small targets of different sizes (5–30 mm side) located at increasing depths in water. The TPS analytical algorithm was benchmarked against experimental data and the FLUKA Monte Carlo (MC) code, previously validated for the selected beam-line. We tested the Siemens syngo® TPS plan optimisation module for water cubes fixing the configurable parameters at clinical standards, with homogeneous target coverage to a 2 Gy (RBE) dose prescription as unique goal. Plans were delivered and the dose at each volume centre was measured in water with a calibrated PTW Advanced Markus® chamber. An EBT3® film was also positioned at the phantom entrance window for the acquisition of 2D dose maps. Discrepancies between TPS calculated and MC simulated values were mainly due to the different lateral spread modeling and resulted in being related to the field-to-spot size r...
A standard Event Class for Monte Carlo Generators
Institute of Scientific and Technical Information of China (English)
L.A.Gerren; M.Fischler
2001-01-01
StdHepC++[1]is a CLHEP[2] Monte Carlo event class library which provides a common interface to Monte Carlo Event Generators,This work is an extensive redesign of the StdHep Fortran interface to use the full power of object oriented design,A generated event maps naturally onto the Directed Acyclic Graph concept and we have used the HepMC classes to implement this.The full implementation allows the user to combine events to simulate beam pileup and access them transparently as though they were a single event.
Sánchez-Doblado, F; Andreo, P; Capote, R; Leal, A; Perucha, M; Arráns, R; Núñez, L; Mainegra, E; Lagares, J I; Carrasco, E
2003-07-21
Absolute dosimetry with ionization chambers of the narrow photon fields used in stereotactic techniques and IMRT beamlets is constrained by lack of electron equilibrium in the radiation field. It is questionable that stopping-power ratio in dosimetry protocols, obtained for broad photon beams and quasi-electron equilibrium conditions, can be used in the dosimetry of narrow fields while keeping the uncertainty at the same level as for the broad beams used in accelerator calibrations. Monte Carlo simulations have been performed for two 6 MV clinical accelerators (Elekta SL-18 and Siemens Mevatron Primus), equipped with radiosurgery applicators and MLC. Narrow circular and Z-shaped on-axis and off-axis fields, as well as broad IMRT configured beams, have been simulated together with reference 10 x 10 cm2 beams. Phase-space data have been used to generate 3D dose distributions which have been compared satisfactorily with experimental profiles (ion chamber, diodes and film). Photon and electron spectra at various depths in water have been calculated, followed by Spencer-Attix (delta = 10 keV) stopping-power ratio calculations which have been compared to those used in the IAEA TRS-398 code of practice. For water/air and PMMA/air stopping-power ratios, agreements within 0.1% have been obtained for the 10 x 10 cm2 fields. For radiosurgery applicators and narrow MLC beams, the calculated s(w,air) values agree with the reference within +/-0.3%, well within the estimated standard uncertainty of the reference stopping-power ratios (0.5%). Ionization chamber dosimetry of narrow beams at the photon qualities used in this work (6 MV) can therefore be based on stopping-power ratios data in dosimetry protocols. For a modulated 6 MV broad beam used in clinical IMRT, s(w,air) agrees within 0.1% with the value for 10 x 10 cm2, confirming that at low energies IMRT absolute dosimetry can also be based on data for open reference fields. At higher energies (24 MV) the difference in s
Energy Technology Data Exchange (ETDEWEB)
Sanchez-Doblado, F [Radiofisica, Hospital Univ Virgen Macarena, Avda Dr Fedriani s/n, E-41009 Sevilla (Spain); Andreo, P [Division of Medical Radiation Physics, University of Stockholm, Karolinska Institute, PO Box 260, SE-171 76 Stockholm (Sweden); Capote, R [Radiofisica, Hospital Univ Virgen Macarena, Avda Dr Fedriani s/n, E-41009 Sevilla (Spain); Leal, A [Radiofisica, Hospital Univ Virgen Macarena, Avda Dr Fedriani s/n, E-41009 Sevilla (Spain); Perucha, M [Dpto Fisica Medica y Biofisica, F Medicina, Universidad Sevilla (Spain); Arrans, R [Radiofisica, Hospital Univ Virgen Macarena, Avda Dr Fedriani s/n, E-41009 Sevilla (Spain); Nunez, L [Radiofisica, Clinica Puerta de Hierro, Madrid (Spain); Mainegra, E [National Research Council, Ottawa (Canada); Lagares, J I [Radiofisica, Hospital Univ Virgen Macarena, Avda Dr Fedriani s/n, E-41009 Sevilla (Spain); Carrasco, E [Radiofisica, Hospital Univ Virgen Macarena, Avda Dr Fedriani s/n, E-41009 Sevilla (Spain)
2003-07-21
Absolute dosimetry with ionization chambers of the narrow photon fields used in stereotactic techniques and IMRT beamlets is constrained by lack of electron equilibrium in the radiation field. It is questionable that stopping-power ratio in dosimetry protocols, obtained for broad photon beams and quasi-electron equilibrium conditions, can be used in the dosimetry of narrow fields while keeping the uncertainty at the same level as for the broad beams used in accelerator calibrations. Monte Carlo simulations have been performed for two 6 MV clinical accelerators (Elekta SL-18 and Siemens Mevatron Primus), equipped with radiosurgery applicators and MLC. Narrow circular and Z-shaped on-axis and off-axis fields, as well as broad IMRT configured beams, have been simulated together with reference 10 x 10 cm{sup 2} beams. Phase-space data have been used to generate 3D dose distributions which have been compared satisfactorily with experimental profiles (ion chamber, diodes and film). Photon and electron spectra at various depths in water have been calculated, followed by Spencer-Attix ({delta} = 10 keV) stopping-power ratio calculations which have been compared to those used in the IAEA TRS-398 code of practice. For water/air and PMMA/air stopping-power ratios, agreements within 0.1% have been obtained for the 10 x 10 cm{sup 2} fields. For radiosurgery applicators and narrow MLC beams, the calculated s{sub w,air} values agree with the reference within {+-}0.3%, well within the estimated standard uncertainty of the reference stopping-power ratios (0.5%). Ionization chamber dosimetry of narrow beams at the photon qualities used in this work (6 MV) can therefore be based on stopping-power ratios data in dosimetry protocols. For a modulated 6 MV broad beam used in clinical IMRT, s{sub w,air} agrees within 0.1% with the value for 10 x 10 cm{sup 2}, confirming that at low energies IMRT absolute dosimetry can also be based on data for open reference fields. At higher energies (24
Directory of Open Access Journals (Sweden)
Joshi Chandra
2010-01-01
Full Text Available Underdosing of treatment targets can occur in radiation therapy due to electronic disequilibrium around air-tissue interfaces when tumors are situated near natural air cavities. These effects have been shown to increase with the beam energy and decrease with the field size. Intensity modulated radiation therapy (IMRT and tomotherapy techniques employ combinations of multiple small radiation beamlets of varying intensities to deliver highly conformal radiation therapy. The use of small beamlets in these techniques may therefore result in underdosing of treatment target in the air-tissue interfaces region surrounding an air cavity. This work was undertaken to investigate dose reductions near the air-water interfaces of 1x1x1 and 3x3x3 cm 3 air cavities, typically encountered in the treatment of head and neck cancer utilizing radiation therapy techniques such as IMRT and tomotherapy using small fields of Co-60, 6 MV and 15 MV photons. Additional investigations were performed for larger photon field sizes encompassing the entire air-cavity, such as encountered in conventional three dimensional conformal radiation therapy (3DCRT techniques. The EGSnrc/DOSXYZnrc Monte Carlo code was used to calculate the dose reductions (in water in air-water interface region for single, parallel opposed and four field irradiations with 2x2 cm 2 (beamlet, 10x2 cm 2 (fan beam, 5x5 and 7x7 cm 2 field sizes. The magnitude of dose reduction in water near air-water interface increases with photon energy; decreases with distance from the interface as well as decreases as the number of beams are increased. No dose reductions were observed for large field sizes encompassing the air cavities. The results demonstrate that Co-60 beams may provide significantly smaller interface dose reductions than 6 MV and 15 MV irradiations for small field irradiations such as used in IMRT and tomotherapy.
Cros, Maria; Joemai, Raoul M. S.; Geleijns, Jacob; Molina, Diego; Salvadó, Marçal
2017-08-01
This study aims to develop and test software for assessing and reporting doses for standard patients undergoing computed tomography (CT) examinations in a 320 detector-row cone-beam scanner. The software, called SimDoseCT, is based on the Monte Carlo (MC) simulation code, which was developed to calculate organ doses and effective doses in ICRP anthropomorphic adult reference computational phantoms for acquisitions with the Aquilion ONE CT scanner (Toshiba). MC simulation was validated by comparing CTDI measurements within standard CT dose phantoms with results from simulation under the same conditions. SimDoseCT consists of a graphical user interface connected to a MySQL database, which contains the look-up-tables that were generated with MC simulations for volumetric acquisitions at different scan positions along the phantom using any tube voltage, bow tie filter, focal spot and nine different beam widths. Two different methods were developed to estimate organ doses and effective doses from acquisitions using other available beam widths in the scanner. A correction factor was used to estimate doses in helical acquisitions. Hence, the user can select any available protocol in the Aquilion ONE scanner for a standard adult male or female and obtain the dose results through the software interface. Agreement within 9% between CTDI measurements and simulations allowed the validation of the MC program. Additionally, the algorithm for dose reporting in SimDoseCT was validated by comparing dose results from this tool with those obtained from MC simulations for three volumetric acquisitions (head, thorax and abdomen). The comparison was repeated using eight different collimations and also for another collimation in a helical abdomen examination. The results showed differences of 0.1 mSv or less for absolute dose in most organs and also in the effective dose calculation. The software provides a suitable tool for dose assessment in standard adult patients undergoing CT
Energy Technology Data Exchange (ETDEWEB)
Jones, Bernard L; Cho, Sang Hyun, E-mail: scho@gatech.edu [Nuclear/Radiological Engineering and Medical Physics Programs, Woodruff School of Mechanical Engineering, Georgia Institute of Technology, Atlanta, GA 30332-0405 (United States)
2011-06-21
A recent study investigated the feasibility to develop a bench-top x-ray fluorescence computed tomography (XFCT) system capable of determining the spatial distribution and concentration of gold nanoparticles (GNPs) in vivo using a diagnostic energy range polychromatic (i.e. 110 kVp) pencil-beam source. In this follow-up study, we examined the feasibility of a polychromatic cone-beam implementation of XFCT by Monte Carlo (MC) simulations using the MCNP5 code. In the current MC model, cylindrical columns with various sizes (5-10 mm in diameter) containing water loaded with GNPs (0.1-2% gold by weight) were inserted into a 5 cm diameter cylindrical polymethyl methacrylate phantom. The phantom was then irradiated by a lead-filtered 110 kVp x-ray source, and the resulting gold fluorescence and Compton-scattered photons were collected by a series of energy-sensitive tallies after passing through lead parallel-hole collimators. A maximum-likelihood iterative reconstruction algorithm was implemented to reconstruct the image of GNP-loaded objects within the phantom. The effects of attenuation of both the primary beam through the phantom and the gold fluorescence photons en route to the detector were corrected during the image reconstruction. Accurate images of the GNP-containing phantom were successfully reconstructed for three different phantom configurations, with both spatial distribution and relative concentration of GNPs well identified. The pixel intensity of regions containing GNPs was linearly proportional to the gold concentration. The current MC study strongly suggests the possibility of developing a bench-top, polychromatic, cone-beam XFCT system for in vivo imaging.
Jones, Bernard L.; Cho, Sang Hyun
2011-06-01
A recent study investigated the feasibility to develop a bench-top x-ray fluorescence computed tomography (XFCT) system capable of determining the spatial distribution and concentration of gold nanoparticles (GNPs) in vivo using a diagnostic energy range polychromatic (i.e. 110 kVp) pencil-beam source. In this follow-up study, we examined the feasibility of a polychromatic cone-beam implementation of XFCT by Monte Carlo (MC) simulations using the MCNP5 code. In the current MC model, cylindrical columns with various sizes (5-10 mm in diameter) containing water loaded with GNPs (0.1-2% gold by weight) were inserted into a 5 cm diameter cylindrical polymethyl methacrylate phantom. The phantom was then irradiated by a lead-filtered 110 kVp x-ray source, and the resulting gold fluorescence and Compton-scattered photons were collected by a series of energy-sensitive tallies after passing through lead parallel-hole collimators. A maximum-likelihood iterative reconstruction algorithm was implemented to reconstruct the image of GNP-loaded objects within the phantom. The effects of attenuation of both the primary beam through the phantom and the gold fluorescence photons en route to the detector were corrected during the image reconstruction. Accurate images of the GNP-containing phantom were successfully reconstructed for three different phantom configurations, with both spatial distribution and relative concentration of GNPs well identified. The pixel intensity of regions containing GNPs was linearly proportional to the gold concentration. The current MC study strongly suggests the possibility of developing a bench-top, polychromatic, cone-beam XFCT system for in vivo imaging.
DEFF Research Database (Denmark)
Ottosson, R O; Hauer, Anna Karlsson; Behrens, C.F.
2010-01-01
The pencil beam dose calculation method is frequently used in modern radiation therapy treatment planning regardless of the fact that it is documented inaccurately for cases involving large density variations. The inaccuracies are larger for higher beam energies. As a result, low energy beams...
Energy Technology Data Exchange (ETDEWEB)
Lin, Yi-Chun [Health Physics Division, Institute of Nuclear Energy Research, Taoyuan County, Taiwan (China); Huang, Tseng-Te, E-mail: huangtt@iner.gov.tw [Health Physics Division, Institute of Nuclear Energy Research, Taoyuan County, Taiwan (China); Liu, Yuan-Hao [Nuclear Science and Technology Development Center, National Tsing Hua University, Hsinchu City, Taiwan (China); Chen, Wei-Lin [Institute of Nuclear Engineering and Science, National Tsing Hua University, Hsinchu City, Taiwan (China); Chen, Yen-Fu [Atomic Energy Council, New Taipei City, Taiwan (China); Wu, Shu-Wei [Dept. of Biomedical Engineering and Environmental Sciences, National Tsing Hua University, Hsinchu, Taiwan (China); Nievaart, Sander [Institute for Energy, Joint Research Centre, European Commission, Petten (Netherlands); Jiang, Shiang-Huei [Dept. of Engineering and System Science, National Tsing Hua University, Hsinchu, Taiwan (China)
2015-06-01
The paired ionization chambers (ICs) technique is commonly employed to determine neutron and photon doses in radiology or radiotherapy neutron beams, where neutron dose shows very strong dependence on the accuracy of accompanying high energy photon dose. During the dose derivation, it is an important issue to evaluate the photon and electron response functions of two commercially available ionization chambers, denoted as TE(TE) and Mg(Ar), used in our reactor based epithermal neutron beam. Nowadays, most perturbation corrections for accurate dose determination and many treatment planning systems are based on the Monte Carlo technique. We used general purposed Monte Carlo codes, MCNP5, EGSnrc, FLUKA or GEANT4 for benchmark verifications among them and carefully measured values for a precise estimation of chamber current from absorbed dose rate of cavity gas. Also, energy dependent response functions of two chambers were calculated in a parallel beam with mono-energies from 20 keV to 20 MeV photons and electrons by using the optimal simple spherical and detailed IC models. The measurements were performed in the well-defined (a) four primary M-80, M-100, M120 and M150 X-ray calibration fields, (b) primary {sup 60}Co calibration beam, (c) 6 MV and 10 MV photon, (d) 6 MeV and 18 MeV electron LINACs in hospital and (e) BNCT clinical trials neutron beam. For the TE(TE) chamber, all codes were almost identical over the whole photon energy range. In the Mg(Ar) chamber, MCNP5 showed lower response than other codes for photon energy region below 0.1 MeV and presented similar response above 0.2 MeV (agreed within 5% in the simple spherical model). With the increase of electron energy, the response difference between MCNP5 and other codes became larger in both chambers. Compared with the measured currents, MCNP5 had the difference from the measurement data within 5% for the {sup 60}Co, 6 MV, 10 MV, 6 MeV and 18 MeV LINACs beams. But for the Mg(Ar) chamber, the derivations
Cortés-Giraldo, M A; Carabe, A
2015-04-07
We compare unrestricted dose average linear energy transfer (LET) maps calculated with three different Monte Carlo scoring methods in voxelized geometries irradiated with proton therapy beams with three different Monte Carlo scoring methods. Simulations were done with the Geant4 (Geometry ANd Tracking) toolkit. The first method corresponds to a step-by-step computation of LET which has been reported previously in the literature. We found that this scoring strategy is influenced by spurious high LET components, which relative contribution in the dose average LET calculations significantly increases as the voxel size becomes smaller. Dose average LET values calculated for primary protons in water with voxel size of 0.2 mm were a factor ~1.8 higher than those obtained with a size of 2.0 mm at the plateau region for a 160 MeV beam. Such high LET components are a consequence of proton steps in which the condensed-history algorithm determines an energy transfer to an electron of the material close to the maximum value, while the step length remains limited due to voxel boundary crossing. Two alternative methods were derived to overcome this problem. The second scores LET along the entire path described by each proton within the voxel. The third followed the same approach of the first method, but the LET was evaluated at each step from stopping power tables according to the proton kinetic energy value. We carried out microdosimetry calculations with the aim of deriving reference dose average LET values from microdosimetric quantities. Significant differences between the methods were reported either with pristine or spread-out Bragg peaks (SOBPs). The first method reported values systematically higher than the other two at depths proximal to SOBP by about 15% for a 5.9 cm wide SOBP and about 30% for a 11.0 cm one. At distal SOBP, the second method gave values about 15% lower than the others. Overall, we found that the third method gave the most consistent
Ferretti, A; Martignano, A; Simonato, F; Paiusco, M
2014-02-01
The aim of the present work was the validation of the VMC(++) Monte Carlo (MC) engine implemented in the Oncentra Masterplan (OMTPS) and used to calculate the dose distribution produced by the electron beams (energy 5-12 MeV) generated by the linear accelerator (linac) Primus (Siemens), shaped by a digital variable applicator (DEVA). The BEAMnrc/DOSXYZnrc (EGSnrc package) MC model of the linac head was used as a benchmark. Commissioning results for both MC codes were evaluated by means of 1D Gamma Analysis (2%, 2 mm), calculated with a home-made Matlab (The MathWorks) program, comparing the calculations with the measured profiles. The results of the commissioning of OMTPS were good [average gamma index (γ) > 97%]; some mismatches were found with large beams (size ≥ 15 cm). The optimization of the BEAMnrc model required to increase the beam exit window to match the calculated and measured profiles (final average γ > 98%). Then OMTPS dose distribution maps were compared with DOSXYZnrc with a 2D Gamma Analysis (3%, 3 mm), in 3 virtual water phantoms: (a) with an air step, (b) with an air insert, and (c) with a bone insert. The OMTPD and EGSnrc dose distributions with the air-water step phantom were in very high agreement (γ ∼ 99%), while for heterogeneous phantoms there were differences of about 9% in the air insert and of about 10-15% in the bone region. This is due to the Masterplan implementation of VMC(++) which reports the dose as "dose to water", instead of "dose to medium".
Monte Carlo simulations for heavy ion dosimetry
Energy Technology Data Exchange (ETDEWEB)
Geithner, O.
2006-07-26
Water-to-air stopping power ratio (s{sub w,air}) calculations for the ionization chamber dosimetry of clinically relevant ion beams with initial energies from 50 to 450 MeV/u have been performed using the Monte Carlo technique. To simulate the transport of a particle in water the computer code SHIELD-HIT v2 was used which is a substantially modified version of its predecessor SHIELD-HIT v1. The code was partially rewritten, replacing formerly used single precision variables with double precision variables. The lowest particle transport specific energy was decreased from 1 MeV/u down to 10 keV/u by modifying the Bethe- Bloch formula, thus widening its range for medical dosimetry applications. Optional MSTAR and ICRU-73 stopping power data were included. The fragmentation model was verified using all available experimental data and some parameters were adjusted. The present code version shows excellent agreement with experimental data. Additional to the calculations of stopping power ratios, s{sub w,air}, the influence of fragments and I-values on s{sub w,air} for carbon ion beams was investigated. The value of s{sub w,air} deviates as much as 2.3% at the Bragg peak from the recommended by TRS-398 constant value of 1.130 for an energy of 50 MeV/u. (orig.)
Björk, Peter; Knöös, Tommy; Nilsson, Per
2004-10-07
The aim of the present study was to investigate three different detector types (a parallel-plate ionization chamber, a p-type silicon diode and a diamond detector) with regard to output factor measurements in degraded electron beams, such as those encountered in small-electron-field radiotherapy and intraoperative radiation therapy (IORT). The Monte Carlo method was used to calculate mass collision stopping-power ratios between water and the different detector materials for these complex electron beams (nominal energies of 6, 12 and 20 MeV). The diamond detector was shown to exhibit excellent properties for output factor measurements in degraded beams and was therefore used as a reference. The diode detector was found to be well suited for practical measurements of output factors, although the water-to-silicon stopping-power ratio was shown to vary slightly with treatment set-up and irradiation depth (especially for lower electron energies). Application of ionization-chamber-based dosimetry, according to international dosimetry protocols, will introduce uncertainties smaller than 0.3% into the output factor determination for conventional IORT beams if the variation of the water-to-air stopping-power ratio is not taken into account. The IORT system at our department includes a 0.3 cm thin plastic scatterer inside the therapeutic beam, which furthermore increases the energy degradation of the electrons. By ignoring the change in the water-to-air stopping-power ratio due to this scatterer, the output factor could be underestimated by up to 1.3%. This was verified by the measurements. In small-electron-beam dosimetry, the water-to-air stopping-power ratio variation with field size could mostly be ignored. For fields with flat lateral dose profiles (>3 x 3 cm2), output factors determined with the ionization chamber were found to be in close agreement with the results of the diamond detector. For smaller field sizes the lateral extension of the ionization chamber hampers
Approaching Chemical Accuracy with Quantum Monte Carlo
Petruzielo, F R; Umrigar, C J
2012-01-01
A quantum Monte Carlo study of the atomization energies for the G2 set of molecules is presented. Basis size dependence of diffusion Monte Carlo atomization energies is studied with a single determinant Slater-Jastrow trial wavefunction formed from Hartree-Fock orbitals. With the largest basis set, the mean absolute deviation from experimental atomization energies for the G2 set is 3.0 kcal/mol. Optimizing the orbitals within variational Monte Carlo improves the agreement between diffusion Monte Carlo and experiment, reducing the mean absolute deviation to 2.1 kcal/mol. Moving beyond a single determinant Slater-Jastrow trial wavefunction, diffusion Monte Carlo with a small complete active space Slater-Jastrow trial wavefunction results in near chemical accuracy. In this case, the mean absolute deviation from experimental atomization energies is 1.2 kcal/mol. It is shown from calculations on systems containing phosphorus that the accuracy can be further improved by employing a larger active space.
Monte Carlo simulation of large electron fields
Faddegon, Bruce A.; Perl, Joseph; Asai, Makoto
2008-03-01
Two Monte Carlo systems, EGSnrc and Geant4, the latter with two different 'physics lists,' were used to calculate dose distributions in large electron fields used in radiotherapy. Source and geometry parameters were adjusted to match calculated results to measurement. Both codes were capable of accurately reproducing the measured dose distributions of the six electron beams available on the accelerator. Depth penetration matched the average measured with a diode and parallel-plate chamber to 0.04 cm or better. Calculated depth dose curves agreed to 2% with diode measurements in the build-up region, although for the lower beam energies there was a discrepancy of up to 5% in this region when calculated results are compared to parallel-plate measurements. Dose profiles at the depth of maximum dose matched to 2-3% in the central 25 cm of the field, corresponding to the field size of the largest applicator. A 4% match was obtained outside the central region. The discrepancy observed in the bremsstrahlung tail in published results that used EGS4 is no longer evident. Simulations with the different codes and physics lists used different source energies, incident beam angles, thicknesses of the primary foils, and distance between the primary and secondary foil. The true source and geometry parameters were not known with sufficient accuracy to determine which parameter set, including the energy of the source, was closest to the truth. These results underscore the requirement for experimental benchmarks of depth penetration and electron scatter for beam energies and foils relevant to radiotherapy.
Monte Carlo EM加速算法%Acceleration of Monte Carlo EM Algorithm
Institute of Scientific and Technical Information of China (English)
罗季
2008-01-01
EM算法是近年来常用的求后验众数的估计的一种数据增广算法,但由于求出其E步中积分的显示表达式有时很困难,甚至不可能,限制了其应用的广泛性.而Monte Carlo EM算法很好地解决了这个问题,将EM算法中E步的积分用Monte Carlo模拟来有效实现,使其适用性大大增强.但无论是EM算法,还是Monte Carlo EM算法,其收敛速度都是线性的,被缺损信息的倒数所控制,当缺损数据的比例很高时,收敛速度就非常缓慢.而Newton-Raphson算法在后验众数的附近具有二次收敛速率.本文提出Monte Carlo EM加速算法,将Monte Carlo EM算法与Newton-Raphson算法结合,既使得EM算法中的E步用Monte Carlo模拟得以实现,又证明了该算法在后验众数附近具有二次收敛速度.从而使其保留了Monte Carlo EM算法的优点,并改进了Monte Carlo EM算法的收敛速度.本文通过数值例子,将Monte Carlo EM加速算法的结果与EM算法、Monte Carlo EM算法的结果进行比较,进一步说明了Monte Carlo EM加速算法的优良性.
Energy Technology Data Exchange (ETDEWEB)
Son, Myung-Sik; Rhee, Jin-Koo [Dongguk University, Seoul (Korea, Republic of); Lee, Jun-Ha [Sangmyung University, Chonan (Korea, Republic of); Hwang, Ho-Jung [Chung-Ang University, Seoul (Korea, Republic of)
2004-08-15
A computationally efficient and accurate Monte Carlo (MC) simulator for electron beam lithography has been developed and applied for the sub-0.1-mum T-shaped gate (T-gate) process in HEMT devices for the millimeter-wave applications. The enhanced MC simulator for the electron trajectory includes elastic scattering and inelastic scatterings, which include inner-shell ionizations, outer-shell (free) excitations, and plasmon excitations in multi-layer resists and heterogeneous substrates. Our model has been applied to the structure of PMMA/P(MMA-MAA)/PMMA on a GaAs substrate to form the T-gate shape in resist layers. We considered and modeled a real fabrication process, such as the electron-beam double-exposure method, to obtain better reproducibility and controllability in the fabrication of high electron mobility transistor (HEMT) devices. To model an accurate T-gate process by using electron beam lithography, we have modeled three different developers using a string algorithm such as MCB, Methanol : IPA (1 : 1), and MIBK : IPA (1 : 3). Our simulations for the T-gate electron beam lithography have been verified by comparing them with the SEM measurements at a 50-keV electron beam exposure system. In this paper, we show and discuss the differences of exposure profiles and developed pattern shapes for the sub-0.1-mum T-gate formation process in trilayer resists using 50-kV and 100-kV electron beam exposure systems.
Molinelli, S.; Mairani, A.; Mirandola, A.; Vilches Freixas, G.; Tessonnier, T.; Giordanengo, S.; Parodi, K.; Ciocca, M.; Orecchia, R.
2013-06-01
During one year of clinical activity at the Italian National Center for Oncological Hadron Therapy 31 patients were treated with actively scanned proton beams. Results of patient-specific quality assurance procedures are presented here which assess the accuracy of a three-dimensional dose verification technique with the simultaneous use of multiple small-volume ionization chambers. To investigate critical cases of major deviations between treatment planning system (TPS) calculated and measured data points, a Monte Carlo (MC) simulation tool was implemented for plan verification in water. Starting from MC results, the impact of dose calculation, dose delivery and measurement set-up uncertainties on plan verification results was analyzed. All resulting patient-specific quality checks were within the acceptance threshold, which was set at 5% for both mean deviation between measured and calculated doses and standard deviation. The mean deviation between TPS dose calculation and measurement was less than ±3% in 86% of the cases. When all three sources of uncertainty were accounted for, simulated data sets showed a high level of agreement, with mean and maximum absolute deviation lower than 2.5% and 5%, respectively.
Montanari, Davide; Silvestri, Chiara; Graves, Yan J; Yan, Hao; Cervino, Laura; Rice, Roger; Jiang, Steve B; Jia, Xun
2013-01-01
Cone beam CT (CBCT) has been widely used for patient setup in image guided radiation therapy (IGRT). Radiation dose from CBCT scans has become a clinical concern. The purposes of this study are 1) to commission a GPU-based Monte Carlo (MC) dose calculation package gCTD for Varian On-Board Imaging (OBI) system and test the calculation accuracy, and 2) to quantitatively evaluate CBCT dose from the OBI system in typical IGRT scan protocols. We first conducted dose measurements in a water phantom. X-ray source model parameters used in gCTD are obtained through a commissioning process. gCTD accuracy is demonstrated by comparing calculations with measurements in water and in CTDI phantoms. 25 brain cancer patients are used to study dose in a standard-dose head protocol, and 25 prostate cancer patients are used to study dose in pelvis protocol and pelvis spotlight protocol. Mean dose to each organ is calculated. Mean dose to 2% voxels that have the highest dose is also computed to quantify the maximum dose. It is fo...
Random Numbers and Monte Carlo Methods
Scherer, Philipp O. J.
Many-body problems often involve the calculation of integrals of very high dimension which cannot be treated by standard methods. For the calculation of thermodynamic averages Monte Carlo methods are very useful which sample the integration volume at randomly chosen points. After summarizing some basic statistics, we discuss algorithms for the generation of pseudo-random numbers with given probability distribution which are essential for all Monte Carlo methods. We show how the efficiency of Monte Carlo integration can be improved by sampling preferentially the important configurations. Finally the famous Metropolis algorithm is applied to classical many-particle systems. Computer experiments visualize the central limit theorem and apply the Metropolis method to the traveling salesman problem.
SMCTC: Sequential Monte Carlo in C++
Directory of Open Access Journals (Sweden)
Adam M. Johansen
2009-04-01
Full Text Available Sequential Monte Carlo methods are a very general class of Monte Carlo methodsfor sampling from sequences of distributions. Simple examples of these algorithms areused very widely in the tracking and signal processing literature. Recent developmentsillustrate that these techniques have much more general applicability, and can be appliedvery eectively to statistical inference problems. Unfortunately, these methods are oftenperceived as being computationally expensive and dicult to implement. This articleseeks to address both of these problems.A C++ template class library for the ecient and convenient implementation of verygeneral Sequential Monte Carlo algorithms is presented. Two example applications areprovided: a simple particle lter for illustrative purposes and a state-of-the-art algorithmfor rare event estimation.
Shell model the Monte Carlo way
Energy Technology Data Exchange (ETDEWEB)
Ormand, W.E.
1995-03-01
The formalism for the auxiliary-field Monte Carlo approach to the nuclear shell model is presented. The method is based on a linearization of the two-body part of the Hamiltonian in an imaginary-time propagator using the Hubbard-Stratonovich transformation. The foundation of the method, as applied to the nuclear many-body problem, is discussed. Topics presented in detail include: (1) the density-density formulation of the method, (2) computation of the overlaps, (3) the sign of the Monte Carlo weight function, (4) techniques for performing Monte Carlo sampling, and (5) the reconstruction of response functions from an imaginary-time auto-correlation function using MaxEnt techniques. Results obtained using schematic interactions, which have no sign problem, are presented to demonstrate the feasibility of the method, while an extrapolation method for realistic Hamiltonians is presented. In addition, applications at finite temperature are outlined.
Quantum Monte Carlo with variable spins.
Melton, Cody A; Bennett, M Chandler; Mitas, Lubos
2016-06-28
We investigate the inclusion of variable spins in electronic structure quantum Monte Carlo, with a focus on diffusion Monte Carlo with Hamiltonians that include spin-orbit interactions. Following our previous introduction of fixed-phase spin-orbit diffusion Monte Carlo, we thoroughly discuss the details of the method and elaborate upon its technicalities. We present a proof for an upper-bound property for complex nonlocal operators, which allows for the implementation of T-moves to ensure the variational property. We discuss the time step biases associated with our particular choice of spin representation. Applications of the method are also presented for atomic and molecular systems. We calculate the binding energies and geometry of the PbH and Sn2 molecules, as well as the electron affinities of the 6p row elements in close agreement with experiments.
A brief introduction to Monte Carlo simulation.
Bonate, P L
2001-01-01
Simulation affects our life every day through our interactions with the automobile, airline and entertainment industries, just to name a few. The use of simulation in drug development is relatively new, but its use is increasing in relation to the speed at which modern computers run. One well known example of simulation in drug development is molecular modelling. Another use of simulation that is being seen recently in drug development is Monte Carlo simulation of clinical trials. Monte Carlo simulation differs from traditional simulation in that the model parameters are treated as stochastic or random variables, rather than as fixed values. The purpose of this paper is to provide a brief introduction to Monte Carlo simulation methods.
Quantum Monte Carlo with Variable Spins
Melton, Cody A; Mitas, Lubos
2016-01-01
We investigate the inclusion of variable spins in electronic structure quantum Monte Carlo, with a focus on diffusion Monte Carlo with Hamiltonians that include spin-orbit interactions. Following our previous introduction of fixed-phase spin-orbit diffusion Monte Carlo (FPSODMC), we thoroughly discuss the details of the method and elaborate upon its technicalities. We present a proof for an upper-bound property for complex nonlocal operators, which allows for the implementation of T-moves to ensure the variational property. We discuss the time step biases associated with our particular choice of spin representation. Applications of the method are also presented for atomic and molecular systems. We calculate the binding energies and geometry of the PbH and Sn$_2$ molecules, as well as the electron affinities of the 6$p$ row elements in close agreement with experiments.
CosmoPMC: Cosmology Population Monte Carlo
Kilbinger, Martin; Cappe, Olivier; Cardoso, Jean-Francois; Fort, Gersende; Prunet, Simon; Robert, Christian P; Wraith, Darren
2011-01-01
We present the public release of the Bayesian sampling algorithm for cosmology, CosmoPMC (Cosmology Population Monte Carlo). CosmoPMC explores the parameter space of various cosmological probes, and also provides a robust estimate of the Bayesian evidence. CosmoPMC is based on an adaptive importance sampling method called Population Monte Carlo (PMC). Various cosmology likelihood modules are implemented, and new modules can be added easily. The importance-sampling algorithm is written in C, and fully parallelised using the Message Passing Interface (MPI). Due to very little overhead, the wall-clock time required for sampling scales approximately with the number of CPUs. The CosmoPMC package contains post-processing and plotting programs, and in addition a Monte-Carlo Markov chain (MCMC) algorithm. The sampling engine is implemented in the library pmclib, and can be used independently. The software is available for download at http://www.cosmopmc.info.
Quantum speedup of Monte Carlo methods.
Montanaro, Ashley
2015-09-08
Monte Carlo methods use random sampling to estimate numerical quantities which are hard to compute deterministically. One important example is the use in statistical physics of rapidly mixing Markov chains to approximately compute partition functions. In this work, we describe a quantum algorithm which can accelerate Monte Carlo methods in a very general setting. The algorithm estimates the expected output value of an arbitrary randomized or quantum subroutine with bounded variance, achieving a near-quadratic speedup over the best possible classical algorithm. Combining the algorithm with the use of quantum walks gives a quantum speedup of the fastest known classical algorithms with rigorous performance bounds for computing partition functions, which use multiple-stage Markov chain Monte Carlo techniques. The quantum algorithm can also be used to estimate the total variation distance between probability distributions efficiently.
Adiabatic optimization versus diffusion Monte Carlo methods
Jarret, Michael; Jordan, Stephen P.; Lackey, Brad
2016-10-01
Most experimental and theoretical studies of adiabatic optimization use stoquastic Hamiltonians, whose ground states are expressible using only real nonnegative amplitudes. This raises a question as to whether classical Monte Carlo methods can simulate stoquastic adiabatic algorithms with polynomial overhead. Here we analyze diffusion Monte Carlo algorithms. We argue that, based on differences between L1 and L2 normalized states, these algorithms suffer from certain obstructions preventing them from efficiently simulating stoquastic adiabatic evolution in generality. In practice however, we obtain good performance by introducing a method that we call Substochastic Monte Carlo. In fact, our simulations are good classical optimization algorithms in their own right, competitive with the best previously known heuristic solvers for MAX-k -SAT at k =2 ,3 ,4 .
Self-learning Monte Carlo method
Liu, Junwei; Qi, Yang; Meng, Zi Yang; Fu, Liang
2017-01-01
Monte Carlo simulation is an unbiased numerical tool for studying classical and quantum many-body systems. One of its bottlenecks is the lack of a general and efficient update algorithm for large size systems close to the phase transition, for which local updates perform badly. In this Rapid Communication, we propose a general-purpose Monte Carlo method, dubbed self-learning Monte Carlo (SLMC), in which an efficient update algorithm is first learned from the training data generated in trial simulations and then used to speed up the actual simulation. We demonstrate the efficiency of SLMC in a spin model at the phase transition point, achieving a 10-20 times speedup.
Monte Carlo strategies in scientific computing
Liu, Jun S
2008-01-01
This paperback edition is a reprint of the 2001 Springer edition This book provides a self-contained and up-to-date treatment of the Monte Carlo method and develops a common framework under which various Monte Carlo techniques can be "standardized" and compared Given the interdisciplinary nature of the topics and a moderate prerequisite for the reader, this book should be of interest to a broad audience of quantitative researchers such as computational biologists, computer scientists, econometricians, engineers, probabilists, and statisticians It can also be used as the textbook for a graduate-level course on Monte Carlo methods Many problems discussed in the alter chapters can be potential thesis topics for masters’ or PhD students in statistics or computer science departments Jun Liu is Professor of Statistics at Harvard University, with a courtesy Professor appointment at Harvard Biostatistics Department Professor Liu was the recipient of the 2002 COPSS Presidents' Award, the most prestigious one for sta...
Zheng, Dandan; Zhang, Qinghui; Liang, Xiaoying; Zhu, Xiaofeng; Verma, Vivek; Wang, Shuo; Zhou, Sumin
2016-07-08
In lung stereotactic body radiotherapy (SBRT) cases, the pencil beam (PB) dose calculation algorithm is known to overestimate target dose as compared to the more accurate Monte Carlo (MC) algorithm. We investigated whether changing the normalized prescription isodose line affected the magnitude of MC vs. PB target dose differences. Forty-eight patient plans and twenty virtual-tumor phantom plans were studied. For patient plans, four alternative plans prescribed to 60%, 70%, 80%, and 90% isodose lines were each created for 12 patients who previously received lung SBRT treatments. Using 6 MV dynamic conformal arcs, the plans were individually optimized to achieve similar dose coverage and conformity for all plans of the same patient, albeit at the different prescription levels. These plans, having used a PB algorithm, were all recalculated with MC to compare the target dose differences. The relative MC vs. PB target dose variations were investigated by comparing PTV D95, Dmean, and D5 loss at the four prescription levels. The MC-to-PB ratio of the plan heterogeneity index (HI) was also evaluated and compared among different isodose levels. To definitively demonstrate the cause of the isodose line dependence, a simulated phantom study was conducted using simple, spherical virtual tumors planned with uniform block margins. The tumor size and beam energy were also altered in the phantom study to investigate the interplay between these confounding factors and the isodose line effect. The magnitude of the target dose overestimation by PB was greater for higher prescription isodose levels. The MC vs. PB reduction in the target dose coverage indices, D95 and V100 of PTV, were found to monotonically increase with increasing isodose lines from 60% to 90%, resulting in more pronounced target dose coverage deficiency at higher isodose prescription levels. No isodose level-dependent trend was observed for the dose errors in the target mean or high dose indices, Dmean or D5. The
Shrestha, Suman; Vedantham, Srinivasan; Karellas, Andrew
2017-03-01
In digital breast tomosynthesis and digital mammography, the x-ray beam filter material and thickness vary between systems. Replacing K-edge filters with Al was investigated with the intent to reduce exposure duration and to simplify system design. Tungsten target x-ray spectra were simulated with K-edge filters (50 µm Rh; 50 µm Ag) and Al filters of varying thickness. Monte Carlo simulations were conducted to quantify the x-ray scatter from various filters alone, scatter-to-primary ratio (SPR) with compressed breasts, and to determine the radiation dose to the breast. These data were used to analytically compute the signal-difference-to-noise ratio (SDNR) at unit (1 mGy) mean glandular dose (MGD) for W/Rh and W/Ag spectra. At SDNR matched between K-edge and Al filtered spectra, the reductions in exposure duration and MGD were quantified for three strategies: (i) fixed Al thickness and matched tube potential in kilovolts (kV); (ii) fixed Al thickness and varying the kV to match the half-value layer (HVL) between Al and K-edge filtered spectra; and, (iii) matched kV and varying the Al thickness to match the HVL between Al and K-edge filtered spectra. Monte Carlo simulations indicate that the SPR with and without the breast were not different between Al and K-edge filters. Modelling for fixed Al thickness (700 µm) and kV matched to K-edge filtered spectra, identical SDNR was achieved with 37-57% reduction in exposure duration and with 2-20% reduction in MGD, depending on breast thickness. Modelling for fixed Al thickness (700 µm) and HVL matched by increasing the kV over (0,4) range, identical SDNR was achieved with 62-65% decrease in exposure duration and with 2-24% reduction in MGD, depending on breast thickness. For kV and HVL matched to K-edge filtered spectra by varying Al filter thickness over (700, 880) µm range, identical SDNR was achieved with 23-56% reduction in exposure duration and 2-20% reduction in MGD, depending on breast thickness. These
Parallel Markov chain Monte Carlo simulations.
Ren, Ruichao; Orkoulas, G
2007-06-07
With strict detailed balance, parallel Monte Carlo simulation through domain decomposition cannot be validated with conventional Markov chain theory, which describes an intrinsically serial stochastic process. In this work, the parallel version of Markov chain theory and its role in accelerating Monte Carlo simulations via cluster computing is explored. It is shown that sequential updating is the key to improving efficiency in parallel simulations through domain decomposition. A parallel scheme is proposed to reduce interprocessor communication or synchronization, which slows down parallel simulation with increasing number of processors. Parallel simulation results for the two-dimensional lattice gas model show substantial reduction of simulation time for systems of moderate and large size.
Monte Carlo Hamiltonian：Linear Potentials
Institute of Scientific and Technical Information of China (English)
LUOXiang－Qian; HelmutKROEGER; 等
2002-01-01
We further study the validity of the Monte Carlo Hamiltonian method .The advantage of the method,in comparison with the standard Monte Carlo Lagrangian approach,is its capability to study the excited states.We consider two quantum mechanical models:a symmetric one V(x)=/x/2;and an asymmetric one V(x)==∞,for x<0 and V(x)=2,for x≥0.The results for the spectrum,wave functions and thermodynamical observables are in agreement with the analytical or Runge-Kutta calculations.
Monte Carlo dose distributions for radiosurgery
Energy Technology Data Exchange (ETDEWEB)
Perucha, M.; Leal, A.; Rincon, M.; Carrasco, E. [Sevilla Univ. (Spain). Dept. Fisiologia Medica y Biofisica; Sanchez-Doblado, F. [Sevilla Univ. (Spain). Dept. Fisiologia Medica y Biofisica]|[Hospital Univ. Virgen Macarena, Sevilla (Spain). Servicio de Oncologia Radioterapica; Nunez, L. [Clinica Puerta de Hierro, Madrid (Spain). Servicio de Radiofisica; Arrans, R.; Sanchez-Calzado, J.A.; Errazquin, L. [Hospital Univ. Virgen Macarena, Sevilla (Spain). Servicio de Oncologia Radioterapica; Sanchez-Nieto, B. [Royal Marsden NHS Trust (United Kingdom). Joint Dept. of Physics]|[Inst. of Cancer Research, Sutton, Surrey (United Kingdom)
2001-07-01
The precision of Radiosurgery Treatment planning systems is limited by the approximations of their algorithms and by their dosimetrical input data. This fact is especially important in small fields. However, the Monte Carlo methods is an accurate alternative as it considers every aspect of particle transport. In this work an acoustic neurinoma is studied by comparing the dose distribution of both a planning system and Monte Carlo. Relative shifts have been measured and furthermore, Dose-Volume Histograms have been calculated for target and adjacent organs at risk. (orig.)
Monte carlo simulations of organic photovoltaics.
Groves, Chris; Greenham, Neil C
2014-01-01
Monte Carlo simulations are a valuable tool to model the generation, separation, and collection of charges in organic photovoltaics where charges move by hopping in a complex nanostructure and Coulomb interactions between charge carriers are important. We review the Monte Carlo techniques that have been applied to this problem, and describe the results of simulations of the various recombination processes that limit device performance. We show how these processes are influenced by the local physical and energetic structure of the material, providing information that is useful for design of efficient photovoltaic systems.
Monte Carlo simulation of neutron scattering instruments
Energy Technology Data Exchange (ETDEWEB)
Seeger, P.A.
1995-12-31
A library of Monte Carlo subroutines has been developed for the purpose of design of neutron scattering instruments. Using small-angle scattering as an example, the philosophy and structure of the library are described and the programs are used to compare instruments at continuous wave (CW) and long-pulse spallation source (LPSS) neutron facilities. The Monte Carlo results give a count-rate gain of a factor between 2 and 4 using time-of-flight analysis. This is comparable to scaling arguments based on the ratio of wavelength bandwidth to resolution width.
The Rational Hybrid Monte Carlo Algorithm
Clark, M A
2006-01-01
The past few years have seen considerable progress in algorithmic development for the generation of gauge fields including the effects of dynamical fermions. The Rational Hybrid Monte Carlo (RHMC) algorithm, where Hybrid Monte Carlo is performed using a rational approximation in place the usual inverse quark matrix kernel is one of these developments. This algorithm has been found to be extremely beneficial in many areas of lattice QCD (chiral fermions, finite temperature, Wilson fermions etc.). We review the algorithm and some of these benefits, and we compare against other recent algorithm developements. We conclude with an update of the Berlin wall plot comparing costs of all popular fermion formulations.
The Rational Hybrid Monte Carlo algorithm
Clark, Michael
2006-12-01
The past few years have seen considerable progress in algorithmic development for the generation of gauge fields including the effects of dynamical fermions. The Rational Hybrid Monte Carlo (RHMC) algorithm, where Hybrid Monte Carlo is performed using a rational approximation in place the usual inverse quark matrix kernel is one of these developments. This algorithm has been found to be extremely beneficial in many areas of lattice QCD (chiral fermions, finite temperature, Wilson fermions etc.). We review the algorithm and some of these benefits, and we compare against other recent algorithm developements. We conclude with an update of the Berlin wall plot comparing costs of all popular fermion formulations.
Morant, J J; Salvadó, M; Hernández-Girón, I; Casanovas, R; Ortega, R; Calzado, A
2013-01-01
The aim of this study was to calculate organ and effective doses for a range of available protocols in a particular cone beam CT (CBCT) scanner dedicated to dentistry and to derive effective dose conversion factors. Monte Carlo simulations were used to calculate organ and effective doses using the International Commission on Radiological Protection voxel adult male and female reference phantoms (AM and AF) in an i-CAT CBCT. Nine different fields of view (FOVs) were simulated considering full- and half-rotation modes, and also a high-resolution acquisition for a particular protocol. Dose-area product (DAP) was measured. Dose to organs varied for the different FOVs, usually being higher in the AF phantom. For 360°, effective doses were in the range of 25-66 μSv, and 46 μSv for full head. Higher contributions to the effective dose corresponded to the remainder (31%; 27-36 range), salivary glands (23%; 20-29%), thyroid (13%; 8-17%), red bone marrow (10%; 9-11%) and oesophagus (7%; 4-10%). The high-resolution protocol doubled the standard resolution doses. DAP values were between 181 mGy cm(2) and 556 mGy cm(2) for 360°. For 180° protocols, dose to organs, effective dose and DAP were approximately 40% lower. A conversion factor (DAP to effective dose) of 0.130 ± 0.006 μSv mGy(-1) cm(-2) was derived for all the protocols, excluding full head. A wide variation in dose to eye lens and thyroid was found when shifting the FOV in the AF phantom. Organ and effective doses varied according to field size, acquisition angle and positioning of the beam relative to radiosensitive organs. Good positive correlation between calculated effective dose and measured DAP was found.
Energy Technology Data Exchange (ETDEWEB)
Jin, L; Eldib, A; Li, J; Price, R; Ma, C [Fox Chase Cancer Center, Philadelphia, PA (United States)
2015-06-15
Purpose: Uneven nose surfaces and air cavities underneath and the use of bolus present complexity and dose uncertainty when using a single electron energy beam to plan treatments of nose skin with a pencil beam-based planning system. This work demonstrates more accurate dose calculation and more optimal planning using energy and intensity modulated electron radiotherapy (MERT) delivered with a pMLC. Methods: An in-house developed Monte Carlo (MC)-based dose calculation/optimization planning system was employed for treatment planning. Phase space data (6, 9, 12 and 15 MeV) were used as an input source for MC dose calculations for the linac. To reduce the scatter-caused penumbra, a short SSD (61 cm) was used. Our previous work demonstrates good agreement in percentage depth dose and off-axis dose between calculations and film measurement for various field sizes. A MERT plan was generated for treating the nose skin using a patient geometry and a dose volume histogram (DVH) was obtained. The work also shows the comparison of 2D dose distributions between a clinically used conventional single electron energy plan and the MERT plan. Results: The MERT plan resulted in improved target dose coverage as compared to the conventional plan, which demonstrated a target dose deficit at the field edge. The conventional plan showed higher dose normal tissue irradiation underneath the nose skin while the MERT plan resulted in improved conformity and thus reduces normal tissue dose. Conclusion: This preliminary work illustrates that MC-based MERT planning is a promising technique in treating nose skin, not only providing more accurate dose calculation, but also offering an improved target dose coverage and conformity. In addition, this technique may eliminate the necessity of bolus, which often produces dose delivery uncertainty due to the air gaps that may exist between the bolus and skin.
Morant, JJ; Salvadó, M; Hernández-Girón, I; Casanovas, R; Ortega, R; Calzado, A
2013-01-01
Objectives: The aim of this study was to calculate organ and effective doses for a range of available protocols in a particular cone beam CT (CBCT) scanner dedicated to dentistry and to derive effective dose conversion factors. Methods: Monte Carlo simulations were used to calculate organ and effective doses using the International Commission on Radiological Protection voxel adult male and female reference phantoms (AM and AF) in an i-CAT CBCT. Nine different fields of view (FOVs) were simulated considering full- and half-rotation modes, and also a high-resolution acquisition for a particular protocol. Dose–area product (DAP) was measured. Results: Dose to organs varied for the different FOVs, usually being higher in the AF phantom. For 360°, effective doses were in the range of 25–66 μSv, and 46 μSv for full head. Higher contributions to the effective dose corresponded to the remainder (31%; 27–36 range), salivary glands (23%; 20–29%), thyroid (13%; 8–17%), red bone marrow (10%; 9–11%) and oesophagus (7%; 4–10%). The high-resolution protocol doubled the standard resolution doses. DAP values were between 181 mGy cm2 and 556 mGy cm2 for 360°. For 180° protocols, dose to organs, effective dose and DAP were approximately 40% lower. A conversion factor (DAP to effective dose) of 0.130 ± 0.006 μSv mGy−1 cm−2 was derived for all the protocols, excluding full head. A wide variation in dose to eye lens and thyroid was found when shifting the FOV in the AF phantom. Conclusions: Organ and effective doses varied according to field size, acquisition angle and positioning of the beam relative to radiosensitive organs. Good positive correlation between calculated effective dose and measured DAP was found. PMID:22933532
Fast sequential Monte Carlo methods for counting and optimization
Rubinstein, Reuven Y; Vaisman, Radislav
2013-01-01
A comprehensive account of the theory and application of Monte Carlo methods Based on years of research in efficient Monte Carlo methods for estimation of rare-event probabilities, counting problems, and combinatorial optimization, Fast Sequential Monte Carlo Methods for Counting and Optimization is a complete illustration of fast sequential Monte Carlo techniques. The book provides an accessible overview of current work in the field of Monte Carlo methods, specifically sequential Monte Carlo techniques, for solving abstract counting and optimization problems. Written by authorities in the
Energy Technology Data Exchange (ETDEWEB)
Kurosu, Keita [Department of Radiation Oncology, Indiana University School of Medicine, Indianapolis, IN 46202 (United States); Department of Radiation Oncology, Osaka University Graduate School of Medicine, Suita, Osaka 565-0871 (Japan); Department of Radiology, Osaka University Hospital, Suita, Osaka 565-0871 (Japan); Das, Indra J. [Department of Radiation Oncology, Indiana University School of Medicine, Indianapolis, IN 46202 (United States); Moskvin, Vadim P. [Department of Radiation Oncology, Indiana University School of Medicine, Indianapolis, IN 46202 (United States); Department of Radiation Oncology, St. Jude Children’s Research Hospital, Memphis, TN 38105 (United States)
2016-01-15
Spot scanning, owing to its superior dose-shaping capability, provides unsurpassed dose conformity, in particular for complex targets. However, the robustness of the delivered dose distribution and prescription has to be verified. Monte Carlo (MC) simulation has the potential to generate significant advantages for high-precise particle therapy, especially for medium containing inhomogeneities. However, the inherent choice of computational parameters in MC simulation codes of GATE, PHITS and FLUKA that is observed for uniform scanning proton beam needs to be evaluated. This means that the relationship between the effect of input parameters and the calculation results should be carefully scrutinized. The objective of this study was, therefore, to determine the optimal parameters for the spot scanning proton beam for both GATE and PHITS codes by using data from FLUKA simulation as a reference. The proton beam scanning system of the Indiana University Health Proton Therapy Center was modeled in FLUKA, and the geometry was subsequently and identically transferred to GATE and PHITS. Although the beam transport is managed by spot scanning system, the spot location is always set at the center of a water phantom of 600 × 600 × 300 mm{sup 3}, which is placed after the treatment nozzle. The percentage depth dose (PDD) is computed along the central axis using 0.5 × 0.5 × 0.5 mm{sup 3} voxels in the water phantom. The PDDs and the proton ranges obtained with several computational parameters are then compared to those of FLUKA, and optimal parameters are determined from the accuracy of the proton range, suppressed dose deviation, and computational time minimization. Our results indicate that the optimized parameters are different from those for uniform scanning, suggesting that the gold standard for setting computational parameters for any proton therapy application cannot be determined consistently since the impact of setting parameters depends on the proton irradiation
Monte Carlo methods in AB initio quantum chemistry quantum Monte Carlo for molecules
Lester, William A; Reynolds, PJ
1994-01-01
This book presents the basic theory and application of the Monte Carlo method to the electronic structure of atoms and molecules. It assumes no previous knowledge of the subject, only a knowledge of molecular quantum mechanics at the first-year graduate level. A working knowledge of traditional ab initio quantum chemistry is helpful, but not essential.Some distinguishing features of this book are: Clear exposition of the basic theory at a level to facilitate independent study. Discussion of the various versions of the theory: diffusion Monte Carlo, Green's function Monte Carlo, and release n
Use of Monte Carlo Methods in brachytherapy; Uso del metodo de Monte Carlo en braquiterapia
Energy Technology Data Exchange (ETDEWEB)
Granero Cabanero, D.
2015-07-01
The Monte Carlo method has become a fundamental tool for brachytherapy dosimetry mainly because no difficulties associated with experimental dosimetry. In brachytherapy the main handicap of experimental dosimetry is the high dose gradient near the present sources making small uncertainties in the positioning of the detectors lead to large uncertainties in the dose. This presentation will review mainly the procedure for calculating dose distributions around a fountain using the Monte Carlo method showing the difficulties inherent in these calculations. In addition we will briefly review other applications of the method of Monte Carlo in brachytherapy dosimetry, as its use in advanced calculation algorithms, calculating barriers or obtaining dose applicators around. (Author)
On the use of stochastic approximation Monte Carlo for Monte Carlo integration
Liang, Faming
2009-03-01
The stochastic approximation Monte Carlo (SAMC) algorithm has recently been proposed as a dynamic optimization algorithm in the literature. In this paper, we show in theory that the samples generated by SAMC can be used for Monte Carlo integration via a dynamically weighted estimator by calling some results from the literature of nonhomogeneous Markov chains. Our numerical results indicate that SAMC can yield significant savings over conventional Monte Carlo algorithms, such as the Metropolis-Hastings algorithm, for the problems for which the energy landscape is rugged. © 2008 Elsevier B.V. All rights reserved.
Monte Carlo simulations of the NIMROD diffractometer
Energy Technology Data Exchange (ETDEWEB)
Botti, A. [University of Roma TRE, Rome (Italy)]. E-mail: botti@fis.uniroma3.it; Ricci, M.A. [University of Roma TRE, Rome (Italy); Bowron, D.T. [ISIS-Rutherford Appleton Laboratory, Chilton (United Kingdom); Soper, A.K. [ISIS-Rutherford Appleton Laboratory, Chilton (United Kingdom)
2006-11-15
The near and intermediate range order diffractometer (NIMROD) has been selected as a day one instrument on the second target station at ISIS. Uniquely, NIMROD will provide continuous access to particle separations ranging from the interatomic (<1A) to the mesoscopic (<300A). This instrument is mainly designed for structural investigations, although the possibility of putting a Fermi chopper (and corresponding NIMONIC chopper) in the incident beam line, will potentially allow the performance of low resolution inelastic scattering measurements. The performance characteristics of the TOF diffractometer have been simulated by means of a series of Monte Carlo calculations. In particular, the flux as a function of the transferred momentum Q as well as the resolution in Q and transferred energy have been estimated. Moreover, the possibility of including a honeycomb collimator in order to achieve better resolution has been tested. Here, we want to present the design of this diffractometer that will bridge the gap between wide- and small-angle neutron scattering experiments.
Abuhaimed, Abdullah; Martin, Colin J.; Sankaralingam, Marimuthu; Gentle, David J.
2015-07-01
A function called Gx(L) was introduced by the International Commission on Radiation Units and Measurements (ICRU) Report-87 to facilitate measurement of cumulative dose for CT scans within long phantoms as recommended by the American Association of Physicists in Medicine (AAPM) TG-111. The Gx(L) function is equal to the ratio of the cumulative dose at the middle of a CT scan to the volume weighted CTDI (CTDIvol), and was investigated for conventional multi-slice CT scanners operating with a moving table. As the stationary table mode, which is the basis for cone beam CT (CBCT) scans, differs from that used for conventional CT scans, the aim of this study was to investigate the extension of the Gx(L) function to CBCT scans. An On-Board Imager (OBI) system integrated with a TrueBeam linac was simulated with Monte Carlo EGSnrc/BEAMnrc, and the absorbed dose was calculated within PMMA, polyethylene (PE), and water head and body phantoms using EGSnrc/DOSXYZnrc, where the body PE body phantom emulated the ICRU/AAPM phantom. Beams of width 40-500 mm and beam qualities at tube potentials of 80-140 kV were studied. Application of a modified function of beam width (W) termed Gx(W), for which the cumulative dose for CBCT scans f (0) is normalized to the weighted CTDI (CTDIw) for a reference beam of width 40 mm, was investigated as a possible option. However, differences were found in Gx(W) with tube potential, especially for body phantoms, and these were considered to be due to differences in geometry between wide beams used for CBCT scans and those for conventional CT. Therefore, a modified function Gx(W)100 has been proposed, taking the form of values of f (0) at each position in a long phantom, normalized with respect to dose indices f 100(150)x measured with a 100 mm pencil ionization chamber within standard 150 mm PMMA phantoms, using the same scanning parameters, beam widths and positions within the phantom. f 100(150)x averages the dose resulting from
Energy Technology Data Exchange (ETDEWEB)
Ahmad, Syed Bilal, E-mail: ahmadsb@mcmaster.ca [TAB-104D, Medical Physics and Applied Radiation Sciences, McMaster University, Hamilton, Ontario, Canada L8S 4K1 (Canada); Thompson, Jeroen E., E-mail: Jeroen.thompson@gmail.com [Medical Physics and Applied Radiation Sciences, McMaster University, Hamilton, Ontario, Canada L8S 4K1 (Canada); McNeill, Fiona E., E-mail: fmcneill@mcmaster.ca [Medical Physics and Applied Radiation Sciences, McMaster University, Hamilton, Ontario, Canada L8S 4K1 (Canada); Byun, Soo Hyun, E-mail: soohyun@mcmaster.ca [Medical Physics and Applied Radiation Sciences, McMaster University, Hamilton, Ontario, Canada L8S 4K1 (Canada); Prestwich, William V., E-mail: prestwic@mcmaster.ca [Medical Physics and Applied Radiation Sciences, McMaster University, Hamilton, Ontario, Canada L8S 4K1 (Canada)
2013-01-15
The goal of a microbeam is to deliver a highly localized and small dose to the biological medium. This can be achieved by using a set of collimators that confine the charged particle beam to a very small spatial area of the order of microns in diameter. By using a system that combines an appropriate beam detection method that signals to a beam shut-down mechanism, a predetermined and counted number of energetic particles can be delivered to targeted biological cells. Since the shutter and the collimators block a significant proportion of the beam, there is a probability of the production of low energy X-rays and secondary electrons through interactions with the beam. There is little information in the biological microbeam literature on potential X-ray production. We therefore used Monte Carlo simulations to investigate the potential production of particle-induced X-rays and secondary electrons in the collimation system (which is predominantly made of tungsten) and the subsequent possible effects on the total absorbed dose delivered to the biological medium. We found, through the simulation, no evidence of the escape of X-rays or secondary electrons from the collimation system for proton energies up to 3 MeV as we found that the thickness of the collimators is sufficient to reabsorb all of the generated low energy X-rays and secondary electrons. However, if the proton energy exceeds 3 MeV our simulations suggest that 10 keV X-rays can escape the collimator and expose the overlying layer of cells and medium. If the proton energy is further increased to 4.5 MeV or beyond, the collimator can become a significant source of 10 keV and 59 keV X-rays. These additional radiation fields could have effects on cells and these results should be verified through experimental measurement. We suggest that researchers using biological microbeams at higher energies need to be aware that cells may be exposed to a mixed LET radiation field and be careful in their interpretation of
Parrish, R. V.; Dieudonne, J. E.; Filippas, T. A.
1971-01-01
An algorithm employing a modified sequential random perturbation, or creeping random search, was applied to the problem of optimizing the parameters of a high-energy beam transport system. The stochastic solution of the mathematical model for first-order magnetic-field expansion allows the inclusion of state-variable constraints, and the inclusion of parameter constraints allowed by the method of algorithm application eliminates the possibility of infeasible solutions. The mathematical model and the algorithm were programmed for a real-time simulation facility; thus, two important features are provided to the beam designer: (1) a strong degree of man-machine communication (even to the extent of bypassing the algorithm and applying analog-matching techniques), and (2) extensive graphics for displaying information concerning both algorithm operation and transport-system behavior. Chromatic aberration was also included in the mathematical model and in the optimization process. Results presented show this method as yielding better solutions (in terms of resolutions) to the particular problem than those of a standard analog program as well as demonstrating flexibility, in terms of elements, constraints, and chromatic aberration, allowed by user interaction with both the algorithm and the stochastic model. Example of slit usage and a limited comparison of predicted results and actual results obtained with a 600 MeV cyclotron are given.
A comparison of Monte Carlo generators
Golan, Tomasz
2014-01-01
A comparison of GENIE, NEUT, NUANCE, and NuWro Monte Carlo neutrino event generators is presented using a set of four observables: protons multiplicity, total visible energy, most energetic proton momentum, and $\\pi^+$ two-dimensional energy vs cosine distribution.
Monte Carlo Tools for Jet Quenching
Zapp, Korinna
2011-01-01
A thorough understanding of jet quenching on the basis of multi-particle final states and jet observables requires new theoretical tools. This talk summarises the status and propects of the theoretical description of jet quenching in terms of Monte Carlo generators.
An Introduction to Monte Carlo Methods
Raeside, D. E.
1974-01-01
Reviews the principles of Monte Carlo calculation and random number generation in an attempt to introduce the direct and the rejection method of sampling techniques as well as the variance-reduction procedures. Indicates that the increasing availability of computers makes it possible for a wider audience to learn about these powerful methods. (CC)
Variance Reduction Techniques in Monte Carlo Methods
Kleijnen, Jack P.C.; Ridder, A.A.N.; Rubinstein, R.Y.
2010-01-01
Monte Carlo methods are simulation algorithms to estimate a numerical quantity in a statistical model of a real system. These algorithms are executed by computer programs. Variance reduction techniques (VRT) are needed, even though computer speed has been increasing dramatically, ever since the intr
Scalable Domain Decomposed Monte Carlo Particle Transport
Energy Technology Data Exchange (ETDEWEB)
O' Brien, Matthew Joseph [Univ. of California, Davis, CA (United States)
2013-12-05
In this dissertation, we present the parallel algorithms necessary to run domain decomposed Monte Carlo particle transport on large numbers of processors (millions of processors). Previous algorithms were not scalable, and the parallel overhead became more computationally costly than the numerical simulation.
Monte Carlo methods beyond detailed balance
Schram, Raoul D.; Barkema, Gerard T.
2015-01-01
Monte Carlo algorithms are nearly always based on the concept of detailed balance and ergodicity. In this paper we focus on algorithms that do not satisfy detailed balance. We introduce a general method for designing non-detailed balance algorithms, starting from a conventional algorithm satisfying
Variance Reduction Techniques in Monte Carlo Methods
Kleijnen, Jack P.C.; Ridder, A.A.N.; Rubinstein, R.Y.
2010-01-01
Monte Carlo methods are simulation algorithms to estimate a numerical quantity in a statistical model of a real system. These algorithms are executed by computer programs. Variance reduction techniques (VRT) are needed, even though computer speed has been increasing dramatically, ever since the intr
An analysis of Monte Carlo tree search
CSIR Research Space (South Africa)
James, S
2017-02-01
Full Text Available Monte Carlo Tree Search (MCTS) is a family of directed search algorithms that has gained widespread attention in recent years. Despite the vast amount of research into MCTS, the effect of modifications on the algorithm, as well as the manner...
Monte Carlo Simulation of Counting Experiments.
Ogden, Philip M.
A computer program to perform a Monte Carlo simulation of counting experiments was written. The program was based on a mathematical derivation which started with counts in a time interval. The time interval was subdivided to form a binomial distribution with no two counts in the same subinterval. Then the number of subintervals was extended to…
Monte Carlo simulation of NSE at reactor and spallation sources
Energy Technology Data Exchange (ETDEWEB)
Zsigmond, G.; Wechsler, D.; Mezei, F. [Hahn-Meitner-Institut Berlin, Berlin (Germany)
2001-03-01
A MC (Monte Carlo) computation study of NSE (Neutron Spin Echo) has been performed by means of VITESS investigating the classic and TOF-NSE options at spallation sources. The use of white beams in TOF-NSE makes the flipper efficiency in function of the neutron wavelength an important issue. The emphasis was put on exact evaluation of flipper efficiencies for wide wavelength-band instruments. (author)
Monte Carlo simulations to replace film dosimetry in IMRT verification
Goetzfried, Thomas; Rickhey, Mark; Treutwein, Marius; Koelbl, Oliver; Bogner, Ludwig
2011-01-01
Patient-specific verification of intensity-modulated radiation therapy (IMRT) plans can be done by dosimetric measurements or by independent dose or monitor unit calculations. The aim of this study was the clinical evaluation of IMRT verification based on a fast Monte Carlo (MC) program with regard to possible benefits compared to commonly used film dosimetry. 25 head-and-neck IMRT plans were recalculated by a pencil beam based treatment planning system (TPS) using an appropriate quality assu...
Monte Carlo simulation of photon migration path in turbid media
Institute of Scientific and Technical Information of China (English)
无
2008-01-01
A new method of Monte Carlo simulation is developed to simulate the photon migration path in a scattering medium after an ultrashort-pulse laser beam comes into the medium.The most probable trajectory of photons at an instant can be obtained with this method.How the photon migration paths are affected by the optical parameters of the scattering medium is analyzed.It is also concluded that the absorption coefficient has no effect on the most probable trajectory of photons.
The Monte Carlo Simulation of Tsinghua Homo-Source Dual-Beam Medical Accelerator%清华同源双束医用加速器蒙特卡罗模拟
Institute of Scientific and Technical Information of China (English)
马悦; 吴朝霞; 王石; 刘亚强
2011-01-01
Objective: To apply Monte Carlo method in simulating the Homo-Source Dual-Beam medical accelerator developed by Tsinghua University and lay the foundation for future researches on the Accelerator' s KV energy dose distribution in an image-guided radiotherapy procedure. Methods: (1)Using Monte Carlo's BEAMnrc program for simulating the head of Tsinghua Homo-Source Dual-Beam Medical Accelerator we obtained the Phase Space file which would be used as the source for the next step. (2) Using Monte Carlo' s DOSXYZnre program we calculated the percent depth dose (PDD) and off axis ratio (OAR), dealt the dose data by MATLAB program and displayed them in EXCEL. (3) Analyzed the impact of the Monte Carlo simulated parameters on the final outcome was analyzed. (4)Compared the PDD and OAR calculated with the measured of identical condition. Results: The PDD and OAR in water phantom simulated using Monte Carlo method were well matched with those measured in the actual experiments and the Accelerator' s Monte Carlo model was achieved. Conclusions: The Accelerator's Monte Carlo simulated parameters of KV energy are apparently different from that of High-energy. To obtain the accurate Accelerator's Monte Carlo model, it is important to choose proper electron beam energy and electron space density distribution. The Accelerator's Monte Carlo model can be used in future researches on the image dose distribution.%目的:使用蒙特卡罗方法模拟清华大学自主研制的同源双束医用加速器,为今后研究该设备KV级能量在放射治疗中成像剂量分布奠定基础.方法:(1)借助蒙卡BEAMnrc程序模拟加速器机头得到相空间文件.(2)以该相空间文件为源,使用蒙卡DOSXYZnrc程序计算水模体中百分深度剂量(percent depth dose,PDD)和离轴比(off axis ratio,OAR),采用MATLAB编程提取剂量数据显示于EXCEL.(3)分析蒙卡模拟参数对结果的影响.(4)对比实测调整模拟参数.结果:蒙卡模拟所得水模体中PDD
Pietrzak, Robert; Konefał, Adam; Sokół, Maria; Orlef, Andrzej
2016-08-01
The success of proton therapy depends strongly on the precision of treatment planning. Dose distribution in biological tissue may be obtained from Monte Carlo simulations using various scientific codes making it possible to perform very accurate calculations. However, there are many factors affecting the accuracy of modeling. One of them is a structure of objects called bins registering a dose. In this work the influence of bin structure on the dose distributions was examined. The MCNPX code calculations of Bragg curve for the 60 MeV proton beam were done in two ways: using simple logical detectors being the volumes determined in water, and using a precise model of ionization chamber used in clinical dosimetry. The results of the simulations were verified experimentally in the water phantom with Marcus ionization chamber. The average local dose difference between the measured relative doses in the water phantom and those calculated by means of the logical detectors was 1.4% at first 25 mm, whereas in the full depth range this difference was 1.6% for the maximum uncertainty in the calculations less than 2.4% and for the maximum measuring error of 1%. In case of the relative doses calculated with the use of the ionization chamber model this average difference was somewhat greater, being 2.3% at depths up to 25 mm and 2.4% in the full range of depths for the maximum uncertainty in the calculations of 3%. In the dose calculations the ionization chamber model does not offer any additional advantages over the logical detectors. The results provided by both models are similar and in good agreement with the measurements, however, the logical detector approach is a more time-effective method.
Kim, Sung Jin; Kim, Sung Kyu; Kim, Dong Ho
2015-07-01
Treatment planning system calculations in inhomogeneous regions may present significant inaccuracies due to loss of electronic equilibrium. In this study, three different dose calculation algorithms, pencil beam (PB), collapsed cone (CC), and Monte-Carlo (MC), provided by our planning system were compared to assess their impact on the three-dimensional planning of lung and breast cases. A total of five breast and five lung cases were calculated by using the PB, CC, and MC algorithms. Planning treatment volume (PTV) and organs at risk (OARs) delineations were performed according to our institution's protocols on the Oncentra MasterPlan image registration module, on 0.3-0.5 cm computed tomography (CT) slices taken under normal respiration conditions. Intensitymodulated radiation therapy (IMRT) plans were calculated for the three algorithm for each patient. The plans were conducted on the Oncentra MasterPlan (PB and CC) and CMS Monaco (MC) treatment planning systems for 6 MV. The plans were compared in terms of the dose distribution in target, the OAR volumes, and the monitor units (MUs). Furthermore, absolute dosimetry was measured using a three-dimensional diode array detector (ArcCHECK) to evaluate the dose differences in a homogeneous phantom. Comparing the dose distributions planned by using the PB, CC, and MC algorithms, the PB algorithm provided adequate coverage of the PTV. The MUs calculated using the PB algorithm were less than those calculated by using. The MC algorithm showed the highest accuracy in terms of the absolute dosimetry. Differences were found when comparing the calculation algorithms. The PB algorithm estimated higher doses for the target than the CC and the MC algorithms. The PB algorithm actually overestimated the dose compared with those calculated by using the CC and the MC algorithms. The MC algorithm showed better accuracy than the other algorithms.
Chabert, I.; Barat, E.; Dautremer, T.; Montagu, T.; Agelou, M.; Croc de Suray, A.; Garcia-Hernandez, J. C.; Gempp, S.; Benkreira, M.; de Carlan, L.; Lazaro, D.
2016-07-01
This work aims at developing a generic virtual source model (VSM) preserving all existing correlations between variables stored in a Monte Carlo pre-computed phase space (PS) file, for dose calculation and high-resolution portal image prediction. The reference PS file was calculated using the PENELOPE code, after the flattening filter (FF) of an Elekta Synergy 6 MV photon beam. Each particle was represented in a mobile coordinate system by its radial position (r s ) in the PS plane, its energy (E), and its polar and azimuthal angles (φ d and θ d ), describing the particle deviation compared to its initial direction after bremsstrahlung, and the deviation orientation. Three sub-sources were created by sorting out particles according to their last interaction location (target, primary collimator or FF). For each sub-source, 4D correlated-histograms were built by storing E, r s , φ d and θ d values. Five different adaptive binning schemes were studied to construct 4D histograms of the VSMs, to ensure histogram efficient handling as well as an accurate reproduction of E, r s , φ d and θ d distribution details. The five resulting VSMs were then implemented in PENELOPE. Their accuracy was first assessed in the PS plane, by comparing E, r s , φ d and θ d distributions with those obtained from the reference PS file. Second, dose distributions computed in water, using the VSMs and the reference PS file located below the FF, and also after collimation in both water and heterogeneous phantom, were compared using a 1.5%-0 mm and a 2%-0 mm global gamma index, respectively. Finally, portal images were calculated without and with phantoms in the beam. The model was then evaluated using a 1%-0 mm global gamma index. Performance of a mono-source VSM was also investigated and led, as with the multi-source model, to excellent results when combined with an adaptive binning scheme.
Chabert, I; Barat, E; Dautremer, T; Montagu, T; Agelou, M; Croc de Suray, A; Garcia-Hernandez, J C; Gempp, S; Benkreira, M; de Carlan, L; Lazaro, D
2016-07-21
This work aims at developing a generic virtual source model (VSM) preserving all existing correlations between variables stored in a Monte Carlo pre-computed phase space (PS) file, for dose calculation and high-resolution portal image prediction. The reference PS file was calculated using the PENELOPE code, after the flattening filter (FF) of an Elekta Synergy 6 MV photon beam. Each particle was represented in a mobile coordinate system by its radial position (r s ) in the PS plane, its energy (E), and its polar and azimuthal angles (φ d and θ d ), describing the particle deviation compared to its initial direction after bremsstrahlung, and the deviation orientation. Three sub-sources were created by sorting out particles according to their last interaction location (target, primary collimator or FF). For each sub-source, 4D correlated-histograms were built by storing E, r s , φ d and θ d values. Five different adaptive binning schemes were studied to construct 4D histograms of the VSMs, to ensure histogram efficient handling as well as an accurate reproduction of E, r s , φ d and θ d distribution details. The five resulting VSMs were then implemented in PENELOPE. Their accuracy was first assessed in the PS plane, by comparing E, r s , φ d and θ d distributions with those obtained from the reference PS file. Second, dose distributions computed in water, using the VSMs and the reference PS file located below the FF, and also after collimation in both water and heterogeneous phantom, were compared using a 1.5%-0 mm and a 2%-0 mm global gamma index, respectively. Finally, portal images were calculated without and with phantoms in the beam. The model was then evaluated using a 1%-0 mm global gamma index. Performance of a mono-source VSM was also investigated and led, as with the multi-source model, to excellent results when combined with an adaptive binning scheme.
Energy Technology Data Exchange (ETDEWEB)
Ahmad, I.; Back, B.B.; Betts, R.R. [and others
1995-08-01
An essential component in the assessment of the significance of the results from APEX is a demonstrated understanding of the acceptance and response of the apparatus. This requires detailed simulations which can be compared to the results of various source and in-beam measurements. These simulations were carried out using the computer codes EGS and GEANT, both specifically designed for this purpose. As far as is possible, all details of the geometry of APEX were included. We compared the results of these simulations with measurements using electron conversion sources, positron sources and pair sources. The overall agreement is quite acceptable and some of the details are still being worked on. The simulation codes were also used to compare the results of measurements of in-beam positron and conversion electrons with expectations based on known physics or other methods. Again, satisfactory agreement is achieved. We are currently working on the simulation of various pair-producing scenarios such as the decay of a neutral object in the mass range 1.5-2.0 MeV and also the emission of internal pairs from nuclear transitions in the colliding ions. These results are essential input to the final results from APEX on cross section limits for various, previously proposed, sharp-line producing scenarios.
Dosimetry applications in GATE Monte Carlo toolkit.
Papadimitroulas, Panagiotis
2017-02-21
Monte Carlo (MC) simulations are a well-established method for studying physical processes in medical physics. The purpose of this review is to present GATE dosimetry applications on diagnostic and therapeutic simulated protocols. There is a significant need for accurate quantification of the absorbed dose in several specific applications such as preclinical and pediatric studies. GATE is an open-source MC toolkit for simulating imaging, radiotherapy (RT) and dosimetry applications in a user-friendly environment, which is well validated and widely accepted by the scientific community. In RT applications, during treatment planning, it is essential to accurately assess the deposited energy and the absorbed dose per tissue/organ of interest, as well as the local statistical uncertainty. Several types of realistic dosimetric applications are described including: molecular imaging, radio-immunotherapy, radiotherapy and brachytherapy. GATE has been efficiently used in several applications, such as Dose Point Kernels, S-values, Brachytherapy parameters, and has been compared against various MC codes which are considered as standard tools for decades. Furthermore, the presented studies show reliable modeling of particle beams when comparing experimental with simulated data. Examples of different dosimetric protocols are reported for individualized dosimetry and simulations combining imaging and therapy dose monitoring, with the use of modern computational phantoms. Personalization of medical protocols can be achieved by combining GATE MC simulations with anthropomorphic computational models and clinical anatomical data. This is a review study, covering several dosimetric applications of GATE, and the different tools used for modeling realistic clinical acquisitions with accurate dose assessment. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
Hybrid Monte Carlo with Chaotic Mixing
Kadakia, Nirag
2016-01-01
We propose a hybrid Monte Carlo (HMC) technique applicable to high-dimensional multivariate normal distributions that effectively samples along chaotic trajectories. The method is predicated on the freedom of choice of the HMC momentum distribution, and due to its mixing properties, exhibits sample-to-sample autocorrelations that decay far faster than those in the traditional hybrid Monte Carlo algorithm. We test the methods on distributions of varying correlation structure, finding that the proposed technique produces superior covariance estimates, is less reliant on step-size tuning, and can even function with sparse or no momentum re-sampling. The method presented here is promising for more general distributions, such as those that arise in Bayesian learning of artificial neural networks and in the state and parameter estimation of dynamical systems.
Monte Carlo study of real time dynamics
Alexandru, Andrei; Bedaque, Paulo F; Vartak, Sohan; Warrington, Neill C
2016-01-01
Monte Carlo studies involving real time dynamics are severely restricted by the sign problem that emerges from highly oscillatory phase of the path integral. In this letter, we present a new method to compute real time quantities on the lattice using the Schwinger-Keldysh formalism via Monte Carlo simulations. The key idea is to deform the path integration domain to a complex manifold where the phase oscillations are mild and the sign problem is manageable. We use the previously introduced "contraction algorithm" to create a Markov chain on this alternative manifold. We substantiate our approach by analyzing the quantum mechanical anharmonic oscillator. Our results are in agreement with the exact ones obtained by diagonalization of the Hamiltonian. The method we introduce is generic and in principle applicable to quantum field theory albeit very slow. We discuss some possible improvements that should speed up the algorithm.
Multilevel sequential Monte-Carlo samplers
Jasra, Ajay
2016-01-05
Multilevel Monte-Carlo methods provide a powerful computational technique for reducing the computational cost of estimating expectations for a given computational effort. They are particularly relevant for computational problems when approximate distributions are determined via a resolution parameter h, with h=0 giving the theoretical exact distribution (e.g. SDEs or inverse problems with PDEs). The method provides a benefit by coupling samples from successive resolutions, and estimating differences of successive expectations. We develop a methodology that brings Sequential Monte-Carlo (SMC) algorithms within the framework of the Multilevel idea, as SMC provides a natural set-up for coupling samples over different resolutions. We prove that the new algorithm indeed preserves the benefits of the multilevel principle, even if samples at all resolutions are now correlated.
Monte Carlo Simulation for Particle Detectors
Pia, Maria Grazia
2012-01-01
Monte Carlo simulation is an essential component of experimental particle physics in all the phases of its life-cycle: the investigation of the physics reach of detector concepts, the design of facilities and detectors, the development and optimization of data reconstruction software, the data analysis for the production of physics results. This note briefly outlines some research topics related to Monte Carlo simulation, that are relevant to future experimental perspectives in particle physics. The focus is on physics aspects: conceptual progress beyond current particle transport schemes, the incorporation of materials science knowledge relevant to novel detection technologies, functionality to model radiation damage, the capability for multi-scale simulation, quantitative validation and uncertainty quantification to determine the predictive power of simulation. The R&D on simulation for future detectors would profit from cooperation within various components of the particle physics community, and synerg...
An enhanced Monte Carlo outlier detection method.
Zhang, Liangxiao; Li, Peiwu; Mao, Jin; Ma, Fei; Ding, Xiaoxia; Zhang, Qi
2015-09-30
Outlier detection is crucial in building a highly predictive model. In this study, we proposed an enhanced Monte Carlo outlier detection method by establishing cross-prediction models based on determinate normal samples and analyzing the distribution of prediction errors individually for dubious samples. One simulated and three real datasets were used to illustrate and validate the performance of our method, and the results indicated that this method outperformed Monte Carlo outlier detection in outlier diagnosis. After these outliers were removed, the value of validation by Kovats retention indices and the root mean square error of prediction decreased from 3.195 to 1.655, and the average cross-validation prediction error decreased from 2.0341 to 1.2780. This method helps establish a good model by eliminating outliers. © 2015 Wiley Periodicals, Inc.
Composite biasing in Monte Carlo radiative transfer
Baes, Maarten; Lunttila, Tuomas; Bianchi, Simone; Camps, Peter; Juvela, Mika; Kuiper, Rolf
2016-01-01
Biasing or importance sampling is a powerful technique in Monte Carlo radiative transfer, and can be applied in different forms to increase the accuracy and efficiency of simulations. One of the drawbacks of the use of biasing is the potential introduction of large weight factors. We discuss a general strategy, composite biasing, to suppress the appearance of large weight factors. We use this composite biasing approach for two different problems faced by current state-of-the-art Monte Carlo radiative transfer codes: the generation of photon packages from multiple components, and the penetration of radiation through high optical depth barriers. In both cases, the implementation of the relevant algorithms is trivial and does not interfere with any other optimisation techniques. Through simple test models, we demonstrate the general applicability, accuracy and efficiency of the composite biasing approach. In particular, for the penetration of high optical depths, the gain in efficiency is spectacular for the spe...
Multilevel Monte Carlo Approaches for Numerical Homogenization
Efendiev, Yalchin R.
2015-10-01
In this article, we study the application of multilevel Monte Carlo (MLMC) approaches to numerical random homogenization. Our objective is to compute the expectation of some functionals of the homogenized coefficients, or of the homogenized solutions. This is accomplished within MLMC by considering different sizes of representative volumes (RVEs). Many inexpensive computations with the smallest RVE size are combined with fewer expensive computations performed on larger RVEs. Likewise, when it comes to homogenized solutions, different levels of coarse-grid meshes are used to solve the homogenized equation. We show that, by carefully selecting the number of realizations at each level, we can achieve a speed-up in the computations in comparison to a standard Monte Carlo method. Numerical results are presented for both one-dimensional and two-dimensional test-cases that illustrate the efficiency of the approach.
Monte Carlo simulations on SIMD computer architectures
Energy Technology Data Exchange (ETDEWEB)
Burmester, C.P.; Gronsky, R. [Lawrence Berkeley Lab., CA (United States); Wille, L.T. [Florida Atlantic Univ., Boca Raton, FL (United States). Dept. of Physics
1992-03-01
Algorithmic considerations regarding the implementation of various materials science applications of the Monte Carlo technique to single instruction multiple data (SMM) computer architectures are presented. In particular, implementation of the Ising model with nearest, next nearest, and long range screened Coulomb interactions on the SIMD architecture MasPar MP-1 (DEC mpp-12000) series of massively parallel computers is demonstrated. Methods of code development which optimize processor array use and minimize inter-processor communication are presented including lattice partitioning and the use of processor array spanning tree structures for data reduction. Both geometric and algorithmic parallel approaches are utilized. Benchmarks in terms of Monte Carlo updates per second for the MasPar architecture are presented and compared to values reported in the literature from comparable studies on other architectures.
Energy Technology Data Exchange (ETDEWEB)
Mazurier, J
1999-05-28
This thesis has been performed in the framework of national reference setting-up for absorbed dose in water and high energy photon beam provided with the SATURNE-43 medical accelerator of the BNM-LPRI (acronym for National Bureau of Metrology and Primary standard laboratory of ionising radiation). The aim of this work has been to develop and validate different user codes, based on PENELOPE Monte Carlo code system, to determine the photon beam characteristics and calculate the correction factors of reference dosimeters such as Fricke dosimeters and graphite calorimeter. In the first step, the developed user codes have permitted the influence study of different components constituting the irradiation head. Variance reduction techniques have been used to reduce the calculation time. The phase space has been calculated for 6, 12 and 25 MV at the output surface level of the accelerator head, then used for calculating energy spectra and dose distributions in the reference water phantom. Results obtained have been compared with experimental measurements. The second step has been devoted to develop an user code allowing calculation correction factors associated with both BNM-LPRI's graphite and Fricke dosimeters thanks to a correlated sampling method starting with energy spectra obtained in the first step. Then the calculated correction factors have been compared with experimental and calculated results obtained with the Monte Carlo EGS4 code system. The good agreement, between experimental and calculated results, leads to validate simulations performed with the PENELOPE code system. (author)
Inhomogeneous Monte Carlo simulations of dermoscopic spectroscopy
Gareau, Daniel S.; Li, Ting; Jacques, Steven; Krueger, James
2012-03-01
Clinical skin-lesion diagnosis uses dermoscopy: 10X epiluminescence microscopy. Skin appearance ranges from black to white with shades of blue, red, gray and orange. Color is an important diagnostic criteria for diseases including melanoma. Melanin and blood content and distribution impact the diffuse spectral remittance (300-1000nm). Skin layers: immersion medium, stratum corneum, spinous epidermis, basal epidermis and dermis as well as laterally asymmetric features (eg. melanocytic invasion) were modeled in an inhomogeneous Monte Carlo model.
Handbook of Markov chain Monte Carlo
Brooks, Steve
2011-01-01
""Handbook of Markov Chain Monte Carlo"" brings together the major advances that have occurred in recent years while incorporating enough introductory material for new users of MCMC. Along with thorough coverage of the theoretical foundations and algorithmic and computational methodology, this comprehensive handbook includes substantial realistic case studies from a variety of disciplines. These case studies demonstrate the application of MCMC methods and serve as a series of templates for the construction, implementation, and choice of MCMC methodology.
Accelerated Monte Carlo by Embedded Cluster Dynamics
Brower, R. C.; Gross, N. A.; Moriarty, K. J. M.
1991-07-01
We present an overview of the new methods for embedding Ising spins in continuous fields to achieve accelerated cluster Monte Carlo algorithms. The methods of Brower and Tamayo and Wolff are summarized and variations are suggested for the O( N) models based on multiple embedded Z2 spin components and/or correlated projections. Topological features are discussed for the XY model and numerical simulations presented for d=2, d=3 and mean field theory lattices.
An introduction to Monte Carlo methods
Walter, J.-C.; Barkema, G. T.
2015-01-01
Monte Carlo simulations are methods for simulating statistical systems. The aim is to generate a representative ensemble of configurations to access thermodynamical quantities without the need to solve the system analytically or to perform an exact enumeration. The main principles of Monte Carlo simulations are ergodicity and detailed balance. The Ising model is a lattice spin system with nearest neighbor interactions that is appropriate to illustrate different examples of Monte Carlo simulations. It displays a second order phase transition between disordered (high temperature) and ordered (low temperature) phases, leading to different strategies of simulations. The Metropolis algorithm and the Glauber dynamics are efficient at high temperature. Close to the critical temperature, where the spins display long range correlations, cluster algorithms are more efficient. We introduce the rejection free (or continuous time) algorithm and describe in details an interesting alternative representation of the Ising model using graphs instead of spins with the so-called Worm algorithm. We conclude with an important discussion of the dynamical effects such as thermalization and correlation time.
Energy Technology Data Exchange (ETDEWEB)
Pietrzak, Robert [Department of Nuclear Physics and Its Applications, Institute of Physics, University of Silesia, Katowice (Poland); Konefał, Adam, E-mail: adam.konefal@us.edu.pl [Department of Nuclear Physics and Its Applications, Institute of Physics, University of Silesia, Katowice (Poland); Sokół, Maria; Orlef, Andrzej [Department of Medical Physics, Maria Sklodowska-Curie Memorial Cancer Center, Institute of Oncology, Gliwice (Poland)
2016-08-01
The success of proton therapy depends strongly on the precision of treatment planning. Dose distribution in biological tissue may be obtained from Monte Carlo simulations using various scientific codes making it possible to perform very accurate calculations. However, there are many factors affecting the accuracy of modeling. One of them is a structure of objects called bins registering a dose. In this work the influence of bin structure on the dose distributions was examined. The MCNPX code calculations of Bragg curve for the 60 MeV proton beam were done in two ways: using simple logical detectors being the volumes determined in water, and using a precise model of ionization chamber used in clinical dosimetry. The results of the simulations were verified experimentally in the water phantom with Marcus ionization chamber. The average local dose difference between the measured relative doses in the water phantom and those calculated by means of the logical detectors was 1.4% at first 25 mm, whereas in the full depth range this difference was 1.6% for the maximum uncertainty in the calculations less than 2.4% and for the maximum measuring error of 1%. In case of the relative doses calculated with the use of the ionization chamber model this average difference was somewhat greater, being 2.3% at depths up to 25 mm and 2.4% in the full range of depths for the maximum uncertainty in the calculations of 3%. In the dose calculations the ionization chamber model does not offer any additional advantages over the logical detectors. The results provided by both models are similar and in good agreement with the measurements, however, the logical detector approach is a more time-effective method. - Highlights: • Influence of the bin structure on the proton dose distributions was examined for the MC simulations. • The considered relative proton dose distributions in water correspond to the clinical application. • MC simulations performed with the logical detectors and the
Energy Technology Data Exchange (ETDEWEB)
Setty, A.K.; Halley, J.W.; Campbell, C.E. [School of Physics and Astronomy, University of Minnesota, Minneapolis, Minnesota 55455 (United States)
1997-11-01
We report variational Monte Carlo calculations which give amplitudes and phases of the reflected and transmitted components of states representing scattering of helium atoms normally incident on a superfluid {sup 4}He slab. The wave function describes a previously postulated condensate mediated process [J.W. Halley {ital et al.,} Phys.Rev.Lett.{bold 71,} 2429 (1993)] and the results are consistent with uncertainty principle arguments suggesting that the transmission time for thin (but macroscopic) samples will be independent of slab thickness. {copyright} {ital 1997} {ital The American Physical Society}
Guideline of Monte Carlo calculation. Neutron/gamma ray transport simulation by Monte Carlo method
2002-01-01
This report condenses basic theories and advanced applications of neutron/gamma ray transport calculations in many fields of nuclear energy research. Chapters 1 through 5 treat historical progress of Monte Carlo methods, general issues of variance reduction technique, cross section libraries used in continuous energy Monte Carlo codes. In chapter 6, the following issues are discussed: fusion benchmark experiments, design of ITER, experiment analyses of fast critical assembly, core analyses of JMTR, simulation of pulsed neutron experiment, core analyses of HTTR, duct streaming calculations, bulk shielding calculations, neutron/gamma ray transport calculations of the Hiroshima atomic bomb. Chapters 8 and 9 treat function enhancements of MCNP and MVP codes, and a parallel processing of Monte Carlo calculation, respectively. An important references are attached at the end of this report.
Díez, A; Largo, J; Solana, J R
2006-08-21
Computer simulations have been performed for fluids with van der Waals potential, that is, hard spheres with attractive inverse power tails, to determine the equation of state and the excess energy. On the other hand, the first- and second-order perturbative contributions to the energy and the zero- and first-order perturbative contributions to the compressibility factor have been determined too from Monte Carlo simulations performed on the reference hard-sphere system. The aim was to test the reliability of this "exact" perturbation theory. It has been found that the results obtained from the Monte Carlo perturbation theory for these two thermodynamic properties agree well with the direct Monte Carlo simulations. Moreover, it has been found that results from the Barker-Henderson [J. Chem. Phys. 47, 2856 (1967)] perturbation theory are in good agreement with those from the exact perturbation theory.
Energy Technology Data Exchange (ETDEWEB)
Ali, Imad, E-mail: iali@ouhsc.edu [Department of Radiation Oncology, University of Oklahoma Health Sciences Center, Oklahoma City, OK (United States); Ahmad, Salahuddin [Department of Radiation Oncology, University of Oklahoma Health Sciences Center, Oklahoma City, OK (United States)
2013-10-01
To compare the doses calculated using the BrainLAB pencil beam (PB) and Monte Carlo (MC) algorithms for tumors located in various sites including the lung and evaluate quality assurance procedures required for the verification of the accuracy of dose calculation. The dose-calculation accuracy of PB and MC was also assessed quantitatively with measurement using ionization chamber and Gafchromic films placed in solid water and heterogeneous phantoms. The dose was calculated using PB convolution and MC algorithms in the iPlan treatment planning system from BrainLAB. The dose calculation was performed on the patient's computed tomography images with lesions in various treatment sites including 5 lungs, 5 prostates, 4 brains, 2 head and necks, and 2 paraspinal tissues. A combination of conventional, conformal, and intensity-modulated radiation therapy plans was used in dose calculation. The leaf sequence from intensity-modulated radiation therapy plans or beam shapes from conformal plans and monitor units and other planning parameters calculated by the PB were identical for calculating dose with MC. Heterogeneity correction was considered in both PB and MC dose calculations. Dose-volume parameters such as V95 (volume covered by 95% of prescription dose), dose distributions, and gamma analysis were used to evaluate the calculated dose by PB and MC. The measured doses by ionization chamber and EBT GAFCHROMIC film in solid water and heterogeneous phantoms were used to quantitatively asses the accuracy of dose calculated by PB and MC. The dose-volume histograms and dose distributions calculated by PB and MC in the brain, prostate, paraspinal, and head and neck were in good agreement with one another (within 5%) and provided acceptable planning target volume coverage. However, dose distributions of the patients with lung cancer had large discrepancies. For a plan optimized with PB, the dose coverage was shown as clinically acceptable, whereas in reality, the MC showed a
Ali, Imad; Ahmad, Salahuddin
2013-01-01
To compare the doses calculated using the BrainLAB pencil beam (PB) and Monte Carlo (MC) algorithms for tumors located in various sites including the lung and evaluate quality assurance procedures required for the verification of the accuracy of dose calculation. The dose-calculation accuracy of PB and MC was also assessed quantitatively with measurement using ionization chamber and Gafchromic films placed in solid water and heterogeneous phantoms. The dose was calculated using PB convolution and MC algorithms in the iPlan treatment planning system from BrainLAB. The dose calculation was performed on the patient's computed tomography images with lesions in various treatment sites including 5 lungs, 5 prostates, 4 brains, 2 head and necks, and 2 paraspinal tissues. A combination of conventional, conformal, and intensity-modulated radiation therapy plans was used in dose calculation. The leaf sequence from intensity-modulated radiation therapy plans or beam shapes from conformal plans and monitor units and other planning parameters calculated by the PB were identical for calculating dose with MC. Heterogeneity correction was considered in both PB and MC dose calculations. Dose-volume parameters such as V95 (volume covered by 95% of prescription dose), dose distributions, and gamma analysis were used to evaluate the calculated dose by PB and MC. The measured doses by ionization chamber and EBT GAFCHROMIC film in solid water and heterogeneous phantoms were used to quantitatively asses the accuracy of dose calculated by PB and MC. The dose-volume histograms and dose distributions calculated by PB and MC in the brain, prostate, paraspinal, and head and neck were in good agreement with one another (within 5%) and provided acceptable planning target volume coverage. However, dose distributions of the patients with lung cancer had large discrepancies. For a plan optimized with PB, the dose coverage was shown as clinically acceptable, whereas in reality, the MC showed a
Status of Monte-Carlo Event Generators
Energy Technology Data Exchange (ETDEWEB)
Hoeche, Stefan; /SLAC
2011-08-11
Recent progress on general-purpose Monte-Carlo event generators is reviewed with emphasis on the simulation of hard QCD processes and subsequent parton cascades. Describing full final states of high-energy particle collisions in contemporary experiments is an intricate task. Hundreds of particles are typically produced, and the reactions involve both large and small momentum transfer. The high-dimensional phase space makes an exact solution of the problem impossible. Instead, one typically resorts to regarding events as factorized into different steps, ordered descending in the mass scales or invariant momentum transfers which are involved. In this picture, a hard interaction, described through fixed-order perturbation theory, is followed by multiple Bremsstrahlung emissions off initial- and final-state and, finally, by the hadronization process, which binds QCD partons into color-neutral hadrons. Each of these steps can be treated independently, which is the basic concept inherent to general-purpose event generators. Their development is nowadays often focused on an improved description of radiative corrections to hard processes through perturbative QCD. In this context, the concept of jets is introduced, which allows to relate sprays of hadronic particles in detectors to the partons in perturbation theory. In this talk, we briefly review recent progress on perturbative QCD in event generation. The main focus lies on the general-purpose Monte-Carlo programs HERWIG, PYTHIA and SHERPA, which will be the workhorses for LHC phenomenology. A detailed description of the physics models included in these generators can be found in [8]. We also discuss matrix-element generators, which provide the parton-level input for general-purpose Monte Carlo.
Quantum Monte Carlo for vibrating molecules
Energy Technology Data Exchange (ETDEWEB)
Brown, W.R. [Univ. of California, Berkeley, CA (United States). Chemistry Dept.]|[Lawrence Berkeley National Lab., CA (United States). Chemical Sciences Div.
1996-08-01
Quantum Monte Carlo (QMC) has successfully computed the total electronic energies of atoms and molecules. The main goal of this work is to use correlation function quantum Monte Carlo (CFQMC) to compute the vibrational state energies of molecules given a potential energy surface (PES). In CFQMC, an ensemble of random walkers simulate the diffusion and branching processes of the imaginary-time time dependent Schroedinger equation in order to evaluate the matrix elements. The program QMCVIB was written to perform multi-state VMC and CFQMC calculations and employed for several calculations of the H{sub 2}O and C{sub 3} vibrational states, using 7 PES`s, 3 trial wavefunction forms, two methods of non-linear basis function parameter optimization, and on both serial and parallel computers. In order to construct accurate trial wavefunctions different wavefunctions forms were required for H{sub 2}O and C{sub 3}. In order to construct accurate trial wavefunctions for C{sub 3}, the non-linear parameters were optimized with respect to the sum of the energies of several low-lying vibrational states. In order to stabilize the statistical error estimates for C{sub 3} the Monte Carlo data was collected into blocks. Accurate vibrational state energies were computed using both serial and parallel QMCVIB programs. Comparison of vibrational state energies computed from the three C{sub 3} PES`s suggested that a non-linear equilibrium geometry PES is the most accurate and that discrete potential representations may be used to conveniently determine vibrational state energies.
A Monte Carlo algorithm for degenerate plasmas
Energy Technology Data Exchange (ETDEWEB)
Turrell, A.E., E-mail: a.turrell09@imperial.ac.uk; Sherlock, M.; Rose, S.J.
2013-09-15
A procedure for performing Monte Carlo calculations of plasmas with an arbitrary level of degeneracy is outlined. It has possible applications in inertial confinement fusion and astrophysics. Degenerate particles are initialised according to the Fermi–Dirac distribution function, and scattering is via a Pauli blocked binary collision approximation. The algorithm is tested against degenerate electron–ion equilibration, and the degenerate resistivity transport coefficient from unmagnetised first order transport theory. The code is applied to the cold fuel shell and alpha particle equilibration problem of inertial confinement fusion.
A note on simultaneous Monte Carlo tests
DEFF Research Database (Denmark)
Hahn, Ute
In this short note, Monte Carlo tests of goodness of fit for data of the form X(t), t ∈ I are considered, that reject the null hypothesis if X(t) leaves an acceptance region bounded by an upper and lower curve for some t in I. A construction of the acceptance region is proposed that complies to a...... to a given target level of rejection, and yields exact p-values. The construction is based on pointwise quantiles, estimated from simulated realizations of X(t) under the null hypothesis....
Archimedes, the Free Monte Carlo simulator
Sellier, Jean Michel D
2012-01-01
Archimedes is the GNU package for Monte Carlo simulations of electron transport in semiconductor devices. The first release appeared in 2004 and since then it has been improved with many new features like quantum corrections, magnetic fields, new materials, GUI, etc. This document represents the first attempt to have a complete manual. Many of the Physics models implemented are described and a detailed description is presented to make the user able to write his/her own input deck. Please, feel free to contact the author if you want to contribute to the project.
Cluster hybrid Monte Carlo simulation algorithms
Plascak, J. A.; Ferrenberg, Alan M.; Landau, D. P.
2002-06-01
We show that addition of Metropolis single spin flips to the Wolff cluster-flipping Monte Carlo procedure leads to a dramatic increase in performance for the spin-1/2 Ising model. We also show that adding Wolff cluster flipping to the Metropolis or heat bath algorithms in systems where just cluster flipping is not immediately obvious (such as the spin-3/2 Ising model) can substantially reduce the statistical errors of the simulations. A further advantage of these methods is that systematic errors introduced by the use of imperfect random-number generation may be largely healed by hybridizing single spin flips with cluster flipping.
Introduction to Cluster Monte Carlo Algorithms
Luijten, E.
This chapter provides an introduction to cluster Monte Carlo algorithms for classical statistical-mechanical systems. A brief review of the conventional Metropolis algorithm is given, followed by a detailed discussion of the lattice cluster algorithm developed by Swendsen and Wang and the single-cluster variant introduced by Wolff. For continuum systems, the geometric cluster algorithm of Dress and Krauth is described. It is shown how their geometric approach can be generalized to incorporate particle interactions beyond hardcore repulsions, thus forging a connection between the lattice and continuum approaches. Several illustrative examples are discussed.
Monte Carlo simulation for the transport beamline
Energy Technology Data Exchange (ETDEWEB)
Romano, F.; Cuttone, G.; Jia, S. B.; Varisano, A. [INFN, Laboratori Nazionali del Sud, Via Santa Sofia 62, Catania (Italy); Attili, A.; Marchetto, F.; Russo, G. [INFN, Sezione di Torino, Via P.Giuria, 1 10125 Torino (Italy); Cirrone, G. A. P.; Schillaci, F.; Scuderi, V. [INFN, Laboratori Nazionali del Sud, Via Santa Sofia 62, Catania, Italy and Institute of Physics Czech Academy of Science, ELI-Beamlines project, Na Slovance 2, Prague (Czech Republic); Carpinelli, M. [INFN Sezione di Cagliari, c/o Dipartimento di Fisica, Università di Cagliari, Cagliari (Italy); Tramontana, A. [INFN, Laboratori Nazionali del Sud, Via Santa Sofia 62, Catania, Italy and Università di Catania, Dipartimento di Fisica e Astronomia, Via S. Sofia 64, Catania (Italy)
2013-07-26
In the framework of the ELIMED project, Monte Carlo (MC) simulations are widely used to study the physical transport of charged particles generated by laser-target interactions and to preliminarily evaluate fluence and dose distributions. An energy selection system and the experimental setup for the TARANIS laser facility in Belfast (UK) have been already simulated with the GEANT4 (GEometry ANd Tracking) MC toolkit. Preliminary results are reported here. Future developments are planned to implement a MC based 3D treatment planning in order to optimize shots number and dose delivery.
Mosaic crystal algorithm for Monte Carlo simulations
Seeger, P A
2002-01-01
An algorithm is presented for calculating reflectivity, absorption, and scattering of mosaic crystals in Monte Carlo simulations of neutron instruments. The algorithm uses multi-step transport through the crystal with an exact solution of the Darwin equations at each step. It relies on the kinematical model for Bragg reflection (with parameters adjusted to reproduce experimental data). For computation of thermal effects (the Debye-Waller factor and coherent inelastic scattering), an expansion of the Debye integral as a rapidly converging series of exponential terms is also presented. Any crystal geometry and plane orientation may be treated. The algorithm has been incorporated into the neutron instrument simulation package NISP. (orig.)
Diffusion quantum Monte Carlo for molecules
Energy Technology Data Exchange (ETDEWEB)
Lester, W.A. Jr.
1986-07-01
A quantum mechanical Monte Carlo method has been used for the treatment of molecular problems. The imaginary-time Schroedinger equation written with a shift in zero energy (E/sub T/ - V(R)) can be interpreted as a generalized diffusion equation with a position-dependent rate or branching term. Since diffusion is the continuum limit of a random walk, one may simulate the Schroedinger equation with a function psi (note, not psi/sup 2/) as a density of ''walks.'' The walks undergo an exponential birth and death as given by the rate term. 16 refs., 2 tabs.
Energy Technology Data Exchange (ETDEWEB)
Marcus, Ryan C. [Los Alamos National Laboratory
2012-07-24
Overview of this presentation is (1) Exascale computing - different technologies, getting there; (2) high-performance proof-of-concept MCMini - features and results; and (3) OpenCL toolkit - Oatmeal (OpenCL Automatic Memory Allocation Library) - purpose and features. Despite driver issues, OpenCL seems like a good, hardware agnostic tool. MCMini demonstrates the possibility for GPGPU-based Monte Carlo methods - it shows great scaling for HPC application and algorithmic equivalence. Oatmeal provides a flexible framework to aid in the development of scientific OpenCL codes.
State-of-the-art Monte Carlo 1988
Energy Technology Data Exchange (ETDEWEB)
Soran, P.D.
1988-06-28
Particle transport calculations in highly dimensional and physically complex geometries, such as detector calibration, radiation shielding, space reactors, and oil-well logging, generally require Monte Carlo transport techniques. Monte Carlo particle transport can be performed on a variety of computers ranging from APOLLOs to VAXs. Some of the hardware and software developments, which now permit Monte Carlo methods to be routinely used, are reviewed in this paper. The development of inexpensive, large, fast computer memory, coupled with fast central processing units, permits Monte Carlo calculations to be performed on workstations, minicomputers, and supercomputers. The Monte Carlo renaissance is further aided by innovations in computer architecture and software development. Advances in vectorization and parallelization architecture have resulted in the development of new algorithms which have greatly reduced processing times. Finally, the renewed interest in Monte Carlo has spawned new variance reduction techniques which are being implemented in large computer codes. 45 refs.
Radiation shielding design for neutron diffractometers assisted by Monte Carlo methods
Osborn, John C.; Ersez, Tunay; Braoudakis, George
2006-11-01
Monte Carlo simulations may be used to model radiation shielding for neutron diffractometers. The use of the MCNP computer program to assess shielding for a diffractometer is discussed. A comparison is made of shielding requirements for radiation generated by several materials commonly used in neutron optical elements and beam stops, including lithium-6 based absorbers where the Monte Carlo method can model the effects of fast neutrons generated by this material.
Monte-Carlo Simulation on Neutron Instruments at CARR
Institute of Scientific and Technical Information of China (English)
2001-01-01
The design of high resolution neutron powder diffractometer(HRPD) and two cold neutron guides(CNGs) to be built at China advanced research reactor(CARR) are studied by Monte-Carlo simulation technique.The HRPD instrument is desiged to have a minimum resolution of 0.2% and neutron fluence rate of greater than 106 cm-2 ·s-1 at sample position. The resolution curves, neutron fluence rate and effective neutron beam size at sample position are given. Differences in resolutions and intensity between the
Optical Monte Carlo modeling of a true portwine stain anatomy
Barton, Jennifer K.; Pfefer, T. Joshua; Welch, Ashley J.; Smithies, Derek J.; Nelson, Jerry; van Gemert, Martin J.
1998-04-01
A unique Monte Carlo program capable of accommodating an arbitrarily complex geometry was used to determine the energy deposition in a true port wine stain anatomy. Serial histologic sections taken from a biopsy of a dark red, laser therapy resistant stain were digitized and used to create the program input for simulation at wavelengths of 532 and 585 nm. At both wavelengths, the greatest energy deposition occurred in the superficial blood vessels, and subsequently decreased with depth as the laser beam was attenuated. However, more energy was deposited in the epidermis and superficial blood vessels at 532 nm than at 585 nm.
Optimization of Monte Carlo dose calculations: The interface problem
Soudentas, Edward
1998-05-01
High energy photon beams are widely used for radiation treatment of deep-seated tumors. The human body contains many types of interfaces between dissimilar materials that affect dose distribution in radiation therapy. Experimentally, significant radiation dose perturbations has been observed at such interfaces. The EGS4 Monte Carlo code was used to calculate dose perturbations at boundaries between dissimilar materials (such as bone/water) for 60Co and 6 MeV linear accelerator beams using a UNIX workstation. A simple test of the reliability of a random number generator was also developed. A systematic study of the adjustable parameters in EGS4 was performed in order to minimize calculational artifacts at boundaries. Calculations of dose perturbations at boundaries between different materials showed that there is a 12% increase in dose at water/bone interface, and a 44% increase in dose at water/copper interface. with the increase mainly due to electrons produced in water and backscattered from the high atomic number material. The dependence of the dose increase on the atomic number was also investigated. The clinically important case of using two parallel opposed beams for radiation therapy was investigated where increased doses at boundaries has been observed. The Monte Carlo calculations can provide accurate dosimetry data under conditions of electronic non-equilibrium at tissue interfaces.
Monte Carlo Simulations: Number of Iterations and Accuracy
2015-07-01
Jessica Schultheis for her editorial review. vi INTENTIONALLY LEFT BLANK. 1 1. Introduction Monte Carlo (MC) methods1 are often used...ARL-TN-0684 ● JULY 2015 US Army Research Laboratory Monte Carlo Simulations: Number of Iterations and Accuracy by William...needed. Do not return it to the originator. ARL-TN-0684 ● JULY 2015 US Army Research Laboratory Monte Carlo Simulations: Number
Discrete diffusion Monte Carlo for frequency-dependent radiative transfer
Energy Technology Data Exchange (ETDEWEB)
Densmore, Jeffrey D [Los Alamos National Laboratory; Kelly, Thompson G [Los Alamos National Laboratory; Urbatish, Todd J [Los Alamos National Laboratory
2010-11-17
Discrete Diffusion Monte Carlo (DDMC) is a technique for increasing the efficiency of Implicit Monte Carlo radiative-transfer simulations. In this paper, we develop an extension of DDMC for frequency-dependent radiative transfer. We base our new DDMC method on a frequency-integrated diffusion equation for frequencies below a specified threshold. Above this threshold we employ standard Monte Carlo. With a frequency-dependent test problem, we confirm the increased efficiency of our new DDMC technique.
Alternative Monte Carlo Approach for General Global Illumination
Institute of Scientific and Technical Information of China (English)
徐庆; 李朋; 徐源; 孙济洲
2004-01-01
An alternative Monte Carlo strategy for the computation of global illumination problem was presented.The proposed approach provided a new and optimal way for solving Monte Carlo global illumination based on the zero variance importance sampling procedure. A new importance driven Monte Carlo global illumination algorithm in the framework of the new computing scheme was developed and implemented. Results, which were obtained by rendering test scenes, show that this new framework and the newly derived algorithm are effective and promising.
Validation of Compton Scattering Monte Carlo Simulation Models
Weidenspointner, Georg; Hauf, Steffen; Hoff, Gabriela; Kuster, Markus; Pia, Maria Grazia; Saracco, Paolo
2014-01-01
Several models for the Monte Carlo simulation of Compton scattering on electrons are quantitatively evaluated with respect to a large collection of experimental data retrieved from the literature. Some of these models are currently implemented in general purpose Monte Carlo systems; some have been implemented and evaluated for possible use in Monte Carlo particle transport for the first time in this study. Here we present first and preliminary results concerning total and differential Compton scattering cross sections.
Multiple Monte Carlo Testing with Applications in Spatial Point Processes
DEFF Research Database (Denmark)
Mrkvička, Tomáš; Myllymäki, Mari; Hahn, Ute
with a function as the test statistic, 3) several Monte Carlo tests with functions as test statistics. The rank test has correct (global) type I error in each case and it is accompanied with a p-value and with a graphical interpretation which shows which subtest or which distances of the used test function......The rank envelope test (Myllym\\"aki et al., Global envelope tests for spatial processes, arXiv:1307.0239 [stat.ME]) is proposed as a solution to multiple testing problem for Monte Carlo tests. Three different situations are recognized: 1) a few univariate Monte Carlo tests, 2) a Monte Carlo test...
Energy Technology Data Exchange (ETDEWEB)
Mayorga, P. A. [FISRAD S.A.S., CR 64 A No. 22 - 41, Bogotá D C (Colombia); Departamento de Física Atómica, Molecular y Nuclear, Universidad de Granada, E-18071 Granada (Spain); Brualla, L.; Sauerwein, W. [NCTeam, Strahlenklinik, Universitätsklinikum Essen, Hufelandstraße 55, D-45122 Essen (Germany); Lallena, A. M., E-mail: lallena@ugr.es [Departamento de Física Atómica, Molecular y Nuclear, Universidad de Granada, E-18071 Granada (Spain)
2014-01-15
Purpose: Retinoblastoma is the most common intraocular malignancy in the early childhood. Patients treated with external beam radiotherapy respond very well to the treatment. However, owing to the genotype of children suffering hereditary retinoblastoma, the risk of secondary radio-induced malignancies is high. The University Hospital of Essen has successfully treated these patients on a daily basis during nearly 30 years using a dedicated “D”-shaped collimator. The use of this collimator that delivers a highly conformed small radiation field, gives very good results in the control of the primary tumor as well as in preserving visual function, while it avoids the devastating side effects of deformation of midface bones. The purpose of the present paper is to propose a modified version of the “D”-shaped collimator that reduces even further the irradiation field with the scope to reduce as well the risk of radio-induced secondary malignancies. Concurrently, the new dedicated “D”-shaped collimator must be easier to build and at the same time produces dose distributions that only differ on the field size with respect to the dose distributions obtained by the current collimator in use. The scope of the former requirement is to facilitate the employment of the authors' irradiation technique both at the authors' and at other hospitals. The fulfillment of the latter allows the authors to continue using the clinical experience gained in more than 30 years. Methods: The Monte Carlo codePENELOPE was used to study the effect that the different structural elements of the dedicated “D”-shaped collimator have on the absorbed dose distribution. To perform this study, the radiation transport through a Varian Clinac 2100 C/D operating at 6 MV was simulated in order to tally phase-space files which were then used as radiation sources to simulate the considered collimators and the subsequent dose distributions. With the knowledge gained in that study, a new
THE MCNPX MONTE CARLO RADIATION TRANSPORT CODE
Energy Technology Data Exchange (ETDEWEB)
WATERS, LAURIE S. [Los Alamos National Laboratory; MCKINNEY, GREGG W. [Los Alamos National Laboratory; DURKEE, JOE W. [Los Alamos National Laboratory; FENSIN, MICHAEL L. [Los Alamos National Laboratory; JAMES, MICHAEL R. [Los Alamos National Laboratory; JOHNS, RUSSELL C. [Los Alamos National Laboratory; PELOWITZ, DENISE B. [Los Alamos National Laboratory
2007-01-10
MCNPX (Monte Carlo N-Particle eXtended) is a general-purpose Monte Carlo radiation transport code with three-dimensional geometry and continuous-energy transport of 34 particles and light ions. It contains flexible source and tally options, interactive graphics, and support for both sequential and multi-processing computer platforms. MCNPX is based on MCNP4B, and has been upgraded to most MCNP5 capabilities. MCNP is a highly stable code tracking neutrons, photons and electrons, and using evaluated nuclear data libraries for low-energy interaction probabilities. MCNPX has extended this base to a comprehensive set of particles and light ions, with heavy ion transport in development. Models have been included to calculate interaction probabilities when libraries are not available. Recent additions focus on the time evolution of residual nuclei decay, allowing calculation of transmutation and delayed particle emission. MCNPX is now a code of great dynamic range, and the excellent neutronics capabilities allow new opportunities to simulate devices of interest to experimental particle physics; particularly calorimetry. This paper describes the capabilities of the current MCNPX version 2.6.C, and also discusses ongoing code development.
Multi-Index Monte Carlo (MIMC)
Haji Ali, Abdul Lateef
2015-01-07
We propose and analyze a novel Multi-Index Monte Carlo (MIMC) method for weak approximation of stochastic models that are described in terms of differential equations either driven by random measures or with random coefficients. The MIMC method is both a stochastic version of the combination technique introduced by Zenger, Griebel and collaborators and an extension of the Multilevel Monte Carlo (MLMC) method first described by Heinrich and Giles. Inspired by Giles’s seminal work, instead of using first-order differences as in MLMC, we use in MIMC high-order mixed differences to reduce the variance of the hierarchical differences dramatically. Under standard assumptions on the convergence rates of the weak error, variance and work per sample, the optimal index set turns out to be of Total Degree (TD) type. When using such sets, MIMC yields new and improved complexity results, which are natural generalizations of Giles’s MLMC analysis, and which increase the domain of problem parameters for which we achieve the optimal convergence.
Chemical application of diffusion quantum Monte Carlo
Reynolds, P. J.; Lester, W. A., Jr.
1983-10-01
The diffusion quantum Monte Carlo (QMC) method gives a stochastic solution to the Schroedinger equation. As an example the singlet-triplet splitting of the energy of the methylene molecule CH2 is given. The QMC algorithm was implemented on the CYBER 205, first as a direct transcription of the algorithm running on our VAX 11/780, and second by explicitly writing vector code for all loops longer than a crossover length C. The speed of the codes relative to one another as a function of C, and relative to the VAX is discussed. Since CH2 has only eight electrons, most of the loops in this application are fairly short. The longest inner loops run over the set of atomic basis functions. The CPU time dependence obtained versus the number of basis functions is discussed and compared with that obtained from traditional quantum chemistry codes and that obtained from traditional computer architectures. Finally, preliminary work on restructuring the algorithm to compute the separate Monte Carlo realizations in parallel is discussed.
Multi-Index Monte Carlo (MIMC)
Haji Ali, Abdul Lateef
2016-01-06
We propose and analyze a novel Multi-Index Monte Carlo (MIMC) method for weak approximation of stochastic models that are described in terms of differential equations either driven by random measures or with random coefficients. The MIMC method is both a stochastic version of the combination technique introduced by Zenger, Griebel and collaborators and an extension of the Multilevel Monte Carlo (MLMC) method first described by Heinrich and Giles. Inspired by Giles s seminal work, instead of using first-order differences as in MLMC, we use in MIMC high-order mixed differences to reduce the variance of the hierarchical differences dramatically. Under standard assumptions on the convergence rates of the weak error, variance and work per sample, the optimal index set turns out to be of Total Degree (TD) type. When using such sets, MIMC yields new and improved complexity results, which are natural generalizations of Giles s MLMC analysis, and which increase the domain of problem parameters for which we achieve the optimal convergence, O(TOL-2).
Discrete range clustering using Monte Carlo methods
Chatterji, G. B.; Sridhar, B.
1993-01-01
For automatic obstacle avoidance guidance during rotorcraft low altitude flight, a reliable model of the nearby environment is needed. Such a model may be constructed by applying surface fitting techniques to the dense range map obtained by active sensing using radars. However, for covertness, passive sensing techniques using electro-optic sensors are desirable. As opposed to the dense range map obtained via active sensing, passive sensing algorithms produce reliable range at sparse locations, and therefore, surface fitting techniques to fill the gaps in the range measurement are not directly applicable. Both for automatic guidance and as a display for aiding the pilot, these discrete ranges need to be grouped into sets which correspond to objects in the nearby environment. The focus of this paper is on using Monte Carlo methods for clustering range points into meaningful groups. One of the aims of the paper is to explore whether simulated annealing methods offer significant advantage over the basic Monte Carlo method for this class of problems. We compare three different approaches and present application results of these algorithms to a laboratory image sequence and a helicopter flight sequence.
Quantum Monte Carlo Calculations of Neutron Matter
Carlson, J; Ravenhall, D G
2003-01-01
Uniform neutron matter is approximated by a cubic box containing a finite number of neutrons, with periodic boundary conditions. We report variational and Green's function Monte Carlo calculations of the ground state of fourteen neutrons in a periodic box using the Argonne $\\vep $ two-nucleon interaction at densities up to one and half times the nuclear matter density. The effects of the finite box size are estimated using variational wave functions together with cluster expansion and chain summation techniques. They are small at subnuclear densities. We discuss the expansion of the energy of low-density neutron gas in powers of its Fermi momentum. This expansion is strongly modified by the large nn scattering length, and does not begin with the Fermi-gas kinetic energy as assumed in both Skyrme and relativistic mean field theories. The leading term of neutron gas energy is ~ half the Fermi-gas kinetic energy. The quantum Monte Carlo results are also used to calibrate the accuracy of variational calculations ...
Information Geometry and Sequential Monte Carlo
Sim, Aaron; Stumpf, Michael P H
2012-01-01
This paper explores the application of methods from information geometry to the sequential Monte Carlo (SMC) sampler. In particular the Riemannian manifold Metropolis-adjusted Langevin algorithm (mMALA) is adapted for the transition kernels in SMC. Similar to its function in Markov chain Monte Carlo methods, the mMALA is a fully adaptable kernel which allows for efficient sampling of high-dimensional and highly correlated parameter spaces. We set up the theoretical framework for its use in SMC with a focus on the application to the problem of sequential Bayesian inference for dynamical systems as modelled by sets of ordinary differential equations. In addition, we argue that defining the sequence of distributions on geodesics optimises the effective sample sizes in the SMC run. We illustrate the application of the methodology by inferring the parameters of simulated Lotka-Volterra and Fitzhugh-Nagumo models. In particular we demonstrate that compared to employing a standard adaptive random walk kernel, the SM...
Quantum Monte Carlo Endstation for Petascale Computing
Energy Technology Data Exchange (ETDEWEB)
Lubos Mitas
2011-01-26
NCSU research group has been focused on accomplising the key goals of this initiative: establishing new generation of quantum Monte Carlo (QMC) computational tools as a part of Endstation petaflop initiative for use at the DOE ORNL computational facilities and for use by computational electronic structure community at large; carrying out high accuracy quantum Monte Carlo demonstration projects in application of these tools to the forefront electronic structure problems in molecular and solid systems; expanding the impact of QMC methods and approaches; explaining and enhancing the impact of these advanced computational approaches. In particular, we have developed quantum Monte Carlo code (QWalk, www.qwalk.org) which was significantly expanded and optimized using funds from this support and at present became an actively used tool in the petascale regime by ORNL researchers and beyond. These developments have been built upon efforts undertaken by the PI's group and collaborators over the period of the last decade. The code was optimized and tested extensively on a number of parallel architectures including petaflop ORNL Jaguar machine. We have developed and redesigned a number of code modules such as evaluation of wave functions and orbitals, calculations of pfaffians and introduction of backflow coordinates together with overall organization of the code and random walker distribution over multicore architectures. We have addressed several bottlenecks such as load balancing and verified efficiency and accuracy of the calculations with the other groups of the Endstation team. The QWalk package contains about 50,000 lines of high quality object-oriented C++ and includes also interfaces to data files from other conventional electronic structure codes such as Gamess, Gaussian, Crystal and others. This grant supported PI for one month during summers, a full-time postdoc and partially three graduate students over the period of the grant duration, it has resulted in 13
Energy Technology Data Exchange (ETDEWEB)
Ramirez Ros, J. C.; Jerez Sainz, M. I.; Lobato Munoz, M.; Jodar Lopez, C. A.; Ruiz Lopez, M. A.; Carrasco rodriguez, J. L.; Pamos Urena, M.
2013-07-01
We evaluated the Monte Carlo Monaco Planner v2.0.3 for calculation between non-homogeneous low density (equivalent to lung), as a complement to the verification of modeling in homogeneous medium and prior to the introduction of the SBRT technique. We performed the same tests on Pinnacle v8.0m, with the same purpose. We compare the results obtained with the algorithm Monte Carlo of Monaco and the Collapsed Cone of Pinnacle. (Author)
Morse Monte Carlo Radiation Transport Code System
Energy Technology Data Exchange (ETDEWEB)
Emmett, M.B.
1975-02-01
The report contains sections containing descriptions of the MORSE and PICTURE codes, input descriptions, sample problems, deviations of the physical equations and explanations of the various error messages. The MORSE code is a multipurpose neutron and gamma-ray transport Monte Carlo code. Time dependence for both shielding and criticality problems is provided. General three-dimensional geometry may be used with an albedo option available at any material surface. The PICTURE code provide aid in preparing correct input data for the combinatorial geometry package CG. It provides a printed view of arbitrary two-dimensional slices through the geometry. By inspecting these pictures one may determine if the geometry specified by the input cards is indeed the desired geometry. 23 refs. (WRF)
Variational Monte Carlo study of pentaquark states
Energy Technology Data Exchange (ETDEWEB)
Mark W. Paris
2005-07-01
Accurate numerical solution of the five-body Schrodinger equation is effected via variational Monte Carlo. The spectrum is assumed to exhibit a narrow resonance with strangeness S=+1. A fully antisymmetrized and pair-correlated five-quark wave function is obtained for the assumed non-relativistic Hamiltonian which has spin, isospin, and color dependent pair interactions and many-body confining terms which are fixed by the non-exotic spectra. Gauge field dynamics are modeled via flux tube exchange factors. The energy determined for the ground states with J=1/2 and negative (positive) parity is 2.22 GeV (2.50 GeV). A lower energy negative parity state is consistent with recent lattice results. The short-range structure of the state is analyzed via its diquark content.
Monte Carlo simulation of neutron scattering instruments
Energy Technology Data Exchange (ETDEWEB)
Seeger, P.A.; Daemen, L.L.; Hjelm, R.P. Jr.
1998-12-01
A code package consisting of the Monte Carlo Library MCLIB, the executing code MC{_}RUN, the web application MC{_}Web, and various ancillary codes is proposed as an open standard for simulation of neutron scattering instruments. The architecture of the package includes structures to define surfaces, regions, and optical elements contained in regions. A particle is defined by its vector position and velocity, its time of flight, its mass and charge, and a polarization vector. The MC{_}RUN code handles neutron transport and bookkeeping, while the action on the neutron within any region is computed using algorithms that may be deterministic, probabilistic, or a combination. Complete versatility is possible because the existing library may be supplemented by any procedures a user is able to code. Some examples are shown.
Atomistic Monte Carlo simulation of lipid membranes
DEFF Research Database (Denmark)
Wüstner, Daniel; Sklenar, Heinz
2014-01-01
, as assessed by calculation of molecular energies and entropies. We also show transition from a crystalline-like to a fluid DPPC bilayer by the CBC local-move MC method, as indicated by the electron density profile, head group orientation, area per lipid, and whole-lipid displacements. We discuss the potential......Biological membranes are complex assemblies of many different molecules of which analysis demands a variety of experimental and computational approaches. In this article, we explain challenges and advantages of atomistic Monte Carlo (MC) simulation of lipid membranes. We provide an introduction...... into the various move sets that are implemented in current MC methods for efficient conformational sampling of lipids and other molecules. In the second part, we demonstrate for a concrete example, how an atomistic local-move set can be implemented for MC simulations of phospholipid monomers and bilayer patches...
Experimental Monte Carlo Quantum Process Certification
Steffen, L; Fedorov, A; Baur, M; Wallraff, A
2012-01-01
Experimental implementations of quantum information processing have now reached a level of sophistication where quantum process tomography is impractical. The number of experimental settings as well as the computational cost of the data post-processing now translates to days of effort to characterize even experiments with as few as 8 qubits. Recently a more practical approach to determine the fidelity of an experimental quantum process has been proposed, where the experimental data is compared directly to an ideal process using Monte Carlo sampling. Here we present an experimental implementation of this scheme in a circuit quantum electrodynamics setup to determine the fidelity of two qubit gates, such as the cphase and the cnot gate, and three qubit gates, such as the Toffoli gate and two sequential cphase gates.
Gas discharges modeling by Monte Carlo technique
Directory of Open Access Journals (Sweden)
Savić Marija
2010-01-01
Full Text Available The basic assumption of the Townsend theory - that ions produce secondary electrons - is valid only in a very narrow range of the reduced electric field E/N. In accordance with the revised Townsend theory that was suggested by Phelps and Petrović, secondary electrons are produced in collisions of ions, fast neutrals, metastable atoms or photons with the cathode, or in gas phase ionizations by fast neutrals. In this paper we tried to build up a Monte Carlo code that can be used to calculate secondary electron yields for different types of particles. The obtained results are in good agreement with the analytical results of Phelps and. Petrović [Plasma Sourc. Sci. Technol. 8 (1999 R1].
On nonlinear Markov chain Monte Carlo
Andrieu, Christophe; Doucet, Arnaud; Del Moral, Pierre; 10.3150/10-BEJ307
2011-01-01
Let $\\mathscr{P}(E)$ be the space of probability measures on a measurable space $(E,\\mathcal{E})$. In this paper we introduce a class of nonlinear Markov chain Monte Carlo (MCMC) methods for simulating from a probability measure $\\pi\\in\\mathscr{P}(E)$. Nonlinear Markov kernels (see [Feynman--Kac Formulae: Genealogical and Interacting Particle Systems with Applications (2004) Springer]) $K:\\mathscr{P}(E)\\times E\\rightarrow\\mathscr{P}(E)$ can be constructed to, in some sense, improve over MCMC methods. However, such nonlinear kernels cannot be simulated exactly, so approximations of the nonlinear kernels are constructed using auxiliary or potentially self-interacting chains. Several nonlinear kernels are presented and it is demonstrated that, under some conditions, the associated approximations exhibit a strong law of large numbers; our proof technique is via the Poisson equation and Foster--Lyapunov conditions. We investigate the performance of our approximations with some simulations.
Monte Carlo exploration of warped Higgsless models
Energy Technology Data Exchange (ETDEWEB)
Hewett, JoAnne L.; Lillie, Benjamin; Rizzo, Thomas Gerard [Stanford Linear Accelerator Center, 2575 Sand Hill Rd., Menlo Park, CA, 94025 (United States)]. E-mail: rizzo@slac.stanford.edu
2004-10-01
We have performed a detailed Monte Carlo exploration of the parameter space for a warped Higgsless model of electroweak symmetry breaking in 5 dimensions. This model is based on the SU(2){sub L} x SU(2){sub R} x U(1){sub B-L} gauge group in an AdS{sub 5} bulk with arbitrary gauge kinetic terms on both the Planck and TeV branes. Constraints arising from precision electroweak measurements and collider data are found to be relatively easy to satisfy. We show, however, that the additional requirement of perturbative unitarity up to the cut-off, {approx_equal} 10 TeV, in W{sub L}{sup +}W{sub L}{sup -} elastic scattering in the absence of dangerous tachyons eliminates all models. If successful models of this class exist, they must be highly fine-tuned. (author)
Monte Carlo Exploration of Warped Higgsless Models
Hewett, J L; Rizzo, T G
2004-01-01
We have performed a detailed Monte Carlo exploration of the parameter space for a warped Higgsless model of electroweak symmetry breaking in 5 dimensions. This model is based on the $SU(2)_L\\times SU(2)_R\\times U(1)_{B-L}$ gauge group in an AdS$_5$ bulk with arbitrary gauge kinetic terms on both the Planck and TeV branes. Constraints arising from precision electroweak measurements and collider data are found to be relatively easy to satisfy. We show, however, that the additional requirement of perturbative unitarity up to the cut-off, $\\simeq 10$ TeV, in $W_L^+W_L^-$ elastic scattering in the absence of dangerous tachyons eliminates all models. If successful models of this class exist, they must be highly fine-tuned.
Monte Carlo Implementation of Polarized Hadronization
Matevosyan, Hrayr H; Thomas, Anthony W
2016-01-01
We study the polarized quark hadronization in a Monte Carlo (MC) framework based on the recent extension of the quark-jet framework, where a self-consistent treatment of the quark polarization transfer in a sequential hadronization picture has been presented. Here, we first adopt this approach for MC simulations of hadronization process with finite number of produced hadrons, expressing the relevant probabilities in terms of the eight leading twist quark-to-quark transverse momentum dependent (TMD) splitting functions (SFs) for elementary $q \\to q'+h$ transition. We present explicit expressions for the unpolarized and Collins fragmentation functions (FFs) of unpolarized hadrons emitted at rank two. Further, we demonstrate that all the current spectator-type model calculations of the leading twist quark-to-quark TMD SFs violate the positivity constraints, and propose quark model based ansatz for these input functions that circumvents the problem. We validate our MC framework by explicitly proving the absence o...
Commensurabilities between ETNOs: a Monte Carlo survey
Marcos, C de la Fuente
2016-01-01
Many asteroids in the main and trans-Neptunian belts are trapped in mean motion resonances with Jupiter and Neptune, respectively. As a side effect, they experience accidental commensurabilities among themselves. These commensurabilities define characteristic patterns that can be used to trace the source of the observed resonant behaviour. Here, we explore systematically the existence of commensurabilities between the known ETNOs using their heliocentric and barycentric semimajor axes, their uncertainties, and Monte Carlo techniques. We find that the commensurability patterns present in the known ETNO population resemble those found in the main and trans-Neptunian belts. Although based on small number statistics, such patterns can only be properly explained if most, if not all, of the known ETNOs are subjected to the resonant gravitational perturbations of yet undetected trans-Plutonian planets. We show explicitly that some of the statistically significant commensurabilities are compatible with the Planet Nin...
Variable length trajectory compressible hybrid Monte Carlo
Nishimura, Akihiko
2016-01-01
Hybrid Monte Carlo (HMC) generates samples from a prescribed probability distribution in a configuration space by simulating Hamiltonian dynamics, followed by the Metropolis (-Hastings) acceptance/rejection step. Compressible HMC (CHMC) generalizes HMC to a situation in which the dynamics is reversible but not necessarily Hamiltonian. This article presents a framework to further extend the algorithm. Within the existing framework, each trajectory of the dynamics must be integrated for the same amount of (random) time to generate a valid Metropolis proposal. Our generalized acceptance/rejection mechanism allows a more deliberate choice of the integration time for each trajectory. The proposed algorithm in particular enables an effective application of variable step size integrators to HMC-type sampling algorithms based on reversible dynamics. The potential of our framework is further demonstrated by another extension of HMC which reduces the wasted computations due to unstable numerical approximations and corr...
Lunar Regolith Albedos Using Monte Carlos
Wilson, T. L.; Andersen, V.; Pinsky, L. S.
2003-01-01
The analysis of planetary regoliths for their backscatter albedos produced by cosmic rays (CRs) is important for space exploration and its potential contributions to science investigations in fundamental physics and astrophysics. Albedos affect all such experiments and the personnel that operate them. Groups have analyzed the production rates of various particles and elemental species by planetary surfaces when bombarded with Galactic CR fluxes, both theoretically and by means of various transport codes, some of which have emphasized neutrons. Here we report on the preliminary results of our current Monte Carlo investigation into the production of charged particles, neutrons, and neutrinos by the lunar surface using FLUKA. In contrast to previous work, the effects of charm are now included.
Nuclear reactions in Monte Carlo codes.
Ferrari, A; Sala, P R
2002-01-01
The physics foundations of hadronic interactions as implemented in most Monte Carlo codes are presented together with a few practical examples. The description of the relevant physics is presented schematically split into the major steps in order to stress the different approaches required for the full understanding of nuclear reactions at intermediate and high energies. Due to the complexity of the problem, only a few semi-qualitative arguments are developed in this paper. The description will be necessarily schematic and somewhat incomplete, but hopefully it will be useful for a first introduction into this topic. Examples are shown mostly for the high energy regime, where all mechanisms mentioned in the paper are at work and to which perhaps most of the readers are less accustomed. Examples for lower energies can be found in the references.
Atomistic Monte Carlo simulation of lipid membranes
DEFF Research Database (Denmark)
Wüstner, Daniel; Sklenar, Heinz
2014-01-01
Biological membranes are complex assemblies of many different molecules of which analysis demands a variety of experimental and computational approaches. In this article, we explain challenges and advantages of atomistic Monte Carlo (MC) simulation of lipid membranes. We provide an introduction......, as assessed by calculation of molecular energies and entropies. We also show transition from a crystalline-like to a fluid DPPC bilayer by the CBC local-move MC method, as indicated by the electron density profile, head group orientation, area per lipid, and whole-lipid displacements. We discuss the potential...... of local-move MC methods in combination with molecular dynamics simulations, for example, for studying multi-component lipid membranes containing cholesterol....
Geometric Monte Carlo and Black Janus Geometries
Bak, Dongsu; Kim, Kyung Kiu; Min, Hyunsoo; Song, Jeong-Pil
2016-01-01
We describe an application of the Monte Carlo method to the Janus deformation of the black brane background. We present numerical results for three and five dimensional black Janus geometries with planar and spherical interfaces. In particular, we argue that the 5D geometry with a spherical interface has an application in understanding the finite temperature bag-like QCD model via the AdS/CFT correspondence. The accuracy and convergence of the algorithm are evaluated with respect to the grid spacing. The systematic errors of the method are determined using an exact solution of 3D black Janus. This numerical approach for solving linear problems is unaffected initial guess of a trial solution and can handle an arbitrary geometry under various boundary conditions in the presence of source fields.
Accurate barrier heights using diffusion Monte Carlo
Krongchon, Kittithat; Wagner, Lucas K
2016-01-01
Fixed node diffusion Monte Carlo (DMC) has been performed on a test set of forward and reverse barrier heights for 19 non-hydrogen-transfer reactions, and the nodal error has been assessed. The DMC results are robust to changes in the nodal surface, as assessed by using different mean-field techniques to generate single determinant wave functions. Using these single determinant nodal surfaces, DMC results in errors of 1.5(5) kcal/mol on barrier heights. Using the large data set of DMC energies, we attempted to find good descriptors of the fixed node error. It does not correlate with a number of descriptors including change in density, but does correlate with the gap between the highest occupied and lowest unoccupied orbital energies in the mean-field calculation.
Recent Developments in Quantum Monte Carlo: Methods and Applications
Aspuru-Guzik, Alan; Austin, Brian; Domin, Dominik; Galek, Peter T. A.; Handy, Nicholas; Prasad, Rajendra; Salomon-Ferrer, Romelia; Umezawa, Naoto; Lester, William A.
2007-12-01
The quantum Monte Carlo method in the diffusion Monte Carlo form has become recognized for its capability of describing the electronic structure of atomic, molecular and condensed matter systems to high accuracy. This talk will briefly outline the method with emphasis on recent developments connected with trial function construction, linear scaling, and applications to selected systems.
QUANTUM MONTE-CARLO SIMULATIONS - ALGORITHMS, LIMITATIONS AND APPLICATIONS
DERAEDT, H
1992-01-01
A survey is given of Quantum Monte Carlo methods currently used to simulate quantum lattice models. The formalisms employed to construct the simulation algorithms are sketched. The origin of fundamental (minus sign) problems which limit the applicability of the Quantum Monte Carlo approach is shown
QWalk: A Quantum Monte Carlo Program for Electronic Structure
Wagner, Lucas K; Mitas, Lubos
2007-01-01
We describe QWalk, a new computational package capable of performing Quantum Monte Carlo electronic structure calculations for molecules and solids with many electrons. We describe the structure of the program and its implementation of Quantum Monte Carlo methods. It is open-source, licensed under the GPL, and available at the web site http://www.qwalk.org
Quantum Monte Carlo Simulations : Algorithms, Limitations and Applications
Raedt, H. De
1992-01-01
A survey is given of Quantum Monte Carlo methods currently used to simulate quantum lattice models. The formalisms employed to construct the simulation algorithms are sketched. The origin of fundamental (minus sign) problems which limit the applicability of the Quantum Monte Carlo approach is shown
Reporting Monte Carlo Studies in Structural Equation Modeling
Boomsma, Anne
2013-01-01
In structural equation modeling, Monte Carlo simulations have been used increasingly over the last two decades, as an inventory from the journal Structural Equation Modeling illustrates. Reaching out to a broad audience, this article provides guidelines for reporting Monte Carlo studies in that fiel
Practical schemes for accurate forces in quantum Monte Carlo
Moroni, S.; Saccani, S.; Filippi, Claudia
2014-01-01
While the computation of interatomic forces has become a well-established practice within variational Monte Carlo (VMC), the use of the more accurate Fixed-Node Diffusion Monte Carlo (DMC) method is still largely limited to the computation of total energies on structures obtained at a lower level of
Efficiency and accuracy of Monte Carlo (importance) sampling
Waarts, P.H.
2003-01-01
Monte Carlo Analysis is often regarded as the most simple and accurate reliability method. Be-sides it is the most transparent method. The only problem is the accuracy in correlation with the efficiency. Monte Carlo gets less efficient or less accurate when very low probabilities are to be computed
The Monte Carlo Method. Popular Lectures in Mathematics.
Sobol', I. M.
The Monte Carlo Method is a method of approximately solving mathematical and physical problems by the simulation of random quantities. The principal goal of this booklet is to suggest to specialists in all areas that they will encounter problems which can be solved by the Monte Carlo Method. Part I of the booklet discusses the simulation of random…
Forest canopy BRDF simulation using Monte Carlo method
Huang, J.; Wu, B.; Zeng, Y.; Tian, Y.
2006-01-01
Monte Carlo method is a random statistic method, which has been widely used to simulate the Bidirectional Reflectance Distribution Function (BRDF) of vegetation canopy in the field of visible remote sensing. The random process between photons and forest canopy was designed using Monte Carlo method.
Sensitivity of Monte Carlo simulations to input distributions
Energy Technology Data Exchange (ETDEWEB)
RamoRao, B. S.; Srikanta Mishra, S.; McNeish, J.; Andrews, R. W.
2001-07-01
The sensitivity of the results of a Monte Carlo simulation to the shapes and moments of the probability distributions of the input variables is studied. An economical computational scheme is presented as an alternative to the replicate Monte Carlo simulations and is explained with an illustrative example. (Author) 4 refs.
Quantum Monte Carlo using a Stochastic Poisson Solver
Energy Technology Data Exchange (ETDEWEB)
Das, D; Martin, R M; Kalos, M H
2005-05-06
Quantum Monte Carlo (QMC) is an extremely powerful method to treat many-body systems. Usually quantum Monte Carlo has been applied in cases where the interaction potential has a simple analytic form, like the 1/r Coulomb potential. However, in a complicated environment as in a semiconductor heterostructure, the evaluation of the interaction itself becomes a non-trivial problem. Obtaining the potential from any grid-based finite-difference method, for every walker and every step is unfeasible. We demonstrate an alternative approach of solving the Poisson equation by a classical Monte Carlo within the overall quantum Monte Carlo scheme. We have developed a modified ''Walk On Spheres'' algorithm using Green's function techniques, which can efficiently account for the interaction energy of walker configurations, typical of quantum Monte Carlo algorithms. This stochastically obtained potential can be easily incorporated within popular quantum Monte Carlo techniques like variational Monte Carlo (VMC) or diffusion Monte Carlo (DMC). We demonstrate the validity of this method by studying a simple problem, the polarization of a helium atom in the electric field of an infinite capacitor.
Further experience in Bayesian analysis using Monte Carlo Integration
H.K. van Dijk (Herman); T. Kloek (Teun)
1980-01-01
textabstractAn earlier paper [Kloek and Van Dijk (1978)] is extended in three ways. First, Monte Carlo integration is performed in a nine-dimensional parameter space of Klein's model I [Klein (1950)]. Second, Monte Carlo is used as a tool for the elicitation of a uniform prior on a finite region by
New Approaches and Applications for Monte Carlo Perturbation Theory
Energy Technology Data Exchange (ETDEWEB)
Aufiero, Manuele; Bidaud, Adrien; Kotlyar, Dan; Leppänen, Jaakko; Palmiotti, Giuseppe; Salvatores, Massimo; Sen, Sonat; Shwageraus, Eugene; Fratoni, Massimiliano
2017-02-01
This paper presents some of the recent and new advancements in the extension of Monte Carlo Perturbation Theory methodologies and application. In particular, the discussed problems involve Brunup calculation, perturbation calculation based on continuous energy functions, and Monte Carlo Perturbation Theory in loosely coupled systems.
Forest canopy BRDF simulation using Monte Carlo method
Huang, J.; Wu, B.; Zeng, Y.; Tian, Y.
2006-01-01
Monte Carlo method is a random statistic method, which has been widely used to simulate the Bidirectional Reflectance Distribution Function (BRDF) of vegetation canopy in the field of visible remote sensing. The random process between photons and forest canopy was designed using Monte Carlo method.
Practical schemes for accurate forces in quantum Monte Carlo
Moroni, S.; Saccani, S.; Filippi, C.
2014-01-01
While the computation of interatomic forces has become a well-established practice within variational Monte Carlo (VMC), the use of the more accurate Fixed-Node Diffusion Monte Carlo (DMC) method is still largely limited to the computation of total energies on structures obtained at a lower level of
CERN Summer Student Report 2016 Monte Carlo Data Base Improvement
Caciulescu, Alexandru Razvan
2016-01-01
During my Summer Student project I worked on improving the Monte Carlo Data Base and MonALISA services for the ALICE Collaboration. The project included learning the infrastructure for tracking and monitoring of the Monte Carlo productions as well as developing a new RESTful API for seamless integration with the JIRA issue tracking framework.
Optical coherence tomography: Monte Carlo simulation and improvement by optical amplification
DEFF Research Database (Denmark)
Tycho, Andreas
2002-01-01
An advanced novel Monte Carlo simulation model of the detection process of an optical coherence tomography (OCT) system is presented. For the first time it is shown analytically that the applicability of the incoherent Monte Carlo approach to model the heterodyne detection process of an OCT system...... model of the OCT signal. The OCT signal from a scattering medium are obtained for several beam and sample geometries using the new Monte Carlo model, and when comparing to results of an analytical model based on the extended Huygens-Fresnel principle excellent agreement is obtained. With the greater...... flexibility of Monte Carlo simulations, this new model is demonstrated to be excellent as a numerical phantom, i.e., as a substitute for otherwise difficult experiments. Finally, a new model of the signal-to-noise ratio (SNR) of an OCT system with optical amplification of the light reflected from the sample...
Accelerated GPU based SPECT Monte Carlo simulations
Garcia, Marie-Paule; Bert, Julien; Benoit, Didier; Bardiès, Manuel; Visvikis, Dimitris
2016-06-01
Monte Carlo (MC) modelling is widely used in the field of single photon emission computed tomography (SPECT) as it is a reliable technique to simulate very high quality scans. This technique provides very accurate modelling of the radiation transport and particle interactions in a heterogeneous medium. Various MC codes exist for nuclear medicine imaging simulations. Recently, new strategies exploiting the computing capabilities of graphical processing units (GPU) have been proposed. This work aims at evaluating the accuracy of such GPU implementation strategies in comparison to standard MC codes in the context of SPECT imaging. GATE was considered the reference MC toolkit and used to evaluate the performance of newly developed GPU Geant4-based Monte Carlo simulation (GGEMS) modules for SPECT imaging. Radioisotopes with different photon energies were used with these various CPU and GPU Geant4-based MC codes in order to assess the best strategy for each configuration. Three different isotopes were considered: 99m Tc, 111In and 131I, using a low energy high resolution (LEHR) collimator, a medium energy general purpose (MEGP) collimator and a high energy general purpose (HEGP) collimator respectively. Point source, uniform source, cylindrical phantom and anthropomorphic phantom acquisitions were simulated using a model of the GE infinia II 3/8" gamma camera. Both simulation platforms yielded a similar system sensitivity and image statistical quality for the various combinations. The overall acceleration factor between GATE and GGEMS platform derived from the same cylindrical phantom acquisition was between 18 and 27 for the different radioisotopes. Besides, a full MC simulation using an anthropomorphic phantom showed the full potential of the GGEMS platform, with a resulting acceleration factor up to 71. The good agreement with reference codes and the acceleration factors obtained support the use of GPU implementation strategies for improving computational efficiency
Monte Carlo modelling of TRIGA research reactor
El Bakkari, B.; Nacir, B.; El Bardouni, T.; El Younoussi, C.; Merroun, O.; Htet, A.; Boulaich, Y.; Zoubair, M.; Boukhal, H.; Chakir, M.
2010-10-01
The Moroccan 2 MW TRIGA MARK II research reactor at Centre des Etudes Nucléaires de la Maâmora (CENM) achieved initial criticality on May 2, 2007. The reactor is designed to effectively implement the various fields of basic nuclear research, manpower training, and production of radioisotopes for their use in agriculture, industry, and medicine. This study deals with the neutronic analysis of the 2-MW TRIGA MARK II research reactor at CENM and validation of the results by comparisons with the experimental, operational, and available final safety analysis report (FSAR) values. The study was prepared in collaboration between the Laboratory of Radiation and Nuclear Systems (ERSN-LMR) from Faculty of Sciences of Tetuan (Morocco) and CENM. The 3-D continuous energy Monte Carlo code MCNP (version 5) was used to develop a versatile and accurate full model of the TRIGA core. The model represents in detailed all components of the core with literally no physical approximation. Continuous energy cross-section data from the more recent nuclear data evaluations (ENDF/B-VI.8, ENDF/B-VII.0, JEFF-3.1, and JENDL-3.3) as well as S( α, β) thermal neutron scattering functions distributed with the MCNP code were used. The cross-section libraries were generated by using the NJOY99 system updated to its more recent patch file "up259". The consistency and accuracy of both the Monte Carlo simulation and neutron transport physics were established by benchmarking the TRIGA experiments. Core excess reactivity, total and integral control rods worth as well as power peaking factors were used in the validation process. Results of calculations are analysed and discussed.
Accelerated GPU based SPECT Monte Carlo simulations.
Garcia, Marie-Paule; Bert, Julien; Benoit, Didier; Bardiès, Manuel; Visvikis, Dimitris
2016-06-07
Monte Carlo (MC) modelling is widely used in the field of single photon emission computed tomography (SPECT) as it is a reliable technique to simulate very high quality scans. This technique provides very accurate modelling of the radiation transport and particle interactions in a heterogeneous medium. Various MC codes exist for nuclear medicine imaging simulations. Recently, new strategies exploiting the computing capabilities of graphical processing units (GPU) have been proposed. This work aims at evaluating the accuracy of such GPU implementation strategies in comparison to standard MC codes in the context of SPECT imaging. GATE was considered the reference MC toolkit and used to evaluate the performance of newly developed GPU Geant4-based Monte Carlo simulation (GGEMS) modules for SPECT imaging. Radioisotopes with different photon energies were used with these various CPU and GPU Geant4-based MC codes in order to assess the best strategy for each configuration. Three different isotopes were considered: (99m) Tc, (111)In and (131)I, using a low energy high resolution (LEHR) collimator, a medium energy general purpose (MEGP) collimator and a high energy general purpose (HEGP) collimator respectively. Point source, uniform source, cylindrical phantom and anthropomorphic phantom acquisitions were simulated using a model of the GE infinia II 3/8" gamma camera. Both simulation platforms yielded a similar system sensitivity and image statistical quality for the various combinations. The overall acceleration factor between GATE and GGEMS platform derived from the same cylindrical phantom acquisition was between 18 and 27 for the different radioisotopes. Besides, a full MC simulation using an anthropomorphic phantom showed the full potential of the GGEMS platform, with a resulting acceleration factor up to 71. The good agreement with reference codes and the acceleration factors obtained support the use of GPU implementation strategies for improving computational
Monte Carlo scatter correction for SPECT
Liu, Zemei
The goal of this dissertation is to present a quantitatively accurate and computationally fast scatter correction method that is robust and easily accessible for routine applications in SPECT imaging. A Monte Carlo based scatter estimation method is investigated and developed further. The Monte Carlo simulation program SIMIND (Simulating Medical Imaging Nuclear Detectors), was specifically developed to simulate clinical SPECT systems. The SIMIND scatter estimation (SSE) method was developed further using a multithreading technique to distribute the scatter estimation task across multiple threads running concurrently on multi-core CPU's to accelerate the scatter estimation process. An analytical collimator that ensures less noise was used during SSE. The research includes the addition to SIMIND of charge transport modeling in cadmium zinc telluride (CZT) detectors. Phenomena associated with radiation-induced charge transport including charge trapping, charge diffusion, charge sharing between neighboring detector pixels, as well as uncertainties in the detection process are addressed. Experimental measurements and simulation studies were designed for scintillation crystal based SPECT and CZT based SPECT systems to verify and evaluate the expanded SSE method. Jaszczak Deluxe and Anthropomorphic Torso Phantoms (Data Spectrum Corporation, Hillsborough, NC, USA) were used for experimental measurements and digital versions of the same phantoms employed during simulations to mimic experimental acquisitions. This study design enabled easy comparison of experimental and simulated data. The results have consistently shown that the SSE method performed similarly or better than the triple energy window (TEW) and effective scatter source estimation (ESSE) methods for experiments on all the clinical SPECT systems. The SSE method is proven to be a viable method for scatter estimation for routine clinical use.
Fission Matrix Capability for MCNP Monte Carlo
Energy Technology Data Exchange (ETDEWEB)
Carney, Sean E. [Los Alamos National Laboratory; Brown, Forrest B. [Los Alamos National Laboratory; Kiedrowski, Brian C. [Los Alamos National Laboratory; Martin, William R. [Los Alamos National Laboratory
2012-09-05
In a Monte Carlo criticality calculation, before the tallying of quantities can begin, a converged fission source (the fundamental eigenvector of the fission kernel) is required. Tallies of interest may include powers, absorption rates, leakage rates, or the multiplication factor (the fundamental eigenvalue of the fission kernel, k{sub eff}). Just as in the power iteration method of linear algebra, if the dominance ratio (the ratio of the first and zeroth eigenvalues) is high, many iterations of neutron history simulations are required to isolate the fundamental mode of the problem. Optically large systems have large dominance ratios, and systems containing poor neutron communication between regions are also slow to converge. The fission matrix method, implemented into MCNP[1], addresses these problems. When Monte Carlo random walk from a source is executed, the fission kernel is stochastically applied to the source. Random numbers are used for: distances to collision, reaction types, scattering physics, fission reactions, etc. This method is used because the fission kernel is a complex, 7-dimensional operator that is not explicitly known. Deterministic methods use approximations/discretization in energy, space, and direction to the kernel. Consequently, they are faster. Monte Carlo directly simulates the physics, which necessitates the use of random sampling. Because of this statistical noise, common convergence acceleration methods used in deterministic methods do not work. In the fission matrix method, we are using the random walk information not only to build the next-iteration fission source, but also a spatially-averaged fission kernel. Just like in deterministic methods, this involves approximation and discretization. The approximation is the tallying of the spatially-discretized fission kernel with an incorrect fission source. We address this by making the spatial mesh fine enough that this error is negligible. As a consequence of discretization we get a
Monte Carlo Modeling of Crystal Channeling at High Energies
Schoofs, Philippe; Cerutti, Francesco
Charged particles entering a crystal close to some preferred direction can be trapped in the electromagnetic potential well existing between consecutive planes or strings of atoms. This channeling effect can be used to extract beam particles if the crystal is bent beforehand. Crystal channeling is becoming a reliable and efficient technique for collimating beams and removing halo particles. At CERN, the installation of silicon crystals in the LHC is under scrutiny by the UA9 collaboration with the goal of investigating if they are a viable option for the collimation system upgrade. This thesis describes a new Monte Carlo model of planar channeling which has been developed from scratch in order to be implemented in the FLUKA code simulating particle transport and interactions. Crystal channels are described through the concept of continuous potential taking into account thermal motion of the lattice atoms and using Moliere screening function. The energy of the particle transverse motion determines whether or n...
Vectorized Monte Carlo methods for reactor lattice analysis
Brown, F. B.
1984-01-01
Some of the new computational methods and equivalent mathematical representations of physics models used in the MCV code, a vectorized continuous-enery Monte Carlo code for use on the CYBER-205 computer are discussed. While the principal application of MCV is the neutronics analysis of repeating reactor lattices, the new methods used in MCV should be generally useful for vectorizing Monte Carlo for other applications. For background, a brief overview of the vector processing features of the CYBER-205 is included, followed by a discussion of the fundamentals of Monte Carlo vectorization. The physics models used in the MCV vectorized Monte Carlo code are then summarized. The new methods used in scattering analysis are presented along with details of several key, highly specialized computational routines. Finally, speedups relative to CDC-7600 scalar Monte Carlo are discussed.
Quantum Monte Carlo methods algorithms for lattice models
Gubernatis, James; Werner, Philipp
2016-01-01
Featuring detailed explanations of the major algorithms used in quantum Monte Carlo simulations, this is the first textbook of its kind to provide a pedagogical overview of the field and its applications. The book provides a comprehensive introduction to the Monte Carlo method, its use, and its foundations, and examines algorithms for the simulation of quantum many-body lattice problems at finite and zero temperature. These algorithms include continuous-time loop and cluster algorithms for quantum spins, determinant methods for simulating fermions, power methods for computing ground and excited states, and the variational Monte Carlo method. Also discussed are continuous-time algorithms for quantum impurity models and their use within dynamical mean-field theory, along with algorithms for analytically continuing imaginary-time quantum Monte Carlo data. The parallelization of Monte Carlo simulations is also addressed. This is an essential resource for graduate students, teachers, and researchers interested in ...
Baräo, Fernando; Nakagawa, Masayuki; Távora, Luis; Vaz, Pedro
2001-01-01
This book focusses on the state of the art of Monte Carlo methods in radiation physics and particle transport simulation and applications, the latter involving in particular, the use and development of electron--gamma, neutron--gamma and hadronic codes. Besides the basic theory and the methods employed, special attention is paid to algorithm development for modeling, and the analysis of experiments and measurements in a variety of fields ranging from particle to medical physics.
Improvements in Monte Carlo Simulation of Large Electron Fields
Energy Technology Data Exchange (ETDEWEB)
Faddegon, Bruce A.; /UC, San Francisco; Perl, Joseph; Asai, Makoto; /SLAC
2007-11-28
Two Monte Carlo systems, EGSnrc and Geant4, were used to calculate dose distributions in large electron fields used in radiotherapy. Source and geometry parameters were adjusted to match calculated results with measurement. Both codes were capable of accurately reproducing the measured dose distributions of the 6 electron beams available on the accelerator. Depth penetration was matched to 0.1 cm. Depth dose curves generally agreed to 2% in the build-up region, although there is an additional 2-3% experimental uncertainty in this region. Dose profiles matched to 2% at the depth of maximum dose in the central region of the beam, out to the point of the profile where the dose begins to fall rapidly. A 3%/3mm match was obtained outside the central region except for the 6 MeV beam, where dose differences reached 5%. The discrepancy observed in the bremsstrahlung tail in published results that used EGS4 is no longer evident. The different systems required different source energies, incident beam angles, thicknesses of the exit window and primary foils, and distance between the primary and secondary foil. These results underscore the requirement for an experimental benchmark of electron scatter for beam energies and foils relevant to radiotherapy.
Monte Carlo simulation of zinc protoporphyrin fluorescence in the retina
Chen, Xiaoyan; Lane, Stephen
2010-02-01
We have used Monte Carlo simulation of autofluorescence in the retina to determine that noninvasive detection of nutritional iron deficiency is possible. Nutritional iron deficiency (which leads to iron deficiency anemia) affects more than 2 billion people worldwide, and there is an urgent need for a simple, noninvasive diagnostic test. Zinc protoporphyrin (ZPP) is a fluorescent compound that accumulates in red blood cells and is used as a biomarker for nutritional iron deficiency. We developed a computational model of the eye, using parameters that were identified either by literature search, or by direct experimental measurement to test the possibility of detecting ZPP non-invasively in retina. By incorporating fluorescence into Steven Jacques' original code for multi-layered tissue, we performed Monte Carlo simulation of fluorescence in the retina and determined that if the beam is not focused on a blood vessel in a neural retina layer or if part of light is hitting the vessel, ZPP fluorescence will be 10-200 times higher than background lipofuscin fluorescence coming from the retinal pigment epithelium (RPE) layer directly below. In addition we found that if the light can be focused entirely onto a blood vessel in the neural retina layer, the fluorescence signal comes only from ZPP. The fluorescence from layers below in this second situation does not contribute to the signal. Therefore, the possibility that a device could potentially be built and detect ZPP fluorescence in retina looks very promising.
Iterative acceleration methods for Monte Carlo and deterministic criticality calculations
Energy Technology Data Exchange (ETDEWEB)
Urbatsch, T.J.
1995-11-01
If you have ever given up on a nuclear criticality calculation and terminated it because it took so long to converge, you might find this thesis of interest. The author develops three methods for improving the fission source convergence in nuclear criticality calculations for physical systems with high dominance ratios for which convergence is slow. The Fission Matrix Acceleration Method and the Fission Diffusion Synthetic Acceleration (FDSA) Method are acceleration methods that speed fission source convergence for both Monte Carlo and deterministic methods. The third method is a hybrid Monte Carlo method that also converges for difficult problems where the unaccelerated Monte Carlo method fails. The author tested the feasibility of all three methods in a test bed consisting of idealized problems. He has successfully accelerated fission source convergence in both deterministic and Monte Carlo criticality calculations. By filtering statistical noise, he has incorporated deterministic attributes into the Monte Carlo calculations in order to speed their source convergence. He has used both the fission matrix and a diffusion approximation to perform unbiased accelerations. The Fission Matrix Acceleration method has been implemented in the production code MCNP and successfully applied to a real problem. When the unaccelerated calculations are unable to converge to the correct solution, they cannot be accelerated in an unbiased fashion. A Hybrid Monte Carlo method weds Monte Carlo and a modified diffusion calculation to overcome these deficiencies. The Hybrid method additionally possesses reduced statistical errors.
Application of Monte Carlo methods in tomotherapy and radiation biophysics
Hsiao, Ya-Yun
Helical tomotherapy is an attractive treatment for cancer therapy because highly conformal dose distributions can be achieved while the on-board megavoltage CT provides simultaneous images for accurate patient positioning. The convolution/superposition (C/S) dose calculation methods typically used for Tomotherapy treatment planning may overestimate skin (superficial) doses by 3-13%. Although more accurate than C/S methods, Monte Carlo (MC) simulations are too slow for routine clinical treatment planning. However, the computational requirements of MC can be reduced by developing a source model for the parts of the accelerator that do not change from patient to patient. This source model then becomes the starting point for additional simulations of the penetration of radiation through patient. In the first section of this dissertation, a source model for a helical tomotherapy is constructed by condensing information from MC simulations into series of analytical formulas. The MC calculated percentage depth dose and beam profiles computed using the source model agree within 2% of measurements for a wide range of field sizes, which suggests that the proposed source model provides an adequate representation of the tomotherapy head for dose calculations. Monte Carlo methods are a versatile technique for simulating many physical, chemical and biological processes. In the second major of this thesis, a new methodology is developed to simulate of the induction of DNA damage by low-energy photons. First, the PENELOPE Monte Carlo radiation transport code is used to estimate the spectrum of initial electrons produced by photons. The initial spectrum of electrons are then combined with DNA damage yields for monoenergetic electrons from the fast Monte Carlo damage simulation (MCDS) developed earlier by Semenenko and Stewart (Purdue University). Single- and double-strand break yields predicted by the proposed methodology are in good agreement (1%) with the results of published
PREFACE: First European Workshop on Monte Carlo Treatment Planning
Reynaert, Nick
2007-07-01
The "First European Workshop on Monte Carlo treatment planning", was an initiative of the European working group on Monte Carlo treatment planning (EWG-MCTP). It was organised at Ghent University (Belgium) on 22-25October 2006. The meeting was very successful and was attended by 150 participants. The impressive list of invited speakers and the scientific contributions (posters and oral presentations) have led to a very interesting program, that was well appreciated by all attendants. In addition, the presence of seven vendors of commercial MCTP software systems provided serious added value to the workshop. For each vendor, a representative has given a presentation in a dedicated session, explaining the current status of their system. It is clear that, for "traditional" radiotherapy applications (using photon or electron beams), Monte Carlo dose calculations have become the state of the art, and are being introduced into almost all commercial treatment planning systems. Invited lectures illustrated that scientific challenges are currently associated with 4D applications (e.g. respiratory motion) and the introduction of MC dose calculations in inverse planning. But it was striking that the Monte Carlo technique is also becoming very important in more novel treatment modalities such as BNCT, hadron therapy, stereotactic radiosurgery, Tomotherapy, etc. This emphasizes the continuous growing interest in MCTP. The people who attended the dosimetry session will certainly remember the high level discussion on the determination of correction factors for different ion chambers, used in small fields. The following proceedings will certainly confirm the high scientific level of the meeting. I would like to thank the members of the local organizing committee for all the hard work done before, during and after this meeting. The organisation of such an event is not a trivial task and it would not have been possible without the help of all my colleagues. I would also like to thank
Wieslander, Elinore; Knöös, Tommy
2007-02-01
The introduction of Monte Carlo (MC) techniques for treatment planning and also for verification purposes will have considerable impact on the radiation therapy planning process. The aim of this work was to use a virtual accelerator to study the performance of a MC-based electron dose calculation algorithm, implemented in a commercial treatment planning system. The performance in phantoms containing air and bone as well as in patient-specific geometries (thorax wall, nose, parotid gland and spinal cord) has been studied. The agreement between the virtual accelerator and the MC dose calculation algorithm is generally very good. A gamma-evaluation with criteria of 0.03 Gy/3 mm (per Gy at the depth of maximum dose) shows that, even for the worst cases, only a small volume of about 1.5% has gamma>1.0. In the worst case, with the 0.02 Gy/2 mm criteria, about 92% of the volume receiving more than 0.85 Gy per 100 monitor units (MU) has gamma-values <1.0. The corresponding value for the volume receiving more than 0.10 Gy/100 MU is about 98%. For the 18 MeV spinal-cord case, where a 6 x 20 cm2 insert is used, the TPS underestimates the dose outside the primary field due to inadequate modelling of the insert. The possibility of dose calculations in typical patient cases makes the virtual accelerator a powerful tool for validation and evaluation of dose calculation algorithms present in treatment planning systems.
EL Bakkali, Jaafar; EL Bardouni, Tarek; Safavi, Seyedmostafa; Mohammed, Maged; Saeed, Mroan
2016-08-01
The aim of this work is to assess the capabilities of Monte Carlo Geant4 code to reproduce the real percentage depth dose (PDD) curves generated in phantoms which mimic three important clinical treatment situations that include lung slab, bone slab, bone-lung slab geometries. It is hoped that this work will lead us to a better understanding of dose distributions in an inhomogeneous medium, and to identify any limitations of dose calculation algorithm implemented in the Geant4 code. For this purpose, the PDD dosimetric functions associated to the three clinical situations described above, were compared to one produced in a homogeneous water phantom. Our results show, firstly, that the Geant4 simulation shows potential mistakes on the shape of the calculated PDD curve of the first physical test object (PTO), and it is obviously not able to successfully predict dose values in regions near to the boundaries between two different materials. This is, surely due to the electron transport algorithm and it is well-known as the artifacts at interface phenomenon. To deal with this issue, we have added and optimized the StepMax parameter to the dose calculation program; consequently the artifacts due to the electron transport were quasi disappeared. However, the Geant4 simulation becomes painfully slow when we attempt to completely resolve the electron artifact problems by considering a smaller value of an electron StepMax parameter. After electron transport optimization, our results demonstrate the medium-level capabilities of the Geant4 code to modeling dose distribution in clinical PTO objects.
Information-Geometric Markov Chain Monte Carlo Methods Using Diffusions
Directory of Open Access Journals (Sweden)
Samuel Livingstone
2014-06-01
Full Text Available Recent work incorporating geometric ideas in Markov chain Monte Carlo is reviewed in order to highlight these advances and their possible application in a range of domains beyond statistics. A full exposition of Markov chains and their use in Monte Carlo simulation for statistical inference and molecular dynamics is provided, with particular emphasis on methods based on Langevin diffusions. After this, geometric concepts in Markov chain Monte Carlo are introduced. A full derivation of the Langevin diffusion on a Riemannian manifold is given, together with a discussion of the appropriate Riemannian metric choice for different problems. A survey of applications is provided, and some open questions are discussed.
The Monte Carlo method the method of statistical trials
Shreider, YuA
1966-01-01
The Monte Carlo Method: The Method of Statistical Trials is a systematic account of the fundamental concepts and techniques of the Monte Carlo method, together with its range of applications. Some of these applications include the computation of definite integrals, neutron physics, and in the investigation of servicing processes. This volume is comprised of seven chapters and begins with an overview of the basic features of the Monte Carlo method and typical examples of its application to simple problems in computational mathematics. The next chapter examines the computation of multi-dimensio
Rare event simulation using Monte Carlo methods
Rubino, Gerardo
2009-01-01
In a probabilistic model, a rare event is an event with a very small probability of occurrence. The forecasting of rare events is a formidable task but is important in many areas. For instance a catastrophic failure in a transport system or in a nuclear power plant, the failure of an information processing system in a bank, or in the communication network of a group of banks, leading to financial losses. Being able to evaluate the probability of rare events is therefore a critical issue. Monte Carlo Methods, the simulation of corresponding models, are used to analyze rare events. This book sets out to present the mathematical tools available for the efficient simulation of rare events. Importance sampling and splitting are presented along with an exposition of how to apply these tools to a variety of fields ranging from performance and dependability evaluation of complex systems, typically in computer science or in telecommunications, to chemical reaction analysis in biology or particle transport in physics. ...
A continuation multilevel Monte Carlo algorithm
Collier, Nathan
2014-09-05
We propose a novel Continuation Multi Level Monte Carlo (CMLMC) algorithm for weak approximation of stochastic models. The CMLMC algorithm solves the given approximation problem for a sequence of decreasing tolerances, ending when the required error tolerance is satisfied. CMLMC assumes discretization hierarchies that are defined a priori for each level and are geometrically refined across levels. The actual choice of computational work across levels is based on parametric models for the average cost per sample and the corresponding variance and weak error. These parameters are calibrated using Bayesian estimation, taking particular notice of the deepest levels of the discretization hierarchy, where only few realizations are available to produce the estimates. The resulting CMLMC estimator exhibits a non-trivial splitting between bias and statistical contributions. We also show the asymptotic normality of the statistical error in the MLMC estimator and justify in this way our error estimate that allows prescribing both required accuracy and confidence in the final result. Numerical results substantiate the above results and illustrate the corresponding computational savings in examples that are described in terms of differential equations either driven by random measures or with random coefficients. © 2014, Springer Science+Business Media Dordrecht.
Monte Carlo Simulations of the Photospheric Process
Santana, Rodolfo; Hernandez, Roberto A; Kumar, Pawan
2015-01-01
We present a Monte Carlo (MC) code we wrote to simulate the photospheric process and to study the photospheric spectrum above the peak energy. Our simulations were performed with a photon to electron ratio $N_{\\gamma}/N_{e} = 10^{5}$, as determined by observations of the GRB prompt emission. We searched an exhaustive parameter space to determine if the photospheric process can match the observed high-energy spectrum of the prompt emission. If we do not consider electron re-heating, we determined that the best conditions to produce the observed high-energy spectrum are low photon temperatures and high optical depths. However, for these simulations, the spectrum peaks at an energy below 300 keV by a factor $\\sim 10$. For the cases we consider with higher photon temperatures and lower optical depths, we demonstrate that additional energy in the electrons is required to produce a power-law spectrum above the peak-energy. By considering electron re-heating near the photosphere, the spectrum for these simulations h...
Finding Planet Nine: a Monte Carlo approach
Marcos, C de la Fuente
2016-01-01
Planet Nine is a hypothetical planet located well beyond Pluto that has been proposed in an attempt to explain the observed clustering in physical space of the perihelia of six extreme trans-Neptunian objects or ETNOs. The predicted approximate values of its orbital elements include a semimajor axis of 700 au, an eccentricity of 0.6, an inclination of 30 degrees, and an argument of perihelion of 150 degrees. Searching for this putative planet is already under way. Here, we use a Monte Carlo approach to create a synthetic population of Planet Nine orbits and study its visibility statistically in terms of various parameters and focusing on the aphelion configuration. Our analysis shows that, if Planet Nine exists and is at aphelion, it might be found projected against one out of four specific areas in the sky. Each area is linked to a particular value of the longitude of the ascending node and two of them are compatible with an apsidal antialignment scenario. In addition and after studying the current statistic...
Atomistic Monte Carlo simulation of lipid membranes.
Wüstner, Daniel; Sklenar, Heinz
2014-01-24
Biological membranes are complex assemblies of many different molecules of which analysis demands a variety of experimental and computational approaches. In this article, we explain challenges and advantages of atomistic Monte Carlo (MC) simulation of lipid membranes. We provide an introduction into the various move sets that are implemented in current MC methods for efficient conformational sampling of lipids and other molecules. In the second part, we demonstrate for a concrete example, how an atomistic local-move set can be implemented for MC simulations of phospholipid monomers and bilayer patches. We use our recently devised chain breakage/closure (CBC) local move set in the bond-/torsion angle space with the constant-bond-length approximation (CBLA) for the phospholipid dipalmitoylphosphatidylcholine (DPPC). We demonstrate rapid conformational equilibration for a single DPPC molecule, as assessed by calculation of molecular energies and entropies. We also show transition from a crystalline-like to a fluid DPPC bilayer by the CBC local-move MC method, as indicated by the electron density profile, head group orientation, area per lipid, and whole-lipid displacements. We discuss the potential of local-move MC methods in combination with molecular dynamics simulations, for example, for studying multi-component lipid membranes containing cholesterol.
Parallel Monte Carlo Simulation of Aerosol Dynamics
Directory of Open Access Journals (Sweden)
Kun Zhou
2014-02-01
Full Text Available A highly efficient Monte Carlo (MC algorithm is developed for the numerical simulation of aerosol dynamics, that is, nucleation, surface growth, and coagulation. Nucleation and surface growth are handled with deterministic means, while coagulation is simulated with a stochastic method (Marcus-Lushnikov stochastic process. Operator splitting techniques are used to synthesize the deterministic and stochastic parts in the algorithm. The algorithm is parallelized using the Message Passing Interface (MPI. The parallel computing efficiency is investigated through numerical examples. Near 60% parallel efficiency is achieved for the maximum testing case with 3.7 million MC particles running on 93 parallel computing nodes. The algorithm is verified through simulating various testing cases and comparing the simulation results with available analytical and/or other numerical solutions. Generally, it is found that only small number (hundreds or thousands of MC particles is necessary to accurately predict the aerosol particle number density, volume fraction, and so forth, that is, low order moments of the Particle Size Distribution (PSD function. Accurately predicting the high order moments of the PSD needs to dramatically increase the number of MC particles.
Monte Carlo simulations of Protein Adsorption
Sharma, Sumit; Kumar, Sanat K.; Belfort, Georges
2008-03-01
Amyloidogenic diseases, such as, Alzheimer's are caused by adsorption and aggregation of partially unfolded proteins. Adsorption of proteins is a concern in design of biomedical devices, such as dialysis membranes. Protein adsorption is often accompanied by conformational rearrangements in protein molecules. Such conformational rearrangements are thought to affect many properties of adsorbed protein molecules such as their adhesion strength to the surface, biological activity, and aggregation tendency. It has been experimentally shown that many naturally occurring proteins, upon adsorption to hydrophobic surfaces, undergo a helix to sheet or random coil secondary structural rearrangement. However, to better understand the equilibrium structural complexities of this phenomenon, we have performed Monte Carlo (MC) simulations of adsorption of a four helix bundle, modeled as a lattice protein, and studied the adsorption behavior and equilibrium protein conformations at different temperatures and degrees of surface hydrophobicity. To study the free energy and entropic effects on adsorption, Canonical ensemble MC simulations have been combined with Weighted Histogram Analysis Method(WHAM). Conformational transitions of proteins on surfaces will be discussed as a function of surface hydrophobicity and compared to analogous bulk transitions.
Monte Carlo Simulation of River Meander Modelling
Posner, A. J.; Duan, J. G.
2010-12-01
This study first compares the first order analytical solutions for flow field by Ikeda et. al. (1981) and Johanesson and Parker (1989b). Ikeda et. al.’s (1981) linear bank erosion model was implemented to predict the rate of bank erosion in which the bank erosion coefficient is treated as a stochastic variable that varies with physical properties of the bank (e.g. cohesiveness, stratigraphy, vegetation density). The developed model was used to predict the evolution of meandering planforms. Then, the modeling results were analyzed and compared to the observed data. Since the migration of meandering channel consists of downstream translation, lateral expansion, and downstream or upstream rotations. Several measures are formulated in order to determine which of the resulting planform is closest to the experimental measured one. Results from the deterministic model highly depend on the calibrated erosion coefficient. Since field measurements are always limited, the stochastic model yielded more realistic predictions of meandering planform evolutions. Due to the random nature of bank erosion coefficient, the meandering planform evolution is a stochastic process that can only be accurately predicted by a stochastic model. Quasi-2D Ikeda (1989) flow solution with Monte Carlo Simulation of Bank Erosion Coefficient.
Commensurabilities between ETNOs: a Monte Carlo survey
de la Fuente Marcos, C.; de la Fuente Marcos, R.
2016-07-01
Many asteroids in the main and trans-Neptunian belts are trapped in mean motion resonances with Jupiter and Neptune, respectively. As a side effect, they experience accidental commensurabilities among themselves. These commensurabilities define characteristic patterns that can be used to trace the source of the observed resonant behaviour. Here, we explore systematically the existence of commensurabilities between the known ETNOs using their heliocentric and barycentric semimajor axes, their uncertainties, and Monte Carlo techniques. We find that the commensurability patterns present in the known ETNO population resemble those found in the main and trans-Neptunian belts. Although based on small number statistics, such patterns can only be properly explained if most, if not all, of the known ETNOs are subjected to the resonant gravitational perturbations of yet undetected trans-Plutonian planets. We show explicitly that some of the statistically significant commensurabilities are compatible with the Planet Nine hypothesis; in particular, a number of objects may be trapped in the 5:3 and 3:1 mean motion resonances with a putative Planet Nine with semimajor axis ˜700 au.
Diffusion Monte Carlo in internal coordinates.
Petit, Andrew S; McCoy, Anne B
2013-08-15
An internal coordinate extension of diffusion Monte Carlo (DMC) is described as a first step toward a generalized reduced-dimensional DMC approach. The method places no constraints on the choice of internal coordinates other than the requirement that they all be independent. Using H(3)(+) and its isotopologues as model systems, the methodology is shown to be capable of successfully describing the ground state properties of molecules that undergo large amplitude, zero-point vibrational motions. Combining the approach developed here with the fixed-node approximation allows vibrationally excited states to be treated. Analysis of the ground state probability distribution is shown to provide important insights into the set of internal coordinates that are less strongly coupled and therefore more suitable for use as the nodal coordinates for the fixed-node DMC calculations. In particular, the curvilinear normal mode coordinates are found to provide reasonable nodal surfaces for the fundamentals of H(2)D(+) and D(2)H(+) despite both molecules being highly fluxional.
Monte Carlo Production Management at CMS
Boudoul, G.; Pol, A; Srimanobhas, P; Vlimant, J R; Franzoni, Giovanni
2015-01-01
The analysis of the LHC data at the Compact Muon Solenoid (CMS) experiment requires the production of a large number of simulated events.During the runI of LHC (2010-2012), CMS has produced over 12 Billion simulated events,organized in approximately sixty different campaigns each emulating specific detector conditions and LHC running conditions (pile up).In order toaggregate the information needed for the configuration and prioritization of the events production,assure the book-keeping and of all the processing requests placed by the physics analysis groups,and to interface with the CMS production infrastructure,the web-based service Monte Carlo Management (McM) has been developed and put in production in 2012.McM is based on recent server infrastructure technology (CherryPy + java) and relies on a CouchDB database back-end.This contribution will coverthe one and half year of operational experience managing samples of simulated events for CMS,the evolution of its functionalitiesand the extension of its capabi...
Monte Carlo models of dust coagulation
Zsom, Andras
2010-01-01
The thesis deals with the first stage of planet formation, namely dust coagulation from micron to millimeter sizes in circumstellar disks. For the first time, we collect and compile the recent laboratory experiments on dust aggregates into a collision model that can be implemented into dust coagulation models. We put this model into a Monte Carlo code that uses representative particles to simulate dust evolution. Simulations are performed using three different disk models in a local box (0D) located at 1 AU distance from the central star. We find that the dust evolution does not follow the previously assumed growth-fragmentation cycle, but growth is halted by bouncing before the fragmentation regime is reached. We call this the bouncing barrier which is an additional obstacle during the already complex formation process of planetesimals. The absence of the growth-fragmentation cycle and the halted growth has two important consequences for planet formation. 1) It is observed that disk atmospheres are dusty thr...
Atomistic Monte Carlo Simulation of Lipid Membranes
Directory of Open Access Journals (Sweden)
Daniel Wüstner
2014-01-01
Full Text Available Biological membranes are complex assemblies of many different molecules of which analysis demands a variety of experimental and computational approaches. In this article, we explain challenges and advantages of atomistic Monte Carlo (MC simulation of lipid membranes. We provide an introduction into the various move sets that are implemented in current MC methods for efficient conformational sampling of lipids and other molecules. In the second part, we demonstrate for a concrete example, how an atomistic local-move set can be implemented for MC simulations of phospholipid monomers and bilayer patches. We use our recently devised chain breakage/closure (CBC local move set in the bond-/torsion angle space with the constant-bond-length approximation (CBLA for the phospholipid dipalmitoylphosphatidylcholine (DPPC. We demonstrate rapid conformational equilibration for a single DPPC molecule, as assessed by calculation of molecular energies and entropies. We also show transition from a crystalline-like to a fluid DPPC bilayer by the CBC local-move MC method, as indicated by the electron density profile, head group orientation, area per lipid, and whole-lipid displacements. We discuss the potential of local-move MC methods in combination with molecular dynamics simulations, for example, for studying multi-component lipid membranes containing cholesterol.
Parallel Monte Carlo simulation of aerosol dynamics
Zhou, K.
2014-01-01
A highly efficient Monte Carlo (MC) algorithm is developed for the numerical simulation of aerosol dynamics, that is, nucleation, surface growth, and coagulation. Nucleation and surface growth are handled with deterministic means, while coagulation is simulated with a stochastic method (Marcus-Lushnikov stochastic process). Operator splitting techniques are used to synthesize the deterministic and stochastic parts in the algorithm. The algorithm is parallelized using the Message Passing Interface (MPI). The parallel computing efficiency is investigated through numerical examples. Near 60% parallel efficiency is achieved for the maximum testing case with 3.7 million MC particles running on 93 parallel computing nodes. The algorithm is verified through simulating various testing cases and comparing the simulation results with available analytical and/or other numerical solutions. Generally, it is found that only small number (hundreds or thousands) of MC particles is necessary to accurately predict the aerosol particle number density, volume fraction, and so forth, that is, low order moments of the Particle Size Distribution (PSD) function. Accurately predicting the high order moments of the PSD needs to dramatically increase the number of MC particles. 2014 Kun Zhou et al.
Measuring Berry curvature with quantum Monte Carlo
Kolodrubetz, Michael
2014-01-01
The Berry curvature and its descendant, the Berry phase, play an important role in quantum mechanics. They can be used to understand the Aharonov-Bohm effect, define topological Chern numbers, and generally to investigate the geometric properties of a quantum ground state manifold. While Berry curvature has been well-studied in the regimes of few-body physics and non-interacting particles, its use in the regime of strong interactions is hindered by the lack of numerical methods to solve it. In this paper we fill this gap by implementing a quantum Monte Carlo method to solve for the Berry curvature, based on interpreting Berry curvature as a leading correction to imaginary time ramps. We demonstrate our algorithm using the transverse-field Ising model in one and two dimensions, the latter of which is non-integrable. Despite the fact that the Berry curvature gives information about the phase of the wave function, we show that our algorithm has no sign or phase problem for standard sign-problem-free Hamiltonians...
Streamlining resummed QCD calculations using Monte Carlo integration
Farhi, David; Freytsis, Marat; Schwartz, Matthew D
2015-01-01
Some of the most arduous and error-prone aspects of precision resummed calculations are related to the partonic hard process, having nothing to do with the resummation. In particular, interfacing to parton-distribution functions, combining various channels, and performing the phase space integration can be limiting factors in completing calculations. Conveniently, however, most of these tasks are already automated in many Monte Carlo programs, such as MadGraph, Alpgen or Sherpa. In this paper, we show how such programs can be used to produce distributions of partonic kinematics with associated color structures representing the hard factor in a resummed distribution. These distributions can then be used to weight convolutions of jet, soft and beam functions producing a complete resummed calculation. In fact, only around 1000 unweighted events are necessary to produce precise distributions. A number of examples and checks are provided, including $e^+e^-$ two- and four-jet event shapes, $n$-jettiness and jet-mas...
Geometrical form factor calculation using Monte Carlo integration for lidar
Mao, Feiyue; Gong, Wei; Li, Jun
2012-06-01
We proposed a geometrical form factor (GFF) calculation using Monte Carlo integration (GFF-MC) for lidar that is practical and can be applied to any laser intensity distribution. Theoretical results have been calculated with our method based on the functions of measured, uniform and Gaussian laser intensity distribution. Two experimental GFF traces on clear days are obtained to verify the validity of the theoretical results. The results indicated that the measured distribution function outperformed the Gaussian and uniform functions. That means that the deviation of the measured laser intensity distribution from an ideal one can be too large to neglect. In addition, the theoretical GFF of the uniform distribution had a larger error than that of the Gaussian distribution. Furthermore, the effects of the inclination angle of the laser beam and the central obstruction of the support structure of the second mirror of the telescope are discussed in this study.
Academic Training: Monte Carlo generators for the LHC
Françoise Benz
2005-01-01
2004-2005 ACADEMIC TRAINING PROGRAMME LECTURE SERIES 4, 5, 6, 7 April from 11.00 to 12.00 hrs - Main Auditorium, bldg. 500 Monte Carlo generators for the LHC T. SJOSTRAND / CERN-PH, Lund Univ. SE Event generators today are indispensable as tools for the modelling of complex physics processes, that jointly lead to the production of hundreds of particles per event at LHC energies. Generators are used to set detector requirements, to formulate analysis strategies, or to calculate acceptance corrections. These lectures describe the physics that goes into the construction of an event generator, such as hard processes, initial- and final-state radiation, multiple interactions and beam remnants, hadronization and decays, and how these pieces come together. The current main generators are introduced, and are used to illustrate uncertainties in the physics modelling. Some trends for the future are outlined. ENSEIGNEMENT ACADEMIQUE ACADEMIC TRAINING Françoise Benz 73127 academic.training@cern.ch
Monte Carlo simulations and benchmark studies at CERN's accelerator chain
AUTHOR|(CDS)2083190; Brugger, Markus
2016-01-01
Mixed particle and energy radiation fields present at the Large Hadron Collider (LHC) and its accelerator chain are responsible for failures on electronic devices located in the vicinity of the accelerator beam lines. These radiation effects on electronics and, more generally, the overall radiation damage issues have a direct impact on component and system lifetimes, as well as on maintenance requirements and radiation exposure to personnel who have to intervene and fix existing faults. The radiation environments and respective radiation damage issues along the CERN’s accelerator chain were studied in the framework of the CERN Radiation to Electronics (R2E) project and are hereby presented. The important interplay between Monte Carlo simulations and radiation monitoring is also highlighted.
Academic Training: Monte Carlo generators for the LHC
Françoise Benz
2005-01-01
2004-2005 ACADEMIC TRAINING PROGRAMME LECTURE SERIES 4, 5, 6, 7 April from 11.00 to 12.00 hrs - Main Auditorium, bldg. 500 Monte Carlo generators for the LHC T. SJOSTRAND / CERN-PH, Lund Univ. SE Event generators today are indispensable as tools for the modelling of complex physics processes, that jointly lead to the production of hundreds of particles per event at LHC energies. Generators are used to set detector requirements, to formulate analysis strategies, or to calculate acceptance corrections. These lectures describe the physics that goes into the construction of an event generator, such as hard processes, initial- and final-state radiation, multiple interactions and beam remnants, hadronization and decays, and how these pieces come together. The current main generators are introduced, and are used to illustrate uncertainties in the physics modelling. Some trends for the future are outlined. ENSEIGNEMENT ACADEMIQUE ACADEMIC TRAINING Françoise Benz 73127 academic.training@cern.ch
Monte-Carlo simulation-based statistical modeling
Chen, John
2017-01-01
This book brings together expert researchers engaged in Monte-Carlo simulation-based statistical modeling, offering them a forum to present and discuss recent issues in methodological development as well as public health applications. It is divided into three parts, with the first providing an overview of Monte-Carlo techniques, the second focusing on missing data Monte-Carlo methods, and the third addressing Bayesian and general statistical modeling using Monte-Carlo simulations. The data and computer programs used here will also be made publicly available, allowing readers to replicate the model development and data analysis presented in each chapter, and to readily apply them in their own research. Featuring highly topical content, the book has the potential to impact model development and data analyses across a wide spectrum of fields, and to spark further research in this direction.
EXTENDED MONTE CARLO LOCALIZATION ALGORITHM FOR MOBILE SENSOR NETWORKS
Institute of Scientific and Technical Information of China (English)
无
2008-01-01
A real-world localization system for wireless sensor networks that adapts for mobility and irregular radio propagation model is considered.The traditional range-based techniques and recent range-free localization schemes are not welt competent for localization in mobile sensor networks,while the probabilistic approach of Bayesian filtering with particle-based density representations provides a comprehensive solution to such localization problem.Monte Carlo localization is a Bayesian filtering method that approximates the mobile node’S location by a set of weighted particles.In this paper,an enhanced Monte Carlo localization algorithm-Extended Monte Carlo Localization (Ext-MCL) is suitable for the practical wireless network environment where the radio propagation model is irregular.Simulation results show the proposal gets better localization accuracy and higher localizable node number than previously proposed Monte Carlo localization schemes not only for ideal radio model,but also for irregular one.
On the Markov Chain Monte Carlo (MCMC) method
Indian Academy of Sciences (India)
Rajeeva L Karandikar
2006-04-01
Markov Chain Monte Carlo (MCMC) is a popular method used to generate samples from arbitrary distributions, which may be speciﬁed indirectly. In this article, we give an introduction to this method along with some examples.
Bayesian phylogeny analysis via stochastic approximation Monte Carlo
Cheon, Sooyoung
2009-11-01
Monte Carlo methods have received much attention in the recent literature of phylogeny analysis. However, the conventional Markov chain Monte Carlo algorithms, such as the Metropolis-Hastings algorithm, tend to get trapped in a local mode in simulating from the posterior distribution of phylogenetic trees, rendering the inference ineffective. In this paper, we apply an advanced Monte Carlo algorithm, the stochastic approximation Monte Carlo algorithm, to Bayesian phylogeny analysis. Our method is compared with two popular Bayesian phylogeny software, BAMBE and MrBayes, on simulated and real datasets. The numerical results indicate that our method outperforms BAMBE and MrBayes. Among the three methods, SAMC produces the consensus trees which have the highest similarity to the true trees, and the model parameter estimates which have the smallest mean square errors, but costs the least CPU time. © 2009 Elsevier Inc. All rights reserved.
Bayesian phylogeny analysis via stochastic approximation Monte Carlo.
Cheon, Sooyoung; Liang, Faming
2009-11-01
Monte Carlo methods have received much attention in the recent literature of phylogeny analysis. However, the conventional Markov chain Monte Carlo algorithms, such as the Metropolis-Hastings algorithm, tend to get trapped in a local mode in simulating from the posterior distribution of phylogenetic trees, rendering the inference ineffective. In this paper, we apply an advanced Monte Carlo algorithm, the stochastic approximation Monte Carlo algorithm, to Bayesian phylogeny analysis. Our method is compared with two popular Bayesian phylogeny software, BAMBE and MrBayes, on simulated and real datasets. The numerical results indicate that our method outperforms BAMBE and MrBayes. Among the three methods, SAMC produces the consensus trees which have the highest similarity to the true trees, and the model parameter estimates which have the smallest mean square errors, but costs the least CPU time.
Monte Carlo techniques for analyzing deep penetration problems
Energy Technology Data Exchange (ETDEWEB)
Cramer, S.N.; Gonnord, J.; Hendricks, J.S.
1985-01-01
A review of current methods and difficulties in Monte Carlo deep-penetration calculations is presented. Statistical uncertainty is discussed, and recent adjoint optimization of splitting, Russian roulette, and exponential transformation biasing is reviewed. Other aspects of the random walk and estimation processes are covered, including the relatively new DXANG angular biasing technique. Specific items summarized are albedo scattering, Monte Carlo coupling techniques with discrete ordinates and other methods, adjoint solutions, and multi-group Monte Carlo. The topic of code-generated biasing parameters is presented, including the creation of adjoint importance functions from forward calculations. Finally, current and future work in the area of computer learning and artificial intelligence is discussed in connection with Monte Carlo applications. 29 refs.
Monte Carlo simulations: Hidden errors from ``good'' random number generators
Ferrenberg, Alan M.; Landau, D. P.; Wong, Y. Joanna
1992-12-01
The Wolff algorithm is now accepted as the best cluster-flipping Monte Carlo algorithm for beating ``critical slowing down.'' We show how this method can yield incorrect answers due to subtle correlations in ``high quality'' random number generators.
An Introduction to Multilevel Monte Carlo for Option Valuation
Higham, Desmond J
2015-01-01
Monte Carlo is a simple and flexible tool that is widely used in computational finance. In this context, it is common for the quantity of interest to be the expected value of a random variable defined via a stochastic differential equation. In 2008, Giles proposed a remarkable improvement to the approach of discretizing with a numerical method and applying standard Monte Carlo. His multilevel Monte Carlo method offers an order of speed up given by the inverse of epsilon, where epsilon is the required accuracy. So computations can run 100 times more quickly when two digits of accuracy are required. The multilevel philosophy has since been adopted by a range of researchers and a wealth of practically significant results has arisen, most of which have yet to make their way into the expository literature. In this work, we give a brief, accessible, introduction to multilevel Monte Carlo and summarize recent results applicable to the task of option evaluation.
MODELING LEACHING OF VIRUSES BY THE MONTE CARLO METHOD
A predictive screening model was developed for fate and transport of viruses in the unsaturated zone. A database of input parameters allowed Monte Carlo analysis with the model. The resulting kernel densities of predicted attenuation during percolation indicated very ...
A MONTE-CARLO METHOD FOR ESTIMATING THE CORRELATION EXPONENT
MIKOSCH, T; WANG, QA
1995-01-01
We propose a Monte Carlo method for estimating the correlation exponent of a stationary ergodic sequence. The estimator can be considered as a bootstrap version of the classical Hill estimator. A simulation study shows that the method yields reasonable estimates.
Using Supervised Learning to Improve Monte Carlo Integral Estimation
Tracey, Brendan; Alonso, Juan J
2011-01-01
Monte Carlo (MC) techniques are often used to estimate integrals of a multivariate function using randomly generated samples of the function. In light of the increasing interest in uncertainty quantification and robust design applications in aerospace engineering, the calculation of expected values of such functions (e.g. performance measures) becomes important. However, MC techniques often suffer from high variance and slow convergence as the number of samples increases. In this paper we present Stacked Monte Carlo (StackMC), a new method for post-processing an existing set of MC samples to improve the associated integral estimate. StackMC is based on the supervised learning techniques of fitting functions and cross validation. It should reduce the variance of any type of Monte Carlo integral estimate (simple sampling, importance sampling, quasi-Monte Carlo, MCMC, etc.) without adding bias. We report on an extensive set of experiments confirming that the StackMC estimate of an integral is more accurate than ...
A MONTE-CARLO METHOD FOR ESTIMATING THE CORRELATION EXPONENT
MIKOSCH, T; WANG, QA
We propose a Monte Carlo method for estimating the correlation exponent of a stationary ergodic sequence. The estimator can be considered as a bootstrap version of the classical Hill estimator. A simulation study shows that the method yields reasonable estimates.
Accelerating Monte Carlo Renderers by Ray Histogram Fusion
Directory of Open Access Journals (Sweden)
Mauricio Delbracio
2015-03-01
Full Text Available This paper details the recently introduced Ray Histogram Fusion (RHF filter for accelerating Monte Carlo renderers [M. Delbracio et al., Boosting Monte Carlo Rendering by Ray Histogram Fusion, ACM Transactions on Graphics, 33 (2014]. In this filter, each pixel in the image is characterized by the colors of the rays that reach its surface. Pixels are compared using a statistical distance on the associated ray color distributions. Based on this distance, it decides whether two pixels can share their rays or not. The RHF filter is consistent: as the number of samples increases, more evidence is required to average two pixels. The algorithm provides a significant gain in PSNR, or equivalently accelerates the rendering process by using many fewer Monte Carlo samples without observable bias. Since the RHF filter depends only on the Monte Carlo samples color values, it can be naturally combined with all rendering effects.
Monte Carlo methods and applications in nuclear physics
Energy Technology Data Exchange (ETDEWEB)
Carlson, J.
1990-01-01
Monte Carlo methods for studying few- and many-body quantum systems are introduced, with special emphasis given to their applications in nuclear physics. Variational and Green's function Monte Carlo methods are presented in some detail. The status of calculations of light nuclei is reviewed, including discussions of the three-nucleon-interaction, charge and magnetic form factors, the coulomb sum rule, and studies of low-energy radiative transitions. 58 refs., 12 figs.
Public Infrastructure for Monte Carlo Simulation: publicMC@BATAN
Waskita, A A; Akbar, Z; Handoko, L T; 10.1063/1.3462759
2010-01-01
The first cluster-based public computing for Monte Carlo simulation in Indonesia is introduced. The system has been developed to enable public to perform Monte Carlo simulation on a parallel computer through an integrated and user friendly dynamic web interface. The beta version, so called publicMC@BATAN, has been released and implemented for internal users at the National Nuclear Energy Agency (BATAN). In this paper the concept and architecture of publicMC@BATAN are presented.
Radiative Equilibrium and Temperature Correction in Monte Carlo Radiation Transfer
Bjorkman, J. E.; Wood, Kenneth
2001-01-01
We describe a general radiative equilibrium and temperature correction procedure for use in Monte Carlo radiation transfer codes with sources of temperature-independent opacity, such as astrophysical dust. The technique utilizes the fact that Monte Carlo simulations track individual photon packets, so we may easily determine where their energy is absorbed. When a packet is absorbed, it heats a particular cell within the envelope, raising its temperature. To enforce radiative equilibrium, the ...
Chemical accuracy from quantum Monte Carlo for the Benzene Dimer
Azadi, Sam; Cohen, R. E
2015-01-01
We report an accurate study of interactions between Benzene molecules using variational quantum Monte Carlo (VMC) and diffusion quantum Monte Carlo (DMC) methods. We compare these results with density functional theory (DFT) using different van der Waals (vdW) functionals. In our QMC calculations, we use accurate correlated trial wave functions including three-body Jastrow factors, and backflow transformations. We consider two benzene molecules in the parallel displaced (PD) geometry, and fin...
de Finetti Priors using Markov chain Monte Carlo computations.
Bacallado, Sergio; Diaconis, Persi; Holmes, Susan
2015-07-01
Recent advances in Monte Carlo methods allow us to revisit work by de Finetti who suggested the use of approximate exchangeability in the analyses of contingency tables. This paper gives examples of computational implementations using Metropolis Hastings, Langevin and Hamiltonian Monte Carlo to compute posterior distributions for test statistics relevant for testing independence, reversible or three way models for discrete exponential families using polynomial priors and Gröbner bases.
Event-chain Monte Carlo for classical continuous spin models
Michel, Manon; Mayer, Johannes; Krauth, Werner
2015-10-01
We apply the event-chain Monte Carlo algorithm to classical continuum spin models on a lattice and clarify the condition for its validity. In the two-dimensional XY model, it outperforms the local Monte Carlo algorithm by two orders of magnitude, although it remains slower than the Wolff cluster algorithm. In the three-dimensional XY spin glass model at low temperature, the event-chain algorithm is far superior to the other algorithms.
Confidence and efficiency scaling in Variational Quantum Monte Carlo calculations
Delyon, François; Holzmann, Markus
2016-01-01
Based on the central limit theorem, we discuss the problem of evaluation of the statistical error of Monte Carlo calculations using a time discretized diffusion process. We present a robust and practical method to determine the effective variance of general observables and show how to verify the equilibrium hypothesis by the Kolmogorov-Smirnov test. We then derive scaling laws of the efficiency illustrated by Variational Monte Carlo calculations on the two dimensional electron gas.
Study of the Transition Flow Regime using Monte Carlo Methods
Hassan, H. A.
1999-01-01
This NASA Cooperative Agreement presents a study of the Transition Flow Regime Using Monte Carlo Methods. The topics included in this final report are: 1) New Direct Simulation Monte Carlo (DSMC) procedures; 2) The DS3W and DS2A Programs; 3) Papers presented; 4) Miscellaneous Applications and Program Modifications; 5) Solution of Transitional Wake Flows at Mach 10; and 6) Turbulence Modeling of Shock-Dominated Fows with a k-Enstrophy Formulation.
Monte Carlo Simulation of Optical Properties of Wake Bubbles
Institute of Scientific and Technical Information of China (English)
CAO Jing; WANG Jiang-An; JIANG Xing-Zhou; SHI Sheng-Wei
2007-01-01
Based on Mie scattering theory and the theory of multiple light scattering, the light scattering properties of air bubbles in a wake are analysed by Monte Carlo simulation. The results show that backscattering is enhanced obviously due to the existence of bubbles, especially with the increase of bubble density, and that it is feasible to use the Monte Carlo method to study the properties of light scattering by air bubbles.
Successful combination of the stochastic linearization and Monte Carlo methods
Elishakoff, I.; Colombi, P.
1993-01-01
A combination of a stochastic linearization and Monte Carlo techniques is presented for the first time in literature. A system with separable nonlinear damping and nonlinear restoring force is considered. The proposed combination of the energy-wise linearization with the Monte Carlo method yields an error under 5 percent, which corresponds to the error reduction associated with the conventional stochastic linearization by a factor of 4.6.
Confidence and efficiency scaling in variational quantum Monte Carlo calculations
Delyon, F.; Bernu, B.; Holzmann, Markus
2017-02-01
Based on the central limit theorem, we discuss the problem of evaluation of the statistical error of Monte Carlo calculations using a time-discretized diffusion process. We present a robust and practical method to determine the effective variance of general observables and show how to verify the equilibrium hypothesis by the Kolmogorov-Smirnov test. We then derive scaling laws of the efficiency illustrated by variational Monte Carlo calculations on the two-dimensional electron gas.
Monte Carlo methods for light propagation in biological tissues
Vinckenbosch, Laura; Lacaux, Céline; Tindel, Samy; Thomassin, Magalie; Obara, Tiphaine
2016-01-01
Light propagation in turbid media is driven by the equation of radiative transfer. We give a formal probabilistic representation of its solution in the framework of biological tissues and we implement algorithms based on Monte Carlo methods in order to estimate the quantity of light that is received by a homogeneous tissue when emitted by an optic fiber. A variance reduction method is studied and implemented, as well as a Markov chain Monte Carlo method based on the Metropolis–Hastings algori...
Multiscale Monte Carlo equilibration: pure Yang-Mills theory
Endres, Michael G; Detmold, William; Orginos, Kostas; Pochinsky, Andrew V
2015-01-01
We present a multiscale thermalization algorithm for lattice gauge theory, which enables efficient parallel generation of uncorrelated gauge field configurations. The algorithm combines standard Monte Carlo techniques with ideas drawn from real space renormalization group and multigrid methods. We demonstrate the viability of the algorithm for pure Yang-Mills gauge theory for both heat bath and hybrid Monte Carlo evolution, and show that it ameliorates the problem of topological freezing up to controllable lattice spacing artifacts.
Geometrical and Monte Carlo projectors in 3D PET reconstruction
Aguiar, Pablo; Rafecas López, Magdalena; Ortuno, Juan Enrique; Kontaxakis, George; Santos, Andrés; Pavía, Javier; Ros, Domènec
2010-01-01
Purpose: In the present work, the authors compare geometrical and Monte Carlo projectors in detail. The geometrical projectors considered were the conventional geometrical Siddon ray-tracer (S-RT) and the orthogonal distance-based ray-tracer (OD-RT), based on computing the orthogonal distance from the center of image voxel to the line-of-response. A comparison of these geometrical projectors was performed using different point spread function (PSF) models. The Monte Carlo-based method under c...
Monte Carlo method for solving a parabolic problem
Directory of Open Access Journals (Sweden)
Tian Yi
2016-01-01
Full Text Available In this paper, we present a numerical method based on random sampling for a parabolic problem. This method combines use of the Crank-Nicolson method and Monte Carlo method. In the numerical algorithm, we first discretize governing equations by Crank-Nicolson method, and obtain a large sparse system of linear algebraic equations, then use Monte Carlo method to solve the linear algebraic equations. To illustrate the usefulness of this technique, we apply it to some test problems.
MONTE CARLO SIMULATION OF CHARGED PARTICLE IN AN ELECTRONEGATIVE PLASMA
Directory of Open Access Journals (Sweden)
L SETTAOUTI
2003-12-01
Full Text Available Interest in radio frequency (rf discharges has grown tremendously in recent years due to their importance in microelectronic technologies. Especially interesting are the properties of discharges in electronegative gases which are most frequently used for technological applications. Monte Carlo simulation have become increasingly important as a simulation tool particularly in the area of plasma physics. In this work, we present some detailed properties of rf plasmas obtained by Monte Carlo simulation code, in SF6
Lattice gas models and kinetic Monte Carlo simulations of epitaxial growth
Biehl, Michael; Voigt, A
2005-01-01
A brief introduction is given to Kinetic Monte Carlo (KMC) simulations of epitaxial crystal growth. Molecular Beam Epitaxy (MBE) serves as the prototype example for growth far from equilibrium. However, many of the aspects discussed here would carry over to other techniques as well. A variety of app
Monte-Carlo based prediction of radiochromic film response for hadrontherapy dosimetry
Energy Technology Data Exchange (ETDEWEB)
Frisson, T. [Universite de Lyon, F-69622 Lyon (France); CREATIS-LRMN, INSA, Batiment Blaise Pascal, 7 avenue Jean Capelle, 69621 Villeurbanne Cedex (France); Centre Leon Berrard - 28 rue Laennec, F-69373 Lyon Cedex 08 (France)], E-mail: frisson@creatis.insa-lyon.fr; Zahra, N. [Universite de Lyon, F-69622 Lyon (France); IPNL - CNRS/IN2P3 UMR 5822, Universite Lyon 1, Batiment Paul Dirac, 4 rue Enrico Fermi, F-69622 Villeurbanne Cedex (France); Centre Leon Berrard - 28 rue Laennec, F-69373 Lyon Cedex 08 (France); Lautesse, P. [Universite de Lyon, F-69622 Lyon (France); IPNL - CNRS/IN2P3 UMR 5822, Universite Lyon 1, Batiment Paul Dirac, 4 rue Enrico Fermi, F-69622 Villeurbanne Cedex (France); Sarrut, D. [Universite de Lyon, F-69622 Lyon (France); CREATIS-LRMN, INSA, Batiment Blaise Pascal, 7 avenue Jean Capelle, 69621 Villeurbanne Cedex (France); Centre Leon Berrard - 28 rue Laennec, F-69373 Lyon Cedex 08 (France)
2009-07-21
A model has been developed to calculate MD-55-V2 radiochromic film response to ion irradiation. This model is based on photon film response and film saturation by high local energy deposition computed by Monte-Carlo simulation. We have studied the response of the film to photon irradiation and we proposed a calculation method for hadron beams.
Monte-Carlo based prediction of radiochromic film response for hadrontherapy dosimetry
Frisson, T.; Zahra, N.; Lautesse, P.; Sarrut, D.
2009-07-01
A model has been developed to calculate MD-55-V2 radiochromic film response to ion irradiation. This model is based on photon film response and film saturation by high local energy deposition computed by Monte-Carlo simulation. We have studied the response of the film to photon irradiation and we proposed a calculation method for hadron beams.
Lattice gas models and kinetic Monte Carlo simulations of epitaxial growth
Biehl, Michael; Voigt, A
2005-01-01
A brief introduction is given to Kinetic Monte Carlo (KMC) simulations of epitaxial crystal growth. Molecular Beam Epitaxy (MBE) serves as the prototype example for growth far from equilibrium. However, many of the aspects discussed here would carry over to other techniques as well. A variety of app
Curley, Casey Michael
Monte Carlo (MC) and Pencil Beam (PB) calculations are compared to their measured planar dose distributions using a 2-D diode array for lung Stereotactic Body Radiation Therapy (SBRT). The planar dose distributions were studied for two different phantom types: an in-house heterogeneous phantom and a homogeneous phantom. The motivation is to mimic the human anatomy during a lung SBRT treatment and incorporate heterogeneities into the pre-treatment Quality Assurance process, where measured and calculated planar dose distributions are compared before the radiation treatment. Individual and combined field dosimetry has been performed for both fixed gantry angle (anterior to posterior) and planned gantry angle delivery. A gamma analysis has been performed for all beam arrangements. The measurements were obtained using the 2-D diode array MapCHECK 2(TM). MC and PB calculations were performed using the BrainLAB iPlan RTRTM Dose software. The results suggest that with the heterogeneous phantom as a quality assurance device, the MC calculations result in closer agreements to the measured values, when using the planned gantry angle delivery method for composite beams. For the homogeneous phantom, the results suggest that the preferred delivery method is at the fixed anterior to posterior gantry angle. Furthermore, the MC and PB calculations do not show significant differences for dose difference and distance to agreement criteria 3%/3mm. However, PB calculations are in better agreement with the measured values for more stringent gamma criteria when considering individual beam whereas MC agreements are closer for composite beam measurements.
Monte Carlo Volcano Seismic Moment Tensors
Waite, G. P.; Brill, K. A.; Lanza, F.
2015-12-01
Inverse modeling of volcano seismic sources can provide insight into the geometry and dynamics of volcanic conduits. But given the logistical challenges of working on an active volcano, seismic networks are typically deficient in spatial and temporal coverage; this potentially leads to large errors in source models. In addition, uncertainties in the centroid location and moment-tensor components, including volumetric components, are difficult to constrain from the linear inversion results, which leads to a poor understanding of the model space. In this study, we employ a nonlinear inversion using a Monte Carlo scheme with the objective of defining robustly resolved elements of model space. The model space is randomized by centroid location and moment tensor eigenvectors. Point sources densely sample the summit area and moment tensors are constrained to a randomly chosen geometry within the inversion; Green's functions for the random moment tensors are all calculated from modeled single forces, making the nonlinear inversion computationally reasonable. We apply this method to very-long-period (VLP) seismic events that accompany minor eruptions at Fuego volcano, Guatemala. The library of single force Green's functions is computed with a 3D finite-difference modeling algorithm through a homogeneous velocity-density model that includes topography, for a 3D grid of nodes, spaced 40 m apart, within the summit region. The homogenous velocity and density model is justified by long wavelength of VLP data. The nonlinear inversion reveals well resolved model features and informs the interpretation through a better understanding of the possible models. This approach can also be used to evaluate possible station geometries in order to optimize networks prior to deployment.
Quantum Monte Carlo with directed loops.
Syljuåsen, Olav F; Sandvik, Anders W
2002-10-01
We introduce the concept of directed loops in stochastic series expansion and path-integral quantum Monte Carlo methods. Using the detailed balance rules for directed loops, we show that it is possible to smoothly connect generally applicable simulation schemes (in which it is necessary to include backtracking processes in the loop construction) to more restricted loop algorithms that can be constructed only for a limited range of Hamiltonians (where backtracking can be avoided). The "algorithmic discontinuities" between general and special points (or regions) in parameter space can hence be eliminated. As a specific example, we consider the anisotropic S=1/2 Heisenberg antiferromagnet in an external magnetic field. We show that directed-loop simulations are very efficient for the full range of magnetic fields (zero to the saturation point) and anisotropies. In particular, for weak fields and anisotropies, the autocorrelations are significantly reduced relative to those of previous approaches. The back-tracking probability vanishes continuously as the isotropic Heisenberg point is approached. For the XY model, we show that back tracking can be avoided for all fields extending up to the saturation field. The method is hence particularly efficient in this case. We use directed-loop simulations to study the magnetization process in the two-dimensional Heisenberg model at very low temperatures. For LxL lattices with L up to 64, we utilize the step structure in the magnetization curve to extract gaps between different spin sectors. Finite-size scaling of the gaps gives an accurate estimate of the transverse susceptibility in the thermodynamic limit: chi( perpendicular )=0.0659+/-0.0002.
Monte Carlo implementation of polarized hadronization
Matevosyan, Hrayr H.; Kotzinian, Aram; Thomas, Anthony W.
2017-01-01
We study the polarized quark hadronization in a Monte Carlo (MC) framework based on the recent extension of the quark-jet framework, where a self-consistent treatment of the quark polarization transfer in a sequential hadronization picture has been presented. Here, we first adopt this approach for MC simulations of the hadronization process with a finite number of produced hadrons, expressing the relevant probabilities in terms of the eight leading twist quark-to-quark transverse-momentum-dependent (TMD) splitting functions (SFs) for elementary q →q'+h transition. We present explicit expressions for the unpolarized and Collins fragmentation functions (FFs) of unpolarized hadrons emitted at rank 2. Further, we demonstrate that all the current spectator-type model calculations of the leading twist quark-to-quark TMD SFs violate the positivity constraints, and we propose a quark model based ansatz for these input functions that circumvents the problem. We validate our MC framework by explicitly proving the absence of unphysical azimuthal modulations of the computed polarized FFs, and by precisely reproducing the earlier derived explicit results for rank-2 pions. Finally, we present the full results for pion unpolarized and Collins FFs, as well as the corresponding analyzing powers from high statistics MC simulations with a large number of produced hadrons for two different model input elementary SFs. The results for both sets of input functions exhibit the same general features of an opposite signed Collins function for favored and unfavored channels at large z and, at the same time, demonstrate the flexibility of the quark-jet framework by producing significantly different dependences of the results at mid to low z for the two model inputs.
kmos: A lattice kinetic Monte Carlo framework
Hoffmann, Max J.; Matera, Sebastian; Reuter, Karsten
2014-07-01
Kinetic Monte Carlo (kMC) simulations have emerged as a key tool for microkinetic modeling in heterogeneous catalysis and other materials applications. Systems, where site-specificity of all elementary reactions allows a mapping onto a lattice of discrete active sites, can be addressed within the particularly efficient lattice kMC approach. To this end we describe the versatile kmos software package, which offers a most user-friendly implementation, execution, and evaluation of lattice kMC models of arbitrary complexity in one- to three-dimensional lattice systems, involving multiple active sites in periodic or aperiodic arrangements, as well as site-resolved pairwise and higher-order lateral interactions. Conceptually, kmos achieves a maximum runtime performance which is essentially independent of lattice size by generating code for the efficiency-determining local update of available events that is optimized for a defined kMC model. For this model definition and the control of all runtime and evaluation aspects kmos offers a high-level application programming interface. Usage proceeds interactively, via scripts, or a graphical user interface, which visualizes the model geometry, the lattice occupations and rates of selected elementary reactions, while allowing on-the-fly changes of simulation parameters. We demonstrate the performance and scaling of kmos with the application to kMC models for surface catalytic processes, where for given operation conditions (temperature and partial pressures of all reactants) central simulation outcomes are catalytic activity and selectivities, surface composition, and mechanistic insight into the occurrence of individual elementary processes in the reaction network.
Perturbation Monte Carlo methods for tissue structure alterations.
Nguyen, Jennifer; Hayakawa, Carole K; Mourant, Judith R; Spanier, Jerome
2013-01-01
This paper describes an extension of the perturbation Monte Carlo method to model light transport when the phase function is arbitrarily perturbed. Current perturbation Monte Carlo methods allow perturbation of both the scattering and absorption coefficients, however, the phase function can not be varied. The more complex method we develop and test here is not limited in this way. We derive a rigorous perturbation Monte Carlo extension that can be applied to a large family of important biomedical light transport problems and demonstrate its greater computational efficiency compared with using conventional Monte Carlo simulations to produce forward transport problem solutions. The gains of the perturbation method occur because only a single baseline Monte Carlo simulation is needed to obtain forward solutions to other closely related problems whose input is described by perturbing one or more parameters from the input of the baseline problem. The new perturbation Monte Carlo methods are tested using tissue light scattering parameters relevant to epithelia where many tumors originate. The tissue model has parameters for the number density and average size of three classes of scatterers; whole nuclei, organelles such as lysosomes and mitochondria, and small particles such as ribosomes or large protein complexes. When these parameters or the wavelength is varied the scattering coefficient and the phase function vary. Perturbation calculations give accurate results over variations of ∼15-25% of the scattering parameters.
A Survey on Multilevel Monte Carlo for European Options
Directory of Open Access Journals (Sweden)
Masoud Moharamnejad
2016-03-01
Full Text Available One of the most applicable and common methods for pricing options is the Monte Carlo simulation. Among the advantages of this method we can name ease of use, being suitable for different types of options including vanilla options and exotic options. On one hand, convergence rate of Monte Carlo's variance is , which has a slow convergence in responding problems, such that for achieving accuracy of ε for a d dimensional problem, computation complexity would be . Thus, various methods have been proposed in Monte Carlo framework to increase the convergence rate of variance as variance reduction methods. One of the recent methods was proposed by Gills in 2006, is the multilevel Monte Carlo method. This method besides reducing the computationcomplexity to while being used in Euler discretizing and to while being used in Milsteindiscretizing method, has the capacity to be combined with other variance reduction methods. In this article, multilevel Monte Carlo using Euler and Milsteindiscretizing methods is adopted for comparing computation complexity with standard Monte Carlo method in pricing European call options.
Implications of Monte Carlo Statistical Errors in Criticality Safety Assessments
Energy Technology Data Exchange (ETDEWEB)
Pevey, Ronald E.
2005-09-15
Most criticality safety calculations are performed using Monte Carlo techniques because of Monte Carlo's ability to handle complex three-dimensional geometries. For Monte Carlo calculations, the more histories sampled, the lower the standard deviation of the resulting estimates. The common intuition is, therefore, that the more histories, the better; as a result, analysts tend to run Monte Carlo analyses as long as possible (or at least to a minimum acceptable uncertainty). For Monte Carlo criticality safety analyses, however, the optimization situation is complicated by the fact that procedures usually require that an extra margin of safety be added because of the statistical uncertainty of the Monte Carlo calculations. This additional safety margin affects the impact of the choice of the calculational standard deviation, both on production and on safety. This paper shows that, under the assumptions of normally distributed benchmarking calculational errors and exact compliance with the upper subcritical limit (USL), the standard deviation that optimizes production is zero, but there is a non-zero value of the calculational standard deviation that minimizes the risk of inadvertently labeling a supercritical configuration as subcritical. Furthermore, this value is shown to be a simple function of the typical benchmarking step outcomes--the bias, the standard deviation of the bias, the upper subcritical limit, and the number of standard deviations added to calculated k-effectives before comparison to the USL.
Bayesian Optimal Experimental Design Using Multilevel Monte Carlo
Issaid, Chaouki Ben
2015-01-07
Experimental design is very important since experiments are often resource-exhaustive and time-consuming. We carry out experimental design in the Bayesian framework. To measure the amount of information, which can be extracted from the data in an experiment, we use the expected information gain as the utility function, which specifically is the expected logarithmic ratio between the posterior and prior distributions. Optimizing this utility function enables us to design experiments that yield the most informative data for our purpose. One of the major difficulties in evaluating the expected information gain is that the integral is nested and can be high dimensional. We propose using Multilevel Monte Carlo techniques to accelerate the computation of the nested high dimensional integral. The advantages are twofold. First, the Multilevel Monte Carlo can significantly reduce the cost of the nested integral for a given tolerance, by using an optimal sample distribution among different sample averages of the inner integrals. Second, the Multilevel Monte Carlo method imposes less assumptions, such as the concentration of measures, required by Laplace method. We test our Multilevel Monte Carlo technique using a numerical example on the design of sensor deployment for a Darcy flow problem governed by one dimensional Laplace equation. We also compare the performance of the Multilevel Monte Carlo, Laplace approximation and direct double loop Monte Carlo.
Directory of Open Access Journals (Sweden)
Asghar Mesbahi
2015-09-01
Full Text Available Introduction Radiotherapy with small fields is used widely in newly developed techniques. Additionally, dose calculation accuracy of treatment planning systems in small fields plays a crucial role in treatment outcome. In the present study, dose calculation accuracy of two commercial treatment planning systems was evaluated against Monte Carlo method. Materials and Methods Siemens Once or linear accelerator was simulated, using MCNPX Monte Carlo code, according to manufacturer’s instructions. Three analytical algorithms for dose calculation including full scatter convolution (FSC in TiGRT, along with convolution and superposition in XiO system were evaluated for a small solid liver tumor. This solid tumor with a diameter of 1.8 cm was evaluated in a thorax phantom, and calculations were performed for different field sizes (1×1, 2×2, 3×3 and4×4 cm2. The results obtained in these treatment planning systems were compared with calculations by MC method (regarded as the most reliable method. Results For FSC and convolution algorithm, comparison with MC calculations indicated dose overestimations of up to 120%and 25% inside the lung and tumor, respectively in 1×1 cm2field size, using an 18 MV photon beam. Regarding superposition, a close agreement was seen with MC simulation in all studied field sizes. Conclusion The obtained results showed that FSC and convolution algorithm significantly overestimated doses of the lung and solid tumor; therefore, significant errors could arise in treatment plans of lung region, thus affecting the treatment outcomes. Therefore, use of MC-based methods and super position is recommended for lung treatments, using small fields and beamlets.
Yoshizumi, Maíra T; Yoriyaz, Hélio; Caldas, Linda V E
2010-01-01
Backscattered radiation (BSR) from field-defining collimators can affect the response of a monitor chamber in X-radiation fields. This contribution must be considered since this kind of chamber is used to monitor the equipment response. In this work, the dependence of a transmission ionization chamber response on the aperture diameter of the collimators was studied experimentally and using a Monte Carlo (MC) technique. According to the results, the BSR increases the chamber response of over 4.0% in the case of a totally closed collimator and 50 kV energy beam, using both techniques. The results from Monte Carlo simulation confirm the validity of the simulated geometry.
Meric, N; Bor, D
1999-01-01
Scatter fractions have been determined experimentally for lucite, polyethylene, polypropylene, aluminium and copper of varying thicknesses using a polyenergetic broad X-ray beam of 67 kVp. Simulation of the experiment has been carried out by the Monte Carlo technique under the same input conditions. Comparison of the measured and predicted data with each other and with the previously reported values has been given. The Monte Carlo calculations have also been carried out for water, bakelite and bone to examine the dependence of scatter fraction on the density of the scatterer.
Monte Carlo simulation applied in total reflection x-ray fluorescence: Preliminary results
Energy Technology Data Exchange (ETDEWEB)
Meira, Luiza L. C.; Inocente, Guilherme F.; Vieira, Leticia D.; Mesa, Joel [Departamento de Fisica e Biofisica - Instituto de Biociencias de Botucatu, Universidade Estadual Paulista Julio de Mesquita Filho (Brazil)
2013-05-06
The X-ray Fluorescence (XRF) analysis is a technique for the qualitative and quantitative determination of chemical constituents in a sample. This method is based on detection of the characteristic radiation intensities emitted by the elements of the sample, when properly excited. A variant of this technique is the Total Reflection X-ray Fluorescence (TXRF) that utilizes electromagnetic radiation as excitation source. In total reflection of X-ray, the angle of refraction of the incident beam tends to zero and the refracted beam is tangent to the sample support interface. Thus, there is a minimum angle of incidence at which no refracted beam exists and all incident radiation undergoes total reflection. In this study, we evaluated the influence of the energy variation of the beam of incident x-rays, using the MCNPX code (Monte Carlo NParticle) based on Monte Carlo method.
Monte Carlo Techniques for Nuclear Systems - Theory Lectures
Energy Technology Data Exchange (ETDEWEB)
Brown, Forrest B. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). Monte Carlo Methods, Codes, and Applications Group; Univ. of New Mexico, Albuquerque, NM (United States). Nuclear Engineering Dept.
2016-11-29
These are lecture notes for a Monte Carlo class given at the University of New Mexico. The following topics are covered: course information; nuclear eng. review & MC; random numbers and sampling; computational geometry; collision physics; tallies and statistics; eigenvalue calculations I; eigenvalue calculations II; eigenvalue calculations III; variance reduction; parallel Monte Carlo; parameter studies; fission matrix and higher eigenmodes; doppler broadening; Monte Carlo depletion; HTGR modeling; coupled MC and T/H calculations; fission energy deposition. Solving particle transport problems with the Monte Carlo method is simple - just simulate the particle behavior. The devil is in the details, however. These lectures provide a balanced approach to the theory and practice of Monte Carlo simulation codes. The first lectures provide an overview of Monte Carlo simulation methods, covering the transport equation, random sampling, computational geometry, collision physics, and statistics. The next lectures focus on the state-of-the-art in Monte Carlo criticality simulations, covering the theory of eigenvalue calculations, convergence analysis, dominance ratio calculations, bias in Keff and tallies, bias in uncertainties, a case study of a realistic calculation, and Wielandt acceleration techniques. The remaining lectures cover advanced topics, including HTGR modeling and stochastic geometry, temperature dependence, fission energy deposition, depletion calculations, parallel calculations, and parameter studies. This portion of the class focuses on using MCNP to perform criticality calculations for reactor physics and criticality safety applications. It is an intermediate level class, intended for those with at least some familiarity with MCNP. Class examples provide hands-on experience at running the code, plotting both geometry and results, and understanding the code output. The class includes lectures & hands-on computer use for a variety of Monte Carlo calculations
Jaradat, Adnan K; Biggs, Peter J
2007-05-01
The calculation of shielding barrier thicknesses for radiation therapy facilities according to the NCRP formalism is based on the use of broad beams (that is, the maximum possible field sizes). However, in practice, treatment fields used in radiation therapy are, on average, less than half the maximum size. Indeed, many contemporary treatment techniques call for reduced field sizes to reduce co-morbidity and the risk of second cancers. Therefore, published tenth value layers (TVLs) for shielding materials do not apply to these very small fields. There is, hence, a need to determine the TVLs for various beam modalities as a function of field size. The attenuation of (60)Co gamma rays and photons of 4, 6, 10, 15, and 18 MV bremsstrahlung x ray beams by concrete has been studied using the Monte Carlo technique (MCNP version 4C2) for beams of half-opening angles of 0 degrees , 3 degrees , 6 degrees , 9 degrees , 12 degrees , and 14 degrees . The distance between the x-ray source and the distal surface of the shielding wall was fixed at 600 cm, a distance that is typical for modern radiation therapy rooms. The maximum concrete thickness varied between 76.5 cm and 151.5 cm for (60)Co and 18 MV x rays, respectively. Detectors were placed at 630 cm, 700 cm, and 800 cm from the source. TVLs have been determined down to the third TVL. Energy spectra for 4, 6, 10, 15, and 18 MV x rays for 10 x 10 cm(2) and 40 x 40 cm(2) field sizes were used to generate depth dose curves in water that were compared with experimentally measured values.
Energy Technology Data Exchange (ETDEWEB)
Penna, Rodrigo [UNI-BH, Belo Horizonte, MG (Brazil). Dept. de Ciencias Biologicas, Ambientais e da Saude (DCBAS/DCET); Silva, Clemente Jose Gusmao Carneiro da [Universidade Estadual de Santa Cruz, UESC, Ilheus, BA (Brazil); Gomes, Paulo Mauricio Costa [Universidade FUMEC, Belo Horizonte, MG (Brazil)
2008-07-01
Viability of building a nuclear wood densimeter based on low energy photons Compton scattering was done using Monte Carlo code (MCNP- 4C). It is simulated a collimated 60 keV beam of gamma rays emitted by {sup 241}Am source reaching wood blocks. Backscattered radiation by these blocks was calculated. Photons scattered were correlated with blocks of different wood densities. Results showed a linear relationship on wood density and scattered photons, therefore the viability of this wood densimeter. (author)
Energy Technology Data Exchange (ETDEWEB)
Tholomier, M.; Vicario, E.; Doghmane, N.
1987-10-01
The contribution of backscattered electrons to Auger electrons yield was studied with a multiple scattering Monte-Carlo simulation. The Auger backscattering factor has been calculated in the 5 keV-60 keV energy range. The dependence of the Auger backscattering factor on the primary energy and the beam incidence angle were determined. Spatial distributions of backscattered electrons and Auger electrons are presented for a point incident beam. Correlations between these distributions are briefly investigated.
Reducing quasi-ergodicity in a double well potential by Tsallis Monte Carlo simulation
Iwamatsu, Masao; Okabe, Yutaka
2000-01-01
A new Monte Carlo scheme based on the system of Tsallis's generalized statistical mechanics is applied to a simple double well potential to calculate the canonical thermal average of potential energy. Although we observed serious quasi-ergodicity when using the standard Metropolis Monte Carlo algorithm, this problem is largely reduced by the use of the new Monte Carlo algorithm. Therefore the ergodicity is guaranteed even for short Monte Carlo steps if we use this new canonical Monte Carlo sc...
Finding organic vapors - a Monte Carlo approach
Vuollekoski, Henri; Boy, Michael; Kerminen, Veli-Matti; Kulmala, Markku
2010-05-01
drawbacks in accuracy, the inability to find diurnal variation and the lack of size resolution. Here, we aim to shed some light onto the problem by applying an ad hoc Monte Carlo algorithm to a well established aerosol dynamical model, the University of Helsinki Multicomponent Aerosol model (UHMA). By performing a side-by-side comparison with measurement data within the algorithm, this approach has the significant advantage of decreasing the amount of manual labor. But more importantly, by basing the comparison on particle number size distribution data - a quantity that can be quite reliably measured - the accuracy of the results is good.
Coherent Scattering Imaging Monte Carlo Simulation
Hassan, Laila Abdulgalil Rafik
Conventional mammography has poor contrast between healthy and cancerous tissues due to the small difference in attenuation properties. Coherent scatter potentially provides more information because interference of coherently scattered radiation depends on the average intermolecular spacing, and can be used to characterize tissue types. However, typical coherent scatter analysis techniques are not compatible with rapid low dose screening techniques. Coherent scatter slot scan imaging is a novel imaging technique which provides new information with higher contrast. In this work a simulation of coherent scatter was performed for slot scan imaging to assess its performance and provide system optimization. In coherent scatter imaging, the coherent scatter is exploited using a conventional slot scan mammography system with anti-scatter grids tilted at the characteristic angle of cancerous tissues. A Monte Carlo simulation was used to simulate the coherent scatter imaging. System optimization was performed across several parameters, including source voltage, tilt angle, grid distances, grid ratio, and shielding geometry. The contrast increased as the grid tilt angle increased beyond the characteristic angle for the modeled carcinoma. A grid tilt angle of 16 degrees yielded the highest contrast and signal to noise ratio (SNR). Also, contrast increased as the source voltage increased. Increasing grid ratio improved contrast at the expense of decreasing SNR. A grid ratio of 10:1 was sufficient to give a good contrast without reducing the intensity to a noise level. The optimal source to sample distance was determined to be such that the source should be located at the focal distance of the grid. A carcinoma lump of 0.5x0.5x0.5 cm3 in size was detectable which is reasonable considering the high noise due to the usage of relatively small number of incident photons for computational reasons. A further study is needed to study the effect of breast density and breast thickness
Quantum-trajectory Monte Carlo method for study of electron-crystal interaction in STEM.
Ruan, Z; Zeng, R G; Ming, Y; Zhang, M; Da, B; Mao, S F; Ding, Z J
2015-07-21
In this paper, a novel quantum-trajectory Monte Carlo simulation method is developed to study electron beam interaction with a crystalline solid for application to electron microscopy and spectroscopy. The method combines the Bohmian quantum trajectory method, which treats electron elastic scattering and diffraction in a crystal, with a Monte Carlo sampling of electron inelastic scattering events along quantum trajectory paths. We study in this work the electron scattering and secondary electron generation process in crystals for a focused incident electron beam, leading to understanding of the imaging mechanism behind the atomic resolution secondary electron image that has been recently achieved in experiment with a scanning transmission electron microscope. According to this method, the Bohmian quantum trajectories have been calculated at first through a wave function obtained via a numerical solution of the time-dependent Schrödinger equation with a multislice method. The impact parameter-dependent inner-shell excitation cross section then enables the Monte Carlo sampling of ionization events produced by incident electron trajectories travelling along atom columns for excitation of high energy knock-on secondary electrons. Following cascade production, transportation and emission processes of true secondary electrons of very low energies are traced by a conventional Monte Carlo simulation method to present image signals. Comparison of the simulated image for a Si(110) crystal with the experimental image indicates that the dominant mechanism of atomic resolution of secondary electron image is the inner-shell ionization events generated by a high-energy electron beam.
Energy Technology Data Exchange (ETDEWEB)
Flint, D B; O’Brien, D J; McFadden, C H; Wolfe, T; Krishnan, S; Sawakuchi, G O [UT MD Anderson Cancer Center, Houston, TX. (United States); Hallacy, T M [UT MD Anderson Cancer Center, Houston, TX. (United States); Rice University, Houston, TX (United States)
2015-06-15
Purpose: To determine the effect of gold-nanoparticles (AuNPs) on energy deposition in water for different irradiation conditions. Methods: TOPAS version B12 Monte Carlo code was used to simulate energy deposition in water from monoenergetic 40 keV and 85 keV photon beams and a 6 MV Varian Clinac photon beam (IAEA phase space file, 10x10 cm{sup 2}, SSD 100 cm). For the 40 and 85 keV beams, monoenergetic 2x2 mm{sup 2} parallel beams were used to irradiate a 30x30x10 µm {sup 3} water mini-phantom located at 1.5 cm depth in a 30x30x50 cm{sup 3} water phantom. 5000 AuNPs of 50 nm diameter were randomly distributed inside the mini-phantom. Energy deposition was scored in the mini-phantom with the AuNPs’ material set to gold and then water. For the 6 MV beam, we created another phase space (PHSP) file on the surface of a 2 mm diameter sphere located at 1.5 cm depth in the water phantom. The PHSP file consisted of all particles entering the sphere including backscattered particles. Simulations were then performed using the new PHSP as the source with the mini-phantom centered in a 2 mm diameter water sphere in vacuum. The g4em-livermore reference list was used with “EMRangeMin/EMRangeMax = 100 eV/7 MeV” and “SetProductionCutLowerEdge = 990 eV” to create the new PHSP, and “SetProductionCutLowerEdge = 100 eV” for the mini-phantom simulations. All other parameters were set as defaults (“finalRange = 100 µm”). Results: The addition of AuNPs resulted in an increase in the mini-phantom energy deposition of (7.5 ± 8.7)%, (1.6 ± 8.2)%, and (−0.6 ± 1.1)% for 40 keV, 85 keV and 6 MV beams respectively. Conclusion: Enhanced energy deposition was seen at low photon energies, but decreased with increasing energy. No enhancement was observed for the 6 MV beam. Future work is required to decrease the statistical uncertainties in the simulations. This research is partially supported from institutional funds from the Center for Radiation Oncology Research, The
Evaluation of Monte Carlo tools for high energy atmospheric physics
Rutjes, Casper; Sarria, David; Broberg Skeltved, Alexander; Luque, Alejandro; Diniz, Gabriel; Østgaard, Nikolai; Ebert, Ute
2016-11-01
The emerging field of high energy atmospheric physics (HEAP) includes terrestrial gamma-ray flashes, electron-positron beams and gamma-ray glows from thunderstorms. Similar emissions of high energy particles occur in pulsed high voltage discharges. Understanding these phenomena requires appropriate models for the interaction of electrons, positrons and photons of up to 40 MeV energy with atmospheric air. In this paper, we benchmark the performance of the Monte Carlo codes Geant4, EGS5 and FLUKA developed in other fields of physics and of the custom-made codes GRRR and MC-PEPTITA against each other within the parameter regime relevant for high energy atmospheric physics. We focus on basic tests, namely on the evolution of monoenergetic and directed beams of electrons, positrons and photons with kinetic energies between 100 keV and 40 MeV through homogeneous air in the absence of electric and magnetic fields, using a low energy cutoff of 50 keV. We discuss important differences between the results of the different codes and provide plausible explanations. We also test the computational performance of the codes. The Supplement contains all results, providing a first benchmark for present and future custom-made codes that are more flexible in including electrodynamic interactions.
An unbiased Hessian representation for Monte Carlo PDFs
Energy Technology Data Exchange (ETDEWEB)
Carrazza, Stefano; Forte, Stefano [Universita di Milano, TIF Lab, Dipartimento di Fisica, Milan (Italy); INFN, Sezione di Milano (Italy); Kassabov, Zahari [Universita di Milano, TIF Lab, Dipartimento di Fisica, Milan (Italy); Universita di Torino, Dipartimento di Fisica, Turin (Italy); INFN, Sezione di Torino (Italy); Latorre, Jose Ignacio [Universitat de Barcelona, Departament d' Estructura i Constituents de la Materia, Barcelona (Spain); Rojo, Juan [University of Oxford, Rudolf Peierls Centre for Theoretical Physics, Oxford (United Kingdom)
2015-08-15
We develop a methodology for the construction of a Hessian representation of Monte Carlo sets of parton distributions, based on the use of a subset of the Monte Carlo PDF replicas as an unbiased linear basis, and of a genetic algorithm for the determination of the optimal basis. We validate the methodology by first showing that it faithfully reproduces a native Monte Carlo PDF set (NNPDF3.0), and then, that if applied to Hessian PDF set (MMHT14) which was transformed into a Monte Carlo set, it gives back the starting PDFs with minimal information loss. We then show that, when applied to a large Monte Carlo PDF set obtained as combination of several underlying sets, the methodology leads to a Hessian representation in terms of a rather smaller set of parameters (MC-H PDFs), thereby providing an alternative implementation of the recently suggested Meta-PDF idea and a Hessian version of the recently suggested PDF compression algorithm (CMC-PDFs). The mc2hessian conversion code is made publicly available together with (through LHAPDF6) a Hessian representations of the NNPDF3.0 set, and the MC-H PDF set. (orig.)
An Unbiased Hessian Representation for Monte Carlo PDFs
Carrazza, Stefano; Kassabov, Zahari; Latorre, Jose Ignacio; Rojo, Juan
2015-01-01
We develop a methodology for the construction of a Hessian representation of Monte Carlo sets of parton distributions, based on the use of a subset of the Monte Carlo PDF replicas as an unbiased linear basis, and of a genetic algorithm for the determination of the optimal basis. We validate the methodology by first showing that it faithfully reproduces a native Monte Carlo PDF set (NNPDF3.0), and then, that if applied to Hessian PDF set (MMHT14) which was transformed into a Monte Carlo set, it gives back the starting PDFs with minimal information loss. We then show that, when applied to a large Monte Carlo PDF set obtained as combination of several underlying sets, the methodology leads to a Hessian representation in terms of a rather smaller set of parameters (CMC-H PDFs), thereby providing an alternative implementation of the recently suggested Meta-PDF idea and a Hessian version of the recently suggested PDF compression algorithm (CMC-PDFs). The mc2hessian conversion code is made publicly available togethe...
Monte Carlo evaluation of kerma in an HDR brachytherapy bunker
Energy Technology Data Exchange (ETDEWEB)
Perez-Calatayud, J [Department of Atomic, Molecular and Nuclear Physics, and IFIC, CSIC-University of Valencia, Burjassot (Spain); Granero, D [Department of Atomic, Molecular and Nuclear Physics, and IFIC, CSIC-University of Valencia, Burjassot (Spain); Ballester, F [Department of Atomic, Molecular and Nuclear Physics, and IFIC, CSIC-University of Valencia, Burjassot (Spain); Casal, E [Department of Atomic, Molecular and Nuclear Physics, and IFIC, CSIC-University of Valencia, Burjassot (Spain); Crispin, V [FIVO, Fundacion Instituto Valenciano De OncologIa, Valencia (Spain); Puchades, V [Grupo IMO-SFA, Madrid (Spain); Leon, A [Department of Chemistry and Nuclear Engineering, Polytechnic University of Valencia, Valencia (Spain); Verdu, G [Department of Chemistry and Nuclear Engineering, Polytechnic University of Valencia, Valencia (Spain)
2004-12-21
In recent years, the use of high dose rate (HDR) after-loader machines has greatly increased due to the shift from traditional Cs-137/Ir-192 low dose rate (LDR) to HDR brachytherapy. The method used to calculate the required concrete and, where appropriate, lead shielding