WorldWideScience

Sample records for based proton monte

  1. Proton Upset Monte Carlo Simulation

    Science.gov (United States)

    O'Neill, Patrick M.; Kouba, Coy K.; Foster, Charles C.

    2009-01-01

    The Proton Upset Monte Carlo Simulation (PROPSET) program calculates the frequency of on-orbit upsets in computer chips (for given orbits such as Low Earth Orbit, Lunar Orbit, and the like) from proton bombardment based on the results of heavy ion testing alone. The software simulates the bombardment of modern microelectronic components (computer chips) with high-energy (.200 MeV) protons. The nuclear interaction of the proton with the silicon of the chip is modeled and nuclear fragments from this interaction are tracked using Monte Carlo techniques to produce statistically accurate predictions.

  2. A fast GPU-based Monte Carlo simulation of proton transport with detailed modeling of non-elastic interactions

    OpenAIRE

    Tseung, H. Wan Chan; J. Ma; Beltran, C.

    2014-01-01

    Purpose: Very fast Monte Carlo (MC) simulations of proton transport have been implemented recently on GPUs. However, these usually use simplified models for non-elastic (NE) proton-nucleus interactions. Our primary goal is to build a GPU-based proton transport MC with detailed modeling of elastic and NE collisions. Methods: Using CUDA, we implemented GPU kernels for these tasks: (1) Simulation of spots from our scanning nozzle configurations, (2) Proton propagation through CT geometry, consid...

  3. GPU-based fast Monte Carlo dose calculation for proton therapy

    Science.gov (United States)

    Jia, Xun; Schümann, Jan; Paganetti, Harald; Jiang, Steve B.

    2012-12-01

    Accurate radiation dose calculation is essential for successful proton radiotherapy. Monte Carlo (MC) simulation is considered to be the most accurate method. However, the long computation time limits it from routine clinical applications. Recently, graphics processing units (GPUs) have been widely used to accelerate computationally intensive tasks in radiotherapy. We have developed a fast MC dose calculation package, gPMC, for proton dose calculation on a GPU. In gPMC, proton transport is modeled by the class II condensed history simulation scheme with a continuous slowing down approximation. Ionization, elastic and inelastic proton nucleus interactions are considered. Energy straggling and multiple scattering are modeled. Secondary electrons are not transported and their energies are locally deposited. After an inelastic nuclear interaction event, a variety of products are generated using an empirical model. Among them, charged nuclear fragments are terminated with energy locally deposited. Secondary protons are stored in a stack and transported after finishing transport of the primary protons, while secondary neutral particles are neglected. gPMC is implemented on the GPU under the CUDA platform. We have validated gPMC using the TOPAS/Geant4 MC code as the gold standard. For various cases including homogeneous and inhomogeneous phantoms as well as a patient case, good agreements between gPMC and TOPAS/Geant4 are observed. The gamma passing rate for the 2%/2 mm criterion is over 98.7% in the region with dose greater than 10% maximum dose in all cases, excluding low-density air regions. With gPMC it takes only 6-22 s to simulate 10 million source protons to achieve ˜1% relative statistical uncertainty, depending on the phantoms and energy. This is an extremely high efficiency compared to the computational time of tens of CPU hours for TOPAS/Geant4. Our fast GPU-based code can thus facilitate the routine use of MC dose calculation in proton therapy.

  4. A GPU-accelerated and Monte Carlo-based intensity modulated proton therapy optimization system

    International Nuclear Information System (INIS)

    Purpose: Conventional spot scanning intensity modulated proton therapy (IMPT) treatment planning systems (TPSs) optimize proton spot weights based on analytical dose calculations. These analytical dose calculations have been shown to have severe limitations in heterogeneous materials. Monte Carlo (MC) methods do not have these limitations; however, MC-based systems have been of limited clinical use due to the large number of beam spots in IMPT and the extremely long calculation time of traditional MC techniques. In this work, the authors present a clinically applicable IMPT TPS that utilizes a very fast MC calculation. Methods: An in-house graphics processing unit (GPU)-based MC dose calculation engine was employed to generate the dose influence map for each proton spot. With the MC generated influence map, a modified least-squares optimization method was used to achieve the desired dose volume histograms (DVHs). The intrinsic CT image resolution was adopted for voxelization in simulation and optimization to preserve spatial resolution. The optimizations were computed on a multi-GPU framework to mitigate the memory limitation issues for the large dose influence maps that resulted from maintaining the intrinsic CT resolution. The effects of tail cutoff and starting condition were studied and minimized in this work. Results: For relatively large and complex three-field head and neck cases, i.e., >100 000 spots with a target volume of ∼1000 cm3 and multiple surrounding critical structures, the optimization together with the initial MC dose influence map calculation was done in a clinically viable time frame (less than 30 min) on a GPU cluster consisting of 24 Nvidia GeForce GTX Titan cards. The in-house MC TPS plans were comparable to a commercial TPS plans based on DVH comparisons. Conclusions: A MC-based treatment planning system was developed. The treatment planning can be performed in a clinically viable time frame on a hardware system costing around 45 000

  5. A GPU-accelerated and Monte Carlo-based intensity modulated proton therapy optimization system

    Energy Technology Data Exchange (ETDEWEB)

    Ma, Jiasen, E-mail: ma.jiasen@mayo.edu; Beltran, Chris; Seum Wan Chan Tseung, Hok; Herman, Michael G. [Department of Radiation Oncology, Division of Medical Physics, Mayo Clinic, 200 First Street Southwest, Rochester, Minnesota 55905 (United States)

    2014-12-15

    Purpose: Conventional spot scanning intensity modulated proton therapy (IMPT) treatment planning systems (TPSs) optimize proton spot weights based on analytical dose calculations. These analytical dose calculations have been shown to have severe limitations in heterogeneous materials. Monte Carlo (MC) methods do not have these limitations; however, MC-based systems have been of limited clinical use due to the large number of beam spots in IMPT and the extremely long calculation time of traditional MC techniques. In this work, the authors present a clinically applicable IMPT TPS that utilizes a very fast MC calculation. Methods: An in-house graphics processing unit (GPU)-based MC dose calculation engine was employed to generate the dose influence map for each proton spot. With the MC generated influence map, a modified least-squares optimization method was used to achieve the desired dose volume histograms (DVHs). The intrinsic CT image resolution was adopted for voxelization in simulation and optimization to preserve spatial resolution. The optimizations were computed on a multi-GPU framework to mitigate the memory limitation issues for the large dose influence maps that resulted from maintaining the intrinsic CT resolution. The effects of tail cutoff and starting condition were studied and minimized in this work. Results: For relatively large and complex three-field head and neck cases, i.e., >100 000 spots with a target volume of ∼1000 cm{sup 3} and multiple surrounding critical structures, the optimization together with the initial MC dose influence map calculation was done in a clinically viable time frame (less than 30 min) on a GPU cluster consisting of 24 Nvidia GeForce GTX Titan cards. The in-house MC TPS plans were comparable to a commercial TPS plans based on DVH comparisons. Conclusions: A MC-based treatment planning system was developed. The treatment planning can be performed in a clinically viable time frame on a hardware system costing around 45

  6. Dosimetric investigation of proton therapy on CT-based patient data using Monte Carlo simulation

    Science.gov (United States)

    Chongsan, T.; Liamsuwan, T.; Tangboonduangjit, P.

    2016-03-01

    The aim of radiotherapy is to deliver high radiation dose to the tumor with low radiation dose to healthy tissues. Protons have Bragg peaks that give high radiation dose to the tumor but low exit dose or dose tail. Therefore, proton therapy is promising for treating deep- seated tumors and tumors locating close to organs at risk. Moreover, the physical characteristic of protons is suitable for treating cancer in pediatric patients. This work developed a computational platform for calculating proton dose distribution using the Monte Carlo (MC) technique and patient's anatomical data. The studied case is a pediatric patient with a primary brain tumor. PHITS will be used for MC simulation. Therefore, patient-specific CT-DICOM files were converted to the PHITS input. A MATLAB optimization program was developed to create a beam delivery control file for this study. The optimization program requires the proton beam data. All these data were calculated in this work using analytical formulas and the calculation accuracy was tested, before the beam delivery control file is used for MC simulation. This study will be useful for researchers aiming to investigate proton dose distribution in patients but do not have access to proton therapy machines.

  7. A fast GPU-based Monte Carlo simulation of proton transport with detailed modeling of non-elastic interactions

    CERN Document Server

    Tseung, H Wan Chan; Beltran, C

    2014-01-01

    Purpose: Very fast Monte Carlo (MC) simulations of proton transport have been implemented recently on GPUs. However, these usually use simplified models for non-elastic (NE) proton-nucleus interactions. Our primary goal is to build a GPU-based proton transport MC with detailed modeling of elastic and NE collisions. Methods: Using CUDA, we implemented GPU kernels for these tasks: (1) Simulation of spots from our scanning nozzle configurations, (2) Proton propagation through CT geometry, considering nuclear elastic scattering, multiple scattering, and energy loss straggling, (3) Modeling of the intranuclear cascade stage of NE interactions, (4) Nuclear evaporation simulation, and (5) Statistical error estimates on the dose. To validate our MC, we performed: (1) Secondary particle yield calculations in NE collisions, (2) Dose calculations in homogeneous phantoms, (3) Re-calculations of head and neck plans from a commercial treatment planning system (TPS), and compared with Geant4.9.6p2/TOPAS. Results: Yields, en...

  8. Clinical CT-based calculations of dose and positron emitter distributions in proton therapy using the FLUKA Monte Carlo code

    Science.gov (United States)

    Parodi, K.; Ferrari, A.; Sommerer, F.; Paganetti, H.

    2007-07-01

    Clinical investigations on post-irradiation PET/CT (positron emission tomography/computed tomography) imaging for in vivo verification of treatment delivery and, in particular, beam range in proton therapy are underway at Massachusetts General Hospital (MGH). Within this project, we have developed a Monte Carlo framework for CT-based calculation of dose and irradiation-induced positron emitter distributions. Initial proton beam information is provided by a separate Geant4 Monte Carlo simulation modelling the treatment head. Particle transport in the patient is performed in the CT voxel geometry using the FLUKA Monte Carlo code. The implementation uses a discrete number of different tissue types with composition and mean density deduced from the CT scan. Scaling factors are introduced to account for the continuous Hounsfield unit dependence of the mass density and of the relative stopping power ratio to water used by the treatment planning system (XiO (Computerized Medical Systems Inc.)). Resulting Monte Carlo dose distributions are generally found in good correspondence with calculations of the treatment planning program, except a few cases (e.g. in the presence of air/tissue interfaces). Whereas dose is computed using standard FLUKA utilities, positron emitter distributions are calculated by internally combining proton fluence with experimental and evaluated cross-sections yielding 11C, 15O, 14O, 13N, 38K and 30P. Simulated positron emitter distributions yield PET images in good agreement with measurements. In this paper, we describe in detail the specific implementation of the FLUKA calculation framework, which may be easily adapted to handle arbitrary phase spaces of proton beams delivered by other facilities or include more reaction channels based on additional cross-section data. Further, we demonstrate the effects of different acquisition time regimes (e.g., PET imaging during or after irradiation) on the intensity and spatial distribution of the irradiation

  9. Towards offline PET monitoring at a cyclotron-based proton therapy facility. Experiments and Monte Carlo simulations

    Energy Technology Data Exchange (ETDEWEB)

    Wuerl, Matthias

    2016-08-01

    Matthias Wuerl presents two essential steps to implement offline PET monitoring of proton dose delivery at a clinical facility, namely the setting up of an accurate Monte Carlo model of the clinical beamline and the experimental validation of positron emitter production cross-sections. In the first part, the field size dependence of the dose output is described for scanned proton beams. Both the Monte Carlo and an analytical computational beam model were able to accurately predict target dose, while the latter tends to overestimate dose in normal tissue. In the second part, the author presents PET measurements of different phantom materials, which were activated by the proton beam. The results indicate that for an irradiation with a high number of protons for the sake of good statistics, dead time losses of the PET scanner may become important and lead to an underestimation of positron-emitter production yields.

  10. Clinical implementation of a GPU-based simplified Monte Carlo method for a treatment planning system of proton beam therapy

    International Nuclear Information System (INIS)

    We implemented the simplified Monte Carlo (SMC) method on graphics processing unit (GPU) architecture under the computer-unified device architecture platform developed by NVIDIA. The GPU-based SMC was clinically applied for four patients with head and neck, lung, or prostate cancer. The results were compared to those obtained by a traditional CPU-based SMC with respect to the computation time and discrepancy. In the CPU- and GPU-based SMC calculations, the estimated mean statistical errors of the calculated doses in the planning target volume region were within 0.5% rms. The dose distributions calculated by the GPU- and CPU-based SMCs were similar, within statistical errors. The GPU-based SMC showed 12.30–16.00 times faster performance than the CPU-based SMC. The computation time per beam arrangement using the GPU-based SMC for the clinical cases ranged 9–67 s. The results demonstrate the successful application of the GPU-based SMC to a clinical proton treatment planning. (note)

  11. MO-A-BRD-10: A Fast and Accurate GPU-Based Proton Transport Monte Carlo Simulation for Validating Proton Therapy Treatment Plans

    Energy Technology Data Exchange (ETDEWEB)

    Wan Chan Tseung, H; Ma, J; Beltran, C [Mayo Clinic, Rochester, MN (United States)

    2014-06-15

    Purpose: To build a GPU-based Monte Carlo (MC) simulation of proton transport with detailed modeling of elastic and non-elastic (NE) protonnucleus interactions, for use in a very fast and cost-effective proton therapy treatment plan verification system. Methods: Using the CUDA framework, we implemented kernels for the following tasks: (1) Simulation of beam spots from our possible scanning nozzle configurations, (2) Proton propagation through CT geometry, taking into account nuclear elastic and multiple scattering, as well as energy straggling, (3) Bertini-style modeling of the intranuclear cascade stage of NE interactions, and (4) Simulation of nuclear evaporation. To validate our MC, we performed: (1) Secondary particle yield calculations in NE collisions with therapeutically-relevant nuclei, (2) Pencil-beam dose calculations in homogeneous phantoms, (3) A large number of treatment plan dose recalculations, and compared with Geant4.9.6p2/TOPAS. A workflow was devised for calculating plans from a commercially available treatment planning system, with scripts for reading DICOM files and generating inputs for our MC. Results: Yields, energy and angular distributions of secondaries from NE collisions on various nuclei are in good agreement with the Geant4.9.6p2 Bertini and Binary cascade models. The 3D-gamma pass rate at 2%–2mm for 70–230 MeV pencil-beam dose distributions in water, soft tissue, bone and Ti phantoms is 100%. The pass rate at 2%–2mm for treatment plan calculations is typically above 98%. The net computational time on a NVIDIA GTX680 card, including all CPU-GPU data transfers, is around 20s for 1×10{sup 7} proton histories. Conclusion: Our GPU-based proton transport MC is the first of its kind to include a detailed nuclear model to handle NE interactions on any nucleus. Dosimetric calculations demonstrate very good agreement with Geant4.9.6p2/TOPAS. Our MC is being integrated into a framework to perform fast routine clinical QA of pencil

  12. MO-A-BRD-10: A Fast and Accurate GPU-Based Proton Transport Monte Carlo Simulation for Validating Proton Therapy Treatment Plans

    International Nuclear Information System (INIS)

    Purpose: To build a GPU-based Monte Carlo (MC) simulation of proton transport with detailed modeling of elastic and non-elastic (NE) protonnucleus interactions, for use in a very fast and cost-effective proton therapy treatment plan verification system. Methods: Using the CUDA framework, we implemented kernels for the following tasks: (1) Simulation of beam spots from our possible scanning nozzle configurations, (2) Proton propagation through CT geometry, taking into account nuclear elastic and multiple scattering, as well as energy straggling, (3) Bertini-style modeling of the intranuclear cascade stage of NE interactions, and (4) Simulation of nuclear evaporation. To validate our MC, we performed: (1) Secondary particle yield calculations in NE collisions with therapeutically-relevant nuclei, (2) Pencil-beam dose calculations in homogeneous phantoms, (3) A large number of treatment plan dose recalculations, and compared with Geant4.9.6p2/TOPAS. A workflow was devised for calculating plans from a commercially available treatment planning system, with scripts for reading DICOM files and generating inputs for our MC. Results: Yields, energy and angular distributions of secondaries from NE collisions on various nuclei are in good agreement with the Geant4.9.6p2 Bertini and Binary cascade models. The 3D-gamma pass rate at 2%–2mm for 70–230 MeV pencil-beam dose distributions in water, soft tissue, bone and Ti phantoms is 100%. The pass rate at 2%–2mm for treatment plan calculations is typically above 98%. The net computational time on a NVIDIA GTX680 card, including all CPU-GPU data transfers, is around 20s for 1×107 proton histories. Conclusion: Our GPU-based proton transport MC is the first of its kind to include a detailed nuclear model to handle NE interactions on any nucleus. Dosimetric calculations demonstrate very good agreement with Geant4.9.6p2/TOPAS. Our MC is being integrated into a framework to perform fast routine clinical QA of pencil

  13. Optimization of GATE and PHITS Monte Carlo code parameters for uniform scanning proton beam based on simulation with FLUKA general-purpose code

    Energy Technology Data Exchange (ETDEWEB)

    Kurosu, Keita [Department of Medical Physics and Engineering, Osaka University Graduate School of Medicine, Suita, Osaka 565-0871 (Japan); Department of Radiation Oncology, Osaka University Graduate School of Medicine, Suita, Osaka 565-0871 (Japan); Takashina, Masaaki; Koizumi, Masahiko [Department of Medical Physics and Engineering, Osaka University Graduate School of Medicine, Suita, Osaka 565-0871 (Japan); Das, Indra J. [Department of Radiation Oncology, Indiana University School of Medicine, Indianapolis, IN 46202 (United States); Moskvin, Vadim P., E-mail: vadim.p.moskvin@gmail.com [Department of Radiation Oncology, Indiana University School of Medicine, Indianapolis, IN 46202 (United States)

    2014-10-01

    Although three general-purpose Monte Carlo (MC) simulation tools: Geant4, FLUKA and PHITS have been used extensively, differences in calculation results have been reported. The major causes are the implementation of the physical model, preset value of the ionization potential or definition of the maximum step size. In order to achieve artifact free MC simulation, an optimized parameters list for each simulation system is required. Several authors have already proposed the optimized lists, but those studies were performed with a simple system such as only a water phantom. Since particle beams have a transport, interaction and electromagnetic processes during beam delivery, establishment of an optimized parameters-list for whole beam delivery system is therefore of major importance. The purpose of this study was to determine the optimized parameters list for GATE and PHITS using proton treatment nozzle computational model. The simulation was performed with the broad scanning proton beam. The influences of the customizing parameters on the percentage depth dose (PDD) profile and the proton range were investigated by comparison with the result of FLUKA, and then the optimal parameters were determined. The PDD profile and the proton range obtained from our optimized parameters list showed different characteristics from the results obtained with simple system. This led to the conclusion that the physical model, particle transport mechanics and different geometry-based descriptions need accurate customization in planning computational experiments for artifact-free MC simulation.

  14. Optimization of GATE and PHITS Monte Carlo code parameters for uniform scanning proton beam based on simulation with FLUKA general-purpose code

    International Nuclear Information System (INIS)

    Although three general-purpose Monte Carlo (MC) simulation tools: Geant4, FLUKA and PHITS have been used extensively, differences in calculation results have been reported. The major causes are the implementation of the physical model, preset value of the ionization potential or definition of the maximum step size. In order to achieve artifact free MC simulation, an optimized parameters list for each simulation system is required. Several authors have already proposed the optimized lists, but those studies were performed with a simple system such as only a water phantom. Since particle beams have a transport, interaction and electromagnetic processes during beam delivery, establishment of an optimized parameters-list for whole beam delivery system is therefore of major importance. The purpose of this study was to determine the optimized parameters list for GATE and PHITS using proton treatment nozzle computational model. The simulation was performed with the broad scanning proton beam. The influences of the customizing parameters on the percentage depth dose (PDD) profile and the proton range were investigated by comparison with the result of FLUKA, and then the optimal parameters were determined. The PDD profile and the proton range obtained from our optimized parameters list showed different characteristics from the results obtained with simple system. This led to the conclusion that the physical model, particle transport mechanics and different geometry-based descriptions need accurate customization in planning computational experiments for artifact-free MC simulation

  15. TH-A-19A-11: Validation of GPU-Based Monte Carlo Code (gPMC) Versus Fully Implemented Monte Carlo Code (TOPAS) for Proton Radiation Therapy: Clinical Cases Study

    International Nuclear Information System (INIS)

    Purpose: For proton radiation therapy, Monte Carlo simulation (MCS) methods are recognized as the gold-standard dose calculation approach. Although previously unrealistic due to limitations in available computing power, GPU-based applications allow MCS of proton treatment fields to be performed in routine clinical use, on time scales comparable to that of conventional pencil-beam algorithms. This study focuses on validating the results of our GPU-based code (gPMC) versus fully implemented proton therapy based MCS code (TOPAS) for clinical patient cases. Methods: Two treatment sites were selected to provide clinical cases for this study: head-and-neck cases due to anatomical geometrical complexity (air cavities and density heterogeneities), making dose calculation very challenging, and prostate cases due to higher proton energies used and close proximity of the treatment target to sensitive organs at risk. Both gPMC and TOPAS methods were used to calculate 3-dimensional dose distributions for all patients in this study. Comparisons were performed based on target coverage indices (mean dose, V90 and D90) and gamma index distributions for 2% of the prescription dose and 2mm. Results: For seven out of eight studied cases, mean target dose, V90 and D90 differed less than 2% between TOPAS and gPMC dose distributions. Gamma index analysis for all prostate patients resulted in passing rate of more than 99% of voxels in the target. Four out of five head-neck-cases showed passing rate of gamma index for the target of more than 99%, the fifth having a gamma index passing rate of 93%. Conclusion: Our current work showed excellent agreement between our GPU-based MCS code and fully implemented proton therapy based MC code for a group of dosimetrically challenging patient cases

  16. A Monte Carlo dose calculation algorithm for proton therapy

    International Nuclear Information System (INIS)

    A Monte Carlo (MC) code (VMCpro) for treatment planning in proton beam therapy of cancer is introduced. It is based on ideas of the Voxel Monte Carlo algorithm for photons and electrons and is applicable to human tissue for clinical proton energies. In the present paper the implementation of electromagnetic and nuclear interactions is described. They are modeled by a Class II condensed history algorithm with continuous energy loss, ionization, multiple scattering, range straggling, δ-electron transport, nuclear elastic proton nucleus scattering and inelastic proton nucleus reactions. VMCpro is faster than the general purpose MC codes FLUKA by a factor of 13 and GEANT4 by a factor of 35 for simulations in a phantom with inhomogeneities. For dose calculations in patients the speed improvement is larger, because VMCpro has only a weak dependency on the heterogeneity of the calculation grid. Dose distributions produced with VMCpro are in agreement with GEANT4 results. Integrated or broad beam depth dose curves show maximum deviations not larger than 1% or 0.5 mm in regions with large dose gradients for the examples presented here

  17. Development of an analysis software for comparison between proton treatment planning system and Monte Carlo simulation

    International Nuclear Information System (INIS)

    Currently, many proton therapy facilities are used for radiotherapy for treating cancer. The main advantage of proton therapy is the absence of exit dose, which offers a highly conformal dose to treatment target as well as better normal organ sparing. The most of treatment planning system (TPS) in proton therapy calculates dose distribution using a pencil beam algorithm (PBA). PBA is suitable for clinical proton therapy because of the fast computation time. However PBA shows accuracy limitations mainly because of the one-dimensional density scaling of proton pencil beams in water. Recently, we developed Monte Carlo simulation tools for the design of proton therapy facility at National Cancer Center (NCC) using GEANT4 toolkit (version GEANT4.9.2p02). Monte Carlo simulation is expected to reproduce precise influences of complex geometry and material varieties which are difficult to introduce to the PBA. The data format of Monte Carlo simulation result has different from DICOM-RT. Consequently we need we analysis software for comparing between TPS and Monte Carlo simulation. The main objective of this research is to develop an analysis toolkit for verifying precision and accuracy of the proton treatment planning system and to analyze dose calculating algorithm of the proton therapy using Monte Carlo simulation. In this work, we conclude that we developed an analysis software for GEANT4-based medical application. This toolkit is capable of evaluating the accuracy of calculated dose by TPS with Monte Carlo simulation.

  18. Development of an analysis software for comparison between proton treatment planning system and Monte Carlo simulation

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Dae Hyun; Suh, Tae Suk [Dept. of Biomedical Engineering, College of Medicine, The Catholic University of Korea, Seoul (Korea, Republic of); Park, Sey Joon; Yoo, Seung Hoon; Lee, Se Byeong [Proton Therapy Center, National Cancer Center, Goyang (Korea, Republic of); Shin, Jung Wook [Dept. of Radiation Oncology, University of California, SanFrancisco (United States)

    2011-11-15

    Currently, many proton therapy facilities are used for radiotherapy for treating cancer. The main advantage of proton therapy is the absence of exit dose, which offers a highly conformal dose to treatment target as well as better normal organ sparing. The most of treatment planning system (TPS) in proton therapy calculates dose distribution using a pencil beam algorithm (PBA). PBA is suitable for clinical proton therapy because of the fast computation time. However PBA shows accuracy limitations mainly because of the one-dimensional density scaling of proton pencil beams in water. Recently, we developed Monte Carlo simulation tools for the design of proton therapy facility at National Cancer Center (NCC) using GEANT4 toolkit (version GEANT4.9.2p02). Monte Carlo simulation is expected to reproduce precise influences of complex geometry and material varieties which are difficult to introduce to the PBA. The data format of Monte Carlo simulation result has different from DICOM-RT. Consequently we need we analysis software for comparing between TPS and Monte Carlo simulation. The main objective of this research is to develop an analysis toolkit for verifying precision and accuracy of the proton treatment planning system and to analyze dose calculating algorithm of the proton therapy using Monte Carlo simulation. In this work, we conclude that we developed an analysis software for GEANT4-based medical application. This toolkit is capable of evaluating the accuracy of calculated dose by TPS with Monte Carlo simulation.

  19. Scintillating fiber based in-vivo dose monitoring system to the rectum in proton therapy of prostate cancer: A Geant4 Monte Carlo simulation

    Directory of Open Access Journals (Sweden)

    Biniam Yohannes Tesfamicael

    2014-03-01

    Full Text Available Purpose: To construct a dose monitoring system based on an endorectal balloon coupled to thin scintillating fibers to study the dose to the rectum in proton therapy of prostate cancer.Method: A Geant4 Monte Carlo toolkit was used to simulate the proton therapy of prostate cancer, with an endorectal balloon and a set of scintillating fibers for immobilization and dosimetry measurements, respectively.Results: A linear response of the fibers to the dose delivered was observed to within less than 2%. Results obtained show that fibers close to the prostate recorded higher dose, with the closest fiber recording about one-third of the dose to the target. A 1/r2 (r is defined as center-to-center distance between the prostate and the fibers decrease was observed as one goes toward the frontal and distal regions. A very low dose was recorded by the fibers beneath the balloon which is a clear indication that the overall volume of the rectal wall that is exposed to a higher dose is relatively minimized. Further analysis showed a relatively linear relationship between the dose to the target and the dose to the top fibers (total 17, with a slope of (-0.07 ± 0.07 at large number of events per degree of rotation of the modulator wheel (i.e., dose.Conclusion: Thin (1 mm × 1 mm, long (1 m scintillating fibers were found to be ideal for real time in-vivo dose measurement to the rectum during proton therapy of prostate cancer. The linear response of the fibers to the dose delivered makes them good candidates as dosimeters. With thorough calibration and the ability to define a good correlation between the dose to the target and the dose to the fibers, such dosimeters can be used for real time dose verification to the target.-----------------------------------Cite this article as: Tesfamicael BY, Avery S, Gueye P, Lyons D, Mahesh M. Scintillating fiber based in-vivo dose monitoring system to the rectum in proton therapy of prostate cancer: A Geant4 Monte Carlo

  20. Benchmarking of proton transport in Super Monte Carlo simulation program

    International Nuclear Information System (INIS)

    Full text of the publication follows. The Monte Carlo (MC) method has been traditionally applied in nuclear design and analysis due to its capability of dealing with complicated geometries and multi-dimensional physics problems as well as obtaining accurate results. The Super Monte Carlo Simulation Program (SuperMC) is developed by FDS Team in China for fusion, fission, and other nuclear applications. The simulations of radiation transport, isotope burn-up, material activation, radiation dose, and biology damage could be performed using SuperMC. Complicated geometries and the whole physical process of various types of particles in broad energy scale can be well handled. Bi-directional automatic conversion between general CAD models and full-formed input files of SuperMC is supported by MCAM, which is a CAD/image-based automatic modeling program for neutronics and radiation transport simulation. Mixed visualization of dynamical 3D dataset and geometry model is supported by RVIS, which is a nuclear radiation virtual simulation and assessment system. Continuous-energy cross section data from hybrid evaluated nuclear data library HENDL are utilized to support simulation. Neutronic fixed source and critical design parameters calculates for reactors of complex geometry and material distribution based on the transport of neutron and photon have been achieved in our former version of SuperMC. Recently, the proton transport has also been integrated in SuperMC in the energy region up to 10 GeV. The physical processes considered for proton transport include electromagnetic processes and hadronic processes. The electromagnetic processes include ionization, multiple scattering, Bremsstrahlung, and pair production processes. Public evaluated data from HENDL are used in some electromagnetic processes. In hadronic physics, the Bertini intra-nuclear cascade model with excitons, preequilibrium model, nucleus explosion model, fission model, and evaporation model are incorporated to

  1. Clinically applicable Monte Carlo-based biological dose optimization for the treatment of head and neck cancers with spot-scanning proton therapy

    CERN Document Server

    Tseung, H Wan Chan; Kreofsky, C R; Ma, D; Beltran, C

    2016-01-01

    Purpose: To demonstrate the feasibility of fast Monte Carlo (MC) based inverse biological planning for the treatment of head and neck tumors in spot-scanning proton therapy. Methods: Recently, a fast and accurate Graphics Processor Unit (GPU)-based MC simulation of proton transport was developed and used as the dose calculation engine in a GPU-accelerated IMPT optimizer. Besides dose, the dose-averaged linear energy transfer (LETd) can be simultaneously scored, which makes biological dose (BD) optimization possible. To convert from LETd to BD, a linear relation was assumed. Using this novel optimizer, inverse biological planning was applied to 4 patients: 2 small and 1 large thyroid tumor targets, and 1 glioma case. To create these plans, constraints were placed to maintain the physical dose (PD) within 1.25 times the prescription while maximizing target BD. For comparison, conventional IMRT and IMPT plans were created for each case in Eclipse (Varian, Inc). The same critical structure PD constraints were use...

  2. TH-A-19A-12: A GPU-Accelerated and Monte Carlo-Based Intensity Modulated Proton Therapy Optimization System

    International Nuclear Information System (INIS)

    Purpose: To develop a clinically applicable intensity modulated proton therapy (IMPT) optimization system that utilizes more accurate Monte Carlo (MC) dose calculation, rather than analytical dose calculation. Methods: A very fast in-house graphics processing unit (GPU) based MC dose calculation engine was employed to generate the dose influence map for each proton spot. With the MC generated influence map, a modified gradient based optimization method was used to achieve the desired dose volume histograms (DVH). The intrinsic CT image resolution was adopted for voxelization in simulation and optimization to preserve the spatial resolution. The optimizations were computed on a multi-GPU framework to mitigate the memory limitation issues for the large dose influence maps that Result from maintaining the intrinsic CT resolution and large number of proton spots. The dose effects were studied particularly in cases with heterogeneous materials in comparison with the commercial treatment planning system (TPS). Results: For a relatively large and complex three-field bi-lateral head and neck case (i.e. >100K spots with a target volume of ∼1000 cc and multiple surrounding critical structures), the optimization together with the initial MC dose influence map calculation can be done in a clinically viable time frame (i.e. less than 15 minutes) on a GPU cluster consisting of 24 Nvidia GeForce GTX Titan cards. The DVHs of the MC TPS plan compare favorably with those of a commercial treatment planning system. Conclusion: A GPU accelerated and MC-based IMPT optimization system was developed. The dose calculation and plan optimization can be performed in less than 15 minutes on a hardware system costing less than 45,000 dollars. The fast calculation and optimization makes the system easily expandable to robust and multi-criteria optimization. This work was funded in part by a grant from Varian Medical Systems, Inc

  3. Experimental validation of a Monte Carlo proton therapy nozzle model incorporating magnetically steered protons

    Energy Technology Data Exchange (ETDEWEB)

    Peterson, S W; Polf, J; Archambault, L; Beddar, S [Department of Radiation Physics, Unit 94, University of Texas M D Anderson Cancer Center, 1515 Holcombe Blvd., Houston, TX 77030 (United States); Bues, M; Ciangaru, G; Smith, A [Proton Therapy Center, Unit 130, University of Texas M D Anderson Cancer Center, 1840 Old Spanish Trail, Houston, TX 77030 (United States)], E-mail: swpeters@mdanderson.org

    2009-05-21

    The purpose of this study is to validate the accuracy of a Monte Carlo calculation model of a proton magnetic beam scanning delivery nozzle developed using the Geant4 toolkit. The Monte Carlo model was used to produce depth dose and lateral profiles, which were compared to data measured in the clinical scanning treatment nozzle at several energies. Comparisons were also made between measured and simulated off-axis profiles to test the accuracy of the model's magnetic steering. Comparison of the 80% distal dose fall-off values for the measured and simulated depth dose profiles agreed to within 1 mm for the beam energies evaluated. Agreement of the full width at half maximum values for the measured and simulated lateral fluence profiles was within 1.3 mm for all energies. The position of measured and simulated spot positions for the magnetically steered beams agreed to within 0.7 mm of each other. Based on these results, we found that the Geant4 Monte Carlo model of the beam scanning nozzle has the ability to accurately predict depth dose profiles, lateral profiles perpendicular to the beam axis and magnetic steering of a proton beam during beam scanning proton therapy.

  4. Neutron shielding calculations in a proton therapy facility based on Monte Carlo simulations and analytical models: Criterion for selecting the method of choice

    International Nuclear Information System (INIS)

    Proton therapy facilities are shielded to limit the amount of secondary radiation to which patients, occupational workers and members of the general public are exposed. The most commonly applied shielding design methods for proton therapy facilities comprise semi-empirical and analytical methods to estimate the neutron dose equivalent. This study compares the results of these methods with a detailed simulation of a proton therapy facility by using the Monte Carlo technique. A comparison of neutron dose equivalent values predicted by the various methods reveals the superior accuracy of the Monte Carlo predictions in locations where the calculations converge. However, the reliability of the overall shielding design increases if simulation results, for which solutions have not converged, e.g. owing to too few particle histories, can be excluded, and deterministic models are being used at these locations. Criteria to accept or reject Monte Carlo calculations in such complex structures are not well understood. An optimum rejection criterion would allow all converging solutions of Monte Carlo simulation to be taken into account, and reject all solutions with uncertainties larger than the design safety margins. In this study, the optimum rejection criterion of 10% was found. The mean ratio was 26, 62% of all receptor locations showed a ratio between 0.9 and 10, and 92% were between 1 and 100. (authors)

  5. Reconstruction for proton computed tomography by tracing proton trajectories: A Monte Carlo study

    International Nuclear Information System (INIS)

    Proton computed tomography (pCT) has been explored in the past decades because of its unique imaging characteristics, low radiation dose, and its possible use for treatment planning and on-line target localization in proton therapy. However, reconstruction of pCT images is challenging because the proton path within the object to be imaged is statistically affected by multiple Coulomb scattering. In this paper, we employ GEANT4-based Monte Carlo simulations of the two-dimensional pCT reconstruction of an elliptical phantom to investigate the possible use of the algebraic reconstruction technique (ART) with three different path-estimation methods for pCT reconstruction. The first method assumes a straight-line path (SLP) connecting the proton entry and exit positions, the second method adapts the most-likely path (MLP) theoretically determined for a uniform medium, and the third method employs a cubic spline path (CSP). The ART reconstructions showed progressive improvement of spatial resolution when going from the SLP [2 line pairs (lp) cm-1] to the curved CSP and MLP path estimates (5 lp cm-1). The MLP-based ART algorithm had the fastest convergence and smallest residual error of all three estimates. This work demonstrates the advantage of tracking curved proton paths in conjunction with the ART algorithm and curved path estimates

  6. Proton therapy Monte Carlo SRNA-VOX code

    OpenAIRE

    Ilić Radovan D.

    2012-01-01

    The most powerful feature of the Monte Carlo method is the possibility of simulating all individual particle interactions in three dimensions and performing numerical experiments with a preset error. These facts were the motivation behind the development of a general-purpose Monte Carlo SRNA program for proton transport simulation in technical systems described by standard geometrical forms (plane, sphere, cone, cylinder, cube). Some of the possible applications of the SRNA program are:...

  7. Monte Carlo calculations of positron emitter yields in proton radiotherapy

    International Nuclear Information System (INIS)

    Positron emission tomography (PET) is a promising tool for monitoring the three-dimensional dose distribution in charged particle radiotherapy. PET imaging during or shortly after proton treatment is based on the detection of annihilation photons following the β+-decay of radionuclides resulting from nuclear reactions in the irradiated tissue. Therapy monitoring is achieved by comparing the measured spatial distribution of irradiation-induced β+-activity with the predicted distribution based on the treatment plan. The accuracy of the calculated distribution depends on the correctness of the computational models, implemented in the employed Monte Carlo (MC) codes that describe the interactions of the charged particle beam with matter and the production of radionuclides and secondary particles. However, no well-established theoretical models exist for predicting the nuclear interactions and so phenomenological models are typically used based on parameters derived from experimental data. Unfortunately, the experimental data presently available are insufficient to validate such phenomenological hadronic interaction models. Hence, a comparison among the models used by the different MC packages is desirable. In this work, starting from a common geometry, we compare the performances of MCNPX, GATE and PHITS MC codes in predicting the amount and spatial distribution of proton-induced activity, at therapeutic energies, to the already experimentally validated PET modelling based on the FLUKA MC code. In particular, we show how the amount of β+-emitters produced in tissue-like media depends on the physics model and cross-sectional data used to describe the proton nuclear interactions, thus calling for future experimental campaigns aiming at supporting improvements of MC modelling for clinical application of PET monitoring. (authors)

  8. Monte Carlo comparison of x-ray and proton CT for range calculations of proton therapy beams

    International Nuclear Information System (INIS)

    Proton computed tomography (CT) has been described as a solution for imaging the proton stopping power of patient tissues, therefore reducing the uncertainty of the conversion of x-ray CT images to relative stopping power (RSP) maps and its associated margins. This study aimed to investigate this assertion under the assumption of ideal detection systems. We have developed a Monte Carlo framework to assess proton CT performances for the main steps of a proton therapy treatment planning, i.e. proton or x-ray CT imaging, conversion to RSP maps based on the calibration of a tissue phantom, and proton dose simulations. Irradiations of a computational phantom with pencil beams were simulated on various anatomical sites and the proton range was assessed on the reference, the proton CT-based and the x-ray CT-based material maps. Errors on the tissue’s RSP reconstructed from proton CT were found to be significantly smaller and less dependent on the tissue distribution. The imaging dose was also found to be much more uniform and conformal to the primary beam. The mean absolute deviation for range calculations based on x-ray CT varies from 0.18 to 2.01 mm depending on the localization, while it is smaller than 0.1 mm for proton CT. Under the assumption of a perfect detection system, proton range predictions based on proton CT are therefore both more accurate and more uniform than those based on x-ray CT. (paper)

  9. Proton therapy Monte Carlo SRNA-VOX code

    Directory of Open Access Journals (Sweden)

    Ilić Radovan D.

    2012-01-01

    Full Text Available The most powerful feature of the Monte Carlo method is the possibility of simulating all individual particle interactions in three dimensions and performing numerical experiments with a preset error. These facts were the motivation behind the development of a general-purpose Monte Carlo SRNA program for proton transport simulation in technical systems described by standard geometrical forms (plane, sphere, cone, cylinder, cube. Some of the possible applications of the SRNA program are: (a a general code for proton transport modeling, (b design of accelerator-driven systems, (c simulation of proton scattering and degrading shapes and composition, (d research on proton detectors; and (e radiation protection at accelerator installations. This wide range of possible applications of the program demands the development of various versions of SRNA-VOX codes for proton transport modeling in voxelized geometries and has, finally, resulted in the ISTAR package for the calculation of deposited energy distribution in patients on the basis of CT data in radiotherapy. All of the said codes are capable of using 3-D proton sources with an arbitrary energy spectrum in an interval of 100 keV to 250 MeV.

  10. Optimization of GATE and PHITS Monte Carlo code parameters for spot scanning proton beam based on simulation with FLUKA general-purpose code

    Science.gov (United States)

    Kurosu, Keita; Das, Indra J.; Moskvin, Vadim P.

    2016-01-01

    Spot scanning, owing to its superior dose-shaping capability, provides unsurpassed dose conformity, in particular for complex targets. However, the robustness of the delivered dose distribution and prescription has to be verified. Monte Carlo (MC) simulation has the potential to generate significant advantages for high-precise particle therapy, especially for medium containing inhomogeneities. However, the inherent choice of computational parameters in MC simulation codes of GATE, PHITS and FLUKA that is observed for uniform scanning proton beam needs to be evaluated. This means that the relationship between the effect of input parameters and the calculation results should be carefully scrutinized. The objective of this study was, therefore, to determine the optimal parameters for the spot scanning proton beam for both GATE and PHITS codes by using data from FLUKA simulation as a reference. The proton beam scanning system of the Indiana University Health Proton Therapy Center was modeled in FLUKA, and the geometry was subsequently and identically transferred to GATE and PHITS. Although the beam transport is managed by spot scanning system, the spot location is always set at the center of a water phantom of 600 × 600 × 300 mm3, which is placed after the treatment nozzle. The percentage depth dose (PDD) is computed along the central axis using 0.5 × 0.5 × 0.5 mm3 voxels in the water phantom. The PDDs and the proton ranges obtained with several computational parameters are then compared to those of FLUKA, and optimal parameters are determined from the accuracy of the proton range, suppressed dose deviation, and computational time minimization. Our results indicate that the optimized parameters are different from those for uniform scanning, suggesting that the gold standard for setting computational parameters for any proton therapy application cannot be determined consistently since the impact of setting parameters depends on the proton irradiation technique. We

  11. Microdosimetry of the full slowing down of protons using Monte Carlo track structure simulations.

    Science.gov (United States)

    Liamsuwan, T; Uehara, S; Nikjoo, H

    2015-09-01

    The article investigates two approaches in microdosimetric calculations based on Monte Carlo track structure (MCTS) simulations of a 160-MeV proton beam. In the first approach, microdosimetric parameters of the proton beam were obtained using the weighted sum of proton energy distributions and microdosimetric parameters of proton track segments (TSMs). In the second approach, phase spaces of energy depositions obtained using MCTS simulations in the full slowing down (FSD) mode were used for the microdosimetric calculations. Targets of interest were water cylinders of 2.3-100 nm in diameters and heights. Frequency-averaged lineal energies ([Formula: see text]) obtained using both approaches agreed within the statistical uncertainties. Discrepancies beyond this level were observed for dose-averaged lineal energies ([Formula: see text]) towards the Bragg peak region due to the small number of proton energies used in the TSM approach and different energy deposition patterns in the TSM and FSD of protons. PMID:25904698

  12. Quantum Monte Carlo study of the protonated water dimer

    CERN Document Server

    Dagrada, Mario; Saitta, Antonino M; Sorella, Sandro; Mauri, Francesco

    2013-01-01

    We report an extensive theoretical study of the protonated water dimer (Zundel ion) by means of the highly correlated variational Monte Carlo and lattice regularized Monte Carlo approaches. This system represents the simplest model for proton transfer (PT) and a correct description of its properties is essential in order to understand the PT mechanism in more complex acqueous systems. Our Jastrow correlated AGP wave function ensures an accurate treatment of electron correlations. Exploiting the advantages of contracting the primitive basis set over atomic hybrid orbitals, we are able to limit dramatically the number of variational parameters with a systematic control on the numerical precision, crucial in order to simulate larger systems. We investigate energetics and geometrical properties of the Zundel ion as a function of the oxygen-oxygen distance, taken as reaction coordinate. In both cases, our QMC results are found in excellent agreement with coupled cluster CCSD(T) technique, the quantum chemistry "go...

  13. Clinical implementation of full Monte Carlo dose calculation in proton beam therapy

    Energy Technology Data Exchange (ETDEWEB)

    Paganetti, Harald; Jiang, Hongyu; Parodi, Katia; Slopsema, Roelf; Engelsman, Martijn [Department of Radiation Oncology, Massachusetts General Hospital and Harvard Medical School, Boston, MA 02114 (United States)

    2008-09-07

    The goal of this work was to facilitate the clinical use of Monte Carlo proton dose calculation to support routine treatment planning and delivery. The Monte Carlo code Geant4 was used to simulate the treatment head setup, including a time-dependent simulation of modulator wheels (for broad beam modulation) and magnetic field settings (for beam scanning). Any patient-field-specific setup can be modeled according to the treatment control system of the facility. The code was benchmarked against phantom measurements. Using a simulation of the ionization chamber reading in the treatment head allows the Monte Carlo dose to be specified in absolute units (Gy per ionization chamber reading). Next, the capability of reading CT data information was implemented into the Monte Carlo code to model patient anatomy. To allow time-efficient dose calculation, the standard Geant4 tracking algorithm was modified. Finally, a software link of the Monte Carlo dose engine to the patient database and the commercial planning system was established to allow data exchange, thus completing the implementation of the proton Monte Carlo dose calculation engine ('DoC++'). Monte Carlo re-calculated plans are a valuable tool to revisit decisions in the planning process. Identification of clinically significant differences between Monte Carlo and pencil-beam-based dose calculations may also drive improvements of current pencil-beam methods. As an example, four patients (29 fields in total) with tumors in the head and neck regions were analyzed. Differences between the pencil-beam algorithm and Monte Carlo were identified in particular near the end of range, both due to dose degradation and overall differences in range prediction due to bony anatomy in the beam path. Further, the Monte Carlo reports dose-to-tissue as compared to dose-to-water by the planning system. Our implementation is tailored to a specific Monte Carlo code and the treatment planning system XiO (Computerized Medical

  14. Centrality measures highlight proton traps and access points to proton highways in kinetic Monte Carlo trajectories

    Energy Technology Data Exchange (ETDEWEB)

    Krueger, Rachel A. [Department of Chemistry, California Institute of Technology, Pasadena, California 91125 (United States); Haibach, Frederick G. [Confluent Science, Wilbraham, Massachusetts 01095 (United States); Fry, Dana L.; Gomez, Maria A., E-mail: magomez@mtholyoke.edu [Department of Chemistry, Mount Holyoke College, South Hadley, Massachusetts 01075 (United States)

    2015-04-21

    A centrality measure based on the time of first returns rather than the number of steps is developed and applied to finding proton traps and access points to proton highways in the doped perovskite oxides: AZr{sub 0.875}D{sub 0.125}O{sub 3}, where A is Ba or Sr and the dopant D is Y or Al. The high centrality region near the dopant is wider in the SrZrO{sub 3} systems than the BaZrO{sub 3} systems. In the aluminum-doped systems, a region of intermediate centrality (secondary region) is found in a plane away from the dopant. Kinetic Monte Carlo (kMC) trajectories show that this secondary region is an entry to fast conduction planes in the aluminum-doped systems in contrast to the highest centrality area near the dopant trap. The yttrium-doped systems do not show this secondary region because the fast conduction routes are in the same plane as the dopant and hence already in the high centrality trapped area. This centrality measure complements kMC by highlighting key areas in trajectories. The limiting activation barriers found via kMC are in very good agreement with experiments and related to the barriers to escape dopant traps.

  15. Monte Carlo simulations of a novel Micromegas 2D array for proton dosimetry

    Science.gov (United States)

    Dolney, D.; Ainsley, C.; Hollebeek, R.; Maughan, R.

    2016-02-01

    Modern proton therapy affords control of the delivery of radiotherapeutic dose on fine length and temporal scales. The authors have developed a novel detector technology based on Micromesh Gaseous Structure (Micromegas) that is uniquely tailored for applications using therapeutic proton beams. An implementation of a prototype Micromegas detector for Monte Carlo using Geant4 is presented here. Comparison of simulation results with measurements demonstrates agreement in relative dose along the proton longitudinal dose profile to be 1%. The effect of a radioactive calibration source embedded in the chamber gas is demonstrated by measurements and reproduced by simulations, also at the 1% level. Our Monte Carlo simulations are shown to reproduce the time structure of ionization pulses produced by a double-scattering delivery system.

  16. A Monte Carlo track structure code for low energy protons

    CERN Document Server

    Endo, S; Nikjoo, H; Uehara, S; Hoshi, M; Ishikawa, M; Shizuma, K

    2002-01-01

    A code is described for simulation of protons (100 eV to 10 MeV) track structure in water vapor. The code simulates molecular interaction by interaction for the transport of primary ions and secondary electrons in the form of ionizations and excitations. When a low velocity ion collides with the atoms or molecules of a target, the ion may also capture or lose electrons. The probabilities for these processes are described by the quantity cross-section. Although proton track simulation at energies above Bragg peak (>0.3 MeV) has been achieved to a high degree of precision, simulations at energies near or below the Bragg peak have only been attempted recently because of the lack of relevant cross-section data. As the hydrogen atom has a different ionization cross-section from that of a proton, charge exchange processes need to be considered in order to calculate stopping power for low energy protons. In this paper, we have used state-of-the-art Monte Carlo track simulation techniques, in conjunction with the pub...

  17. Srna - Monte Carlo codes for proton transport simulation in combined and voxelized geometries

    Directory of Open Access Journals (Sweden)

    Ilić Radovan D.

    2002-01-01

    Full Text Available This paper describes new Monte Carlo codes for proton transport simulations in complex geometrical forms and in materials of different composition. The SRNA codes were developed for three dimensional (3D dose distribution calculation in proton therapy and dosimetry. The model of these codes is based on the theory of proton multiple scattering and a simple model of compound nucleus decay. The developed package consists of two codes: SRNA-2KG and SRNA-VOX. The first code simulates proton transport in combined geometry that can be described by planes and second order surfaces. The second one uses the voxelized geometry of material zones and is specifically adopted for the application of patient computer tomography data. Transition probabilities for both codes are given by the SRNADAT program. In this paper, we will present the models and algorithms of our programs, as well as the results of the numerical experiments we have carried out applying them, along with the results of proton transport simulation obtained through the PETRA and GEANT programs. The simulation of the proton beam characterization by means of the Multi-Layer Faraday Cup and spatial distribution of positron emitters obtained by our program indicate the imminent application of Monte Carlo techniques in clinical practice.

  18. Srna-Monte Carlo codes for proton transport simulation in combined and voxelized geometries

    International Nuclear Information System (INIS)

    This paper describes new Monte Carlo codes for proton transport simulations in complex geometrical forms and in materials of different composition. The SRNA codes were developed for three dimensional (3D) dose distribution calculation in proton therapy and dosimetry. The model of these codes is based on the theory of proton multiple scattering and a simple model of compound nucleus decay. The developed package consists of two codes: SRNA-2KG and SRNA-VOX. The first code simulates proton transport in combined geometry that can be described by planes and second order surfaces. The second one uses the voxelized geometry of material zones and is specifically adopted for the application of patient computer tomography data. Transition probabilities for both codes are given by the SRNADAT program. In this paper, we will present the models and algorithms of our programs, as well as the results of the numerical experiments we have carried out applying them, along with the results of proton transport simulation obtained through the PETRA and GEANT programs. The simulation of the proton beam characterization by means of the Multi-Layer Faraday Cup and spatial distribution of positron emitters obtained by our program indicate the imminent application of Monte Carlo techniques in clinical practice. (author)

  19. TOPAS: An innovative proton Monte Carlo platform for research and clinical applications

    Energy Technology Data Exchange (ETDEWEB)

    Perl, J.; Shin, J.; Schuemann, J.; Faddegon, B.; Paganetti, H. [SLAC National Accelerator Laboratory, 2575 Sand Hill Road, Menlo Park, California 94025 (United States); University of California San Francisco Comprehensive Cancer Center, 1600 Divisadero Street, San Francisco, California 94143-1708 (United States); Department of Radiation Oncology, Massachusetts General Hospital and Harvard Medical School, Boston, Massachusetts 02114 (United States); University of California San Francisco Comprehensive Cancer Center, 1600 Divisadero Street, San Francisco, California 94143-1708 (United States); Department of Radiation Oncology, Massachusetts General Hospital and Harvard Medical School, Boston, Massachusetts 02114 (United States)

    2012-11-15

    Purpose: While Monte Carlo particle transport has proven useful in many areas (treatment head design, dose calculation, shielding design, and imaging studies) and has been particularly important for proton therapy (due to the conformal dose distributions and a finite beam range in the patient), the available general purpose Monte Carlo codes in proton therapy have been overly complex for most clinical medical physicists. The learning process has large costs not only in time but also in reliability. To address this issue, we developed an innovative proton Monte Carlo platform and tested the tool in a variety of proton therapy applications. Methods: Our approach was to take one of the already-established general purpose Monte Carlo codes and wrap and extend it to create a specialized user-friendly tool for proton therapy. The resulting tool, TOol for PArticle Simulation (TOPAS), should make Monte Carlo simulation more readily available for research and clinical physicists. TOPAS can model a passive scattering or scanning beam treatment head, model a patient geometry based on computed tomography (CT) images, score dose, fluence, etc., save and restart a phase space, provides advanced graphics, and is fully four-dimensional (4D) to handle variations in beam delivery and patient geometry during treatment. A custom-designed TOPAS parameter control system was placed at the heart of the code to meet requirements for ease of use, reliability, and repeatability without sacrificing flexibility. Results: We built and tested the TOPAS code. We have shown that the TOPAS parameter system provides easy yet flexible control over all key simulation areas such as geometry setup, particle source setup, scoring setup, etc. Through design consistency, we have insured that user experience gained in configuring one component, scorer or filter applies equally well to configuring any other component, scorer or filter. We have incorporated key lessons from safety management, proactively

  20. Nanodosimetric verification in proton therapy: Monte Carlo Codes Comparison

    International Nuclear Information System (INIS)

    Full text: Nanodosimetry strives to develop a novel dosimetry concept suitable for advanced modalities of cancer radiotherapy, such as proton therapy. This project aims to evaluate the plausibility of the physical models implemented in the Geant4 Very Low Energy (Geant4-DNA) extensions by comparing nanodosimetric quantities calculated with Geant4-DNA and the PTB Monte Carlo track structure code. Nanodosimetric track structure parameters were calculated for cylindrical targets representing DNA and nucleosome segments and converted into the probability of producing a DSB using the model proposed by Garty et al. [1]. Monoenergetic protons and electrons of energies typical for 6-electron spectra were considered as primary particles. Good agreement was found between the two codes for electrons of energies above 200 eV. Below this energy Geant4-DNA produced slightly higher numbers of ionisations in the sensitive volumes and higher probabilities for DSB formation. For protons, Geant4-DNA also gave higher numbers of ionisations and DSB probabilities, particularly in the low energy range, while a satisfactory agreement was found for energies higher than I MeV. Comparing two codes can be useful as any observed divergence in results between the two codes provides valuable information as to where further consideration of the underlying physical models used in each code may be required. Consistently it was seen that the largest difference between the codes was in the low energy ranges for each particle type. (author)

  1. Monte Carlo calculated stopping-power ratios, water/air, for clinical proton dosimetry (50-250 MeV)

    International Nuclear Information System (INIS)

    Calculations of stopping power ratios, water to air, for the determination of absorbed dose to water in clinical proton beams using ionization chamber measurements have been undertaken using the Monte Carlo method. A computer code to simulate the transport of protons in water (PETRA) has been used to calculate Sw,air-data under different degrees of complexity, ranging from values based on primary protons only to data including secondary electrons and high-energy secondary protons produced in nonelastic nuclear collisions. All numerical data are based on ICRU 49 proton stopping powers. Calculations using primary protons have been compared to the simple continuous slowing-down approximation (c.s.d.a.) analytical technique used in proton dosimetry protocols, not finding significant differences that justify elaborate Monte Carlo simulations except beyond the mean range of the protons (the far side of the Bragg peak). The influence of nuclear nonelastic processes, through the detailed generation and transport of secondary protons, on the calculated stopping-power ratios has been found to be negligible. The effect of alpha particles has also been analysed, finding differences smaller than 0.1% from the results excluding them. Discrepancies of up to 0.6% in the plateau region have been found, however, when the production and transport of secondary electrons are taken into account. The large influence of nonelastic nuclear interactions on proton depth-dose distributions shows that the removal of primary protons from the incident beam decreases the peak-to-plateau ratio by a large factor, up to 40% at 250 MeV. It is therefore emphasized that nonelastic nuclear reactions should be included in Monte Carlo simulations of proton beam depth-dose distributions. (author)

  2. SU-E-T-289: Scintillating Fiber Based In-Vivo Dose Monitoring System to the Rectum in Proton Therapy of Prostate Cancer: A Geant4 Monte Carlo Simulation

    International Nuclear Information System (INIS)

    Purpose: To construct a dose monitoring system based on an endorectal balloon coupled to thin scintillating fibers to study the dose delivered to the rectum during prostate cancer proton therapy Methods: The Geant4 Monte Carlo toolkit version 9.6p02 was used to simulate prostate cancer proton therapy treatments of an endorectal balloon (for immobilization of a 2.9 cm diameter prostate gland) and a set of 34 scintillating fibers symmetrically placed around the balloon and perpendicular to the proton beam direction (for dosimetry measurements) Results: A linear response of the fibers to the dose delivered was observed within <2%, a property that makes them good candidates for real time dosimetry. Results obtained show that the closest fiber recorded about 1/3 of the dose to the target with a 1/r2 decrease in the dose distribution as one goes toward the frontal and distal top fibers. Very low dose was recorded by the bottom fibers (about 45 times comparatively), which is a clear indication that the overall volume of the rectal wall that is exposed to a higher dose is relatively minimized. Further analysis indicated a simple scaling relationship between the dose to the prostate and the dose to the top fibers (a linear fit gave a slope of −0.07±0.07 MeV per treatment Gy) Conclusion: Thin (1 mm × 1 mm × 100 cm) long scintillating fibers were found to be ideal for real time in-vivo dose measurement to the rectum for prostate cancer proton therapy. The linear response of the fibers to the dose delivered makes them good candidates of dosimeters. With thorough calibration and the ability to define a good correlation between the dose to the target and the dose to the fibers, such dosimeters can be used for real time dose verification to the target

  3. Monte Carlo Calculations Supporting Patient Plan Verification in Proton Therapy.

    Science.gov (United States)

    Lima, Thiago V M; Dosanjh, Manjit; Ferrari, Alfredo; Molineli, Silvia; Ciocca, Mario; Mairani, Andrea

    2016-01-01

    Patient's treatment plan verification covers substantial amount of the quality assurance (QA) resources; this is especially true for Intensity-Modulated Proton Therapy (IMPT). The use of Monte Carlo (MC) simulations in supporting QA has been widely discussed, and several methods have been proposed. In this paper, we studied an alternative approach from the one being currently applied clinically at Centro Nazionale di Adroterapia Oncologica (CNAO). We reanalyzed the previously published data (Molinelli et al. (1)), where 9 patient plans were investigated in which the warning QA threshold of 3% mean dose deviation was crossed. The possibility that these differences between measurement and calculated dose were related to dose modeling (Treatment Planning Systems (TPS) vs. MC), limitations on dose delivery system, or detectors mispositioning was originally explored, but other factors, such as the geometric description of the detectors, were not ruled out. For the purpose of this work, we compared ionization chambers' measurements with different MC simulation results. It was also studied that some physical effects were introduced by this new approach, for example, inter-detector interference and the delta ray thresholds. The simulations accounting for a detailed geometry typically are superior (statistical difference - p-value around 0.01) to most of the MC simulations used at CNAO (only inferior to the shift approach used). No real improvement was observed in reducing the current delta ray threshold used (100 keV), and no significant interference between ion chambers in the phantom were detected (p-value 0.81). In conclusion, it was observed that the detailed geometrical description improves the agreement between measurement and MC calculations in some cases. But in other cases, position uncertainty represents the dominant uncertainty. The inter-chamber disturbance was not detected for the therapeutic protons energies, and the results from the current delta threshold

  4. Monte Carlo calculations supporting patient plan verification in proton therapy

    Directory of Open Access Journals (Sweden)

    Thiago Viana Miranda Lima

    2016-03-01

    Full Text Available Patient’s treatment plan verification covers substantial amount of the quality assurance (QA resources, this is especially true for Intensity Modulated Proton Therapy (IMPT. The use of Monte Carlo (MC simulations in supporting QA has been widely discussed and several methods have been proposed. In this paper we studied an alternative approach from the one being currently applied clinically at Centro Nazionale di Adroterapia Oncologica (CNAO. We reanalysed the previously published data (Molinelli et al. 2013, where 9 patient plans were investigated in which the warning QA threshold of 3% mean dose deviation was crossed. The possibility that these differences between measurement and calculated dose were related to dose modelling (Treatment Planning Systems (TPS vs MC, limitations on dose delivery system or detectors mispositioning was originally explored but other factors such as the geometric description of the detectors were not ruled out. For the purpose of this work we compared ionisation-chambers measurements with different MC simulations results. It was also studied some physical effects introduced by this new approach for example inter detector interference and the delta ray thresholds. The simulations accounting for a detailed geometry typically are superior (statistical difference - p-value around 0.01 to most of the MC simulations used at CNAO (only inferior to the shift approach used. No real improvement were observed in reducing the current delta-ray threshold used (100 keV and no significant interference between ion chambers in the phantom were detected (p-value 0.81. In conclusion, it was observed that the detailed geometrical description improves the agreement between measurement and MC calculations in some cases. But in other cases position uncertainty represents the dominant uncertainty. The inter chamber disturbance was not detected for the therapeutic protons energies and the results from the current delta threshold are

  5. Monte Carlo Calculations Supporting Patient Plan Verification in Proton Therapy

    Science.gov (United States)

    Lima, Thiago V. M.; Dosanjh, Manjit; Ferrari, Alfredo; Molineli, Silvia; Ciocca, Mario; Mairani, Andrea

    2016-01-01

    Patient’s treatment plan verification covers substantial amount of the quality assurance (QA) resources; this is especially true for Intensity-Modulated Proton Therapy (IMPT). The use of Monte Carlo (MC) simulations in supporting QA has been widely discussed, and several methods have been proposed. In this paper, we studied an alternative approach from the one being currently applied clinically at Centro Nazionale di Adroterapia Oncologica (CNAO). We reanalyzed the previously published data (Molinelli et al. (1)), where 9 patient plans were investigated in which the warning QA threshold of 3% mean dose deviation was crossed. The possibility that these differences between measurement and calculated dose were related to dose modeling (Treatment Planning Systems (TPS) vs. MC), limitations on dose delivery system, or detectors mispositioning was originally explored, but other factors, such as the geometric description of the detectors, were not ruled out. For the purpose of this work, we compared ionization chambers’ measurements with different MC simulation results. It was also studied that some physical effects were introduced by this new approach, for example, inter-detector interference and the delta ray thresholds. The simulations accounting for a detailed geometry typically are superior (statistical difference – p-value around 0.01) to most of the MC simulations used at CNAO (only inferior to the shift approach used). No real improvement was observed in reducing the current delta ray threshold used (100 keV), and no significant interference between ion chambers in the phantom were detected (p-value 0.81). In conclusion, it was observed that the detailed geometrical description improves the agreement between measurement and MC calculations in some cases. But in other cases, position uncertainty represents the dominant uncertainty. The inter-chamber disturbance was not detected for the therapeutic protons energies, and the results from the current delta

  6. SU-E-T-239: Monte Carlo Modelling of SMC Proton Nozzles Using TOPAS

    International Nuclear Information System (INIS)

    Purpose: To expedite and cross-check the commissioning of the proton therapy nozzles at Samsung Medical Center using TOPAS. Methods: We have two different types of nozzles at Samsung Medical Center (SMC), a multi-purpose nozzle and a pencil beam scanning dedicated nozzle. Both nozzles have been modelled in Monte Carlo simulation by using TOPAS based on the vendor-provided geometry. The multi-purpose nozzle is mainly composed of wobbling magnets, scatterers, ridge filters and multi-leaf collimators (MLC). Including patient specific apertures and compensators, all the parts of the nozzle have been implemented in TOPAS following the geometry information from the vendor.The dedicated scanning nozzle has a simpler structure than the multi-purpose nozzle with a vacuum pipe at the down stream of the nozzle.A simple water tank volume has been implemented to measure the dosimetric characteristics of proton beams from the nozzles. Results: We have simulated the two proton beam nozzles at SMC. Two different ridge filters have been tested for the spread-out Bragg peak (SOBP) generation of wobbling mode in the multi-purpose nozzle. The spot sizes and lateral penumbra in two nozzles have been simulated and analyzed using a double Gaussian model. Using parallel geometry, both the depth dose curve and dose profile have been measured simultaneously. Conclusion: The proton therapy nozzles at SMC have been successfully modelled in Monte Carlo simulation using TOPAS. We will perform a validation with measured base data and then use the MC simulation to interpolate/extrapolate the measured data. We believe it will expedite the commissioning process of the proton therapy nozzles at SMC

  7. Srna-Monte Carlo codes for proton transport simulation in combined and voxelized geometries

    CERN Document Server

    Ilic, R D; Stankovic, S J

    2002-01-01

    This paper describes new Monte Carlo codes for proton transport simulations in complex geometrical forms and in materials of different composition. The SRNA codes were developed for three dimensional (3D) dose distribution calculation in proton therapy and dosimetry. The model of these codes is based on the theory of proton multiple scattering and a simple model of compound nucleus decay. The developed package consists of two codes: SRNA-2KG and SRNA-VOX. The first code simulates proton transport in combined geometry that can be described by planes and second order surfaces. The second one uses the voxelized geometry of material zones and is specifically adopted for the application of patient computer tomography data. Transition probabilities for both codes are given by the SRNADAT program. In this paper, we will present the models and algorithms of our programs, as well as the results of the numerical experiments we have carried out applying them, along with the results of proton transport simulation obtaine...

  8. Monte Carlo simulation in proton computed tomography: a study of image reconstruction technique

    Energy Technology Data Exchange (ETDEWEB)

    Inocente, Guilherme Franco; Stenico, Gabriela V.; Hormaza, Joel Mesa [Universidade Estadual Paulista Julio de Mesquita Filho (UNESP), Botucatu, SP (Brazil). Inst. de Biociencias. Dept. de Fisica e Biofisica

    2012-07-01

    Full text: The radiation method is one of the most used for cancer treatment. In this context arises therapy with proton beams in front of conventional radiotherapy. It is known that with proton therapy there are more advantages to the patient treated when compared with more conventional methods. The dose distributed along the path, especially in healthy tissues - neighbor the tumor, is smaller and the accuracy of treatment is much better. To carry out the treatment, the patient undergoes a plan through images for visualization and location of the target volume. The main method for obtaining these images is computed tomography X-ray (XCT). For treatment with proton beam this imaging technique can to generate some uncertainties. The purpose of this project is to study the feasibility of reconstructing images generated from the irradiation with proton beams, thereby reducing some inaccuracies, as it will be the same type of radiation as treatment planning, and also to drastically reduce some errors location, since the planning can be done at the same place and just before where the patient is treated. This study aims to obtain a relationship between the intrinsic property of the interaction of photons and protons with matter. For this we use computational simulation based on Monte Carlo method with the code SRIM 2008 and MCNPX v.2.5.0, to reconstruct images using the technique used in conventional computed tomography. (author)

  9. Monte Carlo and Analytical Calculation of Lateral Deflection of Proton Beams in Homogeneous Targets

    International Nuclear Information System (INIS)

    Proton radiation therapy is a precise form of radiation therapy, but the avoidance of damage to critical normal tissues and the prevention of geographical tumor misses require accurate knowledge of the dose delivered to the patient and the verification of his position demand a precise imaging technique. In proton therapy facilities, the X-ray Computed Tomography (xCT) is the preferred technique for the planning treatment of patients. This situation has been changing nowadays with the development of proton accelerators for health care and the increase in the number of treated patients. In fact, protons could be more efficient than xCT for this task. One essential difficulty in pCT image reconstruction systems came from the scattering of the protons inside the target due to the numerous small-angle deflections by nuclear Coulomb fields. The purpose of this study is the comparison of an analytical formulation for the determination of beam lateral deflection, based on Moliere's theory and Rutherford scattering with Monte Carlo calculations by SRIM 2008 and MCNPX codes.

  10. Comparison of linear energy transfer scoring techniques in Monte Carlo simulations of proton beams

    International Nuclear Information System (INIS)

    Monte Carlo (MC) simulations are commonly used to study linear energy transfer (LET) distributions in therapeutic proton beams. Various techniques have been used to score LET in MC simulations. The goal of this work was to compare LET distributions obtained using different LET scoring techniques and examine the sensitivity of these distributions to changes in commonly adjusted simulation parameters. We used three different techniques to score average proton LET in TOPAS, which is a MC platform based on the Geant4 simulation toolkit. We determined the sensitivity of each scoring technique to variations in the range production thresholds for secondary electrons and protons. We also compared the depth-LET distributions that we acquired using each technique in a simple monoenergetic proton beam and in a more clinically relevant modulated proton therapy beam. Distributions of both fluence-averaged LET (LETΦ) and dose-averaged LET (LETD) were studied. We found that LETD values varied more between different scoring techniques than the LETΦ values did, and different LET scoring techniques showed different sensitivities to changes in simulation parameters. (note)

  11. The Monte Carlo SRNA-VOX code for 3D proton dose distribution in voxelized geometry using CT data

    Energy Technology Data Exchange (ETDEWEB)

    Ilic, Radovan D [Laboratory of Physics (010), Vinca Institute of Nuclear Sciences, PO Box 522, 11001 Belgrade (Serbia and Montenegro); Spasic-Jokic, Vesna [Laboratory of Physics (010), Vinca Institute of Nuclear Sciences, PO Box 522, 11001 Belgrade (Serbia and Montenegro); Belicev, Petar [Laboratory of Physics (010), Vinca Institute of Nuclear Sciences, PO Box 522, 11001 Belgrade (Serbia and Montenegro); Dragovic, Milos [Center for Nuclear Medicine MEDICA NUCLEARE, Bulevar Despota Stefana 69, 11000 Belgrade (Serbia and Montenegro)

    2005-03-07

    This paper describes the application of the SRNA Monte Carlo package for proton transport simulations in complex geometry and different material compositions. The SRNA package was developed for 3D dose distribution calculation in proton therapy and dosimetry and it was based on the theory of multiple scattering. The decay of proton induced compound nuclei was simulated by the Russian MSDM model and our own using ICRU 63 data. The developed package consists of two codes: the SRNA-2KG, which simulates proton transport in combinatorial geometry and the SRNA-VOX, which uses the voxelized geometry using the CT data and conversion of the Hounsfield's data to tissue elemental composition. Transition probabilities for both codes are prepared by the SRNADAT code. The simulation of the proton beam characterization by multi-layer Faraday cup, spatial distribution of positron emitters obtained by the SRNA-2KG code and intercomparison of computational codes in radiation dosimetry, indicate immediate application of the Monte Carlo techniques in clinical practice. In this paper, we briefly present the physical model implemented in the SRNA package, the ISTAR proton dose planning software, as well as the results of the numerical experiments with proton beams to obtain 3D dose distribution in the eye and breast tumour.

  12. The Monte Carlo SRNA-VOX code for 3D proton dose distribution in voxelized geometry using CT data

    Science.gov (United States)

    Ilic, Radovan D.; Spasic-Jokic, Vesna; Belicev, Petar; Dragovic, Milos

    2005-03-01

    This paper describes the application of the SRNA Monte Carlo package for proton transport simulations in complex geometry and different material compositions. The SRNA package was developed for 3D dose distribution calculation in proton therapy and dosimetry and it was based on the theory of multiple scattering. The decay of proton induced compound nuclei was simulated by the Russian MSDM model and our own using ICRU 63 data. The developed package consists of two codes: the SRNA-2KG, which simulates proton transport in combinatorial geometry and the SRNA-VOX, which uses the voxelized geometry using the CT data and conversion of the Hounsfield's data to tissue elemental composition. Transition probabilities for both codes are prepared by the SRNADAT code. The simulation of the proton beam characterization by multi-layer Faraday cup, spatial distribution of positron emitters obtained by the SRNA-2KG code and intercomparison of computational codes in radiation dosimetry, indicate immediate application of the Monte Carlo techniques in clinical practice. In this paper, we briefly present the physical model implemented in the SRNA package, the ISTAR proton dose planning software, as well as the results of the numerical experiments with proton beams to obtain 3D dose distribution in the eye and breast tumour.

  13. The Monte Carlo SRNA-VOX code for 3D proton dose distribution in voxelized geometry using CT data

    International Nuclear Information System (INIS)

    This paper describes the application of the SRNA Monte Carlo package for proton transport simulations in complex geometry and different material compositions. The SRNA package was developed for 3D dose distribution calculation in proton therapy and dosimetry and it was based on the theory of multiple scattering. The decay of proton induced compound nuclei was simulated by the Russian MSDM model and our own using ICRU 63 data. The developed package consists of two codes: the SRNA-2KG, which simulates proton transport in combinatorial geometry and the SRNA-VOX, which uses the voxelized geometry using the CT data and conversion of the Hounsfield's data to tissue elemental composition. Transition probabilities for both codes are prepared by the SRNADAT code. The simulation of the proton beam characterization by multi-layer Faraday cup, spatial distribution of positron emitters obtained by the SRNA-2KG code and intercomparison of computational codes in radiation dosimetry, indicate immediate application of the Monte Carlo techniques in clinical practice. In this paper, we briefly present the physical model implemented in the SRNA package, the ISTAR proton dose planning software, as well as the results of the numerical experiments with proton beams to obtain 3D dose distribution in the eye and breast tumour

  14. The Monte Carlo srna code as the engine in istar proton dose planning software for the tesla accelerator installation

    Directory of Open Access Journals (Sweden)

    Ilić Radovan D.

    2004-01-01

    Full Text Available This paper describes the application of SRNA Monte Carlo package for proton transport simulations in complex geometry and different material composition. SRNA package was developed for 3D dose distribution calculation in proton therapy and dosimetry and it was based on the theory of multiple scattering. The compound nuclei decay was simulated by our own and the Russian MSDM models using ICRU 63 data. The developed package consists of two codes SRNA-2KG, which simulates proton transport in the combinatorial geometry and SRNA-VOX, which uses the voxelized geometry using the CT data and conversion of the Hounsfield’s data to tissue elemental composition. Transition probabilities for both codes are prepared by the SRNADAT code. The simulation of proton beam characterization by Multi-Layer Faraday Cup, spatial distribution of positron emitters obtained by SRNA-2KG code, and intercomparison of computational codes in radiation dosimetry, indicate the immediate application of the Monte Carlo techniques in clinical practice. In this paper, we briefly present the physical model implemented in SRNA pack age, the ISTAR proton dose planning software, as well as the results of the numerical experiments with proton beams to obtain 3D dose distribution in the eye and breast tumor.

  15. The Monte Carlo SRNA code as the engine in ISTAR proton dose planning software for the tesla accelerator installation

    International Nuclear Information System (INIS)

    This paper describes the application of SRNA Monte Carlo package for proton transport simulations in complex geometry and different material composition. SRNA package was developed for 3D dose distribution calculation in proton therapy and dosimetry and it was based on the theory of multiple scattering. The compound nuclei decay was simulated by our own and the Russian MSDM models using ICRU 63 data. The developed package consists of two codes: SRNA-2KG, which simulates proton transport in the combinatorial geometry and SRNA-VOX, which uses the voxelized geometry using the CT data and conversion of the Hounsfield's data to tissue elemental composition. Transition probabilities for both codes are prepared by the SRNADAT code. The simulation of proton beam characterization by Multi-Layer Faraday Cup, spatial distribution of positron emitters obtained by SRNA-2KG code, and intercomparison of computational codes in radiation dosimetry, indicate the immediate application of the Monte Carlo techniques in clinical practice. In this paper, we briefly present the physical model implemented in SRNA package, the ISTAR proton dose planning software, as well as the results of the numerical experiments with proton beams to obtain 3D dose distribution in the eye and breast tumor. (author)

  16. Transmission calculation by empirical numerical model and Monte Carlo simulation in high energy proton radiography of thick objects

    Science.gov (United States)

    Zheng, Na; Xu, Hai-Bo

    2015-10-01

    An empirical numerical model that includes nuclear absorption, multiple Coulomb scattering and energy loss is presented for the calculation of transmission through thick objects in high energy proton radiography. In this numerical model the angular distributions are treated as Gaussians in the laboratory frame. A Monte Carlo program based on the Geant4 toolkit was developed and used for high energy proton radiography experiment simulations and verification of the empirical numerical model. The two models are used to calculate the transmission fraction of carbon and lead step-wedges in proton radiography at 24 GeV/c, and to calculate radial transmission of the French Test Object in proton radiography at 24 GeV/c with different angular cuts. It is shown that the results of the two models agree with each other, and an analysis of the slight differences is given. Supported by NSAF (11176001) and Science and Technology Developing Foundation of China Academy of Engineering Physics (2012A0202006)

  17. Water equivalence of various materials for clinical proton dosimetry by experiment and Monte Carlo simulation

    Science.gov (United States)

    Al-Sulaiti, Leena; Shipley, David; Thomas, Russell; Kacperek, Andrzej; Regan, Patrick; Palmans, Hugo

    2010-07-01

    The accurate conversion of dose to various materials used in clinical proton dosimetry to dose-to-water is based on fluence correction factors, accounting for attenuation of primary protons and production of secondary particles due to non-elastic nuclear interactions. This work aims to investigate the depth dose distribution and the fluence correction with respect to water or graphite at water equivalent depths (WED) in different target materials relevant for dosimetry such as polymethyl methacrylate (PMMA), graphite, A-150, aluminium and copper at 60 and 200 MeV. This was done through a comparison between Monte Carlo simulation using MCNPX 2.5.0, analytical model calculations and experimental measurements at Clatterbridge Centre of Oncology (CCO) in a 60 MeV modulated and un-modulated proton beam. MCNPX simulations indicated small fluence corrections for all materials with respect to graphite and water in 60 and 200 MeV except for aluminium. The analytical calculations showed an increase in the fluence correction factor to a few percent for all materials with respect to water at 200 MeV. The experimental measurements for 60 MeV un-modulated beam indicated a good agreement with MCNPX. For the modulated beam the fluence correction factor was found to be decreasing below unity by up to few percent with depth for aluminium and copper but almost constant and unity for A-150.

  18. Monte Carlo calculation of lateral deflection of proton beams in homogeneous targets

    International Nuclear Information System (INIS)

    Proton radiation therapy is a precise form of radiation therapy, but the avoidance of damage to critical normal tissues and the prevention of geographical tumor misses require accurate knowledge of the dose delivered to the patient and the verification of his position demand a precise imaging technique . In proton therapy facilities, the x-ray computed tomography (xCT) is the preferred technique for the planning treatment of patients. This situation has been changing nowadays with the development of proton accelerators for health care and the increase in the number of treated patients. In fact, protons could be more efficient than xCT for this task . One essential difficulty in pCT image reconstruction systems came from the scattering of the protons inside the target due to the numerous small-angle deflections by nuclear Coulomb fields, because high energy proton are slowed down when they lose their kinetic energy, mainly through ionization of the medium during multiple Coulomb scatterings (MCS). The purpose of this study is to determine the lateral deflection of proton beam (pencil beam), in the range of energy between 100 MeV and 200 MeV with Monte Carlo calculations due MCNPX (Monte Carlo N-Particle eXtended) v2.50. (author)

  19. Shielding analysis of proton therapy accelerators: a demonstration using Monte Carlo-generated source terms and attenuation lengths.

    Science.gov (United States)

    Lai, Bo-Lun; Sheu, Rong-Jiun; Lin, Uei-Tyng

    2015-05-01

    Monte Carlo simulations are generally considered the most accurate method for complex accelerator shielding analysis. Simplified models based on point-source line-of-sight approximation are often preferable in practice because they are intuitive and easy to use. A set of shielding data, including source terms and attenuation lengths for several common targets (iron, graphite, tissue, and copper) and shielding materials (concrete, iron, and lead) were generated by performing Monte Carlo simulations for 100-300 MeV protons. Possible applications and a proper use of the data set were demonstrated through a practical case study, in which shielding analysis on a typical proton treatment room was conducted. A thorough and consistent comparison between the predictions of our point-source line-of-sight model and those obtained by Monte Carlo simulations for a 360° dose distribution around the room perimeter showed that the data set can yield fairly accurate or conservative estimates for the transmitted doses, except for those near the maze exit. In addition, this study demonstrated that appropriate coupling between the generated source term and empirical formulae for radiation streaming can be used to predict a reasonable dose distribution along the maze. This case study proved the effectiveness and advantage of applying the data set to a quick shielding design and dose evaluation for proton therapy accelerators. PMID:25811254

  20. Computed tomography with a low-intensity proton flux: results of a Monte Carlo simulation study

    Science.gov (United States)

    Schulte, Reinhard W.; Klock, Margio C. L.; Bashkirov, Vladimir; Evseev, Ivan G.; de Assis, Joaquim T.; Yevseyeva, Olga; Lopes, Ricardo T.; Li, Tianfang; Williams, David C.; Wroe, Andrew J.; Schelin, Hugo R.

    2004-10-01

    Conformal proton radiation therapy requires accurate prediction of the Bragg peak position. This problem may be solved by using protons rather than conventional x-rays to determine the relative electron density distribution via proton computed tomography (proton CT). However, proton CT has its own limitations, which need to be carefully studied before this technique can be introduced into routine clinical practice. In this work, we have used analytical relationships as well as the Monte Carlo simulation tool GEANT4 to study the principal resolution limits of proton CT. The GEANT4 simulations were validated by comparing them to predictions of the Bethe Bloch theory and Tschalar's theory of energy loss straggling, and were found to be in good agreement. The relationship between phantom thickness, initial energy, and the relative electron density uncertainty was systematically investigated to estimate the number of protons and dose needed to obtain a given density resolution. The predictions of this study were verified by simulating the performance of a hypothetical proton CT scanner when imaging a cylindrical water phantom with embedded density inhomogeneities. We show that a reasonable density resolution can be achieved with a relatively small number of protons, thus providing a possible dose advantage over x-ray CT.

  1. Monte Carlo study of secondary electron production from gold nanoparticle in proton beam irradiation

    Directory of Open Access Journals (Sweden)

    Jeff Gao

    2014-03-01

    Full Text Available Purpose: In this study, we examined some characteristics of secondary electrons produced by gold nanoparticle (NP during proton beam irradiation.Method: By using the Geant4 Monte Carlo simulation toolkit, we simulated the NP at the range from radius (r of 17.5 nm, 25 nm, 35 nm to r = 50 nm. The proton beam energies used were 20MeV, 50MeV, and 100MeV. Findings on secondary electron production and their average kinetic energy  are presented in this paper. Results: Firstly, for NP with a finite size, the secondary electron production increase with decreasing incident proton beam energy and secondary buildup existed outside NP. Secondly, the average kinetic energy of secondary electrons produced by a gold NP increased with incident proton beam energy. Thirdly, the larger the NP size, the more the secondary electron production.Conclusion: Collectively, our results suggest that apart from biological uptake efficiency, we should take the secondary electron production effect into   account when considering the potential use of NPs in proton beam irradiation.-----------------------------------------------Cite this article as: Gao J, Zheng Y. Monte Carlo study of secondary electron production from gold nanoparticle in proton beam irradiation. Int J  Cancer Ther Oncol 2014; 2(2:02025.DOI: http://dx.doi.org/10.14319/ijcto.0202.5

  2. The proton therapy nozzles at Samsung Medical Center: A Monte Carlo simulation study using TOPAS

    Science.gov (United States)

    Chung, Kwangzoo; Kim, Jinsung; Kim, Dae-Hyun; Ahn, Sunghwan; Han, Youngyih

    2015-07-01

    To expedite the commissioning process of the proton therapy system at Samsung Medical Center (SMC), we have developed a Monte Carlo simulation model of the proton therapy nozzles by using TOol for PArticle Simulation (TOPAS). At SMC proton therapy center, we have two gantry rooms with different types of nozzles: a multi-purpose nozzle and a dedicated scanning nozzle. Each nozzle has been modeled in detail following the geometry information provided by the manufacturer, Sumitomo Heavy Industries, Ltd. For this purpose, the novel features of TOPAS, such as the time feature or the ridge filter class, have been used, and the appropriate physics models for proton nozzle simulation have been defined. Dosimetric properties, like percent depth dose curve, spreadout Bragg peak (SOBP), and beam spot size, have been simulated and verified against measured beam data. Beyond the Monte Carlo nozzle modeling, we have developed an interface between TOPAS and the treatment planning system (TPS), RayStation. An exported radiotherapy (RT) plan from the TPS is interpreted by using an interface and is then translated into the TOPAS input text. The developed Monte Carlo nozzle model can be used to estimate the non-beam performance, such as the neutron background, of the nozzles. Furthermore, the nozzle model can be used to study the mechanical optimization of the design of the nozzle.

  3. Comparison of some popular Monte Carlo solution for proton transportation within pCT problem

    Energy Technology Data Exchange (ETDEWEB)

    Evseev, Ivan; Assis, Joaquim T. de; Yevseyeva, Olga [Universidade do Estado do Rio de Janeiro (UERJ), Nova Friburgo, RJ (Brazil). Inst. Politecnico], E-mail: evseev@iprj.uerj.br, E-mail: joaquim@iprj.uerj.br, E-mail: yevseyeva@iprj.uerj.br; Lopes, Ricardo T.; Cardoso, Jose J.B.; Silva, Ademir X. da [Universidade Federal do Rio de Janeiro (UFRJ), RJ (Brazil). Coordenacao dos Programas de Pos-graduacao de Engenharia (COPPE). Lab. de Instrumentacao Nuclear], E-mail: ricardo@lin.ufrj.br, E-mail: jjbrum@oi.com.br, E-mail: ademir@con.ufrj.br; Vinagre Filho, Ubirajara M. [Instituto de Engenharia Nuclear IEN/CNEN-RJ, Rio de Janeiro, RJ (Brazil)], E-mail: bira@ien.gov.br; Hormaza, Joel M. [UNESP, Botucatu, SP (Brazil). Inst. de Biociencias], E-mail: jmesa@ibb.unesp.br; Schelin, Hugo R.; Paschuk, Sergei A.; Setti, Joao A.P.; Milhoretto, Edney [Universidade Tecnologica Federal do Parana, Curitiba, PR (Brazil)], E-mail: schelin@cpgei.cefetpr.br, E-mail: sergei@utfpr.edu.br, E-mail: jsetti@gmail.com, E-mail: edneymilhoretto@yahoo.com

    2007-07-01

    The proton transport in matter is described by the Boltzmann kinetic equation for the proton flux density. This equation, however, does not have a general analytical solution. Some approximate analytical solutions have been developed within a number of significant simplifications. Alternatively, the Monte Carlo simulations are widely used. Current work is devoted to the discussion of the proton energy spectra obtained by simulation with SRIM2006, GEANT4 and MCNPX packages. The simulations have been performed considering some further applications of the obtained results in computed tomography with proton beam (pCT). Thus the initial and outgoing proton energies (3 / 300 MeV) as well as the thickness of irradiated target (water and aluminum phantoms within 90% of the full range for a given proton beam energy) were considered in the interval of values typical for pCT applications. One from the most interesting results of this comparison is that while the MCNPX spectra are in a good agreement with analytical description within Fokker-Plank approximation and the GEANT4 simulated spectra are slightly shifted from them the SRIM2006 simulations predict a notably higher mean energy loss for protons. (author)

  4. Monte Carlo evaluation of the Filtered Back Projection method for image reconstruction in proton computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Cirrone, G.A.P., E-mail: cirrone@lns.infn.it [Laboratori Nazionali del Sud - National Instiute for Nuclear Physics INFN (INFN-LNS), Via S.Sofia 64, 95100 Catania (Italy); Bucciolini, M. [Department of ' Fisiopatologia Clinica' , University of Florence, V.le Morgagni 85, I-50134 Florence (Italy); Bruzzi, M. [Energetic Department, University of Florence, Via S. Marta 3, I-50139 Florence (Italy); Candiano, G. [Laboratorio di Tecnologie Oncologiche HSR, Giglio Contrada, Pietrapollastra-Pisciotto, 90015 Cefalu, Palermo (Italy); Civinini, C. [National Institute for Nuclear Physics INFN, Section of Florence, Via G. Sansone 1, Sesto Fiorentino, I-50019 Florence (Italy); Cuttone, G. [Laboratori Nazionali del Sud - National Instiute for Nuclear Physics INFN (INFN-LNS), Via S.Sofia 64, 95100 Catania (Italy); Guarino, P. [Nuclear Engineering Department, University of Palermo, Via... Palermo (Italy); Laboratori Nazionali del Sud - National Instiute for Nuclear Physics INFN (INFN-LNS), Via S.Sofia 64, 95100 Catania (Italy); Lo Presti, D. [Physics Department, University of Catania, Via S. Sofia 64, I-95123, Catania (Italy); Mazzaglia, S.E. [Laboratori Nazionali del Sud - National Instiute for Nuclear Physics INFN (INFN-LNS), Via S.Sofia 64, 95100 Catania (Italy); Pallotta, S. [Department of ' Fisiopatologia Clinica' , University of Florence, V.le Morgagni 85, I-50134 Florence (Italy); Randazzo, N. [National Institute for Nuclear Physics INFN, Section of Catania, Via S.Sofia 64, 95123 Catania (Italy); Sipala, V. [National Institute for Nuclear Physics INFN, Section of Catania, Via S.Sofia 64, 95123 Catania (Italy); Physics Department, University of Catania, Via S. Sofia 64, I-95123, Catania (Italy); Stancampiano, C. [National Institute for Nuclear Physics INFN, Section of Catania, Via S.Sofia 64, 95123 Catania (Italy); and others

    2011-12-01

    In this paper the use of the Filtered Back Projection (FBP) Algorithm, in order to reconstruct tomographic images using the high energy (200-250 MeV) proton beams, is investigated. The algorithm has been studied in detail with a Monte Carlo approach and image quality has been analysed and compared with the total absorbed dose. A proton Computed Tomography (pCT) apparatus, developed by our group, has been fully simulated to exploit the power of the Geant4 Monte Carlo toolkit. From the simulation of the apparatus, a set of tomographic images of a test phantom has been reconstructed using the FBP at different absorbed dose values. The images have been evaluated in terms of homogeneity, noise, contrast, spatial and density resolution.

  5. Monte Carlo evaluation of the Filtered Back Projection method for image reconstruction in proton computed tomography

    International Nuclear Information System (INIS)

    In this paper the use of the Filtered Back Projection (FBP) Algorithm, in order to reconstruct tomographic images using the high energy (200-250 MeV) proton beams, is investigated. The algorithm has been studied in detail with a Monte Carlo approach and image quality has been analysed and compared with the total absorbed dose. A proton Computed Tomography (pCT) apparatus, developed by our group, has been fully simulated to exploit the power of the Geant4 Monte Carlo toolkit. From the simulation of the apparatus, a set of tomographic images of a test phantom has been reconstructed using the FBP at different absorbed dose values. The images have been evaluated in terms of homogeneity, noise, contrast, spatial and density resolution.

  6. Epistemic and systematic uncertainties in Monte Carlo simulation: an investigation in proton Bragg peak simulation

    OpenAIRE

    Maria Grazia PiaINFN Sezione di Genova; Marcia BegalliState University Rio de Janeiro; Anton LechnerVienna University of Technology; Lina QuintieriINFN Laboratori Nazionali di Frascati; Paolo SaraccoINFN Sezione di Genova

    2014-01-01

    The issue of how epistemic uncertainties affect the outcome of Monte Carlo simulation is discussed by means of a concrete use case: the simulation of the longitudinal energy deposition profile of low energy protons. A variety of electromagnetic and hadronic physics models is investigated, and their effects are analyzed. Possible systematic effects are highlighted. The results identify requirements for experimental measurements capable of reducing epistemic uncertainties in the physics models.

  7. Epistemic and systematic uncertainties in Monte Carlo simulation: an investigation in proton Bragg peak simulation

    CERN Document Server

    Pia, Maria Grazia; Lechner, Anton; Quintieri, Lina; Saracco, Paolo

    2010-01-01

    The issue of how epistemic uncertainties affect the outcome of Monte Carlo simulation is discussed by means of a concrete use case: the simulation of the longitudinal energy deposition profile of low energy protons. A variety of electromagnetic and hadronic physics models is investigated, and their effects are analyzed. Possible systematic effects are highlighted. The results identify requirements for experimental measurements capable of reducing epistemic uncertainties in the physics models.

  8. Feasibility Study of Neutron Dose for Real Time Image Guided Proton Therapy: A Monte Carlo Study

    OpenAIRE

    Kim, Jin Sung; Shin, Jung Suk; Kim, Daehyun; Shin, EunHyuk; Chung, Kwangzoo; Cho, Sungkoo; Ahn, Sung Hwan; Ju, Sanggyu; Chung, Yoonsun; Jung, Sang Hoon; Han, Youngyih

    2015-01-01

    Two full rotating gantry with different nozzles (Multipurpose nozzle with MLC, Scanning Dedicated nozzle) with conventional cyclotron system is installed and under commissioning for various proton treatment options at Samsung Medical Center in Korea. The purpose of this study is to investigate neutron dose equivalent per therapeutic dose, H/D, to x-ray imaging equipment under various treatment conditions with monte carlo simulation. At first, we investigated H/D with the various modifications...

  9. Monte-Carlo approach to calculate the proton stopping in warm dense matter within particle-in-cell simulations

    OpenAIRE

    Wu, D; X. T. He; Yu, W.; Fritzsche, S.

    2016-01-01

    A Monte-Carlo approach to proton stopping in warm dense matter is implemented into an existing particle-in-cell code. The model is based on multiple binary-collisions among electron-electron, electron-ion and ion-ion, taking into account contributions from both free and bound electrons, and allows to calculate particle stopping in much more natural manner. At low temperature limit, when ``all'' electron are bounded at the nucleus, the stopping power converges to the predictions of Bethe-Bloch...

  10. The Proton Therapy Nozzles at Samsung Medical Center: A Monte Carlo Simulation Study using TOPAS

    CERN Document Server

    Chung, Kwangzoo; Kim, Dae-Hyun; Ahn, Sunghwan; Han, Youngyih

    2015-01-01

    To expedite the commissioning process of the proton therapy system at Samsung Medical Center (SMC), we have developed a Monte Carlo simulation model of the proton therapy nozzles using TOPAS. At SMC proton therapy center, we have two gantry rooms with different types of nozzles; a multi-purpose nozzle and a dedicated scanning nozzle. Each nozzle has been modeled in detail following the geometry information provided by the manufacturer, Sumitomo Heavy Industries, Ltd. For this purpose, novel features of TOPAS, such as the time feature or the ridge filter class, have been used. And the appropriate physics models for proton nozzle simulation were defined. Dosimetric properties, like percent depth dose curve, spread-out Bragg peak (SOBP), beam spot size, have been simulated and verified against measured beam data. Beyond the Monte Carlo nozzle modeling, we have developed an interface between TOPAS and the treatment planning system (TPS), RayStation. An exported RT plan data from the TPS has been interpreted by th...

  11. Monte Carlo calculated stopping power ratio water/air for clinical proton therapy

    International Nuclear Information System (INIS)

    In order to compute stopping-power ratios water/air for use in clinical proton dosimetry a Monte Carlo code has been developed. The main difference between the present code and other codes for proton transport is the inclusion of the detailed production of secondary electrons along the proton track. For this purpose the code is a Class-II type, where single proton-electron collisions yielding energy losses larger than a specific cut-off are considered individually. Proton multiple scattering is sampled from the complete Moliere distribution. To take into account in an approximate way the effect of inelastic nuclear collisions the fraction of the incident energy that is converted to kinetic energy of charged particles in the interaction is deposited on the spot. The energy that goes to neutral particles is assumed to leave the scoring geometry without any energy deposition. Stopping-power ratios are calculated in-line, i.e. during the transport, thereby reducing the uncertainty of the calculated value. The production and transport of the secondary electrons is used to determine an additional contribution to the stopping-power ratios obtained using the proton spectra alone

  12. Intensity modulated radiation therapy using laser-accelerated protons: a Monte Carlo dosimetric study

    International Nuclear Information System (INIS)

    In this paper we present Monte Carlo studies of intensity modulated radiation therapy using laser-accelerated proton beams. Laser-accelerated protons coming out of a solid high-density target have broad energy and angular spectra leading to dose distributions that cannot be directly used for therapeutic applications. Through the introduction of a spectrometer-like particle selection system that delivers small pencil beams of protons with desired energy spectra it is feasible to use laser-accelerated protons for intensity modulated radiotherapy. The method presented in this paper is a three-dimensional modulation in which the proton energy spectrum and intensity of each individual beamlet are modulated to yield a homogeneous dose in both the longitudinal and lateral directions. As an evaluation of the efficacy of this method, it has been applied to two prostate cases using a variety of beam arrangements. We have performed a comparison study between intensity modulated photon plans and those for laser-accelerated protons. For identical beam arrangements and the same optimization parameters, proton plans exhibit superior coverage of the target and sparing of neighbouring critical structures. Dose-volume histogram analysis of the resulting dose distributions shows up to 50% reduction of dose to the critical structures. As the number of fields is decreased, the proton modality exhibits a better preservation of the optimization requirements on the target and critical structures. It is shown that for a two-beam arrangement (parallel-opposed) it is possible to achieve both superior target coverage with 5% dose inhomogeneity within the target and excellent sparing of surrounding tissue

  13. Coevolution Based Adaptive Monte Carlo Localization (CEAMCL

    Directory of Open Access Journals (Sweden)

    Luo Ronghua

    2008-11-01

    Full Text Available An adaptive Monte Carlo localization algorithm based on coevolution mechanism of ecological species is proposed. Samples are clustered into species, each of which represents a hypothesis of the robot's pose. Since the coevolution between the species ensures that the multiple distinct hypotheses can be tracked stably, the problem of premature convergence when using MCL in highly symmetric environments can be solved. And the sample size can be adjusted adaptively over time according to the uncertainty of the robot's pose by using the population growth model. In addition, by using the crossover and mutation operators in evolutionary computation, intra-species evolution can drive the samples move towards the regions where the desired posterior density is large. So a small size of samples can represent the desired density well enough to make precise localization. The new algorithm is termed coevolution based adaptive Monte Carlo localization (CEAMCL. Experiments have been carried out to prove the efficiency of the new localization algorithm.

  14. Coevolution Based Adaptive Monte Carlo Localization (CEAMCL)

    OpenAIRE

    Luo Ronghua; Hong Bingrong

    2004-01-01

    An adaptive Monte Carlo localization algorithm based on coevolution mechanism of ecological species is proposed. Samples are clustered into species, each of which represents a hypothesis of the robot's pose. Since the coevolution between the species ensures that the multiple distinct hypotheses can be tracked stably, the problem of premature convergence when using MCL in highly symmetric environments can be solved. And the sample size can be adjusted adaptively over time according to the unce...

  15. SU-E-T-586: Field Size Dependence of Output Factor for Uniform Scanning Proton Beams: A Comparison of TPS Calculation, Measurement and Monte Carlo Simulation

    International Nuclear Information System (INIS)

    Purpose: Output dependence on field size for uniform scanning beams, and the accuracy of treatment planning system (TPS) calculation are not well studied. The purpose of this work is to investigate the dependence of output on field size for uniform scanning beams and compare it among TPS calculation, measurements and Monte Carlo simulations. Methods: Field size dependence was studied using various field sizes between 2.5 cm diameter to 10 cm diameter. The field size factor was studied for a number of proton range and modulation combinations based on output at the center of spread out Bragg peak normalized to a 10 cm diameter field. Three methods were used and compared in this study: 1) TPS calculation, 2) ionization chamber measurement, and 3) Monte Carlos simulation. The XiO TPS (Electa, St. Louis) was used to calculate the output factor using a pencil beam algorithm; a pinpoint ionization chamber was used for measurements; and the Fluka code was used for Monte Carlo simulations. Results: The field size factor varied with proton beam parameters, such as range, modulation, and calibration depth, and could decrease over 10% from a 10 cm to 3 cm diameter field for a large range proton beam. The XiO TPS predicted the field size factor relatively well at large field size, but could differ from measurements by 5% or more for small field and large range beams. Monte Carlo simulations predicted the field size factor within 1.5% of measurements. Conclusion: Output factor can vary largely with field size, and needs to be accounted for accurate proton beam delivery. This is especially important for small field beams such as in stereotactic proton therapy, where the field size dependence is large and TPS calculation is inaccurate. Measurements or Monte Carlo simulations are recommended for output determination for such cases

  16. Shielding properties of iron at high energy proton accelerators studied by a Monte Carlo code

    International Nuclear Information System (INIS)

    Shielding properties of a lateral iron shield and of iron and concrete shields at angles between 5deg and 30deg are studied by means of the Monte Carlo program FLUNEV (DESY-D3 version of the FLUKA code extended for emission and transport of low energy neutrons). The following quantities were calculated for a high energy proton beam hitting an extended iron target: total and partial dose equivalents, attenuation coefficients, neutron spectra, star densities (compared also with the CASIM code) and quality factors. The dependence of the dose equivalent on the energy of primary protons, the effect of a concrete layer behind a lateral iron shielding and the total number of neutrons produced in the target were also estimated. (orig.)

  17. TH-A-19A-10: Fast Four Dimensional Monte Carlo Dose Computations for Proton Therapy of Lung Cancer

    International Nuclear Information System (INIS)

    Purpose: To develop and validate a fast and accurate four dimensional (4D) Monte Carlo (MC) dose computation system for proton therapy of lung cancer and other thoracic and abdominal malignancies in which the delivered dose distributions can be affected by respiratory motion of the patient. Methods: A 4D computer tomography (CT) scan for a lung cancer patient treated with protons in our clinic was used to create a time dependent patient model using our in-house, MCNPX-based Monte Carlo system (“MC2”). The beam line configurations for two passively scattered proton beams used in the actual treatment were extracted from the clinical treatment plan and a set of input files was created automatically using MC2. A full MC simulation of the beam line was computed using MCNPX and a set of phase space files for each beam was collected at the distal surface of the range compensator. The particles from these phase space files were transported through the 10 voxelized patient models corresponding to the 10 phases of the breathing cycle in the 4DCT, using MCNPX and an accelerated (fast) MC code called “FDC”, developed by us and which is based on the track repeating algorithm. The accuracy of the fast algorithm was assessed by comparing the two time dependent dose distributions. Results: The error of less than 1% in 100% of the voxels in all phases of the breathing cycle was achieved using this method with a speedup of more than 1000 times. Conclusion: The proposed method, which uses full MC to simulate the beam line and the accelerated MC code FDC for the time consuming particle transport inside the complex, time dependent, geometry of the patient shows excellent accuracy together with an extraordinary speed

  18. TH-A-19A-10: Fast Four Dimensional Monte Carlo Dose Computations for Proton Therapy of Lung Cancer

    Energy Technology Data Exchange (ETDEWEB)

    Mirkovic, D; Titt, U; Mohan, R [U.T M.D. Anderson Cancer Center, Houston, TX (United States); Yepes, P [Rice University, Houston, TX (United States)

    2014-06-15

    Purpose: To develop and validate a fast and accurate four dimensional (4D) Monte Carlo (MC) dose computation system for proton therapy of lung cancer and other thoracic and abdominal malignancies in which the delivered dose distributions can be affected by respiratory motion of the patient. Methods: A 4D computer tomography (CT) scan for a lung cancer patient treated with protons in our clinic was used to create a time dependent patient model using our in-house, MCNPX-based Monte Carlo system (“MC{sup 2}”). The beam line configurations for two passively scattered proton beams used in the actual treatment were extracted from the clinical treatment plan and a set of input files was created automatically using MC{sup 2}. A full MC simulation of the beam line was computed using MCNPX and a set of phase space files for each beam was collected at the distal surface of the range compensator. The particles from these phase space files were transported through the 10 voxelized patient models corresponding to the 10 phases of the breathing cycle in the 4DCT, using MCNPX and an accelerated (fast) MC code called “FDC”, developed by us and which is based on the track repeating algorithm. The accuracy of the fast algorithm was assessed by comparing the two time dependent dose distributions. Results: The error of less than 1% in 100% of the voxels in all phases of the breathing cycle was achieved using this method with a speedup of more than 1000 times. Conclusion: The proposed method, which uses full MC to simulate the beam line and the accelerated MC code FDC for the time consuming particle transport inside the complex, time dependent, geometry of the patient shows excellent accuracy together with an extraordinary speed.

  19. Predicting image blur in proton radiography: comparisons between measurements and Monte Carlo simulations

    Energy Technology Data Exchange (ETDEWEB)

    von Wittenau, A; Aufderheide, M B; Henderson, G L

    2010-05-07

    Given the cost and lead-times involved in high-energy proton radiography, it is prudent to model proposed radiographic experiments to see if the images predicted would return useful information. We recently modified our raytracing transmission radiography modeling code HADES to perform simplified Monte Carlo simulations of the transport of protons in a proton radiography beamline. Beamline objects include the initial diffuser, vacuum magnetic fields, windows, angle-selecting collimators, and objects described as distorted 2D (planar or cylindrical) meshes or as distorted 3D hexahedral meshes. We present an overview of the algorithms used for the modeling and code timings for simulations through typical 2D and 3D meshes. We next calculate expected changes in image blur as scattering materials are placed upstream and downstream of a resolution test object (a 3 mm thick sheet of tantalum, into which 0.4 mm wide slits have been cut), and as the current supplied to the focusing magnets is varied. We compare and contrast the resulting simulations with the results of measurements obtained at the 800 MeV Los Alamos LANSCE Line-C proton radiography facility.

  20. Predicting image blur in proton radiography: Comparisons between measurements and Monte Carlo simulations

    Energy Technology Data Exchange (ETDEWEB)

    Schach von Wittenau, Alexis E., E-mail: schachvonwittenau1@llnl.gov [Lawrence Livermore National Laboratory, Livermore, CA 94551 (United States); Aufderheide, Maurice; Henderson, Gary [Lawrence Livermore National Laboratory, Livermore, CA 94551 (United States)

    2011-10-01

    Given the cost and lead-times involved in high-energy proton radiography, it is prudent to model proposed radiographic experiments to see if the images predicted would return useful information. We recently modified our raytracing transmission radiography modeling code HADES to perform simplified Monte Carlo simulations of the transport of protons in a proton radiography beamline. Beamline objects include the initial diffuser, vacuum magnetic fields, windows, angle-selecting collimators, and objects described as distorted 2D (planar or cylindrical) meshes or as distorted 3D hexahedral meshes. We describe the algorithms used for simulations through typical 2D and 3D meshes. We calculate expected changes in image blur as scattering materials are placed upstream and downstream of a resolution test object (a 3 mm thick sheet of tantalum, into which 0.4 mm wide slits have been cut), and as the current supplied to the focusing magnets is varied. We compare and contrast the resulting simulations with the results of measurements obtained at the 800 MeV Los Alamos LANSCE Line-C proton radiography facility.

  1. Monte Carlo approach for hadron azimuthal correlations in high energy proton and nuclear collisions

    CERN Document Server

    Ayala, Alejandro; Jalilian-Marian, Jamal; Magnin, J; Tejeda-Yeomans, Maria Elena

    2012-01-01

    We use a Monte Carlo approach to study hadron azimuthal angular correlations in high energy proton-proton and central nucleus-nucleus collisions at the BNL Relativistic Heavy Ion Collider (RHIC) energies at mid-rapidity. We build a hadron event generator that incorporates the production of $2\\to 2$ and $2\\to 3$ parton processes and their evolution into hadron states. For nucleus-nucleus collisions we include the effect of parton energy loss in the Quark-Gluon Plasma using a modified fragmentation function approach. In the presence of the medium, for the case when three partons are produced in the hard scattering, we analyze the Monte Carlo sample in parton and hadron momentum bins to reconstruct the angular correlations. We characterize this sample by the number of partons that are able to hadronize by fragmentation within the selected bins. In the nuclear environment the model allows hadronization by fragmentation only for partons with momentum above a threshold $p_T^{{\\tiny{thresh}}}=2.4$ GeV. We argue that...

  2. Monte Carlo approach for hadron azimuthal correlations in high energy proton and nuclear collisions

    Science.gov (United States)

    Ayala, Alejandro; Dominguez, Isabel; Jalilian-Marian, Jamal; Magnin, J.; Tejeda-Yeomans, Maria Elena

    2012-09-01

    We use a Monte Carlo approach to study hadron azimuthal angular correlations in high-energy proton-proton and central nucleus-nucleus collisions at the BNL Relativistic Heavy Ion Collider energies at midrapidity. We build a hadron event generator that incorporates the production of 2→2 and 2→3 parton processes and their evolution into hadron states. For nucleus-nucleus collisions we include the effect of parton energy loss in the quark-gluon plasma using a modified fragmentation function approach. In the presence of the medium, for the case when three partons are produced in the hard scattering, we analyze the Monte Carlo sample in parton and hadron momentum bins to reconstruct the angular correlations. We characterize this sample by the number of partons that are able to hadronize by fragmentation within the selected bins. In the nuclear environment the model allows hadronization by fragmentation only for partons with momentum above a threshold pTthresh=2.4 GeV. We argue that one should treat properly the effect of those partons with momentum below the threshold, because their interaction with the medium may lead to showers of low-momentum hadrons along the direction of motion of the original partons as the medium becomes diluted.

  3. A pencil beam algorithm for intensity modulated proton therapy derived from Monte Carlo simulations.

    Science.gov (United States)

    Soukup, Martin; Fippel, Matthias; Alber, Markus

    2005-11-01

    A pencil beam algorithm as a component of an optimization algorithm for intensity modulated proton therapy (IMPT) is presented. The pencil beam algorithm is tuned to the special accuracy requirements of IMPT, where in heterogeneous geometries both the position and distortion of the Bragg peak and the lateral scatter pose problems which are amplified by the spot weight optimization. Heterogeneity corrections are implemented by a multiple raytracing approach using fluence-weighted sub-spots. In order to derive nuclear interaction corrections, Monte Carlo simulations were performed. The contribution of long ranged products of nuclear interactions is taken into account by a fit to the Monte Carlo results. Energy-dependent stopping power ratios are also implemented. Scatter in optional beam line accessories such as range shifters or ripple filters is taken into account. The collimator can also be included, but without additional scattering. Finally, dose distributions are benchmarked against Monte Carlo simulations, showing 3%/1 mm agreement for simple heterogeneous phantoms. In the case of more complicated phantoms, principal shortcomings of pencil beam algorithms are evident. The influence of these effects on IMPT dose distributions is shown in clinical examples. PMID:16237243

  4. Development and verification of an analytical algorithm to predict absorbed dose distributions in ocular proton therapy using Monte Carlo simulations

    International Nuclear Information System (INIS)

    Proton beam radiotherapy is an effective and non-invasive treatment for uveal melanoma. Recent research efforts have focused on improving the dosimetric accuracy of treatment planning and overcoming the present limitation of relative analytical dose calculations. Monte Carlo algorithms have been shown to accurately predict dose per monitor unit (D/MU) values, but this has yet to be shown for analytical algorithms dedicated to ocular proton therapy, which are typically less computationally expensive than Monte Carlo algorithms. The objective of this study was to determine if an analytical method could predict absolute dose distributions and D/MU values for a variety of treatment fields like those used in ocular proton therapy. To accomplish this objective, we used a previously validated Monte Carlo model of an ocular nozzle to develop an analytical algorithm to predict three-dimensional distributions of D/MU values from pristine Bragg peaks and therapeutically useful spread-out Bragg peaks (SOBPs). Results demonstrated generally good agreement between the analytical and Monte Carlo absolute dose calculations. While agreement in the proximal region decreased for beams with less penetrating Bragg peaks compared with the open-beam condition, the difference was shown to be largely attributable to edge-scattered protons. A method for including this effect in any future analytical algorithm was proposed. Comparisons of D/MU values showed typical agreement to within 0.5%. We conclude that analytical algorithms can be employed to accurately predict absolute proton dose distributions delivered by an ocular nozzle.

  5. Effect of elemental compositions on Monte Carlo dose calculations in proton therapy of eye tumors

    Science.gov (United States)

    Rasouli, Fatemeh S.; Farhad Masoudi, S.; Keshazare, Shiva; Jette, David

    2015-12-01

    Recent studies in eye plaque brachytherapy have found considerable differences between the dosimetric results by using a water phantom, and a complete human eye model. Since the eye continues to be simulated as water-equivalent tissue in the proton therapy literature, a similar study for investigating such a difference in treating eye tumors by protons is indispensable. The present study inquires into this effect in proton therapy utilizing Monte Carlo simulations. A three-dimensional eye model with elemental compositions is simulated and used to examine the dose deposition to the phantom. The beam is planned to pass through a designed beam line to moderate the protons to the desired energies for ocular treatments. The results are compared with similar irradiation to a water phantom, as well as to a material with uniform density throughout the whole volume. Spread-out Bragg peaks (SOBPs) are created by adding pristine peaks to cover a typical tumor volume. Moreover, the corresponding beam parameters recommended by the ICRU are calculated, and the isodose curves are computed. The results show that the maximum dose deposited in ocular media is approximately 5-7% more than in the water phantom, and about 1-1.5% less than in the homogenized material of density 1.05 g cm-3. Furthermore, there is about a 0.2 mm shift in the Bragg peak due to the tissue composition difference between the models. It is found that using the weighted dose profiles optimized in a water phantom for the realistic eye model leads to a small disturbance of the SOBP plateau dose. In spite of the plaque brachytherapy results for treatment of eye tumors, it is found that the differences between the simplified models presented in this work, especially the phantom containing the homogenized material, are not clinically significant in proton therapy. Taking into account the intrinsic uncertainty of the patient dose calculation for protons, and practical problems corresponding to applying patient

  6. A Monte Carlo pencil beam scanning model for proton treatment plan simulation using GATE/GEANT4

    Energy Technology Data Exchange (ETDEWEB)

    Grevillot, L; Freud, N; Sarrut, D [Universite de Lyon, CREATIS, CNRS UMR5220, Inserm U1044, INSA-Lyon, Universite Lyon 1, Centre Leon Berard, Lyon (France); Bertrand, D; Dessy, F, E-mail: loic.grevillot@creatis.insa-lyon.fr [IBA, B-1348, Louvain-la Neuve (Belgium)

    2011-08-21

    This work proposes a generic method for modeling scanned ion beam delivery systems, without simulation of the treatment nozzle and based exclusively on beam data library (BDL) measurements required for treatment planning systems (TPS). To this aim, new tools dedicated to treatment plan simulation were implemented in the Gate Monte Carlo platform. The method was applied to a dedicated nozzle from IBA for proton pencil beam scanning delivery. Optical and energy parameters of the system were modeled using a set of proton depth-dose profiles and spot sizes measured at 27 therapeutic energies. For further validation of the beam model, specific 2D and 3D plans were produced and then measured with appropriate dosimetric tools. Dose contributions from secondary particles produced by nuclear interactions were also investigated using field size factor experiments. Pristine Bragg peaks were reproduced with 0.7 mm range and 0.2 mm spot size accuracy. A 32 cm range spread-out Bragg peak with 10 cm modulation was reproduced with 0.8 mm range accuracy and a maximum point-to-point dose difference of less than 2%. A 2D test pattern consisting of a combination of homogeneous and high-gradient dose regions passed a 2%/2 mm gamma index comparison for 97% of the points. In conclusion, the generic modeling method proposed for scanned ion beam delivery systems was applicable to an IBA proton therapy system. The key advantage of the method is that it only requires BDL measurements of the system. The validation tests performed so far demonstrated that the beam model achieves clinical performance, paving the way for further studies toward TPS benchmarking. The method involves new sources that are available in the new Gate release V6.1 and could be further applied to other particle therapy systems delivering protons or other types of ions like carbon.

  7. SU-E-T-314: The Application of Cloud Computing in Pencil Beam Scanning Proton Therapy Monte Carlo Simulation

    International Nuclear Information System (INIS)

    Purpose: Monte Carlo simulation plays an important role for proton Pencil Beam Scanning (PBS) technique. However, MC simulation demands high computing power and is limited to few large proton centers that can afford a computer cluster. We study the feasibility of utilizing cloud computing in the MC simulation of PBS beams. Methods: A GATE/GEANT4 based MC simulation software was installed on a commercial cloud computing virtual machine (Linux 64-bits, Amazon EC2). Single spot Integral Depth Dose (IDD) curves and in-air transverse profiles were used to tune the source parameters to simulate an IBA machine. With the use of StarCluster software developed at MIT, a Linux cluster with 2–100 nodes can be conveniently launched in the cloud. A proton PBS plan was then exported to the cloud where the MC simulation was run. Results: The simulated PBS plan has a field size of 10×10cm2, 20cm range, 10cm modulation, and contains over 10,000 beam spots. EC2 instance type m1.medium was selected considering the CPU/memory requirement and 40 instances were used to form a Linux cluster. To minimize cost, master node was created with on-demand instance and worker nodes were created with spot-instance. The hourly cost for the 40-node cluster was $0.63 and the projected cost for a 100-node cluster was $1.41. Ten million events were simulated to plot PDD and profile, with each job containing 500k events. The simulation completed within 1 hour and an overall statistical uncertainty of < 2% was achieved. Good agreement between MC simulation and measurement was observed. Conclusion: Cloud computing is a cost-effective and easy to maintain platform to run proton PBS MC simulation. When proton MC packages such as GATE and TOPAS are combined with cloud computing, it will greatly facilitate the pursuing of PBS MC studies, especially for newly established proton centers or individual researchers

  8. Accelerated GPU based SPECT Monte Carlo simulations.

    Science.gov (United States)

    Garcia, Marie-Paule; Bert, Julien; Benoit, Didier; Bardiès, Manuel; Visvikis, Dimitris

    2016-06-01

    Monte Carlo (MC) modelling is widely used in the field of single photon emission computed tomography (SPECT) as it is a reliable technique to simulate very high quality scans. This technique provides very accurate modelling of the radiation transport and particle interactions in a heterogeneous medium. Various MC codes exist for nuclear medicine imaging simulations. Recently, new strategies exploiting the computing capabilities of graphical processing units (GPU) have been proposed. This work aims at evaluating the accuracy of such GPU implementation strategies in comparison to standard MC codes in the context of SPECT imaging. GATE was considered the reference MC toolkit and used to evaluate the performance of newly developed GPU Geant4-based Monte Carlo simulation (GGEMS) modules for SPECT imaging. Radioisotopes with different photon energies were used with these various CPU and GPU Geant4-based MC codes in order to assess the best strategy for each configuration. Three different isotopes were considered: (99m) Tc, (111)In and (131)I, using a low energy high resolution (LEHR) collimator, a medium energy general purpose (MEGP) collimator and a high energy general purpose (HEGP) collimator respectively. Point source, uniform source, cylindrical phantom and anthropomorphic phantom acquisitions were simulated using a model of the GE infinia II 3/8" gamma camera. Both simulation platforms yielded a similar system sensitivity and image statistical quality for the various combinations. The overall acceleration factor between GATE and GGEMS platform derived from the same cylindrical phantom acquisition was between 18 and 27 for the different radioisotopes. Besides, a full MC simulation using an anthropomorphic phantom showed the full potential of the GGEMS platform, with a resulting acceleration factor up to 71. The good agreement with reference codes and the acceleration factors obtained support the use of GPU implementation strategies for improving computational

  9. Accelerated GPU based SPECT Monte Carlo simulations

    Science.gov (United States)

    Garcia, Marie-Paule; Bert, Julien; Benoit, Didier; Bardiès, Manuel; Visvikis, Dimitris

    2016-06-01

    Monte Carlo (MC) modelling is widely used in the field of single photon emission computed tomography (SPECT) as it is a reliable technique to simulate very high quality scans. This technique provides very accurate modelling of the radiation transport and particle interactions in a heterogeneous medium. Various MC codes exist for nuclear medicine imaging simulations. Recently, new strategies exploiting the computing capabilities of graphical processing units (GPU) have been proposed. This work aims at evaluating the accuracy of such GPU implementation strategies in comparison to standard MC codes in the context of SPECT imaging. GATE was considered the reference MC toolkit and used to evaluate the performance of newly developed GPU Geant4-based Monte Carlo simulation (GGEMS) modules for SPECT imaging. Radioisotopes with different photon energies were used with these various CPU and GPU Geant4-based MC codes in order to assess the best strategy for each configuration. Three different isotopes were considered: 99m Tc, 111In and 131I, using a low energy high resolution (LEHR) collimator, a medium energy general purpose (MEGP) collimator and a high energy general purpose (HEGP) collimator respectively. Point source, uniform source, cylindrical phantom and anthropomorphic phantom acquisitions were simulated using a model of the GE infinia II 3/8" gamma camera. Both simulation platforms yielded a similar system sensitivity and image statistical quality for the various combinations. The overall acceleration factor between GATE and GGEMS platform derived from the same cylindrical phantom acquisition was between 18 and 27 for the different radioisotopes. Besides, a full MC simulation using an anthropomorphic phantom showed the full potential of the GGEMS platform, with a resulting acceleration factor up to 71. The good agreement with reference codes and the acceleration factors obtained support the use of GPU implementation strategies for improving computational efficiency

  10. The impact of different Monte Carlo models on the cross section measurement of top-pair production at 7 TeV proton-proton collisions

    Energy Technology Data Exchange (ETDEWEB)

    Krause, Claudius

    2012-04-15

    High energy proton-proton collisions lead to a large amount of secondary particles to be measured in a detector. A final state containing top quarks is of particular interest. But top quarks are only produced in a small fraction of the collisions. Hence, criteria must be defined to separate events containing top quarks from the background. From detectors, we record signals, for example hits in the tracker system or deposits in the calorimeters. In order to obtain the momentum of the particles, we apply algorithms to reconstruct tracks in space. More sophisticated algorithms are needed to identify the flavour of quarks, such as b-tagging. Several steps are needed to test these algorithms. Collision products of proton-proton events are generated using Monte Carlo techniques and their passage through the detector is simulated. After that, the algorithms are applied and the signal efficiency and the mistagging rate can be obtained. There are, however, many different approaches and algorithms realized in programs, so the question arises if the choice of the Monte Carlo generator influences the measured quantities. In this thesis, two commonly used Monte Carlo generators, SHERPA and MadGraph/MadEvent, are compared and the differences in the selection efficiency of semimuonic tt events are estimated. In addition, the distributions of kinematic variables are shown. A special chapter about the matching of matrix elements with parton showers is included. The main algorithms, CKKW for SHERPA and MLM for MadGraph/MadEvent, are introduced.

  11. Use of Monte Carlo software to aid design of a proton therapy nozzle

    Energy Technology Data Exchange (ETDEWEB)

    Swanepoel, M.W. [Medical Radiation Group, iThemba LABS, P.O. Box 22, Somerset West 7129 (South Africa)], E-mail: mark@tlabs.ac.za; Jones, D.T.L. [Medical Radiation Group, iThemba LABS, P.O. Box 22, Somerset West 7129 (South Africa)

    2007-09-21

    A second proton therapy nozzle is being developed at iThemba LABS to irradiate lesions in the body, thus complementing an existing facility for head and neck treatments. A passive scattering system is being developed, the complexity of which necessitates Monte Carlo simulations. We have used MCNPX to set the apertures and spacing of collimators, to model dose distributions in water, to check and modify beam scattering and energy modulating components, and to check radiation shields. The comprehensive shielding model was adapted for other problems by reducing the types of particles transported, limiting the extent and complexity of the geometry, and where possible killing particles by setting their importance to zero. Our results appear to indicate that the Rossi and Greisen description of multiple Coulomb scattering as used in MCNPX predicts high-Z, large angle scattering acceptably well for modeling proton therapy nozzles. MCNPX is easy to learn and implement, but has disadvantages when used to model therapy nozzles: (1) it does not yet offer a true capability to model electromagnetic interactions, (2) it cannot model moving components, and (3) it uses energy rather than range cut-offs for particles. Hence a GEANT4 model of the new nozzle is also being implemented.

  12. Improved efficiency in Monte Carlo simulation for passive-scattering proton therapy

    International Nuclear Information System (INIS)

    The aim of this work was to improve the computational efficiency of Monte Carlo simulations when tracking protons through a proton therapy treatment head. Two proton therapy facilities were considered, the Francis H Burr Proton Therapy Center (FHBPTC) at the Massachusetts General Hospital and the Crocker Lab eye treatment facility used by University of California at San Francisco (UCSFETF). The computational efficiency was evaluated for phase space files scored at the exit of the treatment head to determine optimal parameters to improve efficiency while maintaining accuracy in the dose calculation.For FHBPTC, particles were split by a factor of 8 upstream of the second scatterer and upstream of the aperture. The radius of the region for Russian roulette was set to 2.5 or 1.5 times the radius of the aperture and a secondary particle production cut (PC) of 50 mm was applied. For UCSFETF, particles were split a factor of 16 upstream of a water absorber column and upstream of the aperture. Here, the radius of the region for Russian roulette was set to 4 times the radius of the aperture and a PC of 0.05 mm was applied. In both setups, the cylindrical symmetry of the proton beam was exploited to position the split particles randomly spaced around the beam axis.When simulating a phase space for subsequent water phantom simulations, efficiency gains between a factor of 19.9  ±  0.1 and 52.21  ±  0.04 for the FHTPC setups and 57.3  ±  0.5 for the UCSFETF setups were obtained. For a phase space used as input for simulations in a patient geometry, the gain was a factor of 78.6  ±  7.5. Lateral-dose curves in water were within the accepted clinical tolerance of 2%, with statistical uncertainties of 0.5% for the two facilities. For the patient geometry and by considering the 2% and 2mm criteria, 98.4% of the voxels showed a gamma index lower than unity. An analysis of the dose distribution resulted in systematic deviations below of 0.88% for 20

  13. Monte Carlo simulation of electron and proton irradiation of carbon nanotube and graphene transistors

    OpenAIRE

    Chatzikyriakou, Eleni; Smyrnis, Chris; Chatwin, Chris

    2014-01-01

    Carbon-based nanotechnology electronics can provide high performance, low-power and low-weight solutions, which are very suitable for innovative aerospace applications. However, its application in the space environment where there is a radiation hazard, requires an assessment of the response of such electronic products to the background irradiance. To explore the potential of carbon-based nanotechnology, Monte Carlo simulations of radiation interacting with a gate-all-around carbo...

  14. Monte-Carlo approach to calculate the proton stopping in warm dense matter within particle-in-cell simulations

    CERN Document Server

    Wu, D; Yu, W; Fritzsche, S

    2016-01-01

    A Monte-Carlo approach to proton stopping in warm dense matter is implemented into an existing particle-in-cell code. The model is based on multiple binary-collisions among electron-electron, electron-ion and ion-ion, taking into account contributions from both free and bound electrons, and allows to calculate particle stopping in much more natural manner. At low temperature limit, when ``all'' electron are bounded at the nucleus, the stopping power converges to the predictions of Bethe-Bloch theory, which shows good consistency with data provided by the NIST. With the rising of temperatures, more and more bound electron are ionized, thus giving rise to an increased stopping power to cold matter, which is consistent with the report of a recently experimental measurement [Phys. Rev. Lett. 114, 215002 (2015)]. When temperature is further increased, with ionizations reaching the maximum, lowered stopping power is observed, which is due to the suppression of collision frequency between projected proton beam and h...

  15. SU-E-T-546: Modeling and Validation of the a New Proton Therapy System Using a Monte-Carlo Environment Optimized for Protons

    International Nuclear Information System (INIS)

    8 particle histories were run. Results: Range measurements of the Monte-Carlo simulations matched the measured data within 1mm. Distal fall-off of the simulated fields matched within <1mm. Lateral penumbra and field size measurements of the standard-sized square and half-beam blocked fields matched within 1mm at all three planes compared. A small difference was seen in the in-air profiles at doses <0%. The suspected cause of the difference was the aperture shape. The measured data utilized a divergent aperture. The Monte-Carlo calculation used a non-divergent aperture. Conclusion: The validation measurements indicate that we were able to accurately model the MEVION s250 Proton therapy system using Monte-Carlo Calculations. This may reduce the commissioning time for future users. Purpose: Monte-Carlo modeling is an important tool for understanding the behavior of therapeutic proton beams in a heterogeneous media such as the patient. To gain confidence that a Monte-Carlo model is accurate in complex geometries and media, it must first be compared with measurement in simple situations. This study documents the validation of our Monte-Carlo Model. Methods: A model of the MEVION s250 Proton therapy system was created in the TOPAS Monte-Carlo environment using machine geometry and field shaping system information provided by the vendor. For each of 24 options, validation of the TOPAS model was performed by comparing the dose scored by TOPAS to the dose measurements obtained during the commissioning of the treatment planning system. The measurements compared consisted of: pristine peak depth-dose profiles, in-air profiles for a standard-sized square field (20cm×20cm or 10cm×10cm depending on the maximum field size for each option) at isocenter and at 20cm upstream and downstream of isocenter, and in-air profiles with a half-beam blocked aperture at isocenter and at 20cm upstream and downstream of isocenter. For all Monte-Carlo simulations,

  16. CERN Summer Student Report 2016 Monte Carlo Data Base Improvement

    CERN Document Server

    Caciulescu, Alexandru Razvan

    2016-01-01

    During my Summer Student project I worked on improving the Monte Carlo Data Base and MonALISA services for the ALICE Collaboration. The project included learning the infrastructure for tracking and monitoring of the Monte Carlo productions as well as developing a new RESTful API for seamless integration with the JIRA issue tracking framework.

  17. Feasibility Study of Neutron Dose for Real Time Image Guided Proton Therapy: A Monte Carlo Study

    CERN Document Server

    Kim, Jin Sung; Kim, Daehyun; Shin, EunHyuk; Chung, Kwangzoo; Cho, Sungkoo; Ahn, Sung Hwan; Ju, Sanggyu; Chung, Yoonsun; Jung, Sang Hoon; Han, Youngyih

    2015-01-01

    Two full rotating gantry with different nozzles (Multipurpose nozzle with MLC, Scanning Dedicated nozzle) with conventional cyclotron system is installed and under commissioning for various proton treatment options at Samsung Medical Center in Korea. The purpose of this study is to investigate neutron dose equivalent per therapeutic dose, H/D, to x-ray imaging equipment under various treatment conditions with monte carlo simulation. At first, we investigated H/D with the various modifications of the beam line devices (Scattering, Scanning, Multi-leaf collimator, Aperture, Compensator) at isocenter, 20, 40, 60 cm distance from isocenter and compared with other research groups. Next, we investigated the neutron dose at x-ray equipments used for real time imaging with various treatment conditions. Our investigation showed the 0.07 ~ 0.19 mSv/Gy at x-ray imaging equipments according to various treatment options and intestingly 50% neutron dose reduction effect of flat panel detector was observed due to multi- lea...

  18. Formal quality control for a proton Monte Carlo system in radiation therapy

    International Nuclear Information System (INIS)

    TOPAS (TOol for PArticle Simulation) is a Monte Carlo particle transport tool being released to a wide variety of proton therapy users worldwide. Because TOPAS provides unprecedented ease in 4D placement of geometry components, beam sources and scoring, including options to place geometry components, beam sources or scorers within each other, Quality Control (QC) for TOPAS is both critical and challenging. All simulation details (geometry, particle sources, scoring, physics settings, time-dependent motions, gating, etc.) are specified in the TOPAS Parameter Control System (which catches many user errors). QC includes Unit and End-to-End Testing. Each code unit is tested (each geometry component, particle source option, scoring option, etc.) and these unit testing procedures are shared with end users so they can reproduce tests. End-to-End testing of several full clinical setups is routinely performed. End-to-End testing presents a challenge since one cannot anticipate all the ways users will combine TOPAS flexible units for their specific project. Automated checking catches geometry overlaps and some other problematic setups, but one can never rule out the potential for problems when users combine units in new setups. QC is ultimately a partnership between the tool developer and the user. Key is that the developer be clear to the end user about what has been tested and what has not.

  19. Distributions of secondary particles in proton and carbon-ion therapy: a comparison between GATE/Geant4 and FLUKA Monte Carlo codes

    International Nuclear Information System (INIS)

    Monte Carlo simulations play a crucial role for in-vivo treatment monitoring based on PET and prompt gamma imaging in proton and carbon-ion therapies. The accuracy of the nuclear fragmentation models implemented in these codes might affect the quality of the treatment verification. In this paper, we investigate the nuclear models implemented in GATE/Geant4 and FLUKA by comparing the angular and energy distributions of secondary particles exiting a homogeneous target of PMMA. Comparison results were restricted to fragmentation of 16O and 12C. Despite the very simple target and set-up, substantial discrepancies were observed between the two codes. For instance, the number of high energy (>1 MeV) prompt gammas exiting the target was about twice as large with GATE/Geant4 than with FLUKA both for proton and carbon ion beams. Such differences were not observed for the predicted annihilation photon production yields, for which ratios of 1.09 and 1.20 were obtained between GATE and FLUKA for the proton beam and the carbon ion beam, respectively. For neutrons and protons, discrepancies from 14% (exiting protons–carbon ion beam) to 57% (exiting neutrons–proton beam) have been identified in production yields as well as in the energy spectra for neutrons. (paper)

  20. SU-E-T-569: Neutron Shielding Calculation Using Analytical and Multi-Monte Carlo Method for Proton Therapy Facility

    Energy Technology Data Exchange (ETDEWEB)

    Cho, S; Shin, E H; Kim, J; Ahn, S H; Chung, K; Kim, D-H; Han, Y; Choi, D H [Samsung Medical Center, Seoul (Korea, Republic of)

    2015-06-15

    Purpose: To evaluate the shielding wall design to protect patients, staff and member of the general public for secondary neutron using a simply analytic solution, multi-Monte Carlo code MCNPX, ANISN and FLUKA. Methods: An analytical and multi-Monte Carlo method were calculated for proton facility (Sumitomo Heavy Industry Ltd.) at Samsung Medical Center in Korea. The NCRP-144 analytical evaluation methods, which produced conservative estimates on the dose equivalent values for the shielding, were used for analytical evaluations. Then, the radiation transport was simulated with the multi-Monte Carlo code. The neutron dose at evaluation point is got by the value using the production of the simulation value and the neutron dose coefficient introduced in ICRP-74. Results: The evaluation points of accelerator control room and control room entrance are mainly influenced by the point of the proton beam loss. So the neutron dose equivalent of accelerator control room for evaluation point is 0.651, 1.530, 0.912, 0.943 mSv/yr and the entrance of cyclotron room is 0.465, 0.790, 0.522, 0.453 mSv/yr with calculation by the method of NCRP-144 formalism, ANISN, FLUKA and MCNP, respectively. The most of Result of MCNPX and FLUKA using the complicated geometry showed smaller values than Result of ANISN. Conclusion: The neutron shielding for a proton therapy facility has been evaluated by the analytic model and multi-Monte Carlo methods. We confirmed that the setting of shielding was located in well accessible area to people when the proton facility is operated.

  1. Prediction of production of 22Na in a gas-cell target irradiated by protons using Monte Carlo tracking

    International Nuclear Information System (INIS)

    Highlights: • Angular distribution of the proton beam in a gaseous environment. • Particle energy distribution profile and proton flux within gas-cell target with MCNPX. • Detection of the residual nuclei during the nuclear reactions. • Estimation of production yield for 22,natNe(p,x)22Na reactions. - Abstract: The 22Ne(p,n)22Na is an optimal reaction for the cyclotron production of 22Na. This work tends to monitor the proton induced production of 22Na in a gas-cell target, containing natural and enriched neon gas, using Monte Carlo method. The excitation functions of reactions are calculated by both TALYS-1.6 and ALICE/ASH codes and then the optimum energy range of projectile for the high yield production is selected. A free gaseous environment of neon at a particular pressure and temperature is prearranged and the proton beam is transported within it using Monte Carlo codes MCNPX and SRIM. The beam monitoring performed by each of these codes indicates that the gas-cell has to be designed as conical frustum to reach desired interactions. The MCNPX is also employed to calculate the energy distribution of proton in the designed target and estimation of the residual nuclei during irradiation. The production yield of 22Na in 22Ne(p,n)22Na and natNe(p,x)22Na reactions are estimated and it shows a good agreement with the experimental results. The results demonstrate that Monte Carlo makes available a beneficial manner to design and optimize the gas targets as well as calibration of detectors, which can be used for the radionuclide production purposes

  2. MONTE: An automated Monte Carlo based approach to nuclear magnetic resonance assignment of proteins

    Energy Technology Data Exchange (ETDEWEB)

    Hitchens, T. Kevin; Lukin, Jonathan A.; Zhan Yiping; McCallum, Scott A.; Rule, Gordon S. [Carnegie Mellon University, Department of Biological Sciences (United States)], E-mail: rule@andrew.cmu.edu

    2003-01-15

    A general-purpose Monte Carlo assignment program has been developed to aid in the assignment of NMR resonances from proteins. By virtue of its flexible data requirements the program is capable of obtaining assignments of both heavily deuterated and fully protonated proteins. A wide variety of source data, such as inter-residue scalar connectivity, inter-residue dipolar (NOE) connectivity, and residue specific information, can be utilized in the assignment process. The program can also use known assignments from one form of a protein to facilitate the assignment of another form of the protein. This attribute is useful for assigning protein-ligand complexes when the assignments of the unliganded protein are known. The program can be also be used as an interactive research tool to assist in the choice of additional experimental data to facilitate completion of assignments. The assignment of a deuterated 45 kDa homodimeric Glutathione-S-transferase illustrates the principal features of the program.

  3. Investigations on Monte Carlo based coupled core calculations

    International Nuclear Information System (INIS)

    The present trend in advanced and next generation nuclear reactor core designs is towards increased material heterogeneity and geometry complexity. The continuous energy Monte Carlo method has the capability of modeling such core environments with high accuracy. This paper presents results from feasibility studies being performed at the Pennsylvania State University (PSU) on both accelerating Monte Carlo criticality calculations by using hybrid nodal diffusion Monte Carlo schemes and thermal-hydraulic feedback modeling in Monte Carlo core calculations. The computation process is greatly accelerated by calculating the three-dimensional (3D) distributions of fission source and thermal-hydraulics parameters with the coupled NEM/COBRA-TF code and then using coupled MCNP5/COBRA-TF code to fine tune the results to obtain an increased accuracy. The PSU NEM code employs cross-sections generated by MCNP5 for pin-cell based nodal compositions. The implementation of different code modifications facilitating coupled calculations are presented first. Then the coupled hybrid Monte Carlo based code system is applied to a 3D 2*2 pin array extracted from a Boiling Water Reactor (BWR) assembly with reflective radial boundary conditions. The obtained results are discussed and it is showed that performing Monte-Carlo based coupled core steady state calculations are feasible. (authors)

  4. Analysis of the track- and dose-averaged LET and LET spectra in proton therapy using the GEANT4 Monte Carlo code

    International Nuclear Information System (INIS)

    Purpose: The motivation of this study was to find and eliminate the cause of errors in dose-averaged linear energy transfer (LET) calculations from therapeutic protons in small targets, such as biological cell layers, calculated using the GEANT 4 Monte Carlo code. Furthermore, the purpose was also to provide a recommendation to select an appropriate LET quantity from GEANT 4 simulations to correlate with biological effectiveness of therapeutic protons. Methods: The authors developed a particle tracking step based strategy to calculate the average LET quantities (track-averaged LET, LETt and dose-averaged LET, LETd) using GEANT 4 for different tracking step size limits. A step size limit refers to the maximally allowable tracking step length. The authors investigated how the tracking step size limit influenced the calculated LETt and LETd of protons with six different step limits ranging from 1 to 500 μm in a water phantom irradiated by a 79.7-MeV clinical proton beam. In addition, the authors analyzed the detailed stochastic energy deposition information including fluence spectra and dose spectra of the energy-deposition-per-step of protons. As a reference, the authors also calculated the averaged LET and analyzed the LET spectra combining the Monte Carlo method and the deterministic method. Relative biological effectiveness (RBE) calculations were performed to illustrate the impact of different LET calculation methods on the RBE-weighted dose. Results: Simulation results showed that the step limit effect was small for LETt but significant for LETd. This resulted from differences in the energy-deposition-per-step between the fluence spectra and dose spectra at different depths in the phantom. Using the Monte Carlo particle tracking method in GEANT 4 can result in incorrect LETd calculation results in the dose plateau region for small step limits. The erroneous LETd results can be attributed to the algorithm to determine fluctuations in energy deposition along the

  5. Fluctuations in the EAS radio signal derived with improved Monte Carlo simulations based on CORSIKA

    CERN Document Server

    Huege, T; Badea, F; Bähren, L; Bekk, K; Bercuci, A; Bertaina, M; Biermann, P L; Blumer, J; Bozdog, H; Brancus, I M; Buitink, S; Bruggemann, M; Buchholz, P; Butcher, H; Chiavassa, A; Daumiller, K; De Bruyn, A G; De Vos, C M; Di Pierro, F; Doll, P; Engel, R; Falcke, H; Gemmeke, H; Ghia, P L; Glasstetter, R; Grupen, C; Haungs, A; Heck, D; Hörandel, J R; Horneffer, A; Kampert, K H; Kant, G W; Klein, U; Kolotaev, Yu; Koopman, Y; Krömer, O; Kuijpers, J; Lafebre, S; Maier, G; Mathes, H J; Mayer, H J; Milke, J; Mitrica, B; Morello, C; Navarra, G; Nehls, S; Nigl, A; Obenland, R; Oehlschläger, J; Ostapchenko, S; Over, S; Pepping, H J; Petcu, M; Petrovic, J; Pierog, T; Plewnia, S; Rebel, H; Risse, A; Roth, M; Schieler, H; Schoonderbeek, G; Sima, O; Stumpert, M; Toma, G; Trinchero, G C; Ulrich, H; Valchierotti, S; Van Buren, J; Van Capellen, W; Walkowiak, W; Weindl, A; Wijnholds, S J; Wochele, J; Zabierowski, J; Zensus, J A; Zimmermann, D; Bowman, J D; Huege, Tim

    2005-01-01

    Cosmic ray air showers are known to emit pulsed radio emission which can be understood as coherent geosynchrotron radiation arising from the deflection of electron-positron pairs in the earth's magnetic field. Here, we present simulations carried out with an improved version of our Monte Carlo code for the calculation of geosynchrotron radiation. Replacing the formerly analytically parametrised longitudinal air shower development with CORSIKA-generated longitudinal profiles, we study the radio flux variations arising from inherent fluctuations between individual air showers. Additionally, we quantify the dependence of the radio emission on the nature of the primary particle by comparing the emission generated by proton- and iron-induced showers. This is only the first step in the incorporation of a more realistic air shower model into our Monte Carlo code. The inclusion of highly realistic CORSIKA-based particle energy, momentum and spatial distributions together with an analytical treatment of ionisation los...

  6. Evaluation of ion chamber dependent correction factors for ionisation chamber dosimetry in proton beams using a Monte Carlo method

    Energy Technology Data Exchange (ETDEWEB)

    Palmans, H. [Ghent Univ. (Belgium). Dept. of Biomedical Physics; Verhaegen, F.

    1995-12-01

    In the last decade, several clinical proton beam therapy facilities have been developed. To satisfy the demand for uniformity in clinical (routine) proton beam dosimetry two dosimetry protocols (ECHED and AAPM) have been published. Both protocols neglect the influence of ion chamber dependent parameters on dose determination in proton beams because of the scatter properties of these beams, although the problem has not been studied thoroughly yet. A comparison between water calorimetry and ionisation chamber dosimetry showed a discrepancy of 2.6% between the former method and ionometry following the ECHED protocol. Possibly, a small part of this difference can be attributed to chamber dependent correction factors. Indications for this possibility are found in ionometry measurements. To allow the simulation of complex geometries with different media necessary for the study of those corrections, an existing proton Monte Carlo code (PTRAN, Berger) has been modified. The original code, that applies Mollire`s multiple scattering theory and Vavilov`s energy straggling theory, calculates depth dose profiles, energy distributions and radial distributions for pencil beams in water. Comparisons with measurements and calculations reported in the literature are done to test the program`s accuracy. Preliminary results of the influence of chamber design and chamber materials on dose to water determination are presented.

  7. Evaluation of ion chamber dependent correction factors for ionisation chamber dosimetry in proton beams using a Monte Carlo method

    International Nuclear Information System (INIS)

    In the last decade, several clinical proton beam therapy facilities have been developed. To satisfy the demand for uniformity in clinical (routine) proton beam dosimetry two dosimetry protocols (ECHED and AAPM) have been published. Both protocols neglect the influence of ion chamber dependent parameters on dose determination in proton beams because of the scatter properties of these beams, although the problem has not been studied thoroughly yet. A comparison between water calorimetry and ionisation chamber dosimetry showed a discrepancy of 2.6% between the former method and ionometry following the ECHED protocol. Possibly, a small part of this difference can be attributed to chamber dependent correction factors. Indications for this possibility are found in ionometry measurements. To allow the simulation of complex geometries with different media necessary for the study of those corrections, an existing proton Monte Carlo code (PTRAN, Berger) has been modified. The original code, that applies Mollire's multiple scattering theory and Vavilov's energy straggling theory, calculates depth dose profiles, energy distributions and radial distributions for pencil beams in water. Comparisons with measurements and calculations reported in the literature are done to test the program's accuracy. Preliminary results of the influence of chamber design and chamber materials on dose to water determination are presented

  8. Neutron H*(10) inside a proton therapy facility: comparison between Monte Carlo simulations and WENDI-2 measurements

    International Nuclear Information System (INIS)

    Inside an IBA proton therapy centre, secondary neutrons are produced due to nuclear interactions of the proton beam with matter mainly inside the cyclotron, the beam line, the treatment nozzle and the patient. Accurate measurements of the neutron ambient dose equivalent H*(10) in such a facility require the use of a detector that has a good sensitivity for neutrons ranging from thermal energies up to 230 MeV, such as for instance the WENDI-2 detector. WENDI-2 measurements have been performed at the Westdeutsches Protonentherapiezentrum Essen, at several positions around the cyclotron room and around a gantry treatment room operated in two different beam delivery modes: Pencil Beam Scanning and Double Scattering. These measurements are compared with Monte Carlo simulation results for the neutron H*(10) obtained with MCNPX 2.5.0 and GEANT4 9.6. In proton therapy, proton beams with energies up to typically 230 MeV are used to treat cancerous tumours very efficiently while sparing surrounding healthy tissues as much as possible. Due to nuclear interactions of the proton beams with matter, mainly inside the cyclotron, the beam line, the treatment nozzle and the patient, secondary neutrons with energies up to 230 MeV are unfortunately produced, as well as photons up to ∼10 MeV. Behind the thick concrete shielding walls which are necessary to attenuate the stray radiation fields, the total ambient dose equivalent H*(10) is very large due to the neutron component. In shielding studies for proton therapy facilities, the neutron H*(10) component is often evaluated using the Monte Carlo codes MCNPX(5), FLUKA(6) or PHITS(7). Recent benchmark simulations performed with GEANT4 have shown that this code would also be a suitable tool for the shielding studies of proton therapy centres. The experimental validation of such shielding studies requires the use of a detector with a good sensitivity for neutrons ranging from thermal energies up to 230 MeV, such as for example the

  9. MCHIT - Monte Carlo model for proton and heavy-ion therapy

    CERN Document Server

    Pshenichnov, Igor; Greiner, Walter

    2007-01-01

    We study the propagation of nucleons and nuclei in tissue-like media within a Monte Carlo Model for Heavy-ion Therapy (MCHIT) based on the GEANT4 toolkit (version 8.2). The model takes into account fragmentation of projectile nuclei and secondary interactions of produced nuclear fragments. Model predictions are validated with available experimental data obtained for water and PMMA phantoms irradiated by monoenergetic carbon-ion beams. The MCHIT model describes well (1) the depth-dose distributions in water and PMMA, (2) the doses measured for fragments of certain charge, (3) the distributions of positron emitting nuclear fragments produced by carbon-ion beams, and (4) the energy spectra of secondary neutrons measured at different angles to the beam direction. Radial dose profiles for primary nuclei and for different projectile fragments are calculated and discussed as possible input for evaluation of biological dose distributions. It is shown that at the periphery of the transverse dose profile close to the B...

  10. Comparison of Monte Carlo simulations with proton experiment for a thick Au absorber

    International Nuclear Information System (INIS)

    Proton therapy applications deal with relatively thick targets like the human head or the trunk. Therefore, relatively small differences in the total proton stopping power given, for example, by the different models provided by GEANT4, could lead to significant disagreement in the final proton energy spectra when integrated along lengthy proton trajectories. This work presents a comparison of proton energy spectra for 49.1 MeV protons passing through a couple of Au absorbers with different thicknesses obtained by GEANT4.8.2 simulations using ICRU49, Ziegler1985 and Ziegler2000 models. The comparison was made with the experimental data of Tschalaer, with TRIM/SRIM2008 and MCNPX2.4.0 simulations, and the Payne analytical solution for the transport equation in the Fokker-Plank approximation. It is shown that the simulations reproduce the experimental spectra with some detectable contradictions. It should be noted that all the spectra lay at the proton energies significantly above 2 MeV, i.e. in the so-called 'Bethe-Bloch region'. Therefore the observed disagreements in GEANT4 results, simulated with different models, are somewhat unexpected. Further studies for a better understanding and to obtain definitive conclusions are necessary. (author)

  11. Comparison of Monte Carlo simulations with proton experiment for a thick Au absorber

    Energy Technology Data Exchange (ETDEWEB)

    Yevseyeva, Olga; Assis, Joaquim T. de, E-mail: yevseveva@iprj.uerj.b, E-mail: joaquim@iprj.uerj.b [Universidade do Estado do Rio de Janeiro (UERJ), Nova Friburgo, RJ (Brazil). Inst. Politecnico; Evseev, Ivan; Schelin, Hugo R.; Paschuk, Sergei A.; Milhoretto, Edney; Setti, Joao A.P., E-mail: evseev@utfpr.edu.b, E-mail: schelin@utfpr.edu.b, E-mail: sergei@utfpr.edu.b, E-mail: edneymilhoretto@yahoo.co, E-mail: jsetti@gmail.co [Universidade Tecnologica Federal do Parana, Curitiba, PR (Brazil); Diaz, Katherin S., E-mail: kshtejer@infomed.sld.c [Centro de Aplicaciones Tecnologicas y Desarrollo Nuclear, Havana (Cuba); Hormaza, Joel M., E-mail: jmesa@ibb.unesp.b [UNESP, Botucatu, SP (Brazil). Inst. de Biociencias; Lopes, Ricardo T., E-mail: ricardo@lin.ufrj.b [Coordenacao dos Programas de Pos-graduacao de Engenharia (COPPE/UFRJ), Rio de Janeiro, RJ (Brazil). Lab. de Instrumentacao Nuclear

    2009-07-01

    Proton therapy applications deal with relatively thick targets like the human head or the trunk. Therefore, relatively small differences in the total proton stopping power given, for example, by the different models provided by GEANT4, could lead to significant disagreement in the final proton energy spectra when integrated along lengthy proton trajectories. This work presents a comparison of proton energy spectra for 49.1 MeV protons passing through a couple of Au absorbers with different thicknesses obtained by GEANT4.8.2 simulations using ICRU49, Ziegler1985 and Ziegler2000 models. The comparison was made with the experimental data of Tschalaer, with TRIM/SRIM2008 and MCNPX2.4.0 simulations, and the Payne analytical solution for the transport equation in the Fokker-Plank approximation. It is shown that the simulations reproduce the experimental spectra with some detectable contradictions. It should be noted that all the spectra lay at the proton energies significantly above 2 MeV, i.e. in the so-called 'Bethe-Bloch region'. Therefore the observed disagreements in GEANT4 results, simulated with different models, are somewhat unexpected. Further studies for a better understanding and to obtain definitive conclusions are necessary. (author)

  12. Monte Carlo study on the sensitivity of prompt gamma imaging to proton range variations due to interfractional changes in prostate cancer patients

    Science.gov (United States)

    Schmid, S.; Landry, G.; Thieke, C.; Verhaegen, F.; Ganswindt, U.; Belka, C.; Parodi, K.; Dedes, G.

    2015-12-01

    Proton range verification based on prompt gamma imaging is increasingly considered in proton therapy. Tissue heterogeneity normal to the beam direction or near the end of range may considerably degrade the ability of prompt gamma imaging to detect proton range shifts. The goal of this study was to systematically investigate the accuracy and precision of range detection from prompt gamma emission profiles for various fractions for intensity modulated proton therapy of prostate cancer, using a comprehensive clinical dataset of 15 different CT scans for 5 patients. Monte Carlo simulations using Geant4 were performed to generate spot-by-spot dose distributions and prompt gamma emission profiles for prostate treatment plans. The prompt gammas were scored at their point of emission. Three CT scans of the same patient were used to evaluate the impact of inter-fractional changes on proton range. The range shifts deduced from the comparison of prompt gamma emission profiles in the planning CT and subsequent CTs were then correlated to the corresponding range shifts deduced from the dose distributions for individual pencil beams. The distributions of range shift differences between prompt gamma and dose were evaluated in terms of precision (defined as half the 95% inter-percentile range IPR) and accuracy (median). In total about 1700 individual proton pencil beams were investigated. The IPR of the relative range shift differences between the dose profiles and the prompt gamma profiles varied between  ±1.4 mm and  ±2.9 mm when using the more robust profile shifting analysis. The median was found smaller than 1 mm. Methods to identify and reject unreliable spots for range verification due to range mixing were derived and resulted in an average 10% spot rejection, clearly improving the prompt gamma-dose correlation. This work supports that prompt gamma imaging can offer a reliable indicator of range changes due to anatomical variations and tissue heterogeneity

  13. Impact of the material composition on proton range variation - A Monte Carlo study

    Science.gov (United States)

    Wu, S. W.; Tung, C. J.; Lee, C. C.; Fan, K. H.; Huang, H. C.; Chao, T. C.

    2015-11-01

    In this study, we used the Geant4 toolkit to demonstrate the impacts of the material composition of tissues on proton range variation. Bragg curves of different materials subjected to a 250 MeV mono-energy proton beam were simulated and compared. These simulated materials included adipose, heart, brain, cartilage, cortical bone and water. The results showed that there was significant proton range deviation between Bragg curves, especially for cortical bone. The R50 values for a 250 MeV proton beam were approximately 39.55 cm, 35.52 cm, 37.00 cm, 36.51 cm, 36.72 cm, 22.53 cm, and 38.52 cm in the phantoms that were composed completely of adipose, cartilage, tissue, heart, brain, cortical bone, and water, respectively. Mass density and electron density were used to scale the proton range for each material; electron density provided better range scaling. In addition, a similar comparison was performed by artificially setting all material density to 1.0 g/cm3 to evaluate the range deviation due to chemical components alone. Tissue heterogeneity effects due to density variation were more significant, and less significant for chemical composition variation unless the Z/A was very different.

  14. SU-E-T-591: Measurement and Monte Carlo Simulation of Stray Neutrons in Passive Scattering Proton Therapy: Needs and Challenges

    Energy Technology Data Exchange (ETDEWEB)

    Farah, J; Bonfrate, A; Donadille, L; Dubourg, N; Lacoste, V; Martinetti, F; Sayah, R; Trompier, F; Clairand, I [IRSN - Institute for Radiological Protection and Nuclear Safety, Fontenay-aux-roses (France); Caresana, M [Politecnico di Milano, Milano (Italy); Delacroix, S; Nauraye, C [Institut Curie - Centre de Protontherapie d Orsay, Orsay (France); Herault, J [Centre Antoine Lacassagne, Nice (France); Piau, S; Vabre, I [Institut de Physique Nucleaire d Orsay, Orsay (France)

    2014-06-01

    Purpose: Measure stray radiation inside a passive scattering proton therapy facility, compare values to Monte Carlo (MC) simulations and identify the actual needs and challenges. Methods: Measurements and MC simulations were considered to acknowledge neutron exposure associated with 75 MeV ocular or 180 MeV intracranial passively scattered proton treatments. First, using a specifically-designed high sensitivity Bonner Sphere system, neutron spectra were measured at different positions inside the treatment rooms. Next, measurement-based mapping of neutron ambient dose equivalent was fulfilled using several TEPCs and rem-meters. Finally, photon and neutron organ doses were measured using TLDs, RPLs and PADCs set inside anthropomorphic phantoms (Rando, 1 and 5-years-old CIRS). All measurements were also simulated with MCNPX to investigate the efficiency of MC models in predicting stray neutrons considering different nuclear cross sections and models. Results: Knowledge of the neutron fluence and energy distribution inside a proton therapy room is critical for stray radiation dosimetry. However, as spectrometry unfolding is initiated using a MC guess spectrum and suffers from algorithmic limits a 20% spectrometry uncertainty is expected. H*(10) mapping with TEPCs and rem-meters showed a good agreement between the detectors. Differences within measurement uncertainty (10–15%) were observed and are inherent to the energy, fluence and directional response of each detector. For a typical ocular and intracranial treatment respectively, neutron doses outside the clinical target volume of 0.4 and 11 mGy were measured inside the Rando phantom. Photon doses were 2–10 times lower depending on organs position. High uncertainties (40%) are inherent to TLDs and PADCs measurements due to the need for neutron spectra at detector position. Finally, stray neutrons prediction with MC simulations proved to be extremely dependent on proton beam energy and the used nuclear models and

  15. Integration and evaluation of automated Monte Carlo simulations in the clinical practice of scanned proton and carbon ion beam therapy

    International Nuclear Information System (INIS)

    Monte Carlo (MC) simulations of beam interaction and transport in matter are increasingly considered as essential tools to support several aspects of radiation therapy. Despite the vast application of MC to photon therapy and scattered proton therapy, clinical experience in scanned ion beam therapy is still scarce. This is especially the case for ions heavier than protons, which pose additional issues like nuclear fragmentation and varying biological effectiveness. In this work, we present the evaluation of a dedicated framework which has been developed at the Heidelberg Ion Beam Therapy Center to provide automated FLUKA MC simulations of clinical patient treatments with scanned proton and carbon ion beams. Investigations on the number of transported primaries and the dimension of the geometry and scoring grids have been performed for a representative class of patient cases in order to provide recommendations on the simulation settings, showing that recommendations derived from the experience in proton therapy cannot be directly translated to the case of carbon ion beams. The MC results with the optimized settings have been compared to the calculations of the analytical treatment planning system (TPS), showing that regardless of the consistency of the two systems (in terms of beam model in water and range calculation in different materials) relevant differences can be found in dosimetric quantities and range, especially in the case of heterogeneous and deep seated treatment sites depending on the ion beam species and energies, homogeneity of the traversed tissue and size of the treated volume. The analysis of typical TPS speed-up approximations highlighted effects which deserve accurate treatment, in contrast to adequate beam model simplifications for scanned ion beam therapy. In terms of biological dose calculations, the investigation of the mixed field components in realistic anatomical situations confirmed the findings of previous groups so far reported only in

  16. Monte Carlo simulations of soft proton flares: testing the physics with XMM-Newton

    CERN Document Server

    Fioretti, Valentina; Malaguti, Giuseppe; Spiga, Daniele; Tiengo, Andrea

    2016-01-01

    Low energy protons (<100-300 keV) in the Van Allen belt and the outer regions can enter the field of view of X-ray focusing telescopes, interact with the Wolter-I optics, and reach the focal plane. The use of special filters protects the XMM-Newton focal plane below an altitude of 70000 km, but above this limit the effect of soft protons is still present in the form of sudden flares in the count rate of the EPIC instruments, causing the loss of large amounts of observing time. We try to characterize the input proton population and the physics interaction by simulating, using the BoGEMMS framework, the proton interaction with a simplified model of the X-ray mirror module and the focal plane, and comparing the result with a real observation. The analysis of ten orbits of observations of the EPIC/pn instrument show that the detection of flares in regions far outside the radiation belt is largely influenced by the different orientation of the Earth's magnetosphere respect with XMM-Newton's orbit, confirming th...

  17. EPEWAX - a Monte Carlo generator for W production in electron proton scattering

    Energy Technology Data Exchange (ETDEWEB)

    Theuer, E. (RWTH Aachen, 1. Physikalisches Institut (Germany))

    1992-04-01

    EPEWAX is a Monte Carlo event generator intended to simulate the production of free single W bosons at an eP collider according to the process: e[sup -]P [yields] e[sup -]Wsub([yields]f[sub 1]f[sub 2]) X. It offers the possibility to completely simulate W production down to the particle level. Because the WW-Photon couplings are not fixed it allows to study non standard model W production. (orig.).

  18. EPEWAX - a Monte Carlo generator for W production in electron proton scattering

    International Nuclear Information System (INIS)

    EPEWAX is a Monte Carlo event generator intended to simulate the production of free single W bosons at an eP collider according to the process: e-P → e-Wsub(→f1f2) X. It offers the possibility to completely simulate W production down to the particle level. Because the WW-Photon couplings are not fixed it allows to study non standard model W production. (orig.)

  19. Self-consistent Monte Carlo simulations of proton acceleration in coronal shocks: Effect of anisotropic pitch-angle scattering of particles

    CERN Document Server

    Afanasiev, Alexandr; Vainio, Rami

    2016-01-01

    Context. Solar energetic particles observed in association with coronal mass ejections (CMEs) are produced by the CME-driven shock waves. The acceleration of particles is considered to be due to diffusive shock acceleration (DSA). Aims. We aim at a better understanding of DSA in the case of quasi-parallel shocks, in which self-generated turbulence in the shock vicinity plays a key role. Methods. We have developed and applied a new Monte Carlo simulation code for acceleration of protons in parallel coronal shocks. The code performs a self-consistent calculation of resonant interactions of particles with Alfv\\'en waves based on the quasi-linear theory. In contrast to the existing Monte Carlo codes of DSA, the new code features the full quasi-linear resonance condition of particle pitch-angle scattering. This allows us to take anisotropy of particle pitch-angle scattering into account, while the older codes implement an approximate resonance condition leading to isotropic scattering.We performed simulations with...

  20. GPU based Monte Carlo for PET image reconstruction: detector modeling

    International Nuclear Information System (INIS)

    Monte Carlo (MC) calculations and Graphical Processing Units (GPUs) are almost like the dedicated hardware designed for the specific task given the similarities between visible light transport and neutral particle trajectories. A GPU based MC gamma transport code has been developed for Positron Emission Tomography iterative image reconstruction calculating the projection from unknowns to data at each iteration step taking into account the full physics of the system. This paper describes the simplified scintillation detector modeling and its effect on convergence. (author)

  1. Feasibility study of the neutron dose for real-time image-guided proton therapy: A Monte Carlo study

    Science.gov (United States)

    Kim, Jin Sung; Shin, Jung Suk; Kim, Daehyun; Shin, Eunhyuk; Chung, Kwangzoo; Cho, Sungkoo; Ahn, Sung Hwan; Ju, Sanggyu; Chung, Yoonsun; Jung, Sang Hoon; Han, Youngyih

    2015-07-01

    Two full rotating gantries with different nozzles (multipurpose nozzle with MLC, scanning dedicated nozzle) for a conventional cyclotron system are installed and being commissioned for various proton treatment options at Samsung Medical Center in Korea. The purpose of this study is to use Monte Carlo simulation to investigate the neutron dose equivalent per therapeutic dose, H/D, for X-ray imaging equipment under various treatment conditions. At first, we investigated the H/D for various modifications of the beamline devices (scattering, scanning, multi-leaf collimator, aperture, compensator) at the isocenter and at 20, 40 and 60 cm distances from the isocenter, and we compared our results with those of other research groups. Next, we investigated the neutron dose at the X-ray equipment used for real-time imaging under various treatment conditions. Our investigation showed doses of 0.07 ~ 0.19 mSv/Gy at the X-ray imaging equipment, depending on the treatment option and interestingly, the 50% neutron dose reduction was observed due to multileaf collimator during proton scanning treatment with the multipurpose nozzle. In future studies, we plan to measure the neutron dose experimentally and to validate the simulation data for X-ray imaging equipment for use as an additional neutron dose reduction method.

  2. Monte Carlo Predictions of Proton SEE Cross-Sections from Heavy Ion Test Data

    CERN Document Server

    Xi, Kai; Zhang, Zhan-Gang; Hou, Ming-Dong; Sun, You-Mei; Luo, Jie; Liu, Tian-Qi; Wang, Bin; Ye, Bing; Yin, Ya-Nan; Liu, Jie

    2015-01-01

    The limits of previous methods promote us to design a new approach (named PRESTAGE) to predict proton single event effect (SEE) cross-sections using heavy-ion test data. To more realistically simulate the SEE mechanisms, we adopt Geant4 and the location-dependent strategy to describe the physics processes and the sensitivity of the device. Cross-sections predicted by PRESTAGE for over twenty devices are compared with the measured data. Evidences show that PRESTAGE can calculate not only single event upsets induced by proton indirect ionization, but also direct ionization effects and single event latch-ups. Most of the PRESTAGE calculated results agree with the experimental data within a factor of 2-3.

  3. Range degradation and distal edge behavior of proton radiotherapy beams using 11C activation and Monte Carlo simulation

    Science.gov (United States)

    Elmekawy, Ahmed Farouk

    The distal edge of therapeutic proton radiation beams was investigated by different methods. Proton beams produced at the Hampton University Proton Therapy Institute (HUPTI) were used to irradiate a Polymethylmethacrylate (PMMA) phantom for three different ranges (13.5, 17.0 and 21.0 cm) to investigate the distal slope dependence of the Bragg peak. The activation of 11 C was studied by scanning the phantom less than 10 minutes post-irradiation with a Philips Big Bore Gemini(c) PET/CT. The DICOM images were imported into the Varian Eclipse(c) Treatment Planning System (TPS) for analysis and then analyzed by ImageJ(c) . The distal slope ranged from ?0.1671 +/- 0.0036 to -0.1986 +/- 0.0052 (pixel intensity/slice number) for ranges 13.5 to 21.0 cm respectively. A realistic description of the setup was modeled using the GATE 7.0 Monte Carlo simulation tool and compared to the experiment data. The results show the distal slope ranged from -0.1158+/-0.0133 to -0.0787+/-0.002 (Gy/mm). Additionally, low activity, 11C were simulated to study the 11C reconstructed half-life dependence versus the initial activity for six ranges chosen around the previous activation study. The results of the expected/nominal half-life vs. activity ranged from -5 x 10-4 +/- 2.8104 x 10-4 to 1.6 x 10-3 +/- 9.44 x 10-4 (%diff./Bq). The comparison between two experiments with proton beams on a PMMA phantom and multi-layer ion chamber, and two GATE simulations of a proton beam incident on a water phantom and 11C PET study show that: (i) the distal fall-off variation of the steepness of the slopes are found to be similar thus validating the sensitivity of the PET technique to the range degradation and (ii) the average of the super-ratios difference between all studies observed is primarily due to the difference in the dose deposited in the media.

  4. Validation of proton ionization cross section generators for Monte Carlo particle transport

    CERN Document Server

    Batic, Matej; Saracco, Paolo

    2011-01-01

    Three software systems, ERCS08, ISICS 2011 and \\v{S}mit's code, that implement theoretical calculations of inner shell ionization cross sections by proton impact, are validated with respect to experimental data. The accuracy of the cross sections they generate is quantitatively estimated and inter-compared through statistical methods. Updates and extensions of a cross section data library relevant to PIXE simulation with Geant4 are discussed.

  5. Analysis of the track- and dose-averaged LET and LET spectra in proton therapy using the GEANT4 Monte Carlo code

    Energy Technology Data Exchange (ETDEWEB)

    Guan, Fada; Peeler, Christopher; Taleei, Reza; Randeniya, Sharmalee; Ge, Shuaiping; Mirkovic, Dragan; Mohan, Radhe; Titt, Uwe, E-mail: UTitt@mdanderson.org [Department of Radiation Physics, The University of Texas MD Anderson Cancer Center, 1515 Holcombe Boulevard, Houston, Texas 77030 (United States); Bronk, Lawrence [Department of Experimental Radiation Oncology, The University of Texas MD Anderson Cancer Center, 1515 Holcombe Boulevard, Houston, Texas 77030 (United States); Geng, Changran [Department of Nuclear Science and Engineering, Nanjing University of Aeronautics and Astronautics, Nanjing 210016, China and Department of Radiation Oncology, Massachusetts General Hospital and Harvard Medical School, Boston, Massachusetts 02114 (United States); Grosshans, David [Department of Experimental Radiation Oncology, The University of Texas MD Anderson Cancer Center, 1515 Holcombe Boulevard, Houston, Texas 77030 and Department of Radiation Oncology, The University of Texas MD Anderson Cancer Center, 1515 Holcombe Boulevard, Houston, Texas 77030 (United States)

    2015-11-15

    Purpose: The motivation of this study was to find and eliminate the cause of errors in dose-averaged linear energy transfer (LET) calculations from therapeutic protons in small targets, such as biological cell layers, calculated using the GEANT 4 Monte Carlo code. Furthermore, the purpose was also to provide a recommendation to select an appropriate LET quantity from GEANT 4 simulations to correlate with biological effectiveness of therapeutic protons. Methods: The authors developed a particle tracking step based strategy to calculate the average LET quantities (track-averaged LET, LET{sub t} and dose-averaged LET, LET{sub d}) using GEANT 4 for different tracking step size limits. A step size limit refers to the maximally allowable tracking step length. The authors investigated how the tracking step size limit influenced the calculated LET{sub t} and LET{sub d} of protons with six different step limits ranging from 1 to 500 μm in a water phantom irradiated by a 79.7-MeV clinical proton beam. In addition, the authors analyzed the detailed stochastic energy deposition information including fluence spectra and dose spectra of the energy-deposition-per-step of protons. As a reference, the authors also calculated the averaged LET and analyzed the LET spectra combining the Monte Carlo method and the deterministic method. Relative biological effectiveness (RBE) calculations were performed to illustrate the impact of different LET calculation methods on the RBE-weighted dose. Results: Simulation results showed that the step limit effect was small for LET{sub t} but significant for LET{sub d}. This resulted from differences in the energy-deposition-per-step between the fluence spectra and dose spectra at different depths in the phantom. Using the Monte Carlo particle tracking method in GEANT 4 can result in incorrect LET{sub d} calculation results in the dose plateau region for small step limits. The erroneous LET{sub d} results can be attributed to the algorithm to

  6. {sup 18}F-FET-PET-based dose painting by numbers with protons

    Energy Technology Data Exchange (ETDEWEB)

    Rickhey, Mark; Moravek, Zdenek; Koelbl, Oliver; Bogner, Ludwig [Dept. of Radiotherapy, Univ. of Regensburg (Germany); Eilles, Christoph [Dept. of Nuclear Medicine, Univ. of Regensburg (Germany)

    2010-06-15

    Purpose: to investigate the potential of {sup 18}F-fluoroethyltyrosine-positron emission tomography-({sup 18}F-FET-PET-)based dose painting by numbers with protons. Material and methods: due to its high specificity to brain tumor cells, FET has a high potential to serve as a target for dose painting by numbers. Biological image-based dose painting might lead to an inhomogeneous dose prescription. For precise treatment planning of such a prescribed dose, an intensity-modulated radiotherapy (IMRT) algorithm including a Monte Carlo dose-calculation algorithm for spot-scanning protons was used. A linear tracer uptake to dose model was used to derive a dose prescription from the {sup 18}F-FET-PET. As a first investigation, a modified modulation transfer function (MTF) of protons was evaluated and compared to the MTF of photons. In a clinically adapted planning study, the feasibility of {sup 18}F-FET-PET-based dose painting with protons was demonstrated using three patients with glioblastome multiforme. The resulting dose distributions were evaluated by means of dose-difference and dose-volume histograms and compared to IMRT data. Results: the MTF for protons was constantly above that for photons. The standard deviations of the dose differences between the prescribed and the optimized dose were smaller in case of protons compared to photons. Furthermore, the escalation study showed that the doses within the subvolumes identified by biological imaging techniques could be escalated remarkably while the dose within the organs at risk was kept at a constant level. Conclusion: the presented investigation fortifies the feasibility of {sup 18}F-FET-PET-based dose painting with protons. (orig.)

  7. SU-E-J-145: Validation of An Analytical Model for in Vivo Range Verification Using GATE Monte Carlo Simulation in Proton Therapy

    International Nuclear Information System (INIS)

    Purpose: Predicted PET images on the basis of analytical filtering approach for proton range verification has been successful developed and validated using FLUKA Monte Carlo (MC) codes and phantom measurements. The purpose of the study is to validate the effectiveness of analytical filtering model for proton range verification on GATE/GEANT4 Monte Carlo simulation codes. Methods: In this study, we performed two experiments for validation of predicted β+-isotope by the analytical model with GATE/GEANT4 simulations. The first experiments to evaluate the accuracy of predicting β+-yields as a function of irradiated proton energies. In second experiment, we simulate homogeneous phantoms of different materials irradiated by a mono-energetic pencil-like proton beam. The results of filtered β+-yields distributions by the analytical model is compared with those of MC simulated β+-yields in proximal and distal fall-off ranges. Results: The results investigate the distribution between filtered β+-yields and MC simulated β+-yields distribution in different conditions. First, we found that the analytical filtering can be applied over the whole range of the therapeutic energies. Second, the range difference between filtered β+-yields and MC simulated β+-yields at the distal fall-off region are within 1.5mm for all materials used. The findings validated the usefulness of analytical filtering model on range verification of proton therapy on GATE Monte Carlo simulations. In addition, there is a larger discrepancy between filtered prediction and MC simulated β+-yields using GATE code, especially in proximal region. This discrepancy might Result from the absence of wellestablished theoretical models for predicting the nuclear interactions. Conclusion: Despite the fact that large discrepancies of the distributions between MC-simulated and predicted β+-yields were observed, the study prove the effectiveness of analytical filtering model for proton range verification using

  8. Proton irradiation on textured bismuth based cuprate superconductors

    International Nuclear Information System (INIS)

    Textured bulk polycrystalline samples of bismuth based cuprate superconductors have been subjected to irradiation with 15 MeV protons. In case of Bi-2212, there has been substantial increase in Tc, which may be due to proton induced knock-out of loosely bound oxygen. In case of (Bi,Pb)-2223, there has been a reduction in Tc. The difference in behaviour in these two systems towards proton irradiation has been explained. (author). 7 refs., 3 figs., 1 tab

  9. Fuel-Cell Electrolytes Based on Organosilica Hybrid Proton Conductors

    Science.gov (United States)

    Narayan, Sri R.; Yen, Shiao-Pin S.

    2008-01-01

    A new membrane composite material that combines an organosilica proton conductor with perfluorinated Nafion material to achieve good proton conductivity and high-temperature performance for membranes used for fuel cells in stationary, transportation, and portable applications has been developed. To achieve high proton conductivities of the order of 10(exp -1)S/cm over a wide range of temperatures, a composite membrane based on a new class of mesoporous, proton-conducting, hydrogen-bonded organosilica, used with Nafion, will allow for water retention and high proton conductivity over a wider range of temperatures than currently offered by Nafion alone. At the time of this reporting, this innovation is at the concept level. Some of the materials and processes investigated have shown good proton conductivity, but membranes have not yet been prepared and demonstrated.

  10. Proton Spin Based On Chiral Dynamics

    OpenAIRE

    Weber, H. J.

    1999-01-01

    Chiral spin fraction models agree with the proton spin data only when the chiral quark-Goldstone boson couplings are pure spinflip. For axial-vector coupling from soft-pion physics this is true for massless quarks but not for constituent quarks. Axial-vector quark-Goldstone boson couplings with {\\bf constituent} quarks are found to be inconsistent with the proton spin data.

  11. Monte Carlo based weighting functions in neutron capture measurements

    International Nuclear Information System (INIS)

    To determine neutron capture cross sections using C6D6 detectors, the Pulse Height Weighting Technique (PHWT) is mostly applied. The weighting function depends from the response function of the detection system in use. Therefore, the quality of the data depends on the detector response used for the calculation of the weighting function. An experimental determination of the response of C6D6 detectors is not always straightforward. We determined the detector response and, hence, the weighting function from Monte Carlo simulations, using the MCNP 4C2 code. To obtain reliable results a big effort was made in preparing geometry input file describing the experimental conditions. To validate the results of the Monte Carlo simulations we performed several experiments at GELINA. First, we measured the C6D6 detector response for standard -sources and for selected resonances in the 206Pb(n,). These responses were compared with the one based on Monte Carlo simulations. The good agreement between experimental and simulated data confirms the reliability of the Monte Carlo simulations. As a second validation exercise, we also determined the normalization factor in Ag and Au sample of different composition and thickness and the neutron width of the 1.15 keV resonance in 5 Fe using samples of different compositions. The result of this validation exercise was that the photon transport and the coupling of the photon and neutron transport must be accounted for in the determination of the weighting function. Accurate weighting functions are required for capture reactions in nuclei where the gamma cascade differs strongly from resonance to resonance, and are extremely important for neutron data related to reactor technologies where Pb-isotopes play an important role. The Monte Carlo based weighting function have been used to deduce the capture yield of 206Pb between 3 and 620 keV and of 232Th between 5 and 150 keV. This method will also be used for the analysis of other neutron capture

  12. Quantitative Monte Carlo-based holmium-166 SPECT reconstruction

    International Nuclear Information System (INIS)

    Purpose: Quantitative imaging of the radionuclide distribution is of increasing interest for microsphere radioembolization (RE) of liver malignancies, to aid treatment planning and dosimetry. For this purpose, holmium-166 (166Ho) microspheres have been developed, which can be visualized with a gamma camera. The objective of this work is to develop and evaluate a new reconstruction method for quantitative 166Ho SPECT, including Monte Carlo-based modeling of photon contributions from the full energy spectrum.Methods: A fast Monte Carlo (MC) simulator was developed for simulation of 166Ho projection images and incorporated in a statistical reconstruction algorithm (SPECT-fMC). Photon scatter and attenuation for all photons sampled from the full 166Ho energy spectrum were modeled during reconstruction by Monte Carlo simulations. The energy- and distance-dependent collimator-detector response was modeled using precalculated convolution kernels. Phantom experiments were performed to quantitatively evaluate image contrast, image noise, count errors, and activity recovery coefficients (ARCs) of SPECT-fMC in comparison with those of an energy window-based method for correction of down-scattered high-energy photons (SPECT-DSW) and a previously presented hybrid method that combines MC simulation of photopeak scatter with energy window-based estimation of down-scattered high-energy contributions (SPECT-ppMC+DSW). Additionally, the impact of SPECT-fMC on whole-body recovered activities (Aest) and estimated radiation absorbed doses was evaluated using clinical SPECT data of six 166Ho RE patients.Results: At the same noise level, SPECT-fMC images showed substantially higher contrast than SPECT-DSW and SPECT-ppMC+DSW in spheres ≥17 mm in diameter. The count error was reduced from 29% (SPECT-DSW) and 25% (SPECT-ppMC+DSW) to 12% (SPECT-fMC). ARCs in five spherical volumes of 1.96–106.21 ml were improved from 32%–63% (SPECT-DSW) and 50%–80% (SPECT-ppMC+DSW) to 76%–103

  13. Feasibility study of proton-based quality assurance of proton range compensator

    Science.gov (United States)

    Park, S.; Jeong, C.; Min, B. J.; Kwak, J.; Lee, J.; Cho, S.; Shin, D.; Lim, Y. K.; Park, S. Y.; Lee, S. B.

    2013-06-01

    All patient specific range compensators (RCs) are customized for achieving distal dose conformity of target volume in passively scattered proton therapy. Compensators are milled precisely using a computerized machine. In proton therapy, precision of the compensator is critical and quality assurance (QA) is required to protect normal tissues and organs from radiation damage. This study aims to evaluate the precision of proton-based quality assurance of range compensator. First, the geometry information of two compensators was extracted from the DICOM Radiotherapy (RT) plan. Next, RCs were irradiated on the EBT film individually by proton beam which is modulated to have a photon-like percent depth dose (PDD). Step phantoms were also irradiated on the EBT film to generate calibration curve which indicates relationship between optical density of irradiated film and perpendicular depth of compensator. Comparisons were made using the mean absolute difference (MAD) between coordinate information from DICOM RT and converted depth information from the EBT film. MAD over the whole region was 1.7, and 2.0 mm. However, MAD over the relatively flat regions on each compensator selected for comparison was within 1 mm. These results shows that proton-based quality assurance of range compensator is feasible and it is expected to achieve MAD over the whole region less than 1 mm with further correction about scattering effect of proton imaging.

  14. Monte Carlo simulation of single spin asymmetries in pion-proton collisions

    CERN Document Server

    Bianconi, A; Bianconi, Andrea; Radici, Marco

    2006-01-01

    We present Monte Carlo simulations of both the Sivers and the Boer-Mulders effects in the polarized Drell-Yan $\\pi^\\pm p^\\uparrow \\to \\mu^+ \\mu^- X$ process at the center-of-mass energy $\\sqrt{s} \\sim 14$ GeV reachable at COMPASS with pion beams of energy 100 GeV. For the Sivers effect, we adopt two different parametrizations for the Sivers function to explore the statistical accuracy required to extract unambiguous information on this parton density. In particular, we verify the possibility of checking its predicted sign change between Semi-Inclusive Deep-Inelastic Scattering (SIDIS) and Drell-Yan processes, a crucial test of nonperturbative QCD. For the Boer-Mulders effect, because of the lack of parametrizations we can make only guesses. The goal is to explore the possibility of extracting information on the transversity distribution, the missing piece necessary to complete the knowledge of the nucleon spin structure at leading twist, and the Boer-Mulders function, which is related to the long-standing pro...

  15. Measurement of LET (linear energy transfer) spectra using CR-39 at different depths of water irradiated by 171 MeV protons: A comparison with Monte Carlo simulation

    Science.gov (United States)

    Sahoo, G. S.; Tripathy, S. P.; Molokanov, A. G.; Aleynikov, V. E.; Sharma, S. D.; Bandyopadhyay, T.

    2016-05-01

    In this work, we have used CR-39 detectors to estimate the LET (linear energy transfer) spectrum of secondary particles due to 171 MeV proton beam at different depths of water including the Bragg peak region. The measured LET spectra were compared with those obtained from FLUKA Monte Carlo simulation. The absorbed dose (DLET), dose equivalent (HLET) were estimated using the LET spectra. The values of DLET and HLET per incident proton fluence were found to increase with the increase in depth of water and were maximum at Bragg peak.

  16. Contribution to proton transport simulation from the MeV range to the keV range by the Monte-Carlo method

    International Nuclear Information System (INIS)

    This study settles on a contribution to the elaboration of slow protons transport simulation. Atomic inner shell ionization is studied in the Plane Wave Born Approximations and in the Binary Encounter Approximation. BRINKMAN-KRAMER's theory and DMITRIEV's theory are used to study charge exchange phenomena. Protons slowing down is studied with the BRICE's stopping power, with the VAVILOV and SYMON's energy straggling distributions and with the MOLIERE, KEIL and MEYER's angular deflexion distributions. Transport simulation is made with Monte Carlo Method; K electrons motion is also taken into account

  17. SU-E-T-243: MonteCarlo Simulation Study of Polymer and Radiochromic Gel for Three-Dimensional Proton Dose Distribution

    Energy Technology Data Exchange (ETDEWEB)

    Park, M; Jung, H; Kim, G; Ji, Y; Kim, K; Park, S [Korea Institute of Radiological and Medical Sciences, Seoul, Nowon-gu (Korea, Republic of)

    2014-06-01

    Purpose: To estimate the three dimensional dose distributions in a polymer gel and a radiochromic gel by comparing with the virtual water phantom exposed to proton beams by applying Monte Carlo simulation. Methods: The polymer gel dosimeter is the compositeness material of gelatin, methacrylic acid, hydroquinone, tetrakis, and distilled water. The radiochromic gel is PRESAGE product. The densities of polymer and radiochromic gel were 1.040 and 1.0005 g/cm3, respectively. The shape of water phantom was a hexahedron with the size of 13 × 13 × 15 cm3. The proton beam energies of 72 and 116 MeV were used in the simulation. Proton beam was directed to the top of the phantom with Z-axis and the shape of beam was quadrangle with 10 × 10 cm2 dimension. The Percent depth dose and the dose distribution were evaluated for estimating the dose distribution of proton particle in two gel dosimeters, and compared with the virtual water phantom. Results: The Bragg-peak for proton particles in two gel dosimeters was similar to the virtual water phantom. Bragg-peak regions of polymer gel, radiochromic gel, and virtual water phantom were represented in the identical region (4.3 cm) for 72 MeV proton beam. For 116 MeV proton beam, the Bragg-peak regions of polymer gel, radiochromic gel, and virtual water phantom were represented in 9.9, 9.9 and 9.7 cm, respectively. The dose distribution of proton particles in polymer gel, radiochromic gel, and virtual water phantom was approximately identical in the case of 72 and 116 MeV energies. The errors for the simulation were under 10%. Conclusion: This work indicates the evaluation of three dimensional dose distributions by exposing proton particles to polymer and radiochromic gel dosimeter by comparing with the water phantom. The polymer gel and the radiochromic gel dosimeter show similar dose distributions for the proton beams.

  18. Adaptation of GEANT4 to Monte Carlo dose calculations based on CT data

    International Nuclear Information System (INIS)

    The GEANT4 Monte Carlo code provides many powerful functions for conducting particle transport simulations with great reliability and flexibility. However, as a general purpose Monte Carlo code, not all the functions were specifically designed and fully optimized for applications in radiation therapy. One of the primary issues is the computational efficiency, which is especially critical when patient CT data have to be imported into the simulation model. In this paper we summarize the relevant aspects of the GEANT4 tracking and geometry algorithms and introduce our work on using the code to conduct dose calculations based on CT data. The emphasis is focused on modifications of the GEANT4 source code to meet the requirements for fast dose calculations. The major features include a quick voxel search algorithm, fast volume optimization, and the dynamic assignment of material density. These features are ready to be used for tracking the primary types of particles employed in radiation therapy such as photons, electrons, and heavy charged particles. Re-calculation of a proton therapy treatment plan generated by a commercial treatment planning program for a paranasal sinus case is presented as an example

  19. Clinical dosimetry in photon radiotherapy. A Monte Carlo based investigation

    International Nuclear Information System (INIS)

    Practical clinical dosimetry is a fundamental step within the radiation therapy process and aims at quantifying the absorbed radiation dose within a 1-2% uncertainty. To achieve this level of accuracy, corrections are needed for calibrated and air-filled ionization chambers, which are used for dose measurement. The procedures of correction are based on cavity theory of Spencer-Attix and are defined in current dosimetry protocols. Energy dependent corrections for deviations from calibration beams account for changed ionization chamber response in the treatment beam. The corrections applied are usually based on semi-analytical models or measurements and are generally hard to determine due to their magnitude of only a few percents or even less. Furthermore the corrections are defined for fixed geometrical reference-conditions and do not apply to non-reference conditions in modern radiotherapy applications. The stochastic Monte Carlo method for the simulation of radiation transport is becoming a valuable tool in the field of Medical Physics. As a suitable tool for calculation of these corrections with high accuracy the simulations enable the investigation of ionization chambers under various conditions. The aim of this work is the consistent investigation of ionization chamber dosimetry in photon radiation therapy with the use of Monte Carlo methods. Nowadays Monte Carlo systems exist, which enable the accurate calculation of ionization chamber response in principle. Still, their bare use for studies of this type is limited due to the long calculation times needed for a meaningful result with a small statistical uncertainty, inherent to every result of a Monte Carlo simulation. Besides heavy use of computer hardware, techniques methods of variance reduction to reduce the needed calculation time can be applied. Methods for increasing the efficiency in the results of simulation were developed and incorporated in a modern and established Monte Carlo simulation environment

  20. SU-E-T-290: Secondary Dose Monitoring Using Scintillating Fibers in Proton Therapy of Prostate Cancer: A Geant4 Monte Carlo Simulation

    International Nuclear Information System (INIS)

    Purpose: To monitor the secondary dose distribution originating from a water phantom during proton therapy of prostate cancer using scintillating fibers. Methods: The Geant4 Monte Carlo toolkit version 9.6.p02 was used to simulate prostate cancer proton therapy based treatments. Two cases were studied. In the first case, 8 × 8 = 64 equally spaced fibers inside three 4 × 4 × 2.54 cmm3 DuPont™ Delrin blocks were used to monitor the emission of secondary particles in the transverse (left and right) and distal regions relative to the beam direction. In the second case, a scintillating block with a thickness of 2.54 cm and equal vertical and longitudinal dimensions as the water phantom was used. Geometrical cuts were used to extract the energy deposited in each fiber and the scintillating block. Results: The transverse dose distributions from secondary particles in both cases agree within <5% and with a very good symmetry. The energy deposited not only gradually increases as one moves from the peripheral row fibers towards the center of the block (aligned with the center of the prostate) but also decreases as one goes from the frontal to distal region of the block. The ratio of the doses from the prostate to the ones in the middle two rows of fibers showed a linear relationship with a slope (−3.55±2.26) × 10−5 MeV per treatment Gy. The distal detectors recorded a very small energy deposited due to water attenuation. Conclusion: With a good calibration and the ability to define a good correlation between the dose to the external fibers and the prostate, such fibers can be used for real time dose verification to the target

  1. An 8-GeV Synchrotron-Based Proton Driver

    International Nuclear Information System (INIS)

    In January 2002, the Fermilab Director initiated a design study for a high average power, modest energy proton facility. Such a facility is a possible candidate for a construction project in the U.S. starting in the middle of this decade. The key technical element is a new machine, dubbed the ''Proton Driver,'' as a replacement of the present Booster. The study of an 8-GeV synchrotron-based proton driver has been completed and published. This paper will give a summary report, including machine layout and performance, optics, beam dynamics issues, technical systems design, civil construction, cost estimate and schedule

  2. A Monte Carlo Study of the Relationship between the Time Structures of Prompt Gammas and in vivo Radiation Dose in Proton Therapy

    CERN Document Server

    Shin, Wook-Geun; Shin, Jae-Ik; Jeong, Jong Hwi; Lee, Se Byeong

    2015-01-01

    For the in vivo range verification in proton therapy, it has been tried to measure the spatial distribution of the prompt gammas generated by the proton-induced interactions with the close relationship with the proton dose distribution. However, the high energy of the prompt gammas and background gammas are still problematic in measuring the distribution. In this study, we suggested a new method determining the in vivo range by utilizing the time structure of the prompt gammas formed with the rotation of a range modulation wheel (RMW) in the passive scattering proton therapy. To validate the Monte Carlo code simulating the proton beam nozzle, axial percent depth doses (PDDs) were compared with the measured PDDs with the varying beam range of 4.73-24.01 cm. And the relationship between the proton dose rate and the time structure of the prompt gammas was assessed and compared in the water phantom. The results of the PDD showed accurate agreement within the relative errors of 1.1% in the distal range and 2.9% in...

  3. Monte-Carlo Modeling of Parameters of a Subcritical Cascade Reactor Based on MSBR and LMFBR Technologies

    CERN Document Server

    Bznuni, S A; Zhamkochyan, V M; Polanski, A; Sosnin, A N; Khudaverdyan, A H

    2001-01-01

    Parameters of a subcritical cascade reactor driven by a proton accelerator and based on a primary lead-bismuth target, main reactor constructed analogously to the molten salt breeder (MSBR) reactor core and a booster-reactor analogous to the core of the BN-350 liquid metal cooled fast breeder reactor (LMFBR). It is shown by means of Monte-Carlo modeling that the reactor under study provides safe operation modes (k_{eff}=0.94-0.98), is apable to transmute effectively radioactive nuclear waste and reduces by an order of magnitude the requirements on the accelerator beam current. Calculations show that the maximal neutron flux in the thermal zone is 10^{14} cm^{12}\\cdot s^_{-1}, in the fast booster zone is 5.12\\cdot10^{15} cm^{12}\\cdot s{-1} at k_{eff}=0.98 and proton beam current I=2.1 mA.

  4. Quantitative Monte Carlo-based holmium-166 SPECT reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Elschot, Mattijs; Smits, Maarten L. J.; Nijsen, Johannes F. W.; Lam, Marnix G. E. H.; Zonnenberg, Bernard A.; Bosch, Maurice A. A. J. van den; Jong, Hugo W. A. M. de [Department of Radiology and Nuclear Medicine, University Medical Center Utrecht, Heidelberglaan 100, 3584 CX Utrecht (Netherlands); Viergever, Max A. [Image Sciences Institute, University Medical Center Utrecht, Heidelberglaan 100, 3584 CX Utrecht (Netherlands)

    2013-11-15

    Purpose: Quantitative imaging of the radionuclide distribution is of increasing interest for microsphere radioembolization (RE) of liver malignancies, to aid treatment planning and dosimetry. For this purpose, holmium-166 ({sup 166}Ho) microspheres have been developed, which can be visualized with a gamma camera. The objective of this work is to develop and evaluate a new reconstruction method for quantitative {sup 166}Ho SPECT, including Monte Carlo-based modeling of photon contributions from the full energy spectrum.Methods: A fast Monte Carlo (MC) simulator was developed for simulation of {sup 166}Ho projection images and incorporated in a statistical reconstruction algorithm (SPECT-fMC). Photon scatter and attenuation for all photons sampled from the full {sup 166}Ho energy spectrum were modeled during reconstruction by Monte Carlo simulations. The energy- and distance-dependent collimator-detector response was modeled using precalculated convolution kernels. Phantom experiments were performed to quantitatively evaluate image contrast, image noise, count errors, and activity recovery coefficients (ARCs) of SPECT-fMC in comparison with those of an energy window-based method for correction of down-scattered high-energy photons (SPECT-DSW) and a previously presented hybrid method that combines MC simulation of photopeak scatter with energy window-based estimation of down-scattered high-energy contributions (SPECT-ppMC+DSW). Additionally, the impact of SPECT-fMC on whole-body recovered activities (A{sup est}) and estimated radiation absorbed doses was evaluated using clinical SPECT data of six {sup 166}Ho RE patients.Results: At the same noise level, SPECT-fMC images showed substantially higher contrast than SPECT-DSW and SPECT-ppMC+DSW in spheres ≥17 mm in diameter. The count error was reduced from 29% (SPECT-DSW) and 25% (SPECT-ppMC+DSW) to 12% (SPECT-fMC). ARCs in five spherical volumes of 1.96–106.21 ml were improved from 32%–63% (SPECT-DSW) and 50%–80

  5. Electron emission from amorphous solid water after proton impact: Benchmarking PTra and Geant4 track structure Monte Carlo simulations

    International Nuclear Information System (INIS)

    Track structure Monte Carlo simulations of ionising radiation in water are often used to estimate radiation damage to DNA. For this purpose, an accurate simulation of the transport of densely ionising low-energy secondary electrons is particularly important, but is impaired by a high uncertainty of the required physical interaction cross section data of liquid water. A possible tool for the verification of the secondary electron transport in a track structure simulation has been suggested by Toburen et al. (2010), who have measured the angle-dependent energy spectra of electrons, emitted from a thin layer of amorphous solid water (ASW) upon a passage of 6 MeV protons. In this work, simulations were performed for the setup of their experiment, using the PTB Track structure code (PTra) and Geant4-DNA. To enable electron transport below the ionisation threshold, additional excitation and dissociative attachment anion states were included in PTra and activated in Geant4. Additionally, a surface potential was considered in both simulations, such that the escape probability for an electron is dependent on its energy and impact angle at the ASW/vacuum interface. For vanishing surface potential, the simulated spectra are in good agreement with the measured spectra for energies above 50 eV. Below, the simulations overestimate the yield of electrons by a factor up to 4 (PTra) or 7 (Geant4-DNA), which is still a better agreement than obtained in previous simulations of this experimental situation. The agreement of the simulations with experimental data was significantly improved by using a step-like increase of the potential energy at the ASW surface. - Highlights: ► Benchmarked electron transport in track structure simulations using liquid water. ► Simulated differential electron spectra agree with measured data. ► The agreement was improved by including a 3 eV surface potential step.

  6. An experimental approach to improve the Monte Carlo modelling of offline PET/CT-imaging of positron emitters induced by scanned proton beams.

    Science.gov (United States)

    Bauer, J; Unholtz, D; Kurz, C; Parodi, K

    2013-08-01

    We report on the experimental campaign carried out at the Heidelberg Ion-Beam Therapy Center (HIT) to optimize the Monte Carlo (MC) modelling of proton-induced positron-emitter production. The presented experimental strategy constitutes a pragmatic inverse approach to overcome the known uncertainties in the modelling of positron-emitter production due to the lack of reliable cross-section data for the relevant therapeutic energy range. This work is motivated by the clinical implementation of offline PET/CT-based treatment verification at our facility. Here, the irradiation induced tissue activation in the patient is monitored shortly after the treatment delivery by means of a commercial PET/CT scanner and compared to a MC simulated activity expectation, derived under the assumption of a correct treatment delivery. At HIT, the MC particle transport and interaction code FLUKA is used for the simulation of the expected positron-emitter yield. For this particular application, the code is coupled to externally provided cross-section data of several proton-induced reactions. Studying experimentally the positron-emitting radionuclide yield in homogeneous phantoms provides access to the fundamental production channels. Therefore, five different materials have been irradiated by monoenergetic proton pencil beams at various energies and the induced β(+) activity subsequently acquired with a commercial full-ring PET/CT scanner. With the analysis of dynamically reconstructed PET images, we are able to determine separately the spatial distribution of different radionuclide concentrations at the starting time of the PET scan. The laterally integrated radionuclide yields in depth are used to tune the input cross-section data such that the impact of both the physical production and the imaging process on the various positron-emitter yields is reproduced. The resulting cross-section data sets allow to model the absolute level of measured β(+) activity induced in the investigated

  7. Poster — Thur Eve — 45: Comparison of different Monte Carlo methods of scoring linear energy transfer in modulated proton therapy beams

    International Nuclear Information System (INIS)

    In this work, we demonstrate inconsistencies in commonly used Monte Carlo methods of scoring linear energy transfer (LET) in proton therapy beams. In particle therapy beams, the LET is an important parameter because the relative biological effectiveness (RBE) depends on it. LET is often determined using Monte Carlo techniques. We used a realistic Monte Carlo model of a proton therapy nozzle to score proton LET in spread-out Bragg peak (SOBP) depth-dose distributions. We used three different scoring and calculation techniques to determine average LET at varying depths within a 140 MeV beam with a 4 cm SOBP and a 250 MeV beam with a 10 cm SOBP. These techniques included fluence-weighted (Φ-LET) and dose-weighted average (D-LET) LET calculations from: 1) scored energy spectra converted to LET spectra through a lookup table, 2) directly scored LET spectra and 3) accumulated LET scored ‘on-the-fly’ during simulations. All protons (primary and secondary) were included in the scoring. Φ-LET was found to be less sensitive to changes in scoring technique than D-LET. In addition, the spectral scoring methods were sensitive to low-energy (high-LET) cutoff values in the averaging. Using cutoff parameters chosen carefully for consistency between techniques, we found variations in Φ-LET values of up to 1.6% and variations in D-LET values of up to 11.2% for the same irradiation conditions, depending on the method used to score LET. Variations were largest near the end of the SOBP, where the LET and energy spectra are broader

  8. GPU-Monte Carlo based fast IMRT plan optimization

    Directory of Open Access Journals (Sweden)

    Yongbao Li

    2014-03-01

    Full Text Available Purpose: Intensity-modulated radiation treatment (IMRT plan optimization needs pre-calculated beamlet dose distribution. Pencil-beam or superposition/convolution type algorithms are typically used because of high computation speed. However, inaccurate beamlet dose distributions, particularly in cases with high levels of inhomogeneity, may mislead optimization, hindering the resulting plan quality. It is desire to use Monte Carlo (MC methods for beamlet dose calculations. Yet, the long computational time from repeated dose calculations for a number of beamlets prevents this application. It is our objective to integrate a GPU-based MC dose engine in lung IMRT optimization using a novel two-steps workflow.Methods: A GPU-based MC code gDPM is used. Each particle is tagged with an index of a beamlet where the source particle is from. Deposit dose are stored separately for beamlets based on the index. Due to limited GPU memory size, a pyramid space is allocated for each beamlet, and dose outside the space is neglected. A two-steps optimization workflow is proposed for fast MC-based optimization. At first step, a rough dose calculation is conducted with only a few number of particle per beamlet. Plan optimization is followed to get an approximated fluence map. In the second step, more accurate beamlet doses are calculated, where sampled number of particles for a beamlet is proportional to the intensity determined previously. A second-round optimization is conducted, yielding the final result.Results: For a lung case with 5317 beamlets, 105 particles per beamlet in the first round, and 108 particles per beam in the second round are enough to get a good plan quality. The total simulation time is 96.4 sec.Conclusion: A fast GPU-based MC dose calculation method along with a novel two-step optimization workflow are developed. The high efficiency allows the use of MC for IMRT optimizations.--------------------------------Cite this article as: Li Y, Tian Z

  9. A CNS calculation line based on a Monte Carlo method

    International Nuclear Information System (INIS)

    Full text: The design of the moderator cell of a Cold Neutron Source (CNS) involves many different considerations regarding geometry, location, and materials. Decisions taken in this sense affect not only the neutron flux in the source neighborhood, which can be evaluated by a standard empirical method, but also the neutron flux values in experimental positions far away of the neutron source. At long distances from the neutron source, very time consuming 3D deterministic methods or Monte Carlo transport methods are necessary in order to get accurate figures. Standard and typical terminology such as average neutron flux, neutron current, angular flux, luminosity, are magnitudes very difficult to evaluate in positions located several meters away from the neutron source. The Monte Carlo method is a unique and powerful tool to transport neutrons. Its use in a bootstrap scheme appears to be an appropriate solution for this type of systems. The proper use of MCNP as the main tool leads to a fast and reliable method to perform calculations in a relatively short time with low statistical errors. The design goal is to evaluate the performance of the neutron sources, their beam tubes and neutron guides at specific experimental locations in the reactor hall as well as in the neutron or experimental hall. In this work, the calculation methodology used to design Cold, Thermal and Hot Neutron Sources and their associated Neutron Beam Transport Systems, based on the use of the MCNP code, is presented. This work also presents some changes made to the cross section libraries in order to cope with cryogenic moderators such as liquid hydrogen and liquid deuterium. (author)

  10. GPU based Monte Carlo for PET image reconstruction: parameter optimization

    International Nuclear Information System (INIS)

    This paper presents the optimization of a fully Monte Carlo (MC) based iterative image reconstruction of Positron Emission Tomography (PET) measurements. With our MC re- construction method all the physical effects in a PET system are taken into account thus superior image quality is achieved in exchange for increased computational effort. The method is feasible because we utilize the enormous processing power of Graphical Processing Units (GPUs) to solve the inherently parallel problem of photon transport. The MC approach regards the simulated positron decays as samples in mathematical sums required in the iterative reconstruction algorithm, so to complement the fast architecture, our work of optimization focuses on the number of simulated positron decays required to obtain sufficient image quality. We have achieved significant results in determining the optimal number of samples for arbitrary measurement data, this allows to achieve the best image quality with the least possible computational effort. Based on this research recommendations can be given for effective partitioning of computational effort into the iterations in limited time reconstructions. (author)

  11. Proton radiography and proton computed tomography based on time-resolved dose measurements.

    Science.gov (United States)

    Testa, Mauro; Verburg, Joost M; Rose, Mark; Min, Chul Hee; Tang, Shikui; Bentefour, El Hassane; Paganetti, Harald; Lu, Hsiao-Ming

    2013-11-21

    We present a proof of principle study of proton radiography and proton computed tomography (pCT) based on time-resolved dose measurements. We used a prototype, two-dimensional, diode-array detector capable of fast dose rate measurements, to acquire proton radiographic images expressed directly in water equivalent path length (WEPL). The technique is based on the time dependence of the dose distribution delivered by a proton beam traversing a range modulator wheel in passive scattering proton therapy systems. The dose rate produced in the medium by such a system is periodic and has a unique pattern in time at each point along the beam path and thus encodes the WEPL. By measuring the time dose pattern at the point of interest, the WEPL to this point can be decoded. If one measures the time–dose patterns at points on a plane behind the patient for a beam with sufficient energy to penetrate the patient, the obtained 2D distribution of the WEPL forms an image. The technique requires only a 2D dosimeter array and it uses only the clinical beam for a fraction of second with negligible dose to patient. We first evaluated the accuracy of the technique in determining the WEPL for static phantoms aiming at beam range verification of the brain fields of medulloblastoma patients. Accurate beam ranges for these fields can significantly reduce the dose to the cranial skin of the patient and thus the risk of permanent alopecia. Second, we investigated the potential features of the technique for real-time imaging of a moving phantom. Real-time tumor tracking by proton radiography could provide more accurate validations of tumor motion models due to the more sensitive dependence of proton beam on tissue density compared to x-rays. Our radiographic technique is rapid (~100 ms) and simultaneous over the whole field, it can image mobile tumors without the problem of interplay effect inherently challenging for methods based on pencil beams. Third, we present the reconstructed p

  12. Proton radiography and proton computed tomography based on time-resolved dose measurements

    Science.gov (United States)

    Testa, Mauro; Verburg, Joost M.; Rose, Mark; Min, Chul Hee; Tang, Shikui; Hassane Bentefour, El; Paganetti, Harald; Lu, Hsiao-Ming

    2013-11-01

    We present a proof of principle study of proton radiography and proton computed tomography (pCT) based on time-resolved dose measurements. We used a prototype, two-dimensional, diode-array detector capable of fast dose rate measurements, to acquire proton radiographic images expressed directly in water equivalent path length (WEPL). The technique is based on the time dependence of the dose distribution delivered by a proton beam traversing a range modulator wheel in passive scattering proton therapy systems. The dose rate produced in the medium by such a system is periodic and has a unique pattern in time at each point along the beam path and thus encodes the WEPL. By measuring the time dose pattern at the point of interest, the WEPL to this point can be decoded. If one measures the time-dose patterns at points on a plane behind the patient for a beam with sufficient energy to penetrate the patient, the obtained 2D distribution of the WEPL forms an image. The technique requires only a 2D dosimeter array and it uses only the clinical beam for a fraction of second with negligible dose to patient. We first evaluated the accuracy of the technique in determining the WEPL for static phantoms aiming at beam range verification of the brain fields of medulloblastoma patients. Accurate beam ranges for these fields can significantly reduce the dose to the cranial skin of the patient and thus the risk of permanent alopecia. Second, we investigated the potential features of the technique for real-time imaging of a moving phantom. Real-time tumor tracking by proton radiography could provide more accurate validations of tumor motion models due to the more sensitive dependence of proton beam on tissue density compared to x-rays. Our radiographic technique is rapid (˜100 ms) and simultaneous over the whole field, it can image mobile tumors without the problem of interplay effect inherently challenging for methods based on pencil beams. Third, we present the reconstructed p

  13. Application of equivalence methods on Monte Carlo method based homogenization multi-group constants

    International Nuclear Information System (INIS)

    The multi-group constants generated via continuous energy Monte Carlo method do not satisfy the equivalence between reference calculation and diffusion calculation applied in reactor core analysis. To the satisfaction of the equivalence theory, general equivalence theory (GET) and super homogenization method (SPH) were applied to the Monte Carlo method based group constants, and a simplified reactor core and C5G7 benchmark were examined with the Monte Carlo constants. The results show that the calculating precision of group constants is improved, and GET and SPH are good candidates for the equivalence treatment of Monte Carlo homogenization. (authors)

  14. Secondary Neutron Doses to Pediatric Patients During Intracranial Proton Therapy: Monte Carlo Simulation of the Neutron Energy Spectrum and its Organ Doses.

    Science.gov (United States)

    Matsumoto, Shinnosuke; Koba, Yusuke; Kohno, Ryosuke; Lee, Choonsik; Bolch, Wesley E; Kai, Michiaki

    2016-04-01

    Proton therapy has the physical advantage of a Bragg peak that can provide a better dose distribution than conventional x-ray therapy. However, radiation exposure of normal tissues cannot be ignored because it is likely to increase the risk of secondary cancer. Evaluating secondary neutrons generated by the interaction of the proton beam with the treatment beam-line structure is necessary; thus, performing the optimization of radiation protection in proton therapy is required. In this research, the organ dose and energy spectrum were calculated from secondary neutrons using Monte Carlo simulations. The Monte Carlo code known as the Particle and Heavy Ion Transport code System (PHITS) was used to simulate the transport proton and its interaction with the treatment beam-line structure that modeled the double scattering body of the treatment nozzle at the National Cancer Center Hospital East. The doses of the organs in a hybrid computational phantom simulating a 5-y-old boy were calculated. In general, secondary neutron doses were found to decrease with increasing distance to the treatment field. Secondary neutron energy spectra were characterized by incident neutrons with three energy peaks: 1×10, 1, and 100 MeV. A block collimator and a patient collimator contributed significantly to organ doses. In particular, the secondary neutrons from the patient collimator were 30 times higher than those from the first scatter. These results suggested that proactive protection will be required in the design of the treatment beam-line structures and that organ doses from secondary neutrons may be able to be reduced. PMID:26910030

  15. Cell death following BNCT: A theoretical approach based on Monte Carlo simulations

    International Nuclear Information System (INIS)

    In parallel to boron measurements and animal studies, investigations on radiation-induced cell death are also in progress in Pavia, with the aim of better characterisation of the effects of a BNCT treatment down to the cellular level. Such studies are being carried out not only experimentally but also theoretically, based on a mechanistic model and a Monte Carlo code. Such model assumes that: (1) only clustered DNA strand breaks can lead to chromosome aberrations; (2) only chromosome fragments within a certain threshold distance can undergo misrejoining; (3) the so-called 'lethal aberrations' (dicentrics, rings and large deletions) lead to cell death. After applying the model to normal cells exposed to monochromatic fields of different radiation types, the irradiation section of the code was purposely extended to mimic the cell exposure to a mixed radiation field produced by the 10B(n,α) 7Li reaction, which gives rise to alpha particles and Li ions of short range and high biological effectiveness, and by the 14N(n,p)14C reaction, which produces 0.58 MeV protons. Very good agreement between model predictions and literature data was found for human and animal cells exposed to X- or gamma-rays, protons and alpha particles, thus allowing to validate the model for cell death induced by monochromatic radiation fields. The model predictions showed good agreement also with experimental data obtained by our group exposing DHD cells to thermal neutrons in the TRIGA Mark II reactor of University of Pavia; this allowed to validate the model also for a BNCT exposure scenario, providing a useful predictive tool to bridge the gap between irradiation and cell death.

  16. Cell death following BNCT: A theoretical approach based on Monte Carlo simulations

    Energy Technology Data Exchange (ETDEWEB)

    Ballarini, F., E-mail: francesca.ballarini@pv.infn.it [University of Pavia, Department of Nuclear and Theoretical Physics, via Bassi 6, Pavia (Italy)] [INFN (National Institute of Nuclear Physics)-Sezione di Pavia, via Bassi 6, Pavia (Italy); Bakeine, J. [University of Pavia, Department of Nuclear and Theoretical Physics, via Bassi 6, Pavia (Italy); Bortolussi, S. [University of Pavia, Department of Nuclear and Theoretical Physics, via Bassi 6, Pavia (Italy)] [INFN (National Institute of Nuclear Physics)-Sezione di Pavia, via Bassi 6, Pavia (Italy); Bruschi, P. [University of Pavia, Department of Nuclear and Theoretical Physics, via Bassi 6, Pavia (Italy); Cansolino, L.; Clerici, A.M.; Ferrari, C. [University of Pavia, Department of Surgery, Experimental Surgery Laboratory, Pavia (Italy); Protti, N.; Stella, S. [University of Pavia, Department of Nuclear and Theoretical Physics, via Bassi 6, Pavia (Italy)] [INFN (National Institute of Nuclear Physics)-Sezione di Pavia, via Bassi 6, Pavia (Italy); Zonta, A.; Zonta, C. [University of Pavia, Department of Surgery, Experimental Surgery Laboratory, Pavia (Italy); Altieri, S. [University of Pavia, Department of Nuclear and Theoretical Physics, via Bassi 6, Pavia (Italy)] [INFN (National Institute of Nuclear Physics)-Sezione di Pavia, via Bassi 6, Pavia (Italy)

    2011-12-15

    In parallel to boron measurements and animal studies, investigations on radiation-induced cell death are also in progress in Pavia, with the aim of better characterisation of the effects of a BNCT treatment down to the cellular level. Such studies are being carried out not only experimentally but also theoretically, based on a mechanistic model and a Monte Carlo code. Such model assumes that: (1) only clustered DNA strand breaks can lead to chromosome aberrations; (2) only chromosome fragments within a certain threshold distance can undergo misrejoining; (3) the so-called 'lethal aberrations' (dicentrics, rings and large deletions) lead to cell death. After applying the model to normal cells exposed to monochromatic fields of different radiation types, the irradiation section of the code was purposely extended to mimic the cell exposure to a mixed radiation field produced by the {sup 10}B(n,{alpha}) {sup 7}Li reaction, which gives rise to alpha particles and Li ions of short range and high biological effectiveness, and by the {sup 14}N(n,p){sup 14}C reaction, which produces 0.58 MeV protons. Very good agreement between model predictions and literature data was found for human and animal cells exposed to X- or gamma-rays, protons and alpha particles, thus allowing to validate the model for cell death induced by monochromatic radiation fields. The model predictions showed good agreement also with experimental data obtained by our group exposing DHD cells to thermal neutrons in the TRIGA Mark II reactor of University of Pavia; this allowed to validate the model also for a BNCT exposure scenario, providing a useful predictive tool to bridge the gap between irradiation and cell death.

  17. Cell death following BNCT: a theoretical approach based on Monte Carlo simulations.

    Science.gov (United States)

    Ballarini, F; Bakeine, J; Bortolussi, S; Bruschi, P; Cansolino, L; Clerici, A M; Ferrari, C; Protti, N; Stella, S; Zonta, A; Zonta, C; Altieri, S

    2011-12-01

    In parallel to boron measurements and animal studies, investigations on radiation-induced cell death are also in progress in Pavia, with the aim of better characterisation of the effects of a BNCT treatment down to the cellular level. Such studies are being carried out not only experimentally but also theoretically, based on a mechanistic model and a Monte Carlo code. Such model assumes that: (1) only clustered DNA strand breaks can lead to chromosome aberrations; (2) only chromosome fragments within a certain threshold distance can undergo misrejoining; (3) the so-called "lethal aberrations" (dicentrics, rings and large deletions) lead to cell death. After applying the model to normal cells exposed to monochromatic fields of different radiation types, the irradiation section of the code was purposely extended to mimic the cell exposure to a mixed radiation field produced by the (10)B(n,α) (7)Li reaction, which gives rise to alpha particles and Li ions of short range and high biological effectiveness, and by the (14)N(n,p)(14)C reaction, which produces 0.58 MeV protons. Very good agreement between model predictions and literature data was found for human and animal cells exposed to X- or gamma-rays, protons and alpha particles, thus allowing to validate the model for cell death induced by monochromatic radiation fields. The model predictions showed good agreement also with experimental data obtained by our group exposing DHD cells to thermal neutrons in the TRIGA Mark II reactor of the University of Pavia; this allowed to validate the model also for a BNCT exposure scenario, providing a useful predictive tool to bridge the gap between irradiation and cell death. PMID:21481595

  18. SU-E-T-519: Emission of Secondary Particles From a PMMA Phantom During Proton Irradiation: A Simulation Study with the Geant4 Monte Carlo Toolkit

    International Nuclear Information System (INIS)

    Purpose: Proton therapy exhibits several advantages over photon therapy due to depth-dose distributions from proton interactions within the target material. However, uncertainties associated with protons beam range in the patient limit the advantage of proton therapy applications. To quantify beam range, positron-emitting nuclei (PEN) and prompt gamma (PG) techniques have been developed. These techniques use de-excitation photons to describe the location of the beam in the patient. To develop a detector system for implementing the PG technique for range verification applications in proton therapy, we studied the yields, energy and angular distributions of the secondary particles emitted from a PMMA phantom. Methods: Proton pencil beams of various energies incident onto a PMMA phantom with dimensions of 5 x 5 x 50 cm3 were used for simulation with the Geant4 toolkit using the standard electromagnetic packages as well as the packages based on the binary-cascade nuclear model. The emitted secondary particles are analyzed . Results: For 160 MeV incident protons, the yields of secondary neutrons and photons per 100 incident protons were ~6 and ~15 respectively. Secondary photon energy spectrum showed several energy peaks in the range between 0 and 10 MeV. The energy peaks located between 4 and 6 MeV were attributed to originate from direct proton interactions with 12C (~ 4.4 MeV) and 16O (~ 6 MeV), respectively. Most of the escaping secondary neutrons were found to have energies between 10 and 100 MeV. Isotropic emissions were found for lower energy neutrons (<10 MeV) and photons for all energies, while higher energy neutrons were emitted predominantly in the forward direction. The yields of emitted photons and neutrons increased with the increase of incident proton energies. Conclusions: A detector system is currently being developed incorporating the yields, energy and angular distributions of secondary particles from proton interactions obtained from this study

  19. Proton irradiation of liquid crystal based adaptive optical devices

    International Nuclear Information System (INIS)

    To assess its radiation hardness, a liquid crystal based adaptive optical element has been irradiated using a 60 MeV proton beam. The device with the functionality of an optical beam steerer was characterised before, during and after the irradiation. A systematic set of measurements on the transmission and beam deflection angles was carried out. The measurements showed that the transmission decreased only marginally and that its optical performance degraded only after a very high proton fluence (1010p/cm2). The device showed complete annealing in the functionality as a beam steerer, which leads to the conclusion that the liquid crystal technology for optical devices is not vulnerable to proton irradiation as expected in space.

  20. Refined Stratified Sampling for efficient Monte Carlo based uncertainty quantification

    International Nuclear Information System (INIS)

    A general adaptive approach rooted in stratified sampling (SS) is proposed for sample-based uncertainty quantification (UQ). To motivate its use in this context the space-filling, orthogonality, and projective properties of SS are compared with simple random sampling and Latin hypercube sampling (LHS). SS is demonstrated to provide attractive properties for certain classes of problems. The proposed approach, Refined Stratified Sampling (RSS), capitalizes on these properties through an adaptive process that adds samples sequentially by dividing the existing subspaces of a stratified design. RSS is proven to reduce variance compared to traditional stratified sample extension methods while providing comparable or enhanced variance reduction when compared to sample size extension methods for LHS – which do not afford the same degree of flexibility to facilitate a truly adaptive UQ process. An initial investigation of optimal stratification is presented and motivates the potential for major advances in variance reduction through optimally designed RSS. Potential paths for extension of the method to high dimension are discussed. Two examples are provided. The first involves UQ for a low dimensional function where convergence is evaluated analytically. The second presents a study to asses the response variability of a floating structure to an underwater shock. - Highlights: • An adaptive process, rooted in stratified sampling, is proposed for Monte Carlo-based uncertainty quantification. • Space-filling, orthogonality, and projective properties of stratified sampling are investigated • Stratified sampling is shown to possess attractive properties for certain classes of problems. • Refined Stratified Sampling, a new sampling method is proposed that enables the adaptive UQ process. • Optimality of RSS stratum division is explored

  1. Clinical results of proton beam therapy for skull base chordoma

    International Nuclear Information System (INIS)

    Purpose: To evaluate clinical results of proton beam therapy for patients with skull base chordoma. Methods and materials: Thirteen patients with skull base chordoma who were treated with proton beams with or without X-rays at the University of Tsukuba between 1989 and 2000 were retrospectively reviewed. A median total tumor dose of 72.0 Gy (range, 63.0-95.0 Gy) was delivered. The patients were followed for a median period of 69.3 months (range, 14.6-123.4 months). Results: The 5-year local control rate was 46.0%. Cause-specific, overall, and disease-free survival rates at 5 years were 72.2%, 66.7%, and 42.2%, respectively. The local control rate was higher, without statistical significance, for those with preoperative tumors <30 mL. Partial or subtotal tumor removal did not yield better local control rates than for patients who underwent biopsy only as the latest surgery. Conclusion: Proton beam therapy is effective for patients with skull base chordoma, especially for those with small tumors. For a patient with a tumor of <30 mL with no prior treatment, biopsy without tumor removal seems to be appropriate before proton beam therapy

  2. A global reaction route mapping-based kinetic Monte Carlo algorithm

    Science.gov (United States)

    Mitchell, Izaac; Irle, Stephan; Page, Alister J.

    2016-07-01

    We propose a new on-the-fly kinetic Monte Carlo (KMC) method that is based on exhaustive potential energy surface searching carried out with the global reaction route mapping (GRRM) algorithm. Starting from any given equilibrium state, this GRRM-KMC algorithm performs a one-step GRRM search to identify all surrounding transition states. Intrinsic reaction coordinate pathways are then calculated to identify potential subsequent equilibrium states. Harmonic transition state theory is used to calculate rate constants for all potential pathways, before a standard KMC accept/reject selection is performed. The selected pathway is then used to propagate the system forward in time, which is calculated on the basis of 1st order kinetics. The GRRM-KMC algorithm is validated here in two challenging contexts: intramolecular proton transfer in malonaldehyde and surface carbon diffusion on an iron nanoparticle. We demonstrate that in both cases the GRRM-KMC method is capable of reproducing the 1st order kinetics observed during independent quantum chemical molecular dynamics simulations using the density-functional tight-binding potential.

  3. Monte Carlo-based simulation of dynamic jaws tomotherapy

    International Nuclear Information System (INIS)

    Purpose: Original TomoTherapy systems may involve a trade-off between conformity and treatment speed, the user being limited to three slice widths (1.0, 2.5, and 5.0 cm). This could be overcome by allowing the jaws to define arbitrary fields, including very small slice widths (<1 cm), which are challenging for a beam model. The aim of this work was to incorporate the dynamic jaws feature into a Monte Carlo (MC) model called TomoPen, based on the MC code PENELOPE, previously validated for the original TomoTherapy system. Methods: To keep the general structure of TomoPen and its efficiency, the simulation strategy introduces several techniques: (1) weight modifiers to account for any jaw settings using only the 5 cm phase-space file; (2) a simplified MC based model called FastStatic to compute the modifiers faster than pure MC; (3) actual simulation of dynamic jaws. Weight modifiers computed with both FastStatic and pure MC were compared. Dynamic jaws simulations were compared with the convolution/superposition (C/S) of TomoTherapy in the ''cheese'' phantom for a plan with two targets longitudinally separated by a gap of 3 cm. Optimization was performed in two modes: asymmetric jaws-constant couch speed (''running start stop,'' RSS) and symmetric jaws-variable couch speed (''symmetric running start stop,'' SRSS). Measurements with EDR2 films were also performed for RSS for the formal validation of TomoPen with dynamic jaws. Results: Weight modifiers computed with FastStatic were equivalent to pure MC within statistical uncertainties (0.5% for three standard deviations). Excellent agreement was achieved between TomoPen and C/S for both asymmetric jaw opening/constant couch speed and symmetric jaw opening/variable couch speed, with deviations well within 2%/2 mm. For RSS procedure, agreement between C/S and measurements was within 2%/2 mm for 95% of the points and 3%/3 mm for 98% of the points, where dose is greater than 30% of the prescription dose (gamma analysis

  4. MCHITS: Monte Carlo based Method for Hyperlink Induced Topic Search on Networks

    Directory of Open Access Journals (Sweden)

    Zhaoyan Jin

    2013-10-01

    Full Text Available Hyperlink Induced Topic Search (HITS is the most authoritative and most widely used personalized ranking algorithm on networks. The HITS algorithm ranks nodes on networks according to power iteration, and has high complexity of computation. This paper models the HITS algorithm with the Monte Carlo method, and proposes Monte Carlo based algorithms for the HITS computation. Theoretical analysis and experiments show that the Monte Carlo based approximate computing of the HITS ranking reduces computing resources a lot while keeping higher accuracy, and is significantly better than related works

  5. Implementation and verification of nuclear interactions in a Monte-Carlo code for the Procom-ProGam proton therapy planning system

    International Nuclear Information System (INIS)

    Proton interaction with an exposed object material needs to be modeled with account for three basic processes: electromagnetic stopping of protons in matter, multiple coulomb scattering and nuclear interactions. Just the last type of processes is the topic of this paper. Monte Carlo codes are often used to simulate high-energy particle interaction with matter. However, nuclear interaction models implemented in these codes are rather extensive and their use in treatment planning systems requires huge computational resources. We have selected the IThMC code for its ability to reproduce experiments which measure the distribution of the projected ranges of nuclear secondary particles generated by proton beams in a multi-layer Faraday cup. The multi-layer Faraday cup detectors measure charge rather than dose and allow distinguishing between electromagnetic and nuclear interactions. The event generator used in the IThMC code is faster, but less accurate than any other used in testing. Our model of nuclear reactions demonstrates quite good agreement with experiment in the context of their effect on the Bragg peak in therapeutic applications

  6. Unfolding an under-determined neutron spectrum using genetic algorithm based Monte Carlo

    International Nuclear Information System (INIS)

    Spallation in addition to the other photon-neutron reactions in target materials and different components in accelerators may result in production of huge amount of energetic protons which further leads to the production of neutron and contributes to the main component of the total dose. For dosimetric purposes in accelerator facilities the detector measurements doesn't provide directly the actual neutron flux values but a cumulative picture. To obtain Neutron spectrum from the measured data, response functions of the measuring instrument together with the measurements are used into many unfolding techniques which are frequently used for unfolding the hidden spectral information. Here we discuss a genetic algorithm based unfolding technique which is in the process of development. Genetic Algorithm is a stochastic method based on natural selection, which mimics Darwinian theory of survival of the best. The above said method has been tested to unfold the neutron spectra obtained from a reaction carried out at an accelerator facility, with energetic carbon ions on thick silver target along with its respective neutron response of BC501A liquid scintillation detector. The problem dealt here is under-determined where the number of measurements is less than the required energy bin information. The results so obtained were compared with those obtained using the established unfolding code FERDOR, which unfolds data for completely determined problems. It is seen that the genetic algorithm based solution has a reasonable match with the results of FERDOR, when smoothening carried out by Monte Carlo is taken into consideration. This method appears to be a promising candidate for unfolding neutron spectrum in cases of under-determined and over-determined, where measurements are more. The method also has advantages of flexibility, computational simplicity and works well without need of any initial guess spectrum. (author)

  7. Neutron production in spallation reactions of 0.9 and 1.5 GeV protons on a thick lead target. Comparison between experimental data and Monte-Carlo simulations

    International Nuclear Information System (INIS)

    This paper reports on two experiments performed at the Synchrophasotron/Nuclotron accelerator complex at JINR. Relativistic protons with energies 885 MeV and 1.5 GeV hit a massive cylindrical lead target. The spatial and energetic distributions of the neutron field produced by the spallation reactions were measured by the activation of Al, Au, Bi, Co, and Cu foils placed on the surface of the target and close to it. The yields of the radioactive nuclei produced by threshold reactions in these foils were determined by the analyses of their γ spectra. The comparison with Monte-Carlo based simulations was performed both with the LAHET+MCNP code and the MCNPX code

  8. An empirical formula based on Monte Carlo simulation for diffuse reflectance from turbid media

    Science.gov (United States)

    Gnanatheepam, Einstein; Aruna, Prakasa Rao; Ganesan, Singaravelu

    2016-03-01

    Diffuse reflectance spectroscopy has been widely used in diagnostic oncology and characterization of laser irradiated tissue. However, still accurate and simple analytical equation does not exist for estimation of diffuse reflectance from turbid media. In this work, a diffuse reflectance lookup table for a range of tissue optical properties was generated using Monte Carlo simulation. Based on the generated Monte Carlo lookup table, an empirical formula for diffuse reflectance was developed using surface fitting method. The variance between the Monte Carlo lookup table surface and the surface obtained from the proposed empirical formula is less than 1%. The proposed empirical formula may be used for modeling of diffuse reflectance from tissue.

  9. TH-A-19A-06: Site-Specific Comparison of Analytical and Monte Carlo Based Dose Calculations

    Energy Technology Data Exchange (ETDEWEB)

    Schuemann, J; Grassberger, C; Paganetti, H [Massachusetts General Hospital and Harvard Medical School, Boston, MA (United States); Dowdell, S [Illawarra Shoalhaven Local Health District, Wollongong (Australia)

    2014-06-15

    Purpose: To investigate the impact of complex patient geometries on the capability of analytical dose calculation algorithms to accurately predict dose distributions and to verify currently used uncertainty margins in proton therapy. Methods: Dose distributions predicted by an analytical pencilbeam algorithm were compared with Monte Carlo simulations (MCS) using TOPAS. 79 complete patient treatment plans were investigated for 7 disease sites (liver, prostate, breast, medulloblastoma spine and whole brain, lung and head and neck). A total of 508 individual passively scattered treatment fields were analyzed for field specific properties. Comparisons based on target coverage indices (EUD, D95, D90 and D50) were performed. Range differences were estimated for the distal position of the 90% dose level (R90) and the 50% dose level (R50). Two-dimensional distal dose surfaces were calculated and the root mean square differences (RMSD), average range difference (ARD) and average distal dose degradation (ADD), the distance between the distal position of the 80% and 20% dose levels (R80- R20), were analyzed. Results: We found target coverage indices calculated by TOPAS to generally be around 1–2% lower than predicted by the analytical algorithm. Differences in R90 predicted by TOPAS and the planning system can be larger than currently applied range margins in proton therapy for small regions distal to the target volume. We estimate new site-specific range margins (R90) for analytical dose calculations considering total range uncertainties and uncertainties from dose calculation alone based on the RMSD. Our results demonstrate that a reduction of currently used uncertainty margins is feasible for liver, prostate and whole brain fields even without introducing MC dose calculations. Conclusion: Analytical dose calculation algorithms predict dose distributions within clinical limits for more homogeneous patients sites (liver, prostate, whole brain). However, we recommend

  10. Capture and Transport of Laser Accelerated Protons by Pulsed Magnetic Fields: Advancements Toward Laser-Based Proton Therapy

    Science.gov (United States)

    Burris-Mog, Trevor J.

    The interaction of intense laser light (I > 10 18 W/cm2) with a thin target foil leads to the Target Normal Sheath Acceleration mechanism (TNSA). TNSA is responsible for the generation of high current, ultra-low emittance proton beams, which may allow for the development of a compact and cost effective proton therapy system for the treatment of cancer. Before this application can be realized, control is needed over the large divergence and the 100% kinetic energy spread that are characteristic of TNSA proton beams. The work presented here demonstrates control over the divergence and energy spread using strong magnetic fields generated by a pulse power solenoid. The solenoidal field results in a parallel proton beam with a kinetic energy spread DeltaE/E = 10%. Assuming that next generation lasers will be able to operate at 10 Hz, the 10% spread in the kinetic energy along with the 23% capture efficiency of the solenoid yield enough protons per laser pulse to, for the first time, consider applications in Radiation Oncology. Current lasers can generate proton beams with kinetic energies up to 67.5 MeV, but for therapy applications, the proton kinetic energy must reach 250 MeV. Since the maximum kinetic energy Emax of the proton scales with laser light intensity as Emax ∝ I0.5, next generation lasers may very well accelerate 250 MeV protons. As the kinetic energy of the protons is increased, the magnetic field strength of the solenoid will need to increase. The scaling of the magnetic field B with the kinetic energy of the protons follows B ∝ E1/2. Therefor, the field strength of the solenoid presented in this work will need to be increased by a factor of 2.4 in order to accommodate 250 MeV protons. This scaling factor seems reasonable, even with present technology. This work not only demonstrates control over beam divergence and energy spread, it also allows for us to now perform feasibility studies to further research what a laser-based proton therapy system

  11. Machine learning-based patient specific prompt-gamma dose monitoring in proton therapy

    International Nuclear Information System (INIS)

    Online dose monitoring in proton therapy is currently being investigated with prompt-gamma (PG) devices. PG emission was shown to be correlated with dose deposition. This relationship is mostly unknown under real conditions. We propose a machine learning approach based on simulations to create optimized treatment-specific classifiers that detect discrepancies between planned and delivered dose. Simulations were performed with the Monte-Carlo platform Gate/Geant4 for a spot-scanning proton therapy treatment and a PG camera prototype currently under investigation. The method first builds a learning set of perturbed situations corresponding to a range of patient translation. This set is then used to train a combined classifier using distal falloff and registered correlation measures. Classifier performances were evaluated using receiver operating characteristic curves and maximum associated specificity and sensitivity. A leave-one-out study showed that it is possible to detect discrepancies of 5 mm with specificity and sensitivity of 85% whereas using only distal falloff decreases the sensitivity down to 77% on the same data set. The proposed method could help to evaluate performance and to optimize the design of PG monitoring devices. It is generic: other learning sets of deviations, other measures and other types of classifiers could be studied to potentially reach better performance. At the moment, the main limitation lies in the computation time needed to perform the simulations. (paper)

  12. Proton-Coupled Electron Transfer Reactions with Photometric Bases Reveal Free Energy Relationships for Proton Transfer.

    Science.gov (United States)

    Eisenhart, Thomas T; Howland, William C; Dempsey, Jillian L

    2016-08-18

    The proton-coupled electron transfer (PCET) oxidation of p-aminophenol in acetonitrile was initiated via stopped-flow rapid-mixing and spectroscopically monitored. For oxidation by ferrocenium in the presence of 7-(dimethylamino)quinoline proton acceptors, both the electron transfer and proton transfer components could be optically monitored in the visible region; the decay of the ferrocenium absorbance is readily monitored (λmax = 620 nm), and the absorbance of the 2,4-substituted 7-(dimethylamino)quinoline derivatives (λmax = 370-392 nm) red-shifts substantially (ca. 70 nm) upon protonation. Spectral analysis revealed the reaction proceeds via a stepwise electron transfer-proton transfer process, and modeling of the kinetics traces monitoring the ferrocenium and quinolinium signals provided rate constants for elementary proton and electron transfer steps. As the pKa values of the conjugate acids of the 2,4-R-7-(dimethylamino)quinoline derivatives employed were readily tuned by varying the substituents at the 2- and 4-positions of the quinoline backbone, the driving force for proton transfer was systematically varied. Proton transfer rate constants (kPT,2 = (1.5-7.5) × 10(8) M(-1) s(-1), kPT,4 = (0.55-3.0) × 10(7) M(-1) s(-1)) were found to correlate with the pKa of the conjugate acid of the proton acceptor, in agreement with anticipated free energy relationships for proton transfer processes in PCET reactions. PMID:27500804

  13. Comparison of experimental proton-induced fluorescence spectra for a selection of thin high-Z samples with Geant4 Monte Carlo simulations

    Energy Technology Data Exchange (ETDEWEB)

    Incerti, S., E-mail: sebastien.incerti@tdt.edu.vn [Division of Nuclear Physics, Ton Duc Thang University, Ho Chi Minh City (Viet Nam); Faculty of Applied Sciences, Ton Duc Thang University, Ho Chi Minh City (Viet Nam); Univ. Bordeaux, CENBG, UMR 5797, F-33170 Gradignan (France); CNRS, IN2P3, CENBG, UMR 5797, F-33170 Gradignan (France); Barberet, Ph.; Dévès, G.; Michelet, C. [Univ. Bordeaux, CENBG, UMR 5797, F-33170 Gradignan (France); CNRS, IN2P3, CENBG, UMR 5797, F-33170 Gradignan (France); Francis, Z. [Université Saint Joseph, Science Faculty, Department of Physics, Beirut (Lebanon); Ivantchenko, V. [Ecoanalytica, Moscow (Russian Federation); Geant4 Associates International Ltd, Hebden Bridge (United Kingdom); Mantero, A. [SWHARD srl, via Greto di Cornigliano 6r, 16152 Genova (Italy); El Bitar, Z. [Institut Pluridisciplinaire Hubert Curien, CNRS/IN2P3, 67037 Strasbourg Cedex (France); Bernal, M.A. [Instituto de Física Gleb Wataghin, Universidade Estadual de Campinas, SP (Brazil); Tran, H.N. [Division of Nuclear Physics, Ton Duc Thang University, Ho Chi Minh City (Viet Nam); Faculty of Applied Sciences, Ton Duc Thang University, Ho Chi Minh City (Viet Nam); Karamitros, M.; Seznec, H. [Univ. Bordeaux, CENBG, UMR 5797, F-33170 Gradignan (France); CNRS, IN2P3, CENBG, UMR 5797, F-33170 Gradignan (France)

    2015-09-01

    The general purpose Geant4 Monte Carlo simulation toolkit is able to simulate radiative and non-radiative atomic de-excitation processes such as fluorescence and Auger electron emission, occurring after interaction of incident ionising radiation with target atomic electrons. In this paper, we evaluate the Geant4 modelling capability for the simulation of fluorescence spectra induced by 1.5 MeV proton irradiation of thin high-Z foils (Fe, GdF{sub 3}, Pt, Au) with potential interest for nanotechnologies and life sciences. Simulation results are compared to measurements performed at the Centre d’Etudes Nucléaires de Bordeaux-Gradignan AIFIRA nanobeam line irradiation facility in France. Simulation and experimental conditions are described and the influence of Geant4 electromagnetic physics models is discussed.

  14. Proton radiotherapy in management of pediatric base of skull tumors

    International Nuclear Information System (INIS)

    Purpose: Primary skull base tumors of the developing child are rare and present a formidable challenge to both surgeons and radiation oncologists. Gross total resection with negative margins is rarely achieved, and the risks of functional, structural, and cosmetic deficits limit the radiation dose using conventional radiation techniques. Twenty-nine children and adolescents treated with conformal proton radiotherapy (proton RT) were analyzed to assess treatment efficacy and safety. Methods and Materials: Between July 1992 and April 1999, 29 patients with mesenchymal tumors underwent fractionated proton (13 patients) or fractionated combined proton and photon (16 patients) irradiation. The age at treatment ranged from 1 to 19 years (median 12); 14 patients were male and 15 female. Tumors were grouped as malignant or benign. Twenty patients had malignant histologic findings, including chordoma (n=10), chondrosarcoma (n=3), rhabdomyosarcoma (n=4), and other sarcomas (n=3). Target doses ranged between 50.4 and 78.6 Gy/cobalt Gray equivalent (CGE), delivered at doses of 1.8-2.0 Gy/CGE per fraction. The benign histologic findings included giant cell tumors (n=6), angiofibromas (n=2), and chondroblastoma (n=1). RT doses for this group ranged from 45.0 to 71.8 Gy/CGE. Despite maximal surgical resection, 28 (97%) of 29 patients had gross disease at the time of proton RT. Follow-up after proton RT ranged from 13 to 92 months (mean 40). Results: Of the 20 patients with malignant tumors, 5 (25%) had local failure; 1 patient had failure in the surgical access route and 3 patients developed distant metastases. Seven patients had died of progressive disease at the time of analysis. Local tumor control was maintained in 6 (60%) of 10 patients with chordoma, 3 (100%) of 3 with chondrosarcoma, 4 (100%) of 4 with rhabdomyosarcoma, and 2 (66%) of 3 with other sarcomas. The actuarial 5-year local control and overall survival rate was 72% and 56%, respectively, and the overall survival

  15. TH-A-19A-08: Intel Xeon Phi Implementation of a Fast Multi-Purpose Monte Carlo Simulation for Proton Therapy

    International Nuclear Information System (INIS)

    Purpose: Recent studies have demonstrated the capability of graphics processing units (GPUs) to compute dose distributions using Monte Carlo (MC) methods within clinical time constraints. However, GPUs have a rigid vectorial architecture that favors the implementation of simplified particle transport algorithms, adapted to specific tasks. Our new, fast, and multipurpose MC code, named MCsquare, runs on Intel Xeon Phi coprocessors. This technology offers 60 independent cores, and therefore more flexibility to implement fast and yet generic MC functionalities, such as prompt gamma simulations. Methods: MCsquare implements several models and hence allows users to make their own tradeoff between speed and accuracy. A 200 MeV proton beam is simulated in a heterogeneous phantom using Geant4 and two configurations of MCsquare. The first one is the most conservative and accurate. The method of fictitious interactions handles the interfaces and secondary charged particles emitted in nuclear interactions are fully simulated. The second, faster configuration simplifies interface crossings and simulates only secondary protons after nuclear interaction events. Integral depth-dose and transversal profiles are compared to those of Geant4. Moreover, the production profile of prompt gammas is compared to PENH results. Results: Integral depth dose and transversal profiles computed by MCsquare and Geant4 are within 3%. The production of secondaries from nuclear interactions is slightly inaccurate at interfaces for the fastest configuration of MCsquare but this is unlikely to have any clinical impact. The computation time varies between 90 seconds for the most conservative settings to merely 59 seconds in the fastest configuration. Finally prompt gamma profiles are also in very good agreement with PENH results. Conclusion: Our new, fast, and multi-purpose Monte Carlo code simulates prompt gammas and calculates dose distributions in less than a minute, which complies with clinical time

  16. Foundation of an analytical proton beamlet model for inclusion in a general proton dose calculation system

    OpenAIRE

    Ulmer, W.; Schaffner, B.

    2010-01-01

    We have developed a model for proton depth dose and lateral distributions based on Monte Carlo calculations (GEANT4) and an integration procedure of the Bethe-Bloch equation (BBE). The model accounts for the transport of primary and secondary protons, the creation of recoil protons and heavy recoil nuclei as well as lateral scattering of these contributions. The buildup, which is experimentally observed in higher energy depth dose curves, is modeled by inclusion of two different origins: 1. S...

  17. Sci—Fri PM: Topics — 07: Monte Carlo Simulation of Primary Dose and PET Isotope Production for the TRIUMF Proton Therapy Facility

    Energy Technology Data Exchange (ETDEWEB)

    Lindsay, C; Jirasek, A [University of Victoria (Australia); Blackmore, E; Hoehr, C; Schaffer, P; Trinczek, M [TRIUMF (Canada); Sossi, V [University of British Columbia (Canada)

    2014-08-15

    Uveal melanoma is a rare and deadly tumour of the eye with primary metastases in the liver resulting in an 8% 2-year survival rate upon detection. Large growths, or those in close proximity to the optic nerve, pose a particular challenge to the commonly employed eye-sparing technique of eye-plaque brachytherapy. In these cases external beam charged particle therapy offers improved odds in avoiding catastrophic side effects such as neuropathy or blindness. Since 1995, the British Columbia Cancer Agency in partnership with the TRIUMF national laboratory have offered proton therapy in the treatment of difficult ocular tumors. Having seen 175 patients, yielding 80% globe preservation and 82% metastasis free survival as of 2010, this modality has proven to be highly effective. Despite this success, there have been few studies into the use of the world's largest cyclotron in patient care. Here we describe first efforts of modeling the TRIUMF dose delivery system using the FLUKA Monte Carlo package. Details on geometry, estimating beam parameters, measurement of primary dose and simulation of PET isotope production are discussed. Proton depth dose in both modulated and pristine beams is successfully simulated to sub-millimeter precision in range (within limits of measurement) and 2% agreement to measurement within in a treatment volume. With the goal of using PET signals for in vivo dosimetry (alignment), a first look at PET isotope depth distribution is presented — comparing favourably to a naive method of approximating simulated PET slice activity in a Lucite phantom.

  18. Spot-Scanning-Based Proton Therapy for Extracranial Chordoma

    Energy Technology Data Exchange (ETDEWEB)

    Staab, Adrian, E-mail: adrian.staab@psi.ch [Center for Proton Therapy, Paul Scherrer Institute, Villigen (Switzerland); Rutz, Hans Peter; Ares, Carmen; Timmermann, Beate; Schneider, Ralf; Bolsi, Alessandra; Albertini, Francesca; Lomax, Antony; Goitein, Gudrun; Hug, Eugen [Center for Proton Therapy, Paul Scherrer Institute, Villigen (Switzerland)

    2011-11-15

    Purpose: To evaluate effectiveness and safety of spot-scanning-based proton-radiotherapy (PT) for extracranial chordomas (ECC). Methods and Material: Between 1999-2006, 40 patients with chordoma of C-, T-, and L-spine and sacrum were treated at Paul Scherrer Institute (PSI) with PT using spot-scanning. Median patient age was 58 years (range, 10-81 years); 63% were male, and 36% were female. Nineteen patients (47%) had gross residual disease (mean 69 cc; range, 13-495 cc) before PT, and 21 patients (53%) had undergone prior titanium-based surgical stabilization (SS) and reconstruction of the axial skeleton. Proton doses were expressed as Gy(RBE). A conversion factor of 1.1 was used to account for higher relative biological effectiveness (RBE) of protons compared with photons. Mean total dose was 72.5 Gy(RBE) [range, 59.4-75.2 Gy(RBE)] delivered at 1.8-2.0 Gy(RBE) dose per fraction. Median follow-up time was 43 months. Results: In 19 patients without surgical stabilization, actuarial local control (LC) rate at 5 years was 100%. LC for patients with gross residual disease but without surgical stabilization was also 100% at 5 years. In contrast, 12 failures occurred in 21 patients with SS, yielding a significantly decreased 5-year LC rate of 30% (p = 0.0003). For the entire cohort, 5-year LC rates were 62%, disease-free survival rates were 57%, and overall survival rates were 80%. Rates were 100% for patients without SS. No other factor, including dosimetric parameters (V95, V80) were predictive for tumor control on univariate analysis. Conclusion: Spot-scanning-based PT at PSI delivered subsequently to function-preserving surgery for tumor debulking, decompression of spinal cord, or biopsy only is safe and highly effective in patients with ECC without major surgical instrumentation even in view of large, unresectable disease.

  19. Pentanol-based target material with polarized protons

    International Nuclear Information System (INIS)

    1-pentanol is a promising material for a target with polarized protons owing to its high resistance to radiation damage. To develop the target, the solutions of 1-pentanol or 2-pentanol with complexes of pentavalent chromium ware investigated. The material based EHBA-Cr(V) solution in a glass-like matrix, consisting of 1-pentanol, 3-pentanol and 1,2-propanediol, was proposed as a target material. It was investigated by the electron paramagnetic resonance and differential scanning calorimetry methods. 24 refs.; 3 figs.; 1 tab

  20. Excited States of Proton-bound DNA/RNA Base Homo-dimers: Pyrimidines

    CERN Document Server

    Féraud, Géraldine; Dedonder, Claude; Jouvet, Christophe; Pino, Gustavo A

    2015-01-01

    We are presenting the electronic photo fragment spectra of the protonated pyrimidine DNA bases homo-dimers. Only the thymine dimer exhibits a well structured vibrational progression, while protonated monomer shows broad vibrational bands. This shows that proton bonding can block some non radiative processes present in the monomer.

  1. Monte-Carlo based uncertainty analysis: Sampling efficiency and sampling convergence

    International Nuclear Information System (INIS)

    Monte Carlo analysis has become nearly ubiquitous since its introduction, now over 65 years ago. It is an important tool in many assessments of the reliability and robustness of systems, structures or solutions. As the deterministic core simulation can be lengthy, the computational costs of Monte Carlo can be a limiting factor. To reduce that computational expense as much as possible, sampling efficiency and convergence for Monte Carlo are investigated in this paper. The first section shows that non-collapsing space-filling sampling strategies, illustrated here with the maximin and uniform Latin hypercube designs, highly enhance the sampling efficiency, and render a desired level of accuracy of the outcomes attainable with far lesser runs. In the second section it is demonstrated that standard sampling statistics are inapplicable for Latin hypercube strategies. A sample-splitting approach is put forward, which in combination with a replicated Latin hypercube sampling allows assessing the accuracy of Monte Carlo outcomes. The assessment in turn permits halting the Monte Carlo simulation when the desired levels of accuracy are reached. Both measures form fairly noncomplex upgrades of the current state-of-the-art in Monte-Carlo based uncertainty analysis but give a substantial further progress with respect to its applicability.

  2. Foundation of an analytical proton beamlet model for inclusion in a general proton dose calculation system

    CERN Document Server

    Ulmer, W

    2010-01-01

    We have developed a model for proton depth dose and lateral distributions based on Monte Carlo calculations (GEANT4) and an integration procedure of the Bethe-Bloch equation (BBE). The model accounts for the transport of primary and secondary protons, the creation of recoil protons and heavy recoil nuclei as well as lateral scattering of these contributions. The buildup, which is experimentally observed in higher energy depth dose curves, is modeled by inclusion of two different origins: 1. Secondary reaction protons with a contribution of ca. 65 % of the buildup (for monoenergetic protons). 2. Landau tails as well as Gaussian type of fluctuations for range straggling effects. All parameters of the model for initially monoenergetic proton beams have been obtained from Monte Carlo calculations or checked by them. Furthermore, there are a few parameters, which can be obtained by fitting the model to measured depth dose curves in order to describe individual characteristics of the beamline - the most important b...

  3. Density Functional Theory (DFT) modeling and Monte Carlo simulation assessment of inhibition performance of some carbohydrazide Schiff bases for steel corrosion

    Science.gov (United States)

    Obot, I. B.; Kaya, Savaş; Kaya, Cemal; Tüzün, Burak

    2016-06-01

    DFT and Monte Carlo simulation were performed on three Schiff bases namely, 4-(4-bromophenyl)-N‧-(4-methoxybenzylidene)thiazole-2-carbohydrazide (BMTC), 4-(4-bromophenyl)-N‧-(2,4-dimethoxybenzylidene)thiazole-2-carbohydrazide (BDTC), 4-(4-bromophenyl)-N‧-(4-hydroxybenzylidene)thiazole-2-carbohydrazide (BHTC) recently studied as corrosion inhibitor for steel in acid medium. Electronic parameters relevant to their inhibition activity such as EHOMO, ELUMO, Energy gap (ΔE), hardness (η), softness (σ), the absolute electronegativity (χ), proton affinity (PA) and nucleophilicity (ω) etc., were computed and discussed. Monte Carlo simulations were applied to search for the most stable configuration and adsorption energies for the interaction of the inhibitors with Fe (110) surface. The theoretical data obtained are in most cases in agreement with experimental results.

  4. Implementation of a Monte Carlo based inverse planning model for clinical IMRT with MCNP code

    Science.gov (United States)

    He, Tongming Tony

    In IMRT inverse planning, inaccurate dose calculations and limitations in optimization algorithms introduce both systematic and convergence errors to treatment plans. The goal of this work is to practically implement a Monte Carlo based inverse planning model for clinical IMRT. The intention is to minimize both types of error in inverse planning and obtain treatment plans with better clinical accuracy than non-Monte Carlo based systems. The strategy is to calculate the dose matrices of small beamlets by using a Monte Carlo based method. Optimization of beamlet intensities is followed based on the calculated dose data using an optimization algorithm that is capable of escape from local minima and prevents possible pre-mature convergence. The MCNP 4B Monte Carlo code is improved to perform fast particle transport and dose tallying in lattice cells by adopting a selective transport and tallying algorithm. Efficient dose matrix calculation for small beamlets is made possible by adopting a scheme that allows concurrent calculation of multiple beamlets of single port. A finite-sized point source (FSPS) beam model is introduced for easy and accurate beam modeling. A DVH based objective function and a parallel platform based algorithm are developed for the optimization of intensities. The calculation accuracy of improved MCNP code and FSPS beam model is validated by dose measurements in phantoms. Agreements better than 1.5% or 0.2 cm have been achieved. Applications of the implemented model to clinical cases of brain, head/neck, lung, spine, pancreas and prostate have demonstrated the feasibility and capability of Monte Carlo based inverse planning for clinical IMRT. Dose distributions of selected treatment plans from a commercial non-Monte Carlo based system are evaluated in comparison with Monte Carlo based calculations. Systematic errors of up to 12% in tumor doses and up to 17% in critical structure doses have been observed. The clinical importance of Monte Carlo based

  5. Reactor physics analysis method based on Monte Carlo homogenization

    International Nuclear Information System (INIS)

    Background: Many new concepts of nuclear energy systems with complicated geometric structures and diverse energy spectra have been put forward to meet the future demand of nuclear energy market. The traditional deterministic neutronics analysis method has been challenged in two aspects: one is the ability of generic geometry processing; the other is the multi-spectrum applicability of the multi-group cross section libraries. The Monte Carlo (MC) method predominates the suitability of geometry and spectrum, but faces the problems of long computation time and slow convergence. Purpose: This work aims to find a novel scheme to take the advantages of both methods drawn from the deterministic core analysis method and MC method. Methods: A new two-step core analysis scheme is proposed to combine the geometry modeling capability and continuous energy cross section libraries of MC method, as well as the higher computational efficiency of deterministic method. First of all, the MC simulations are performed for assembly, and the assembly homogenized multi-group cross sections are tallied at the same time. Then, the core diffusion calculations can be done with these multi-group cross sections. Results: The new scheme can achieve high efficiency while maintain acceptable precision. Conclusion: The new scheme can be used as an effective tool for the design and analysis of innovative nuclear energy systems, which has been verified by numeric tests. (authors)

  6. Application of Photon Transport Monte Carlo Module with GPU-based Parallel System

    Energy Technology Data Exchange (ETDEWEB)

    Park, Chang Je [Sejong University, Seoul (Korea, Republic of); Shon, Heejeong [Golden Eng. Co. LTD, Seoul (Korea, Republic of); Lee, Donghak [CoCo Link Inc., Seoul (Korea, Republic of)

    2015-05-15

    In general, it takes lots of computing time to get reliable results in Monte Carlo simulations especially in deep penetration problems with a thick shielding medium. To mitigate such a weakness of Monte Carlo methods, lots of variance reduction algorithms are proposed including geometry splitting and Russian roulette, weight windows, exponential transform, and forced collision, etc. Simultaneously, advanced computing hardware systems such as GPU(Graphics Processing Units)-based parallel machines are used to get a better performance of the Monte Carlo simulation. The GPU is much easier to access and to manage when comparing a CPU cluster system. It also becomes less expensive these days due to enhanced computer technology. There, lots of engineering areas adapt GPU-bases massive parallel computation technique. based photon transport Monte Carlo method. It provides almost 30 times speedup without any optimization and it is expected almost 200 times with fully supported GPU system. It is expected that GPU system with advanced parallelization algorithm will contribute successfully for development of the Monte Carlo module which requires quick and accurate simulations.

  7. Application of Photon Transport Monte Carlo Module with GPU-based Parallel System

    International Nuclear Information System (INIS)

    In general, it takes lots of computing time to get reliable results in Monte Carlo simulations especially in deep penetration problems with a thick shielding medium. To mitigate such a weakness of Monte Carlo methods, lots of variance reduction algorithms are proposed including geometry splitting and Russian roulette, weight windows, exponential transform, and forced collision, etc. Simultaneously, advanced computing hardware systems such as GPU(Graphics Processing Units)-based parallel machines are used to get a better performance of the Monte Carlo simulation. The GPU is much easier to access and to manage when comparing a CPU cluster system. It also becomes less expensive these days due to enhanced computer technology. There, lots of engineering areas adapt GPU-bases massive parallel computation technique. based photon transport Monte Carlo method. It provides almost 30 times speedup without any optimization and it is expected almost 200 times with fully supported GPU system. It is expected that GPU system with advanced parallelization algorithm will contribute successfully for development of the Monte Carlo module which requires quick and accurate simulations

  8. Acceptance and implementation of a system of planning computerized based on Monte Carlo

    International Nuclear Information System (INIS)

    It has been done the acceptance for use clinical Monaco computerized planning system, based on an on a virtual model of the energy yield of the head of the linear electron Accelerator and that performs the calculation of the dose with an algorithm of x-rays (XVMC) based on Monte Carlo algorithm. (Author)

  9. A Monte-Carlo-Based Network Method for Source Positioning in Bioluminescence Tomography

    OpenAIRE

    Zhun Xu; Xiaolei Song; Xiaomeng Zhang; Jing Bai

    2007-01-01

    We present an approach based on the improved Levenberg Marquardt (LM) algorithm of backpropagation (BP) neural network to estimate the light source position in bioluminescent imaging. For solving the forward problem, the table-based random sampling algorithm (TBRS), a fast Monte Carlo simulation method ...

  10. VIP-Man: An image-based whole-body adult male model constructed from color photographs of the visible human project for multi-particle Monte Carlo calculations

    International Nuclear Information System (INIS)

    Human anatomical models have been indispensable to radiation protection dosimetry using Monte Carlo calculations. Existing MIRD-based mathematical models are easy to compute and standardize, but they are simplified and crude compared to human anatomy. This article describes the development of an image-based whole-body model, called VIP-Man, using transversal color photographic images obtained from the National Library of Medicine's Visible Human Project for Monte Carlo organ dose calculations involving photons, electron, neutrons, and protons. As the first of a series of papers on dose calculations based on VIP-Man, this article provides detailed information about how to construct an image-based model, as well as how to adopt it into well-tested Monte Carlo codes, EGS4, MCNP4B, and MCNPX

  11. Proton therapy for tumors of the skull base

    Energy Technology Data Exchange (ETDEWEB)

    Munzenrider, J.E.; Liebsch, N.J. [Dept. of Radiation Oncology, Harvard Univ. Medical School, Boston, MA (United States)

    1999-06-01

    Charged particle beams are ideal for treating skull base and cervical spine tumors: dose can be focused in the target, while achieving significant sparing of the brain, brain stem, cervical cord, and optic nerves and chiasm. For skull base tumors, 10-year local control rates with combined proton-photon therapy are highest for chondrosarcomas, intermediate for male chordomas, and lowest for female chordomas (94%, 65%, and 42%, respectively). For cervical spine tumors, 10-year local control rates are not significantly different for chordomas and chondrosarcomas (54% and 48%, respectively), nor is there any difference in local control between males and females. Observed treatment-related morbidity has been judged acceptable, in view of the major morbidity and mortality which accompany uncontrolled tumor growth. (orig.)

  12. The influence of lateral beam profile modifications in scanned proton and carbon ion therapy: a Monte Carlo study

    CERN Document Server

    Parodi, K; Kraemer, M; Sommerer, F; Naumann, J; Mairani, A; Brons, S

    2010-01-01

    Scanned ion beam delivery promises superior flexibility and accuracy for highly conformal tumour therapy in comparison to the usage of passive beam shaping systems. The attainable precision demands correct overlapping of the pencil-like beams which build up the entire dose distribution in the treatment field. In particular, improper dose application due to deviations of the lateral beam profiles from the nominal planning conditions must be prevented via appropriate beam monitoring in the beamline, prior to the entrance in the patient. To assess the necessary tolerance thresholds of the beam monitoring system at the Heidelberg Ion Beam Therapy Center, Germany, this study has investigated several worst-case scenarios for a sensitive treatment plan, namely scanned proton and carbon ion delivery to a small target volume at a shallow depth. Deviations from the nominal lateral beam profiles were simulated, which may occur because of misaligned elements or changes of the beam optic in the beamline. Data have been an...

  13. Acceptance and implementation of a system of planning computerized based on Monte Carlo; Aceptacion y puesta en marcha de un sistema de planificacion comutarizada basado en Monte Carlo

    Energy Technology Data Exchange (ETDEWEB)

    Lopez-Tarjuelo, J.; Garcia-Molla, R.; Suan-Senabre, X. J.; Quiros-Higueras, J. Q.; Santos-Serra, A.; Marco-Blancas, N.; Calzada-Feliu, S.

    2013-07-01

    It has been done the acceptance for use clinical Monaco computerized planning system, based on an on a virtual model of the energy yield of the head of the linear electron Accelerator and that performs the calculation of the dose with an algorithm of x-rays (XVMC) based on Monte Carlo algorithm. (Author)

  14. The future of new calculation concepts in dosimetry based on the Monte Carlo Methods; Avenir des nouveaux concepts des calculs dosimetriques bases sur les methodes de Monte Carlo

    Energy Technology Data Exchange (ETDEWEB)

    Makovicka, L.; Vasseur, A.; Sauget, M.; Martin, E.; Gschwind, R.; Henriet, J. [Universite de Franche-Comte, Equipe IRMA/ENISYS/FEMTO-ST, UMR6174 CNRS, 25 - Montbeliard (France); Vasseur, A.; Sauget, M.; Martin, E.; Gschwind, R.; Henriet, J.; Salomon, M. [Universite de Franche-Comte, Equipe AND/LIFC, 90 - Belfort (France)

    2009-01-15

    Monte Carlo codes, precise but slow, are very important tools in the vast majority of specialities connected to Radiation Physics, Radiation Protection and Dosimetry. A discussion about some other computing solutions is carried out; solutions not only based on the enhancement of computer power, or on the 'biasing'used for relative acceleration of these codes (in the case of photons), but on more efficient methods (A.N.N. - artificial neural network, C.B.R. - case-based reasoning - or other computer science techniques) already and successfully used for a long time in other scientific or industrial applications and not only Radiation Protection or Medical Dosimetry. (authors)

  15. Response matrix Monte Carlo based on a general geometry local calculation for electron transport

    International Nuclear Information System (INIS)

    A Response Matrix Monte Carlo (RMMC) method has been developed for solving electron transport problems. This method was born of the need to have a reliable, computationally efficient transport method for low energy electrons (below a few hundred keV) in all materials. Today, condensed history methods are used which reduce the computation time by modeling the combined effect of many collisions but fail at low energy because of the assumptions required to characterize the electron scattering. Analog Monte Carlo simulations are prohibitively expensive since electrons undergo coulombic scattering with little state change after a collision. The RMMC method attempts to combine the accuracy of an analog Monte Carlo simulation with the speed of the condensed history methods. Like condensed history, the RMMC method uses probability distributions functions (PDFs) to describe the energy and direction of the electron after several collisions. However, unlike the condensed history method the PDFs are based on an analog Monte Carlo simulation over a small region. Condensed history theories require assumptions about the electron scattering to derive the PDFs for direction and energy. Thus the RMMC method samples from PDFs which more accurately represent the electron random walk. Results show good agreement between the RMMC method and analog Monte Carlo. 13 refs., 8 figs

  16. Convex-based void filling method for CAD-based Monte Carlo geometry modeling

    International Nuclear Information System (INIS)

    Highlights: • We present a new void filling method named CVF for CAD based MC geometry modeling. • We describe convex based void description based and quality-based space subdivision. • The results showed improvements provided by CVF for both modeling and MC calculation efficiency. - Abstract: CAD based automatic geometry modeling tools have been widely applied to generate Monte Carlo (MC) calculation geometry for complex systems according to CAD models. Automatic void filling is one of the main functions in the CAD based MC geometry modeling tools, because the void space between parts in CAD models is traditionally not modeled while MC codes such as MCNP need all the problem space to be described. A dedicated void filling method, named Convex-based Void Filling (CVF), is proposed in this study for efficient void filling and concise void descriptions. The method subdivides all the problem space into disjointed regions using Quality based Subdivision (QS) and describes the void space in each region with complementary descriptions of the convex volumes intersecting with that region. It has been implemented in SuperMC/MCAM, the Multiple-Physics Coupling Analysis Modeling Program, and tested on International Thermonuclear Experimental Reactor (ITER) Alite model. The results showed that the new method reduced both automatic modeling time and MC calculation time

  17. Comparison of depth-dose distributions of proton therapeutic beams calculated by means of logical detectors and ionization chamber modeled in Monte Carlo codes

    Science.gov (United States)

    Pietrzak, Robert; Konefał, Adam; Sokół, Maria; Orlef, Andrzej

    2016-08-01

    The success of proton therapy depends strongly on the precision of treatment planning. Dose distribution in biological tissue may be obtained from Monte Carlo simulations using various scientific codes making it possible to perform very accurate calculations. However, there are many factors affecting the accuracy of modeling. One of them is a structure of objects called bins registering a dose. In this work the influence of bin structure on the dose distributions was examined. The MCNPX code calculations of Bragg curve for the 60 MeV proton beam were done in two ways: using simple logical detectors being the volumes determined in water, and using a precise model of ionization chamber used in clinical dosimetry. The results of the simulations were verified experimentally in the water phantom with Marcus ionization chamber. The average local dose difference between the measured relative doses in the water phantom and those calculated by means of the logical detectors was 1.4% at first 25 mm, whereas in the full depth range this difference was 1.6% for the maximum uncertainty in the calculations less than 2.4% and for the maximum measuring error of 1%. In case of the relative doses calculated with the use of the ionization chamber model this average difference was somewhat greater, being 2.3% at depths up to 25 mm and 2.4% in the full range of depths for the maximum uncertainty in the calculations of 3%. In the dose calculations the ionization chamber model does not offer any additional advantages over the logical detectors. The results provided by both models are similar and in good agreement with the measurements, however, the logical detector approach is a more time-effective method.

  18. Monte-Carlo based prediction of radiochromic film response for hadrontherapy dosimetry

    International Nuclear Information System (INIS)

    A model has been developed to calculate MD-55-V2 radiochromic film response to ion irradiation. This model is based on photon film response and film saturation by high local energy deposition computed by Monte-Carlo simulation. We have studied the response of the film to photon irradiation and we proposed a calculation method for hadron beams.

  19. New memory devices based on the proton transfer process

    Science.gov (United States)

    Wierzbowska, Małgorzata

    2016-01-01

    Memory devices operating due to the fast proton transfer (PT) process are proposed by the means of first-principles calculations. Writing information is performed using the electrostatic potential of scanning tunneling microscopy (STM). Reading information is based on the effect of the local magnetization induced at the zigzag graphene nanoribbon (Z-GNR) edge—saturated with oxygen or the hydroxy group—and can be realized with the use of giant magnetoresistance (GMR), a magnetic tunnel junction or spin-transfer torque devices. The energetic barriers for the hop forward and backward processes can be tuned by the distance and potential of the STM tip; this thus enables us to tailor the non-volatile logic states. The proposed system enables very dense packing of the logic cells and could be used in random access and flash memory devices.

  20. New memory devices based on the proton transfer process.

    Science.gov (United States)

    Wierzbowska, Małgorzata

    2016-01-01

    Memory devices operating due to the fast proton transfer (PT) process are proposed by the means of first-principles calculations. Writing  information is performed using the electrostatic potential of scanning tunneling microscopy (STM). Reading information is based on the effect of the local magnetization induced at the zigzag graphene nanoribbon (Z-GNR) edge-saturated with oxygen or the hydroxy group-and can be realized with the use of giant magnetoresistance (GMR), a magnetic tunnel junction or spin-transfer torque devices. The energetic barriers for the hop forward and backward processes can be tuned by the distance and potential of the STM tip; this thus enables us to tailor the non-volatile logic states. The proposed system enables very dense packing of the logic cells and could be used in random access and flash memory devices. PMID:26596910

  1. Proton beam micromachining on strippable aqueous base developable negative resist

    Energy Technology Data Exchange (ETDEWEB)

    Rajta, I. [Institute of Nuclear Research of the Hungarian Academy of Sciences, H-4001 Debrecen, P.O. Box 51 (Hungary)]. E-mail: rajta@atomki.hu; Baradacs, E. [University of Debrecen, Department of Environmental Physics, H-4026 Debrecen, Poroszlay u. 6 (Hungary); Chatzichristidi, M. [Institute of Microelectronics, NCSR-' Demokritos' , POB 62230, 153 10 Ag. Paraskevi (Greece); Valamontes, E.S. [Department of Electronics Technological Educational Institute of Athens, 12210 Aegaleo (Greece); Uzonyi, I. [Institute of Nuclear Research of the Hungarian Academy of Sciences, H-4001 Debrecen, P.O. Box 51 (Hungary); Raptis, I. [Institute of Microelectronics, NCSR-' Demokritos' , POB 62230, 153 10 Ag. Paraskevi (Greece)

    2005-04-01

    Nowadays a significant amount of research effort is devoted to the development of technologies for the fabrication of microcomponents and microsystems worldwide. In certain applications of micromachining high aspect ratio (HAR) structures are required. However, the resist materials used in HAR technologies are usually not compatible with the IC fabrication, either because they cannot be stripped away or because they are developed in organic solvents. In the present work the application of a novel chemically amplified resist for proton beam micromachining is presented. The resist based on epoxy and polyhydroxystyrene polymers is developed in the IC standard aqueous developers. The exposed areas can be stripped away using conventional organic stripping solutions. In order to test the exposure dose sensitivity and the lateral resolution, various test structures were irradiated. Using this formulation 5-8 {mu}m wide lines with aspect ratio 4-6 were resolved.

  2. Proton beam micromachining on strippable aqueous base developable negative resist

    International Nuclear Information System (INIS)

    Nowadays a significant amount of research effort is devoted to the development of technologies for the fabrication of microcomponents and microsystems worldwide. In certain applications of micromachining high aspect ratio (HAR) structures are required. However, the resist materials used in HAR technologies are usually not compatible with the IC fabrication, either because they cannot be stripped away or because they are developed in organic solvents. In the present work the application of a novel chemically amplified resist for proton beam micromachining is presented. The resist based on epoxy and polyhydroxystyrene polymers is developed in the IC standard aqueous developers. The exposed areas can be stripped away using conventional organic stripping solutions. In order to test the exposure dose sensitivity and the lateral resolution, various test structures were irradiated. Using this formulation 5-8 μm wide lines with aspect ratio 4-6 were resolved

  3. A Muon Source Proton Driver at JPARC-based Parameters

    Energy Technology Data Exchange (ETDEWEB)

    Neuffer, David [Fermilab

    2016-06-01

    An "ultimate" high intensity proton source for neutrino factories and/or muon colliders was projected to be a ~4 MW multi-GeV proton source providing short, intense proton pulses at ~15 Hz. The JPARC ~1 MW accelerators provide beam at parameters that in many respects overlap these goals. Proton pulses from the JPARC Main Ring can readily meet the pulsed intensity goals. We explore these parameters, describing the overlap and consider extensions that may take a JPARC-like facility toward this "ultimate" source. JPARC itself could serve as a stage 1 source for such a facility.

  4. Dosimetric accuracy of a treatment planning system for actively scanned proton beams and small target volumes: Monte Carlo and experimental validation

    CERN Document Server

    Magro, G; Mairani, A; Mirandola, A; Panizza, D; Russo, S; Ferrari, A; Valvo, F; Fossati, P; Ciocca, M

    2015-01-01

    This study was performed to evaluate the accuracy of a commercial treatment planning system (TPS), in optimising proton pencil beam dose distributions for small targets of different sizes (5–30 mm side) located at increasing depths in water. The TPS analytical algorithm was benchmarked against experimental data and the FLUKA Monte Carlo (MC) code, previously validated for the selected beam-line. We tested the Siemens syngo® TPS plan optimisation module for water cubes fixing the configurable parameters at clinical standards, with homogeneous target coverage to a 2 Gy (RBE) dose prescription as unique goal. Plans were delivered and the dose at each volume centre was measured in water with a calibrated PTW Advanced Markus® chamber. An EBT3® film was also positioned at the phantom entrance window for the acquisition of 2D dose maps. Discrepancies between TPS calculated and MC simulated values were mainly due to the different lateral spread modeling and resulted in being related to the field-to-spot size r...

  5. Development of Monte Carlo-based pebble bed reactor fuel management code

    International Nuclear Information System (INIS)

    Highlights: • A new Monte Carlo-based fuel management code for OTTO cycle pebble bed reactor was developed. • The double-heterogeneity was modeled using statistical method in MVP-BURN code. • The code can perform analysis of equilibrium and non-equilibrium phase. • Code-to-code comparisons for Once-Through-Then-Out case were investigated. • Ability of the code to accommodate the void cavity was confirmed. - Abstract: A fuel management code for pebble bed reactors (PBRs) based on the Monte Carlo method has been developed in this study. The code, named Monte Carlo burnup analysis code for PBR (MCPBR), enables a simulation of the Once-Through-Then-Out (OTTO) cycle of a PBR from the running-in phase to the equilibrium condition. In MCPBR, a burnup calculation based on a continuous-energy Monte Carlo code, MVP-BURN, is coupled with an additional utility code to be able to simulate the OTTO cycle of PBR. MCPBR has several advantages in modeling PBRs, namely its Monte Carlo neutron transport modeling, its capability of explicitly modeling the double heterogeneity of the PBR core, and its ability to model different axial fuel speeds in the PBR core. Analysis at the equilibrium condition of the simplified PBR was used as the validation test of MCPBR. The calculation results of the code were compared with the results of diffusion-based fuel management PBR codes, namely the VSOP and PEBBED codes. Using JENDL-4.0 nuclide library, MCPBR gave a 4.15% and 3.32% lower keff value compared to VSOP and PEBBED, respectively. While using JENDL-3.3, MCPBR gave a 2.22% and 3.11% higher keff value compared to VSOP and PEBBED, respectively. The ability of MCPBR to analyze neutron transport in the top void of the PBR core and its effects was also confirmed

  6. Comparison between experimental data and Monte-Carlo simulations of neutron production in spallation reactions of 0.7-1.5 GeV protons on a thick, lead target

    International Nuclear Information System (INIS)

    Relativistic protons with energies 0.7-1.5 GeV interacting with a thick, cylindrical, lead target, surrounded by a uranium blanket and a polyethylene moderator, produced spallation neutrons. The spatial and energetic distributions of the produced neutron field were measured by the Activation Analysis Method using Al, Au, Bi, and Co radio-chemical sensors. The experimental yields of isotopes induced in the sensors were compared with Monte-Carlo calculations performed with the MCNPX 2.4.0 code

  7. The effect of statistical uncertainty on inverse treatment planning based on Monte Carlo dose calculation

    Science.gov (United States)

    Jeraj, Robert; Keall, Paul

    2000-12-01

    The effect of the statistical uncertainty, or noise, in inverse treatment planning for intensity modulated radiotherapy (IMRT) based on Monte Carlo dose calculation was studied. Sets of Monte Carlo beamlets were calculated to give uncertainties at Dmax ranging from 0.2% to 4% for a lung tumour plan. The weights of these beamlets were optimized using a previously described procedure based on a simulated annealing optimization algorithm. Several different objective functions were used. It was determined that the use of Monte Carlo dose calculation in inverse treatment planning introduces two errors in the calculated plan. In addition to the statistical error due to the statistical uncertainty of the Monte Carlo calculation, a noise convergence error also appears. For the statistical error it was determined that apparently successfully optimized plans with a noisy dose calculation (3% 1σ at Dmax ), which satisfied the required uniformity of the dose within the tumour, showed as much as 7% underdose when recalculated with a noise-free dose calculation. The statistical error is larger towards the tumour and is only weakly dependent on the choice of objective function. The noise convergence error appears because the optimum weights are determined using a noisy calculation, which is different from the optimum weights determined for a noise-free calculation. Unlike the statistical error, the noise convergence error is generally larger outside the tumour, is case dependent and strongly depends on the required objectives.

  8. Proton beam micromachining on strippable aqueous base developable negative resist

    International Nuclear Information System (INIS)

    Complete text of publication follows. Proton Beam Micromachining (PBM, also known as P-beam writing), a novel direct- write process for the production of 3D microstructures, can be used to make multilevel structures in a single layer of resist by varying the ion energy. The interaction between the bombarding ions and the target material is mainly ionization, and very few ions suffer high angle nuclear collisions, therefore structures made with PBM have smooth near vertical side walls. The most commony applied resists in PBM are the positive, conventional, polymethyl methacrylate (PMMA); and the negative, chemically amplified, SU-8 (Micro Chem Corp). SU-8 is an epoxy based resist suitable also for LIGA and UV-LIGA processes, it offers good sensitivity, good process latitude, very high aspect ratio and therefore it dominates in the high aspect ratio micromachining applications. SU-8 requires 30 nC/mm2 fluence for PBM irradiations at 2 MeV protons. Its crosslinking chemistry is based on the eight epoxy rings in the polymer chain, which provide a very dense three dimensional network in the presence of suitably activated photo acid generators (PAGs) which is very difficult to be stripped away after development. Thus, stripping has to be assisted with plasma processes or with special liquid removers. Moreover, the SU-8 developer is organic, propylene glycol methyl ether acetate (PGMEA), and thus environmentally non-friendly. To overcome the SU-8 stripping limitations, design of a negative resist system where solubility change is not based solely on cross- linking but also on the differentiation of hydrophilicity between exposed and non-exposed areas is desirable. A new resist formulation, fulfilling the above specifications has been developed recently [1]. This formulation is based on a specific grade epoxy novolac (EP) polymer, a partially hydrogenated poly-4-hydroxy styrene (PHS) polymer, and an onium salt as photoacid generator (PAG), and has been successfully applied

  9. Feasibility Study of Accumulator and Compressor for the 6-bunches SPL based Proton Driver

    CERN Document Server

    Aiba, M

    2008-01-01

    Feasibility of the accumulator and the compressor ring for the SPL based proton driver have been studied for a future neutrino factory. The scenario retained for the SPL proton driver uses six bunches, with 10^14 protons in total at 50 Hz. Possible lattices for the accumulator and the compressor are presented. The beam injection/accumulation and the bunch compression are delicate issues and discussed in detail in this note. Throughout the presented study, these difficulties are disclosed not to be critical issues, and together with a discussion on the focusing towards production target, the feasibility of the 6-bunches SPL based proton driver has been confirmed.

  10. Feasibility study of accumulator and compressor for the 6-bunches SPL based proton driver

    CERN Document Server

    Aiba, M

    2008-01-01

    Feasibility studies of the accumulator and the compressor rings for the SPL based proton driver have been studied for a future neutrino factory. The scenario retained for the SPL proton driver uses six bunches, with 10^14 protons in total at 50 Hz. Possible lattices for the accumulator and the compressor are presented. The beam injection/accumulation and the bunch compression are delicate issues and discussed in detail in this note. Throughout the presented study, these difficulties are disclosed not to be critical issues, and together with a discussion on the focusing towards production target, the feasibility of the 6-bunches SPL based proton driver has been confirmed

  11. Monte Carlo vs. Pencil Beam based optimization of stereotactic lung IMRT

    OpenAIRE

    Weinmann Martin; Söhn Matthias; Muzik Jan; Sikora Marcin; Alber Markus

    2009-01-01

    Abstract Background The purpose of the present study is to compare finite size pencil beam (fsPB) and Monte Carlo (MC) based optimization of lung intensity-modulated stereotactic radiotherapy (lung IMSRT). Materials and methods A fsPB and a MC algorithm as implemented in a biological IMRT planning system were validated by film measurements in a static lung phantom. Then, they were applied for static lung IMSRT planning based on three different geometrical patient models (one phase static CT, ...

  12. Monte Carlo vs. Pencil Beam based optimization of stereotactic lung IMRT

    OpenAIRE

    Sikora, Marcin; Muzik, Jan; Söhn, Matthias; Weinmann, Martin; Alber, Markus

    2009-01-01

    Background The purpose of the present study is to compare finite size pencil beam (fsPB) and Monte Carlo (MC) based optimization of lung intensity-modulated stereotactic radiotherapy (lung IMSRT). Materials and methods A fsPB and a MC algorithm as implemented in a biological IMRT planning system were validated by film measurements in a static lung phantom. Then, they were applied for static lung IMSRT planning based on three different geometrical patient models (one phase static CT, density o...

  13. Monte Carlo vs. Pencil Beam based optimization of stereotactic lung IMRT

    OpenAIRE

    Sikora, Marcin Pawel; Muzik, Jan; Söhn, Matthias; Weinmann, Martin; Alber, Markus

    2009-01-01

    Background: The purpose of the present study is to compare finite size pencil beam (fsPB) and Monte Carlo (MC) based optimization of lung intensity-modulated stereotactic radiotherapy (lung IMSRT). Materials and methods: A fsPB and a MC algorithm as implemented in a biological IMRT planning system were validated by film measurements in a static lung phantom. Then, they were applied for static lung IMSRT planning based on three different geometrical patient models (one phase ...

  14. Verification of the use of GEANT4 and MCNPX Monte Carlo Codes for Calculations of the Depth-Dose Distributions in Water for the Proton Therapy of Eye Tumours

    Directory of Open Access Journals (Sweden)

    Grządziel Małgorzata

    2014-07-01

    Full Text Available Verification of calculations of the depth-dose distributions in water, using GEANT4 (version of 4.9.3 and MCNPX (version of 2.7.0 Monte Carlo codes, was performed for the scatterer-phantom system used in the dosimetry measurements in the proton therapy of eye tumours. The simulated primary proton beam had the energy spectra distributed according to the Gauss distribution with the cut at energy greater than that related to the maximum of the spectrum. The energy spectra of the primary protons were chosen to get the possibly best agreement between the measured relative depth-dose distributions along the central-axis of the proton beam in a water phantom and that derived from the Monte Carlo calculations separately for the both tested codes. The local depth-dose differences between results from the calculations and the measurements were mostly less than 5% (the mean value of 2.1% and 3.6% for the MCNPX and GEANT4 calculations. In the case of the MCNPX calculations, the best fit to the experimental data was obtained for the spectrum with maximum at 60.8 MeV (more probable energy, FWHM of the spectrum of 0.4 MeV and the energy cut at 60.85 MeV whereas in the GEANT4 calculations more probable energy was 60.5 MeV, FWHM of 0.5 MeV, the energy cut at 60.7 MeV. Thus, one can say that the results obtained by means of the both considered Monte Carlo codes are similar but they are not the same. Therefore the agreement between the calculations and the measurements has to be verified before each application of the MCNPX and GEANT4 codes for the determination of the depth-dose curves for the therapeutic protons.

  15. A Markov Chain Monte Carlo Based Method for System Identification

    Energy Technology Data Exchange (ETDEWEB)

    Glaser, R E; Lee, C L; Nitao, J J; Hanley, W G

    2002-10-22

    This paper describes a novel methodology for the identification of mechanical systems and structures from vibration response measurements. It combines prior information, observational data and predictive finite element models to produce configurations and system parameter values that are most consistent with the available data and model. Bayesian inference and a Metropolis simulation algorithm form the basis for this approach. The resulting process enables the estimation of distributions of both individual parameters and system-wide states. Attractive features of this approach include its ability to: (1) provide quantitative measures of the uncertainty of a generated estimate; (2) function effectively when exposed to degraded conditions including: noisy data, incomplete data sets and model misspecification; (3) allow alternative estimates to be produced and compared, and (4) incrementally update initial estimates and analysis as more data becomes available. A series of test cases based on a simple fixed-free cantilever beam is presented. These results demonstrate that the algorithm is able to identify the system, based on the stiffness matrix, given applied force and resultant nodal displacements. Moreover, it effectively identifies locations on the beam where damage (represented by a change in elastic modulus) was specified.

  16. Effects of CT based Voxel Phantoms on Dose Distribution Calculated with Monte Carlo Method

    Institute of Scientific and Technical Information of China (English)

    Chen Chaobin; Huang Qunying; Wu Yican

    2005-01-01

    A few CT-based voxel phantoms were produced to investigate the sensitivity of Monte Carlo simulations of X-ray beam and electron beam to the proportions of elements and the mass densities of the materials used to express the patient's anatomical structure. The human body can be well outlined by air, lung, adipose, muscle, soft bone and hard bone to calculate the dose distribution with Monte Carlo method. The effects of the calibration curves established by using various CT scanners are not clinically significant based on our investigation. The deviation from the values of cumulative dose volume histogram derived from CT-based voxel phantoms is less than 1% for the given target.

  17. Effects of CT based Voxel Phantoms on Dose Distribution Calculated with Monte Carlo Method

    Science.gov (United States)

    Chen, Chaobin; Huang, Qunying; Wu, Yican

    2005-04-01

    A few CT-based voxel phantoms were produced to investigate the sensitivity of Monte Carlo simulations of x-ray beam and electron beam to the proportions of elements and the mass densities of the materials used to express the patient's anatomical structure. The human body can be well outlined by air, lung, adipose, muscle, soft bone and hard bone to calculate the dose distribution with Monte Carlo method. The effects of the calibration curves established by using various CT scanners are not clinically significant based on our investigation. The deviation from the values of cumulative dose volume histogram derived from CT-based voxel phantoms is less than 1% for the given target.

  18. VHEeP: A very high energy electron-proton collider based on proton-driven plasma wakefield acceleration

    CERN Document Server

    Caldwell, Allen

    2015-01-01

    Based on current CERN infrastructure, an electron-proton collider is proposed at a centre-of-mass energy of about 9 TeV. A 7 TeV LHC bunch is used as the proton driver to create a plasma wakefield which then accelerates electrons to 3 TeV, these then colliding with the other 7 TeV LHC proton beam. The basic parameters of the collider are presented, which although of very high energy, has integrated luminosities of the order of 1 pb$^{-1}$/year. For such a collider, with a centre-of-mass energy 30 times greater than HERA, parton momentum fractions, $x$, down to about $10^{-8}$ are accessible for $Q^2$ of 1 GeV$^2$ and could lead to effects of saturation or some other breakdown of DGLAP being observed. The total photon-proton cross section can be measured up to very high energies and also at different energies as the possibility of varying the electron beam energy is assumed; this could have synergy with cosmic-ray physics. Other physics which can be pursued at such a collider are contact interaction searches, ...

  19. Interplay effects in proton scanning for lung: a 4D Monte Carlo study assessing the impact of tumor and beam delivery parameters

    International Nuclear Information System (INIS)

    Relative motion between a tumor and a scanning proton beam results in a degradation of the dose distribution (interplay effect). This study investigates the relationship between beam scanning parameters and the interplay effect, with the goal of finding parameters that minimize interplay. 4D Monte Carlo simulations of pencil beam scanning proton therapy treatments were performed using the 4DCT geometry of five lung cancer patients of varying tumor size (50.4–167.1 cc) and motion amplitude (2.9–30.1 mm). Treatments were planned assuming delivery in 35 × 2.5 Gy(RBE) fractions. The spot size, time to change the beam energy (τes), time required for magnet settling (τss), initial breathing phase, spot spacing, scanning direction, scanning speed, beam current and patient breathing period were varied for each of the five patients. Simulations were performed for a single fraction and an approximation of conventional fractionation. For the patients considered, the interplay effect could not be predicted using the superior–inferior motion amplitude alone. Larger spot sizes (σ ∼ 9–16 mm) were less susceptible to interplay, giving an equivalent uniform dose (EUD) of 99.0 ± 4.4% (1 standard deviation) in a single fraction compared to 86.1 ± 13.1% for smaller spots (σ ∼ 2–4 mm). The smaller spot sizes gave EUD values as low as 65.3% of the prescription dose in a single fraction. Reducing the spot spacing improved the target dose homogeneity. The initial breathing phase can have a significant effect on the interplay, particularly for shorter delivery times. No clear benefit was evident when scanning either parallel or perpendicular to the predominant axis of motion. Longer breathing periods decreased the EUD. In general, longer delivery times led to lower interplay effects. Conventional fractionation showed significant improvement in terms of interplay, giving a EUD of at least 84.7% and 100.0% of the prescription dose for the small and larger spot sizes

  20. A portable secondary dose monitoring system using scintillating fibers for proton therapy of prostate cancer: A Geant4 Monte Carlo simulation study

    Directory of Open Access Journals (Sweden)

    Biniam Tesfamicael

    2016-03-01

    Full Text Available Purpose: The main purpose of this study was to monitor the secondary dose distribution originating from a water phantom during proton therapy of prostate cancer using scintillating fibers.Methods: The Geant4 Monte Carlo toolkit version 9.6.p02 was used to simulate a proton therapy of prostate cancer. Two cases were studied. In the first case, 8 × 8 = 64 equally spaced fibers inside three 4 × 4 × 2.54 cm3 Delrin® blocks were used to monitor the emission of secondary particles in the transverse (left and right and distal regions relative to the beam direction. In the second case, a scintillating block with a thickness of 2.54 cm and equal vertical and longitudinal dimensions as the water phantom was used. Geometrical cuts were implemented to extract the energy deposited in each fiber and inside the scintillating block.Results: The transverse dose distributions from the detected secondary particles in both cases are symmetric and agree to within <3.6%. The energy deposited gradually increases as one moves from the peripheral row of fibers towards the center of the block (aligned with the center of the prostate by a factor of approximately 5. The energy deposited was also observed to decrease as one goes from the frontal to distal region of the block. The ratio of the energy deposited in the prostate to the energy deposited in the middle two rows of fibers showed a linear relationship with a slope of (-3.55±2.26 × 10-5 MeV per treatment Gy delivered. The distal detectors recorded a negligible amount of energy deposited due to higher attenuation of the secondary particles by the water in that direction.Conclusion: With a good calibration and with the ability to define a good correlation between the radiation flux recorded by the external fibers and the dose delivered to the prostate, such fibers can be used for real time dose verification to the target. The system was also observed to respond to the series of Bragg Peaks used to generate the

  1. Interplay effects in proton scanning for lung: a 4D Monte Carlo study assessing the impact of tumor and beam delivery parameters

    Science.gov (United States)

    Dowdell, S.; Grassberger, C.; Sharp, G. C.; Paganetti, H.

    2013-06-01

    Relative motion between a tumor and a scanning proton beam results in a degradation of the dose distribution (interplay effect). This study investigates the relationship between beam scanning parameters and the interplay effect, with the goal of finding parameters that minimize interplay. 4D Monte Carlo simulations of pencil beam scanning proton therapy treatments were performed using the 4DCT geometry of five lung cancer patients of varying tumor size (50.4-167.1 cc) and motion amplitude (2.9-30.1 mm). Treatments were planned assuming delivery in 35 × 2.5 Gy(RBE) fractions. The spot size, time to change the beam energy (τes), time required for magnet settling (τss), initial breathing phase, spot spacing, scanning direction, scanning speed, beam current and patient breathing period were varied for each of the five patients. Simulations were performed for a single fraction and an approximation of conventional fractionation. For the patients considered, the interplay effect could not be predicted using the superior-inferior motion amplitude alone. Larger spot sizes (σ ˜ 9-16 mm) were less susceptible to interplay, giving an equivalent uniform dose (EUD) of 99.0 ± 4.4% (1 standard deviation) in a single fraction compared to 86.1 ± 13.1% for smaller spots (σ ˜ 2-4 mm). The smaller spot sizes gave EUD values as low as 65.3% of the prescription dose in a single fraction. Reducing the spot spacing improved the target dose homogeneity. The initial breathing phase can have a significant effect on the interplay, particularly for shorter delivery times. No clear benefit was evident when scanning either parallel or perpendicular to the predominant axis of motion. Longer breathing periods decreased the EUD. In general, longer delivery times led to lower interplay effects. Conventional fractionation showed significant improvement in terms of interplay, giving a EUD of at least 84.7% and 100.0% of the prescription dose for the small and larger spot sizes respectively

  2. Development of 3d reactor burnup code based on Monte Carlo method and exponential Euler method

    International Nuclear Information System (INIS)

    Burnup analysis plays a key role in fuel breeding, transmutation and post-processing in nuclear reactor. Burnup codes based on one-dimensional and two-dimensional transport method have difficulties in meeting the accuracy requirements. A three-dimensional burnup analysis code based on Monte Carlo method and Exponential Euler method has been developed. The coupling code combines advantage of Monte Carlo method in complex geometry neutron transport calculation and FISPACT in fast and precise inventory calculation, meanwhile resonance Self-shielding effect in inventory calculation can also be considered. The IAEA benchmark text problem has been adopted for code validation. Good agreements were shown in the comparison with other participants' results. (authors)

  3. Proton pump inhibitors in cirrhosis: Tradition or evidence based practice?

    Institute of Scientific and Technical Information of China (English)

    Francesca Lodato; Francesco Azzaroli; Maria Di Girolamo; Valentina Feletti; Paolo Cecinato; Andrea Lisotti; Davide Festi; Enrico Roda; Giuseppe Mazzella

    2008-01-01

    Proton Pump Inhibitors (PPI) are very effective in inhibiting acid secretion and are extensively used in many acid related diseases. They are also often used in patients with cirrhosis sometimes in the absence of a specific acid related disease, with the aim of preventing peptic complications in patients with variceal or hypertensive gastropathic bleeding receiving multidrug treatment. Contradicting reports support their use in cirrhosis and evidence of their efficacy in this condition is poor. Moreover there are convincing papers suggesting that acid secretion is reduced in patients with liver cirrhosis. With regard to H pylori infection, its prevalence in patients with cirrhosis is largely variable among different studies, and it seems that H pylori eradication does not prevent gastro-duodenal ulcer formation and bleeding. With regard to the prevention and treatment of oesophageal complications after banding or sclerotherapy of oesophageal varices, there is little evidence for a protective role of PPI. Moreover, due to liver metabolism of PPI, the dose of most available PPIs should be reduced in cirrhotics. In conclusion, the use of this class of drugs seems more habit related than evidence-based eventually leading to an increase in health costs.

  4. New approach to polarized proton scattering based on Dirac dynamics

    International Nuclear Information System (INIS)

    The Dirac impulse approximation has to date provided dramatic improvement in our ability to predict, with no free parameters, spin observables in proton-nucleus elastic scattering at intermediate energies. The key ingredients of this approach are Dirac propagation and the nucleon-nucleon invariant amplitudes. So far, local approximations to the NN amplitudes have been used. The standard NN representation in terms of Dirac scalar, vector, and so on, parts which is free of kinematical singularities seems to naturally predict the correct coupling to negative energy states for energies above 300 MeV. At low energy, this coupling is subject to an ambiguity between pseudoscalar and pseudovector πN coupling mechanisms and it is evident that the pseudoscalar coupling treated in a local approximation causes too much scalar-vector difference and thus too large pair contributions. Once this problem is remedied, the Dirac optical potential is expected to be calculable from a nucleon-nucleon quasi-potential over the range 0 to 1000 MeV. For the energy region above about 300 MeV, the large scalar and vector potentials of Dirac phenomenology are seen to be accurately predicted by the impulse approximation. Work by Shakin and collaborators provides complementary results at low energy based on a nuclear matter g-matrix. A basic conclusion is that relativistic spin effects cannot be neglected in nuclear physics. 36 references

  5. Four-dimensional Monte Carlo simulations demonstrating how the extent of intensity-modulation impacts motion effects in proton therapy lung treatments

    Energy Technology Data Exchange (ETDEWEB)

    Dowdell, Stephen; Paganetti, Harald [Department of Radiation Oncology, Massachusetts General Hospital and Harvard Medical School, Boston, Massachusetts 02114 (United States); Grassberger, Clemens [Department of Radiation Oncology, Massachusetts General Hospital and Harvard Medical School, Boston, Massachusetts 02114 and Centre for Proton Therapy, Paul Scherrer Institut, 5232 Villigen-PSI (Switzerland)

    2013-12-15

    Purpose: To compare motion effects in intensity modulated proton therapy (IMPT) lung treatments with different levels of intensity modulation.Methods: Spot scanning IMPT treatment plans were generated for ten lung cancer patients for 2.5Gy(RBE) and 12Gy(RBE) fractions and two distinct energy-dependent spot sizes (σ∼8–17 mm and ∼2–4 mm). IMPT plans were generated with the target homogeneity of each individual field restricted to <20% (IMPT{sub 20%}). These plans were compared to full IMPT (IMPT{sub full}), which had no restriction on the single field homogeneity. 4D Monte Carlo simulations were performed upon the patient 4DCT geometry, including deformable image registration and incorporating the detailed timing structure of the proton delivery system. Motion effects were quantified via comparison of the results of the 4D simulations (4D-IMPT{sub 20%}, 4D-IMPT{sub full}) with those of a 3D Monte Carlo simulation (3D-IMPT{sub 20%}, 3D-IMPT{sub full}) upon the planning CT using the equivalent uniform dose (EUD), V{sub 95} and D{sub 1}-D{sub 99}. The effects in normal lung were quantified using mean lung dose (MLD) and V{sub 90%}.Results: For 2.5Gy(RBE), the mean EUD for the large spot size is 99.9%± 2.8% for 4D-IMPT{sub 20%} compared to 100.1%± 2.9% for 4D-IMPT{sub full}. The corresponding values are 88.6%± 8.7% (4D-IMPT{sub 20%}) and 91.0%± 9.3% (4D-IMPT{sub full}) for the smaller spot size. The EUD value is higher in 69.7% of the considered deliveries for 4D-IMPT{sub full}. The V{sub 95} is also higher in 74.7% of the plans for 4D-IMPT{sub full}, implying that IMPT{sub full} plans experience less underdose compared to IMPT{sub 20%}. However, the target dose homogeneity is improved in the majority (67.8%) of plans for 4D-IMPT{sub 20%}. The higher EUD and V{sub 95} suggests that the degraded homogeneity in IMPT{sub full} is actually due to the introduction of hot spots in the target volume, perhaps resulting from the sharper in-target dose gradients. The

  6. Modeling radiation effects at the atomic scale with artificial neural network based kinetic Monte Carlo

    International Nuclear Information System (INIS)

    We briefly present our atomistic kinetic Monte Carlo approach to model the diffusion of point-defects in Fe-based alloys, and therefore to simulate diffusion induced mass transport and subsequent nano-structural and microchemical changes. This methodology has been hitherto successfully applied to the simulation of thermal annealing experiments. We here present our achievements in the generalization of this method to the simulation of neutron irradiation damage. (authors)

  7. Validation of a Monte Carlo Based Depletion Methodology Using HFIR Post-Irradiation Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Chandler, David [ORNL; Maldonado, G Ivan [ORNL; Primm, Trent [ORNL

    2009-11-01

    Post-irradiation uranium isotopic atomic densities within the core of the High Flux Isotope Reactor (HFIR) were calculated and compared to uranium mass spectrographic data measured in the late 1960s and early 70s [1]. This study was performed in order to validate a Monte Carlo based depletion methodology for calculating the burn-up dependent nuclide inventory, specifically the post-irradiation uranium

  8. Laser-based detection and tracking moving objects using data-driven Markov chain Monte Carlo

    OpenAIRE

    Vu, Trung-Dung; Aycard, Olivier

    2009-01-01

    We present a method of simultaneous detection and tracking moving objects from a moving vehicle equipped with a single layer laser scanner. A model-based approach is introduced to interpret the laser measurement sequence by hypotheses of moving object trajectories over a sliding window of time. Knowledge of various aspects including object model, measurement model, motion model are integrated in one theoretically sound Bayesian framework. The data-driven Markov chain Monte Carlo (DDMCMC) tech...

  9. Development of a space radiation Monte Carlo computer simulation based on the FLUKA and ROOT codes

    CERN Document Server

    Pinsky, L; Ferrari, A; Sala, P; Carminati, F; Brun, R

    2001-01-01

    This NASA funded project is proceeding to develop a Monte Carlo-based computer simulation of the radiation environment in space. With actual funding only initially in place at the end of May 2000, the study is still in the early stage of development. The general tasks have been identified and personnel have been selected. The code to be assembled will be based upon two major existing software packages. The radiation transport simulation will be accomplished by updating the FLUKA Monte Carlo program, and the user interface will employ the ROOT software being developed at CERN. The end-product will be a Monte Carlo-based code which will complement the existing analytic codes such as BRYNTRN/HZETRN presently used by NASA to evaluate the effects of radiation shielding in space. The planned code will possess the ability to evaluate the radiation environment for spacecraft and habitats in Earth orbit, in interplanetary space, on the lunar surface, or on a planetary surface such as Mars. Furthermore, it will be usef...

  10. The influence of air cavities within the PTV on Monte Carlo-based IMRT optimization

    Energy Technology Data Exchange (ETDEWEB)

    Smedt, Bart de [Department of Medical Physics, Ghent University, Gent (Belgium); Vanderstraeten, Barbara [Department of Medical Physics, Ghent University, Gent (Belgium); Reynaert, Nick [Department of Medical Physics, Ghent University, Gent (Belgium); Gersem, Werner de [Department of Radiotherapy, Ghent University Hospital, Gent (Belgium); Neve, Wilfried de [Department of Radiotherapy, Ghent University Hospital, Gent (Belgium); Thierens, Hubert [Department of Medical Physics, Ghent University, Gent (Belgium)

    2007-06-15

    Integrating Monte Carlo calculated dose distributions into an iterative aperture-based IMRT optimization process can improve the final treatment plan. However, the influence of large air cavities in the planning target volume (PTV) on the outcome of the optimization process should not be underestimated. To study this influence, the treatment plan of an ethmoid sinus cancer patient, which has large air cavities included in the PTV, is iteratively optimized in two different situations, namely when the large air cavities are included in the PTV and when these air cavities are excluded from the PTV. Two optimization methods were applied to integrate the Monte Carlo calculated dose distributions into the optimization process, namely the 'Correction-method' and the 'Per Segment-method'. The 'Correction-method' takes the Monte Carlo calculated global dose distribution into account in the optimization process by means of a correction matrix, which is in fact a dose distribution that is equal to the difference between the Monte Carlo calculated global dose distribution and the global dose distribution calculated by a conventional dose calculation algorithm. The 'Per Segment-method' uses directly the Monte Carlo calculated dose distributions of the individual segments in the optimization process. Both methods tend to converge whether or not large air cavities are excluded from the PTV during the optimization process. However, the 'Per Segment-method' performs better than the 'Correction-method' in both situations and the 'Per Segment-method' in the case where the large air cavities are excluded from the PTV leads to a better treatment plan then when these air cavities are included. Therefore we advise to exclude large air cavities and to apply the 'Per Segment-method' to integrate the Monte Carlo dose calculations into an iterative aperture-based optimization process. Nevertheless, the &apos

  11. Detailed parametrization of neutrino and gamma-ray energy spectra from high energy proton-proton interactions

    Science.gov (United States)

    Supanitsky, A. D.

    2016-02-01

    Gamma rays and neutrinos are produced as a result of proton-proton interactions that occur in different astrophysical contexts. The detection of these two types of messengers is of great importance for the study of different physical phenomena, related to nonthermal processes, taking place in different astrophysical scenarios. Therefore, the knowledge of the energy spectrum of these two types of particles, as a function of the incident proton energy, is essential for the interpretation of the observational data. In this paper, parametrizations of the energy spectra of gamma rays and neutrinos, originated in proton-proton collisions, are presented. The energy range of the incident protons considered extends from 102 to 108 GeV . The parametrizations are based on Monte Carlo simulations of proton-proton interactions performed with the hadronic interaction models QGSJET-II-04 and EPOS-LHC, which have recently been updated with the data taken by the Large Hadron Collider.

  12. Protonation Equilibrium of Linear Homopolyacids

    Directory of Open Access Journals (Sweden)

    Požar J.

    2015-07-01

    Full Text Available The paper presents a short summary of investigations dealing with protonation equilibrium of linear homopolyacids, in particularly those of high charge density. Apart from the review of experimental results which can be found in the literature, a brief description of theoretical models used in processing the dependence of protonation constants on monomer dissociation degree and ionic strength is given (cylindrical model based on Poisson-Boltzmann equation, cylindrical Stern model, the models according to Ising, Högfeldt, Mandel and Katchalsky. The applicability of these models regarding the polyion charge density, electrolyte concentration and counterion type is discussed. The results of Monte Carlo simulations of protonation equilibrium are also briefly mentioned. In addition, frequently encountered errors connected with calibration of of glass electrode and the related unreliability of determined protonation constants are pointed out.

  13. ERSN-OpenMC, a Java-based GUI for OpenMC Monte Carlo code

    Directory of Open Access Journals (Sweden)

    Jaafar EL Bakkali

    2016-07-01

    Full Text Available OpenMC is a new Monte Carlo transport particle simulation code focused on solving two types of neutronic problems mainly the k-eigenvalue criticality fission source problems and external fixed fission source problems. OpenMC does not have any Graphical User Interface and the creation of one is provided by our java-based application named ERSN-OpenMC. The main feature of this application is to provide to the users an easy-to-use and flexible graphical interface to build better and faster simulations, with less effort and great reliability. Additionally, this graphical tool was developed with several features, as the ability to automate the building process of OpenMC code and related libraries as well as the users are given the freedom to customize their installation of this Monte Carlo code. A full description of the ERSN-OpenMC application is presented in this paper.

  14. Polarization imaging of multiply-scattered radiation based on integral-vector Monte Carlo method

    International Nuclear Information System (INIS)

    A new integral-vector Monte Carlo method (IVMCM) is developed to analyze the transfer of polarized radiation in 3D multiple scattering particle-laden media. The method is based on a 'successive order of scattering series' expression of the integral formulation of the vector radiative transfer equation (VRTE) for application of efficient statistical tools to improve convergence of Monte Carlo calculations of integrals. After validation against reference results in plane-parallel layer backscattering configurations, the model is applied to a cubic container filled with uniformly distributed monodispersed particles and irradiated by a monochromatic narrow collimated beam. 2D lateral images of effective Mueller matrix elements are calculated in the case of spherical and fractal aggregate particles. Detailed analysis of multiple scattering regimes, which are very similar for unpolarized radiation transfer, allows identifying the sensitivity of polarization imaging to size and morphology.

  15. Monte Carlo Methods in Materials Science Based on FLUKA and ROOT

    Science.gov (United States)

    Pinsky, Lawrence; Wilson, Thomas; Empl, Anton; Andersen, Victor

    2003-01-01

    A comprehensive understanding of mitigation measures for space radiation protection necessarily involves the relevant fields of nuclear physics and particle transport modeling. One method of modeling the interaction of radiation traversing matter is Monte Carlo analysis, a subject that has been evolving since the very advent of nuclear reactors and particle accelerators in experimental physics. Countermeasures for radiation protection from neutrons near nuclear reactors, for example, were an early application and Monte Carlo methods were quickly adapted to this general field of investigation. The project discussed here is concerned with taking the latest tools and technology in Monte Carlo analysis and adapting them to space applications such as radiation shielding design for spacecraft, as well as investigating how next-generation Monte Carlos can complement the existing analytical methods currently used by NASA. We have chosen to employ the Monte Carlo program known as FLUKA (A legacy acronym based on the German for FLUctuating KAscade) used to simulate all of the particle transport, and the CERN developed graphical-interface object-oriented analysis software called ROOT. One aspect of space radiation analysis for which the Monte Carlo s are particularly suited is the study of secondary radiation produced as albedoes in the vicinity of the structural geometry involved. This broad goal of simulating space radiation transport through the relevant materials employing the FLUKA code necessarily requires the addition of the capability to simulate all heavy-ion interactions from 10 MeV/A up to the highest conceivable energies. For all energies above 3 GeV/A the Dual Parton Model (DPM) is currently used, although the possible improvement of the DPMJET event generator for energies 3-30 GeV/A is being considered. One of the major tasks still facing us is the provision for heavy ion interactions below 3 GeV/A. The ROOT interface is being developed in conjunction with the

  16. Proton exchange in acid–base complexes induced by reaction coordinates with heavy atom motions

    International Nuclear Information System (INIS)

    Highlights: ► Proton exchange in acid–base complexes is studied. ► The structures, binding energies, and normal mode vibrations are calculated. ► Transition state structures of proton exchange mechanism are determined. ► In the complexes studied, the reaction coordinate involves heavy atom rocking. ► The reaction coordinate is not simply localized in the proton movements. - Abstract: We extend previous work on nitric acid–ammonia and nitric acid–alkylamine complexes to illustrate that proton exchange reaction coordinates involve the rocking motion of the base moiety in many double hydrogen-bonded gas phase strong acid–strong base complexes. The complexes studied involve the biologically and atmospherically relevant glycine, formic, acetic, propionic, and sulfuric acids with ammonia/alkylamine bases. In these complexes, the magnitude of the imaginary frequencies associated with the proton exchange transition states are −1. This contrasts with widely studied proton exchange reactions between symmetric carboxylic acid dimers or asymmetric DNA base pair and their analogs where the reaction coordinate is localized in proton motions and the magnitude of the imaginary frequencies for the transition states are >1100 cm−1. Calculations on complexes of these acids with water are performed for comparison. Variations of normal vibration modes along the reaction coordinate in the complexes are described.

  17. High accuracy modeling for advanced nuclear reactor core designs using Monte Carlo based coupled calculations

    Science.gov (United States)

    Espel, Federico Puente

    The main objective of this PhD research is to develop a high accuracy modeling tool using a Monte Carlo based coupled system. The presented research comprises the development of models to include the thermal-hydraulic feedback to the Monte Carlo method and speed-up mechanisms to accelerate the Monte Carlo criticality calculation. Presently, deterministic codes based on the diffusion approximation of the Boltzmann transport equation, coupled with channel-based (or sub-channel based) thermal-hydraulic codes, carry out the three-dimensional (3-D) reactor core calculations of the Light Water Reactors (LWRs). These deterministic codes utilize nuclear homogenized data (normally over large spatial zones, consisting of fuel assembly or parts of fuel assembly, and in the best case, over small spatial zones, consisting of pin cell), which is functionalized in terms of thermal-hydraulic feedback parameters (in the form of off-line pre-generated cross-section libraries). High accuracy modeling is required for advanced nuclear reactor core designs that present increased geometry complexity and material heterogeneity. Such high-fidelity methods take advantage of the recent progress in computation technology and coupled neutron transport solutions with thermal-hydraulic feedback models on pin or even on sub-pin level (in terms of spatial scale). The continuous energy Monte Carlo method is well suited for solving such core environments with the detailed representation of the complicated 3-D problem. The major advantages of the Monte Carlo method over the deterministic methods are the continuous energy treatment and the exact 3-D geometry modeling. However, the Monte Carlo method involves vast computational time. The interest in Monte Carlo methods has increased thanks to the improvements of the capabilities of high performance computers. Coupled Monte-Carlo calculations can serve as reference solutions for verifying high-fidelity coupled deterministic neutron transport methods

  18. Oxide-based protonic conductors: Point defects and transport properties

    DEFF Research Database (Denmark)

    Bonanos, N.

    , hydrogen pumps, fuel cells, etc. The extent to which protonic defects form depends mainly on the partial pressure of water vapour, temperature and basicity of the constituent oxides, while their mobility depends, among other factors, on the metal-oxygen bond length and bond energy. The defect equilibria...... that determine the protonic concentrations are considered, with emphasis on the regime of low oxygen partial pressure. The measurement of the thermoelectric power (TEP) and of the H+/D+ isotope effect in conductivity are discussed as a means of characterising the conduction process. (C) 2001 Elsevier...

  19. Oxide-based protonic conductors: Point defects and transport properties

    DEFF Research Database (Denmark)

    Bonanos, N.

    2001-01-01

    , hydrogen pumps, fuel cells, etc. The extent to which protonic defects form depends mainly on the partial pressure of water vapour, temperature and basicity of the constituent oxides, while their mobility depends, among other factors, on the metal-oxygen bond length and bond energy. The defect equilibria...... that determine the protonic concentrations are considered, with emphasis on the regime of low oxygen partial pressure. The measurement of the thermoelectric power (TEP) and of the H+/D+ isotope effect in conductivity are discussed as a means of characterising the conduction process. (C) 2001 Elsevier...

  20. Comparing analytical and Monte Carlo optical diffusion models in phosphor-based X-ray detectors

    Science.gov (United States)

    Kalyvas, N.; Liaparinos, P.

    2014-03-01

    Luminescent materials are employed as radiation to light converters in detectors of medical imaging systems, often referred to as phosphor screens. Several processes affect the light transfer properties of phosphors. Amongst the most important is the interaction of light. Light attenuation (absorption and scattering) can be described either through "diffusion" theory in theoretical models or "quantum" theory in Monte Carlo methods. Although analytical methods, based on photon diffusion equations, have been preferentially employed to investigate optical diffusion in the past, Monte Carlo simulation models can overcome several of the analytical modelling assumptions. The present study aimed to compare both methodologies and investigate the dependence of the analytical model optical parameters as a function of particle size. It was found that the optical photon attenuation coefficients calculated by analytical modeling are decreased with respect to the particle size (in the region 1- 12 μm). In addition, for particles sizes smaller than 6μm there is no simultaneous agreement between the theoretical modulation transfer function and light escape values with respect to the Monte Carlo data.

  1. Experimental observation of acoustic emissions generated by a pulsed proton beam from a hospital-based clinical cyclotron

    Energy Technology Data Exchange (ETDEWEB)

    Jones, Kevin C.; Solberg, Timothy D.; Avery, Stephen, E-mail: Stephen.Avery@uphs.upenn.edu [Department of Radiation Oncology, University of Pennsylvania, Philadelphia, Pennsylvania 19104 (United States); Vander Stappen, François; Janssens, Guillaume; Prieels, Damien [Ion Beam Applications SA, Louvain-la-Neuve 1348 (Belgium); Bawiec, Christopher R.; Lewin, Peter A. [School of Biomedical Engineering, Drexel University, Philadelphia, Pennsylvania 19104 (United States); Sehgal, Chandra M. [Department of Radiology, University of Pennsylvania, Philadelphia, Pennsylvania 19104 (United States)

    2015-12-15

    Purpose: To measure the acoustic signal generated by a pulsed proton spill from a hospital-based clinical cyclotron. Methods: An electronic function generator modulated the IBA C230 isochronous cyclotron to create a pulsed proton beam. The acoustic emissions generated by the proton beam were measured in water using a hydrophone. The acoustic measurements were repeated with increasing proton current and increasing distance between detector and beam. Results: The cyclotron generated proton spills with rise times of 18 μs and a maximum measured instantaneous proton current of 790 nA. Acoustic emissions generated by the proton energy deposition were measured to be on the order of mPa. The origin of the acoustic wave was identified as the proton beam based on the correlation between acoustic emission arrival time and distance between the hydrophone and proton beam. The acoustic frequency spectrum peaked at 10 kHz, and the acoustic pressure amplitude increased monotonically with increasing proton current. Conclusions: The authors report the first observation of acoustic emissions generated by a proton beam from a hospital-based clinical cyclotron. When modulated by an electronic function generator, the cyclotron is capable of creating proton spills with fast rise times (18 μs) and high instantaneous currents (790 nA). Measurements of the proton-generated acoustic emissions in a clinical setting may provide a method for in vivo proton range verification and patient monitoring.

  2. Experimental observation of acoustic emissions generated by a pulsed proton beam from a hospital-based clinical cyclotron

    International Nuclear Information System (INIS)

    Purpose: To measure the acoustic signal generated by a pulsed proton spill from a hospital-based clinical cyclotron. Methods: An electronic function generator modulated the IBA C230 isochronous cyclotron to create a pulsed proton beam. The acoustic emissions generated by the proton beam were measured in water using a hydrophone. The acoustic measurements were repeated with increasing proton current and increasing distance between detector and beam. Results: The cyclotron generated proton spills with rise times of 18 μs and a maximum measured instantaneous proton current of 790 nA. Acoustic emissions generated by the proton energy deposition were measured to be on the order of mPa. The origin of the acoustic wave was identified as the proton beam based on the correlation between acoustic emission arrival time and distance between the hydrophone and proton beam. The acoustic frequency spectrum peaked at 10 kHz, and the acoustic pressure amplitude increased monotonically with increasing proton current. Conclusions: The authors report the first observation of acoustic emissions generated by a proton beam from a hospital-based clinical cyclotron. When modulated by an electronic function generator, the cyclotron is capable of creating proton spills with fast rise times (18 μs) and high instantaneous currents (790 nA). Measurements of the proton-generated acoustic emissions in a clinical setting may provide a method for in vivo proton range verification and patient monitoring

  3. Fast GPU-based Monte Carlo simulations for LDR prostate brachytherapy

    Science.gov (United States)

    Bonenfant, Éric; Magnoux, Vincent; Hissoiny, Sami; Ozell, Benoît; Beaulieu, Luc; Després, Philippe

    2015-07-01

    The aim of this study was to evaluate the potential of bGPUMCD, a Monte Carlo algorithm executed on Graphics Processing Units (GPUs), for fast dose calculations in permanent prostate implant dosimetry. It also aimed to validate a low dose rate brachytherapy source in terms of TG-43 metrics and to use this source to compute dose distributions for permanent prostate implant in very short times. The physics of bGPUMCD was reviewed and extended to include Rayleigh scattering and fluorescence from photoelectric interactions for all materials involved. The radial and anisotropy functions were obtained for the Nucletron SelectSeed in TG-43 conditions. These functions were compared to those found in the MD Anderson Imaging and Radiation Oncology Core brachytherapy source registry which are considered the TG-43 reference values. After appropriate calibration of the source, permanent prostate implant dose distributions were calculated for four patients and compared to an already validated Geant4 algorithm. The radial function calculated from bGPUMCD showed excellent agreement (differences within 1.3%) with TG-43 accepted values. The anisotropy functions at r = 1 cm and r = 4 cm were within 2% of TG-43 values for angles over 17.5°. For permanent prostate implants, Monte Carlo-based dose distributions with a statistical uncertainty of 1% or less for the target volume were obtained in 30 s or less for 1 × 1 × 1 mm3 calculation grids. Dosimetric indices were very similar (within 2.7%) to those obtained with a validated, independent Monte Carlo code (Geant4) performing the calculations for the same cases in a much longer time (tens of minutes to more than a hour). bGPUMCD is a promising code that lets envision the use of Monte Carlo techniques in a clinical environment, with sub-minute execution times on a standard workstation. Future work will explore the use of this code with an inverse planning method to provide a complete Monte Carlo-based planning solution.

  4. State-of-the-art in Comprehensive Cascade Control Approach through Monte-Carlo Based Representation

    Directory of Open Access Journals (Sweden)

    A.H. Mazinan

    2015-10-01

    Full Text Available The research relies on the comprehensive cascade control approach to be developed in the area of spacecraft, as long as Monte-Carlo based representation is taken into real consideration with respect to state-of-the-art. It is obvious that the conventional methods do not have sufficient merit to be able to deal with such a process under control, constantly, provided that a number of system parameters variations are to be used in providing real situations. It is to note that the new insights in the area of the research’s topic are valuable to outperform a class of spacecrafts performance as the realizations of the acquired results are to be addressed in both real and academic environments. In a word, there are a combination of double closed loop based upon quaternion based control approach in connection with Euler based control approach to handle the three-axis rotational angles and its rates, synchronously, in association with pulse modulation analysis and control allocation, where the dynamics and kinematics of the present system under control are analyzed. A series of experiments are carried out to consider the approach performance in which the aforementioned Monte-Carlo based representation is to be realized in verifying the investigated outcomes.

  5. Design study of Be-target for proton accelerator based neutron source with 13MeV cyclotron

    International Nuclear Information System (INIS)

    There is a cyclotron named KIRAMS-13 in Pusan National University, Busan, Korea, which has the proton energy of 13MeV and the beam current of 0.05mA. Originally, it was developed for producing medical radioisotopes and nuclear physics research. To improve the utilization of the facility, we are considering the possibilities of installing a neutron generation target in it. The Beryllium target has been considered and neutrons can be generated by 9Be(p,n)9B reaction above the threshold proton energy of 2.057MeV. In this presentation, we suggest candidate materials and structures, thicknesses, metal layers and cooling systems of target, which is optimal for the KIRAMS-13. We chose the Beryllium material of 1.14mm thick, which is calculated by stopping power of Beryllium, based on PSTAR, NIST. As for the cooling system, we chose to use water as a coolant, which will also act as a moderator. As protons pass through the target, hydrogen ions continue to pile up in the material and this makes the material brittle. To solve this problem, we chose Vanadium material because it has high hydrogen diffusion coefficient and short half-life isotope after being activated by neutrons. We simulated the neutron characteristics by the Monte Carlo simulation code, Geant4, CERN and performed thermal analysis on the target. The design of target system is very important to produce neutrons for the desired purposes. There are several other existing facilities in Korea, in addition to the cyclotron facility considered in this study, where new neutron target system can be installed and neutrons can be generated. Two prominent facilities are KOMAC, Gyeongju and RFT-30, Jeongeup and we are planning to do study on the possibilities of utilizing the accelerators for neutron generation.

  6. Proton conductive membranes based on doped sulfonated polytriazole

    Energy Technology Data Exchange (ETDEWEB)

    Boaventura, M.; Brandao, L.; Mendes, A. [Laboratorio de Engenharia de Processos, Ambiente e Energia (LEPAE), Faculdade de Engenharia da Universidade do Porto, Rua Roberto Frias, 4200-465 Porto (Portugal); Ponce, M.L.; Nunes, S.P. [GKSS Research Centre Geesthacht GmbH, Max Planck Str. 1, D-21502, Geesthacht (Germany)

    2010-11-15

    This work reports the preparation and characterization of proton conducting sulfonated polytriazole membranes doped with three different agents: 1H-benzimidazole-2-sulfonic acid, benzimidazole and phosphoric acid. The modified membranes were characterized by scanning electron microscopy (SEM), infrared spectra, thermogravimetric analysis (TGA), dynamical mechanical thermal analysis (DMTA) and electrochemical impedance spectroscopy (EIS). The addition of doping agents resulted in a decrease of the glass transition temperature. For membranes doped with 85 wt.% phosphoric acid solution proton conductivity increased up to 2.10{sup -3} S cm{sup -1} at 120 C and at 5% relative humidity. The performance of the phosphoric acid doped membranes was evaluated in a fuel cell set-up at 120 C and 2.5% relative humidity. (author)

  7. Channel capacity study of underwater wireless optical communications links based on Monte Carlo simulation

    International Nuclear Information System (INIS)

    Channel capacity of ocean water is limited by propagation distance and optical properties. Previous studies on this problem are based on water-tank experiments with different amounts of Maalox antacid. However, propagation distance is limited by the experimental set-up and the optical properties are different from ocean water. Therefore, the experiment result is not accurate for the physical design of underwater wireless communications links. This letter developed a Monte Carlo model to study channel capacity of underwater optical communications. Moreover, this model can flexibly configure various parameters of transmitter, receiver and channel, and is suitable for physical underwater optical communications links design. (paper)

  8. A new Monte-Carlo based simulation for the CryoEDM experiment

    OpenAIRE

    Raso-Barnett, Matthew

    2015-01-01

    This thesis presents a new Monte-Carlo based simulation of the physics of ultra-cold neutrons (UCN) in complex geometries and its application to the CryoEDM experiment. It includes a detailed description of the design and performance of this simulation along with its use in a project to study the magnetic depolarisation time of UCN within the apparatus due to magnetic impurities in the measurement cell, which is a crucial parameter in the sensitivity of a neutron electricdipole-moment (nEDM) ...

  9. CARMEN: a system Monte Carlo based on linear programming from direct openings

    International Nuclear Information System (INIS)

    The use of Monte Carlo (MC) has shown an improvement in the accuracy of the calculation of the dose compared to other analytics algorithms installed on the systems of business planning, especially in the case of non-standard situations typical of complex techniques such as IMRT and VMAT. Our treatment planning system called CARMEN, is based on the complete simulation, both the beam transport in the head of the accelerator and the patient, and simulation designed for efficient operation in terms of the accuracy of the estimate and the required computation times. (Author)

  10. Measurement of the proton light response of various LAB based scintillators and its implication for supernova neutrino detection via neutrino-proton scattering

    Energy Technology Data Exchange (ETDEWEB)

    Krosigk, B. von; Zuber, K. [Technische Universitaet Dresden, Institut fuer Kern- und Teilchenphysik, Dresden (Germany); Neumann, L. [Technische Universitaet Dresden, Institut fuer Kern- und Teilchenphysik, Dresden (Germany); Karlsruher Institut fuer Technologie, Institut fuer Experimentelle Kernphysik, Karlsruhe (Germany); Nolte, R.; Roettger, S. [Physikalisch-Technische Bundesanstalt (PTB), Braunschweig (Germany)

    2013-04-15

    The proton light output function in electron-equivalent energy of various scintillators based on linear alkylbenzene (LAB) has been measured in the energy range from 1 MeV to 17.15 MeV for the first time. The measurement was performed at the Physikalisch-Technische Bundesanstalt (PTB) using a neutron beam with continuous energy distribution. The proton light output data is extracted from proton recoil spectra originating from neutron-proton scattering in the scintillator. The functional behavior of the proton light output is described successfully by Birks' law with a Birks constant kB between (0.0094{+-}0.0002) cm MeV{sup -1} and (0.0098{+-}0.0003) cm MeV{sup -1} for the different LAB solutions. The constant C, parameterizing the quadratic term in the generalized Birks law, is consistent with zero for all investigated scintillators with an upper limit (95 % CL) of about 10{sup -7} cm{sup 2} MeV{sup -2}. The resulting quenching factors are especially important for future planned supernova neutrino detection based on the elastic scattering of neutrinos on protons. The impact of proton quenching on the supernova event yield from neutrino-proton scattering is discussed. (orig.)

  11. Monte Carlo simulation optimisation of zinc sulphide based fast-neutron detector for radiography using a 252Cf source

    Science.gov (United States)

    Meshkian, Mohsen

    2016-02-01

    Neutron radiography is rapidly extending as one of the methods for non-destructive screening of materials. There are various parameters to be studied for optimising imaging screens and image quality for different fast-neutron radiography systems. Herein, a Geant4 Monte Carlo simulation is employed to evaluate the response of a fast-neutron radiography system using a 252Cf neutron source. The neutron radiography system is comprised of a moderator as the neutron-to-proton converter with suspended silver-activated zinc sulphide (ZnS(Ag)) as the phosphor material. The neutron-induced protons deposit energy in the phosphor which consequently emits scintillation light. Further, radiographs are obtained by simulating the overall radiography system including source and sample. Two different standard samples are used to evaluate the quality of the radiographs.

  12. Laboratory Report (LR) to the paper Foundation of an analytical proton beamlet model for inclusion in a general proton dose calculation system [arXiv:1009.0832

    OpenAIRE

    Ulmer, W.; Schaffner, B.

    2010-01-01

    We have developed a model for proton depth dose and lateral distributions based on Monte Carlo calculations (GEANT4) and an integration procedure of the Bethe-Bloch equation (BBE). The model accounts for the transport of primary and secondary protons, the creation of recoil protons and heavy recoil nuclei as well as lateral scattering of these contributions. The buildup, which is experimentally observed in higher energy depth dose curves, is modeled by inclusion of two different origins: 1. S...

  13. Research on Reliability Modelling Method of Machining Center Based on Monte Carlo Simulation

    Directory of Open Access Journals (Sweden)

    Chuanhai Chen

    2013-03-01

    Full Text Available The aim of this study is to get the reliability of series system and analyze the reliability of machining center. So a modified method of reliability modelling based on Monte Carlo simulation for series system is proposed. The reliability function, which is built by the classical statistics method based on the assumption that machine tools were repaired as good as new, may be biased in the real case. The reliability functions of subsystems are established respectively and then the reliability model is built according to the reliability block diagram. Then the fitting reliability function of machine tools is established using the failure data of sample generated by Monte Carlo simulation, whose inverse reliability function is solved by the linearization technique based on radial basis function. Finally, an example of the machining center is presented using the proposed method to show its potential application. The analysis results show that the proposed method can provide an accurate reliability model compared with the conventional method.

  14. GPU-based Monte Carlo radiotherapy dose calculation using phase-space sources

    CERN Document Server

    Townson, Reid; Tian, Zhen; Graves, Yan Jiang; Zavgorodni, Sergei; Jiang, Steve B

    2013-01-01

    A novel phase-space source implementation has been designed for GPU-based Monte Carlo dose calculation engines. Due to the parallelized nature of GPU hardware, it is essential to simultaneously transport particles of the same type and similar energies but separated spatially to yield a high efficiency. We present three methods for phase-space implementation that have been integrated into the most recent version of the GPU-based Monte Carlo radiotherapy dose calculation package gDPM v3.0. The first method is to sequentially read particles from a patient-dependent phase-space and sort them on-the-fly based on particle type and energy. The second method supplements this with a simple secondary collimator model and fluence map implementation so that patient-independent phase-space sources can be used. Finally, as the third method (called the phase-space-let, or PSL, method) we introduce a novel strategy to pre-process patient-independent phase-spaces and bin particles by type, energy and position. Position bins l...

  15. PeneloPET, a Monte Carlo PET simulation tool based on PENELOPE: features and validation

    Energy Technology Data Exchange (ETDEWEB)

    Espana, S; Herraiz, J L; Vicente, E; Udias, J M [Grupo de Fisica Nuclear, Departmento de Fisica Atomica, Molecular y Nuclear, Universidad Complutense de Madrid, Madrid (Spain); Vaquero, J J; Desco, M [Unidad de Medicina y CirugIa Experimental, Hospital General Universitario Gregorio Maranon, Madrid (Spain)], E-mail: jose@nuc2.fis.ucm.es

    2009-03-21

    Monte Carlo simulations play an important role in positron emission tomography (PET) imaging, as an essential tool for the research and development of new scanners and for advanced image reconstruction. PeneloPET, a PET-dedicated Monte Carlo tool, is presented and validated in this work. PeneloPET is based on PENELOPE, a Monte Carlo code for the simulation of the transport in matter of electrons, positrons and photons, with energies from a few hundred eV to 1 GeV. PENELOPE is robust, fast and very accurate, but it may be unfriendly to people not acquainted with the FORTRAN programming language. PeneloPET is an easy-to-use application which allows comprehensive simulations of PET systems within PENELOPE. Complex and realistic simulations can be set by modifying a few simple input text files. Different levels of output data are available for analysis, from sinogram and lines-of-response (LORs) histogramming to fully detailed list mode. These data can be further exploited with the preferred programming language, including ROOT. PeneloPET simulates PET systems based on crystal array blocks coupled to photodetectors and allows the user to define radioactive sources, detectors, shielding and other parts of the scanner. The acquisition chain is simulated in high level detail; for instance, the electronic processing can include pile-up rejection mechanisms and time stamping of events, if desired. This paper describes PeneloPET and shows the results of extensive validations and comparisons of simulations against real measurements from commercial acquisition systems. PeneloPET is being extensively employed to improve the image quality of commercial PET systems and for the development of new ones.

  16. Proton therapy

    International Nuclear Information System (INIS)

    Proton therapy has become a subject of considerable interest in the radiation oncology community and it is expected that there will be a substantial growth in proton treatment facilities during the next decade. I was asked to write a historical review of proton therapy based on my personal experiences, which have all occurred in the United States, so therefore I have a somewhat parochial point of view. Space requirements did not permit me to mention all of the existing proton therapy facilities or the names of all of those who have contributed to proton therapy. (review)

  17. Monte Carlo simulation of a compact microbeam radiotherapy system based on carbon nanotube field emission technology

    International Nuclear Information System (INIS)

    Purpose: Microbeam radiation therapy (MRT) is an experimental radiotherapy technique that has shown potent antitumor effects with minimal damage to normal tissue in animal studies. This unique form of radiation is currently only produced in a few large synchrotron accelerator research facilities in the world. To promote widespread translational research on this promising treatment technology we have proposed and are in the initial development stages of a compact MRT system that is based on carbon nanotube field emission x-ray technology. We report on a Monte Carlo based feasibility study of the compact MRT system design. Methods: Monte Carlo calculations were performed using EGSnrc-based codes. The proposed small animal research MRT device design includes carbon nanotube cathodes shaped to match the corresponding MRT collimator apertures, a common reflection anode with filter, and a MRT collimator. Each collimator aperture is sized to deliver a beam width ranging from 30 to 200 μm at 18.6 cm source-to-axis distance. Design parameters studied with Monte Carlo include electron energy, cathode design, anode angle, filtration, and collimator design. Calculations were performed for single and multibeam configurations. Results: Increasing the energy from 100 kVp to 160 kVp increased the photon fluence through the collimator by a factor of 1.7. Both energies produced a largely uniform fluence along the long dimension of the microbeam, with 5% decreases in intensity near the edges. The isocentric dose rate for 160 kVp was calculated to be 700 Gy/min/A in the center of a 3 cm diameter target. Scatter contributions resulting from collimator size were found to produce only small (<7%) changes in the dose rate for field widths greater than 50 μm. Dose vs depth was weakly dependent on filtration material. The peak-to-valley ratio varied from 10 to 100 as the separation between adjacent microbeams varies from 150 to 1000 μm. Conclusions: Monte Carlo simulations demonstrate

  18. A Proposed Experimental Test of Proton-Driven Plasma Wakefield Acceleration Based on CERN SPS

    CERN Document Server

    Xia, G X; Lotov, K; Pukhov, A; Assmann, R; Zimmermann, F; Huang, C; Vieira, J; Lopes, N; Fonseca, RA; Silva, LO; An, W; Joshi, C; Mori, W; Lu, W; Muggli, P

    2011-01-01

    Proton-bunch driven plasma wakefield acceleration (PDPWA) has been proposed as an approach to accelerate electron beam to TeV energy regime in a single plasma section. An experimental test has recently proposed to demonstrate the capability of PDPWA by using proton beams from the CERN SPS. The layout of the experiment is introduced. Particle-in-cell simulation results based on the realistic beam parameters are presented. Presented at PAC2011 New York, 28 March - 1 April 2011.

  19. Four-dimensional superquadric-based cardiac phantom for Monte Carlo simulation of radiological imaging systems

    International Nuclear Information System (INIS)

    A four-dimensional (x, y, z, t) composite superquadric-based object model of the human heart for Monte Carlo simulation of radiological imaging systems has been developed. The phantom models the real temporal geometric conditions of a beating heart for frame rates up to 32 per cardiac cycle. Phantom objects are described by boolean combinations of superquadric ellipsoid sections.Moving spherical coordinate systems are chosen to model wall movement whereby points of the ventricle and atria walls are assumed to move towards a moving center-of-gravity point. Due to the non-static coordinate systems, the atrial/ventricular valve plane of the mathematical heart phantom moves up and down along the left ventricular long axis resulting in reciprocal emptying and filling of atria and ventricles. Compared to the base movement, the epicardial apex as well as the superior atria area are almost fixed in space. Since geometric parameters of the objects are directly applied on intersection calculations of the photon ray with object boundaries during Monte Carlo simulation, no phantom discretization artifacts are involved

  20. An intense neutron generator based on a proton accelerator

    International Nuclear Information System (INIS)

    A study has been made of the demand for a neutron facility with a thermal flux of ≥ 1016 n cm-2 sec-1 and of possible methods of producing such fluxes with existing or presently developing technology. Experimental projects proposed by neutron users requiring high fluxes call for neutrons of all energies from thermal to 100 MeV with both continuous-wave and pulsed output. Consideration of the heat generated in the source per useful neutron liberated shows that the (p,xn) reaction with 400 1000 MeV bombarding energies and heavy element targets (e.g. bismuth, lead) is capable of greater specific source strength than other possible methods realizable within the time scale. A preliminary parameter optimization carried through for the accelerator currently promising greatest economy (the separated orbit cyclotron or S.O.C.), reveals that a facility delivering a proton beam of about 65 mA at about 1 BeV would satisfy the flux requirement with a neutron cost significantly more favourable than that projected for a high flux reactor. It is suggested that a proton storage ring providing post-acceleration pulsing of the proton beam should be developed for the facility. With this elaboration, and by taking advantage of the intrinsic microscopic pulse structure provided by the radio frequency duty cycle, a very versatile source may be devised capable of producing multiple beams of continuous and pulsed neutrons with a wide range of energies and pulse widths. The source promises to be of great value for high flux irradiations and as a pilot facility for advanced reactor technology. The proposed proton accelerator also constitutes a meson source capable of producing beams of π and μ mesons and of neutrinos orders of magnitude more intense than those of any accelerator presently in use. These beams, which can be produced simultaneously with the neutron beams, open vast areas of new research in fundamental nuclear structure, elementary particle physics, and perhaps also in

  1. Avalanche proton-boron fusion based on elastic nuclear collisions

    Science.gov (United States)

    Eliezer, Shalom; Hora, Heinrich; Korn, Georg; Nissim, Noaz; Martinez Val, Josè Maria

    2016-05-01

    Recent experiments done at Prague with the 600 J/0.2 ns PALS laser interacting with a layer of boron dopants in a hydrogen enriched target have produced around 109 alphas. We suggest that these unexpected very high fusion reactions of proton with 11B indicate an avalanche multiplication for the measured anomalously high nuclear reaction yields. This can be explained by elastic nuclear collisions in the broad 600 keV energy band, which is coincident with the high nuclear p-11B fusion cross section, by the way of multiplication through generation of three secondary alpha particles from a single primarily produced alpha particle.

  2. Inverse treatment planning for radiation therapy based on fast Monte Carlo dose calculation

    International Nuclear Information System (INIS)

    An inverse treatment planning system based on fast Monte Carlo (MC) dose calculation is presented. It allows optimisation of intensity modulated dose distributions in 15 to 60 minutes on present day personal computers. If a multi-processor machine is available, parallel simulation of particle histories is also possible, leading to further calculation time reductions. The optimisation process is divided into two stages. The first stage results influence profiles based on pencil beam (PB) dose calculation. The second stage starts with MC verification and post-optimisation of the PB dose and fluence distributions. Because of the potential to accurately model beam modifiers, MC based inverse planning systems are able to optimise compensator thicknesses and leaf trajectories instead of intensity profiles only. The corresponding techniques, whose implementation is the subject for future work, are also presented here. (orig.)

  3. GPU-based high performance Monte Carlo simulation in neutron transport

    Energy Technology Data Exchange (ETDEWEB)

    Heimlich, Adino; Mol, Antonio C.A.; Pereira, Claudio M.N.A. [Instituto de Engenharia Nuclear (IEN/CNEN-RJ), Rio de Janeiro, RJ (Brazil). Lab. de Inteligencia Artificial Aplicada], e-mail: cmnap@ien.gov.br

    2009-07-01

    Graphics Processing Units (GPU) are high performance co-processors intended, originally, to improve the use and quality of computer graphics applications. Since researchers and practitioners realized the potential of using GPU for general purpose, their application has been extended to other fields out of computer graphics scope. The main objective of this work is to evaluate the impact of using GPU in neutron transport simulation by Monte Carlo method. To accomplish that, GPU- and CPU-based (single and multicore) approaches were developed and applied to a simple, but time-consuming problem. Comparisons demonstrated that the GPU-based approach is about 15 times faster than a parallel 8-core CPU-based approach also developed in this work. (author)

  4. IMPROVED ALGORITHM FOR ROAD REGION SEGMENTATION BASED ON SEQUENTIAL MONTE-CARLO ESTIMATION

    Directory of Open Access Journals (Sweden)

    Zdenek Prochazka

    2014-12-01

    Full Text Available In recent years, many researchers and car makers put a lot of intensive effort into development of autonomous driving systems. Since visual information is the main modality used by human driver, a camera mounted on moving platform is very important kind of sensor, and various computer vision algorithms to handle vehicle surrounding situation are under intensive research. Our final goal is to develop a vision based lane detection system with ability to handle various types of road shapes, working on both structured and unstructured roads, ideally under presence of shadows. This paper presents a modified road region segmentation algorithm based on sequential Monte-Carlo estimation. Detailed description of the algorithm is given, and evaluation results show that the proposed algorithm outperforms the segmentation algorithm developed as a part of our previous work, as well as an conventional algorithm based on colour histogram.

  5. Pattern-oriented Agent-based Monte Carlo simulation of Cellular Redox Environment

    DEFF Research Database (Denmark)

    Tang, Jiaowei; Holcombe, Mike; Boonen, Harrie C.M.

    . Because complex networks and dynamics of redox still is not completely understood , results of existing experiments will be used to validate the modeling according to ideas in pattern-oriented agent-based modeling[8]. The simulation of this model is computational intensive, thus an application 'FLAME......] could be very important factors. In our project, an agent-based Monte Carlo modeling [6] is offered to study the dynamic relationship between extracellular and intracellular redox and complex networks of redox reactions. In the model, pivotal redox-related reactions will be included, and the reactants...... cells. Biochimica Et Biophysica Acta-General Subjects, 2008. 1780(11): p. 1271-1290. 5. Jones, D.P., Redox sensing: orthogonal control in cell cycle and apoptosis signalling. J Intern Med, 2010. 268(5): p. 432-48. 6. Pogson, M., et al., Formal agent-based modelling of intracellular chemical interactions...

  6. GPU-based high performance Monte Carlo simulation in neutron transport

    International Nuclear Information System (INIS)

    Graphics Processing Units (GPU) are high performance co-processors intended, originally, to improve the use and quality of computer graphics applications. Since researchers and practitioners realized the potential of using GPU for general purpose, their application has been extended to other fields out of computer graphics scope. The main objective of this work is to evaluate the impact of using GPU in neutron transport simulation by Monte Carlo method. To accomplish that, GPU- and CPU-based (single and multicore) approaches were developed and applied to a simple, but time-consuming problem. Comparisons demonstrated that the GPU-based approach is about 15 times faster than a parallel 8-core CPU-based approach also developed in this work. (author)

  7. Laboratory Report (LR) to the paper Foundation of an analytical proton beamlet model for inclusion in a general proton dose calculation system [arXiv:1009.0832

    CERN Document Server

    Ulmer, W

    2010-01-01

    We have developed a model for proton depth dose and lateral distributions based on Monte Carlo calculations (GEANT4) and an integration procedure of the Bethe-Bloch equation (BBE). The model accounts for the transport of primary and secondary protons, the creation of recoil protons and heavy recoil nuclei as well as lateral scattering of these contributions. The buildup, which is experimentally observed in higher energy depth dose curves, is modeled by inclusion of two different origins: 1. Secondary reaction protons with a contribution of ca. 65 % of the buildup (for monoenergetic protons). 2. Landau tails as well as Gaussian type of fluctuations for range straggling effects. All parameters of the model for initially monoenergetic proton beams have been obtained from Monte Carlo calculations or checked by them. Furthermore, there are a few parameters, which can be obtained by fitting the model to measured depth dose curves in order to describe individual characteristics of the beamline - the most important b...

  8. CARMEN: a system Monte Carlo based on linear programming from direct openings; CARMEN: Un sistema de planficiacion Monte Carlo basado en programacion lineal a partir de aberturas directas

    Energy Technology Data Exchange (ETDEWEB)

    Ureba, A.; Pereira-Barbeiro, A. R.; Jimenez-Ortega, E.; Baeza, J. A.; Salguero, F. J.; Leal, A.

    2013-07-01

    The use of Monte Carlo (MC) has shown an improvement in the accuracy of the calculation of the dose compared to other analytics algorithms installed on the systems of business planning, especially in the case of non-standard situations typical of complex techniques such as IMRT and VMAT. Our treatment planning system called CARMEN, is based on the complete simulation, both the beam transport in the head of the accelerator and the patient, and simulation designed for efficient operation in terms of the accuracy of the estimate and the required computation times. (Author)

  9. SQUID-based beam position monitoring for proton EDM experiment

    Science.gov (United States)

    Haciomeroglu, Selcuk

    2014-09-01

    One of the major systematic errors in the proton EDM experiment is the radial B-field, since it couples the magnetic dipole moment and causes a vertical spin precession. For a proton with EDM at the level of 10-29 e.cm, 0.22 pG of B-field and 10.5 MV/m of E-field cause same vertical spin precession. On the other hand, the radial B-field splits the counter-rotating beams depending on the vertical focusing strength in the ring The magnetic field due to this split modulated at a few kHz can be measured by a SQUID-magnetometer. This measurement requires the B-field to be kept less than 1 nT everywhere around the ring using shields of mu-metal and aluminum layers. Then, the SQUID measurements involve noise from three sources: outside the shields, the shields themselves and the beam. We study these three sources of noise using an electric circuit (mimicking the beam) inside a magnetic shielding room which consists two-layers of mu-metal and an aluminum layer.

  10. Insight into proton transfer in phosphotungstic acid functionalized mesoporous silica-based proton exchange membrane fuel cells.

    Science.gov (United States)

    Zhou, Yuhua; Yang, Jing; Su, Haibin; Zeng, Jie; Jiang, San Ping; Goddard, William A

    2014-04-01

    We have developed for fuel cells a novel proton exchange membrane (PEM) using inorganic phosphotungstic acid (HPW) as proton carrier and mesoporous silica as matrix (HPW-meso-silica) . The proton conductivity measured by electrochemical impedance spectroscopy is 0.11 S cm(-1) at 90 °C and 100% relative humidity (RH) with a low activation energy of ∼14 kJ mol(-1). In order to determine the energetics associated with proton migration within the HPW-meso-silica PEM and to determine the mechanism of proton hopping, we report density functional theory (DFT) calculations using the generalized gradient approximation (GGA). These DFT calculations revealed that the proton transfer process involves both intramolecular and intermolecular proton transfer pathways. When the adjacent HPWs are close (less than 17.0 Å apart), the calculated activation energy for intramolecular proton transfer within a HPW molecule is higher (29.1-18.8 kJ/mol) than the barrier for intermolecular proton transfer along the hydrogen bond. We find that the overall barrier for proton movement within the HPW-meso-silica membranes is determined by the intramolecular proton transfer pathway, which explains why the proton conductivity remains unchanged when the weight percentage of HPW on meso-silica is above 67 wt %. In contrast, the activation energy of proton transfer on a clean SiO2 (111) surface is computed to be as high as ∼40 kJ mol(-1), confirming the very low proton conductivity on clean silica surfaces observed experimentally. PMID:24628538

  11. Effects of 1-MeV proton irradiation in Hg-based cuprate thin films

    International Nuclear Information System (INIS)

    We have studied the effects of 1-Mev proton irradiation on both superconducting properties and normal state resistivity of high-quality HgBa2CaCu2O6+δ (Hg-1212) and HgBa2Ca2Cu3O8+δ (Hg-1223) thin films. At low proton doses, we observed a linear decrease of the superconducting transition temperature Tc and a linear increase of the extrapolated residual resistivity as proton dose is increased. This is consistent with observations of other high-Tc superconductors while a lower dose threshold for suppressing the superconductivity is found in Hg-1212 and Hg-1223 films. To explain the linear dose dependence of Tc, we propose a model based on the proximity effect. An enhancement of up to 90% in the critical current density at low fields has also been observed in these films at low proton fluences that do not significantly degrade Tc. copyright 1997 The American Physical Society

  12. Monte Carlo calculation based on hydrogen composition of the tissue for MV photon radiotherapy.

    Science.gov (United States)

    Demol, Benjamin; Viard, Romain; Reynaert, Nick

    2015-01-01

    The purpose of this study was to demonstrate that Monte Carlo treatment planning systems require tissue characterization (density and composition) as a function of CT number. A discrete set of tissue classes with a specific composition is introduced. In the current work we demonstrate that, for megavoltage photon radiotherapy, only the hydrogen content of the different tissues is of interest. This conclusion might have an impact on MRI-based dose calculations and on MVCT calibration using tissue substitutes. A stoichiometric calibration was performed, grouping tissues with similar atomic composition into 15 dosimetrically equivalent subsets. To demonstrate the importance of hydrogen, a new scheme was derived, with correct hydrogen content, complemented by oxygen (all elements differing from hydrogen are replaced by oxygen). Mass attenuation coefficients and mass stopping powers for this scheme were calculated and compared to the original scheme. Twenty-five CyberKnife treatment plans were recalculated by an in-house developed Monte Carlo system using tissue density and hydrogen content derived from the CT images. The results were compared to Monte Carlo simulations using the original stoichiometric calibration. Between 300 keV and 3 MeV, the relative difference of mass attenuation coefficients is under 1% within all subsets. Between 10 keV and 20 MeV, the relative difference of mass stopping powers goes up to 5% in hard bone and remains below 2% for all other tissue subsets. Dose-volume histograms (DVHs) of the treatment plans present no visual difference between the two schemes. Relative differences of dose indexes D98, D95, D50, D05, D02, and Dmean were analyzed and a distribution centered around zero and of standard deviation below 2% (3 σ) was established. On the other hand, once the hydrogen content is slightly modified, important dose differences are obtained. Monte Carlo dose planning in the field of megavoltage photon radiotherapy is fully achievable using

  13. Review of dynamical models for external dose calculations based on Monte Carlo simulations in urbanised areas

    International Nuclear Information System (INIS)

    After an accidental release of radionuclides to the inhabited environment the external gamma irradiation from deposited radioactivity contributes significantly to the radiation exposure of the population for extended periods. For evaluating this exposure pathway, three main model requirements are needed: (i) to calculate the air kerma value per photon emitted per unit source area, based on Monte Carlo (MC) simulations; (ii) to describe the distribution and dynamics of radionuclides on the diverse urban surfaces; and (iii) to combine all these elements in a relevant urban model to calculate the resulting doses according to the actual scenario. This paper provides an overview about the different approaches to calculate photon transport in urban areas and about several dose calculation codes published. Two types of Monte Carlo simulations are presented using the global and the local approaches of photon transport. Moreover, two different philosophies of the dose calculation, the 'location factor method' and a combination of relative contamination of surfaces with air kerma values are described. The main features of six codes (ECOSYS, EDEM2M, EXPURT, PARATI, TEMAS, URGENT) are highlighted together with a short model-model features intercomparison

  14. A CNS calculation line based on a Monte-Carlo method

    International Nuclear Information System (INIS)

    The neutronic design of the moderator cell of a Cold Neutron Source (CNS) involves many different considerations regarding geometry, location, and materials. The decisions taken in this sense affect not only the neutron flux in the source neighbourhood, which can be evaluated by a standard deterministic method, but also the neutron flux values in experimental positions far away from the neutron source. At long distances from the CNS, very time consuming 3D deterministic methods or Monte Carlo transport methods are necessary in order to get accurate figures of standard and typical magnitudes such as average neutron flux, neutron current, angular flux, and luminosity. The Monte Carlo method is a unique and powerful tool to calculate the transport of neutrons and photons. Its use in a bootstrap scheme appears to be an appropriate solution for this type of systems. The use of MCNP as the main neutronic design tool leads to a fast and reliable method to perform calculations in a relatively short time with low statistical errors, if the proper scheme is applied. The design goal is to evaluate the performance of the CNS, its beam tubes and neutron guides, at specific experimental locations in the reactor hall and in the neutron or experimental hall. In this work, the calculation methodology used to design a CNS and its associated Neutron Beam Transport Systems (NBTS), based on the use of the MCNP code, is presented. (author)

  15. Fission yield calculation using toy model based on Monte Carlo simulation

    Energy Technology Data Exchange (ETDEWEB)

    Jubaidah, E-mail: jubaidah@student.itb.ac.id [Nuclear Physics and Biophysics Division, Department of Physics, Bandung Institute of Technology. Jl. Ganesa No. 10 Bandung – West Java, Indonesia 40132 (Indonesia); Physics Department, Faculty of Mathematics and Natural Science – State University of Medan. Jl. Willem Iskandar Pasar V Medan Estate – North Sumatera, Indonesia 20221 (Indonesia); Kurniadi, Rizal, E-mail: rijalk@fi.itb.ac.id [Nuclear Physics and Biophysics Division, Department of Physics, Bandung Institute of Technology. Jl. Ganesa No. 10 Bandung – West Java, Indonesia 40132 (Indonesia)

    2015-09-30

    Toy model is a new approximation in predicting fission yield distribution. Toy model assumes nucleus as an elastic toy consist of marbles. The number of marbles represents the number of nucleons, A. This toy nucleus is able to imitate the real nucleus properties. In this research, the toy nucleons are only influenced by central force. A heavy toy nucleus induced by a toy nucleon will be split into two fragments. These two fission fragments are called fission yield. In this research, energy entanglement is neglected. Fission process in toy model is illustrated by two Gaussian curves intersecting each other. There are five Gaussian parameters used in this research. They are scission point of the two curves (R{sub c}), mean of left curve (μ{sub L}) and mean of right curve (μ{sub R}), deviation of left curve (σ{sub L}) and deviation of right curve (σ{sub R}). The fission yields distribution is analyses based on Monte Carlo simulation. The result shows that variation in σ or µ can significanly move the average frequency of asymmetry fission yields. This also varies the range of fission yields distribution probability. In addition, variation in iteration coefficient only change the frequency of fission yields. Monte Carlo simulation for fission yield calculation using toy model successfully indicates the same tendency with experiment results, where average of light fission yield is in the range of 90

  16. Fission yield calculation using toy model based on Monte Carlo simulation

    International Nuclear Information System (INIS)

    Toy model is a new approximation in predicting fission yield distribution. Toy model assumes nucleus as an elastic toy consist of marbles. The number of marbles represents the number of nucleons, A. This toy nucleus is able to imitate the real nucleus properties. In this research, the toy nucleons are only influenced by central force. A heavy toy nucleus induced by a toy nucleon will be split into two fragments. These two fission fragments are called fission yield. In this research, energy entanglement is neglected. Fission process in toy model is illustrated by two Gaussian curves intersecting each other. There are five Gaussian parameters used in this research. They are scission point of the two curves (Rc), mean of left curve (μL) and mean of right curve (μR), deviation of left curve (σL) and deviation of right curve (σR). The fission yields distribution is analyses based on Monte Carlo simulation. The result shows that variation in σ or µ can significanly move the average frequency of asymmetry fission yields. This also varies the range of fission yields distribution probability. In addition, variation in iteration coefficient only change the frequency of fission yields. Monte Carlo simulation for fission yield calculation using toy model successfully indicates the same tendency with experiment results, where average of light fission yield is in the range of 90

  17. Sign learning kink-based (SiLK) quantum Monte Carlo for molecular systems

    CERN Document Server

    Ma, Xiaoyao; Loffler, Frank; Kowalski, Karol; Bhaskaran-Nair, Kiran; Jarrell, Mark; Moreno, Juana

    2015-01-01

    The Sign Learning Kink (SiLK) based Quantum Monte Carlo (QMC) method is used to calculate the ab initio ground state energies for multiple geometries of the H$_{2}$O, N$_2$, and F$_2$ molecules. The method is based on Feynman's path integral formulation of quantum mechanics and has two stages. The first stage is called the learning stage and reduces the well-known QMC minus sign problem by optimizing the linear combinations of Slater determinants which are used in the second stage, a conventional QMC simulation. The method is tested using different vector spaces and compared to the results of other quantum chemical methods and to exact diagonalization. Our findings demonstrate that the SiLK method is accurate and reduces or eliminates the minus sign problem.

  18. Study of CANDU thorium-based fuel cycles by deterministic and Monte Carlo methods

    International Nuclear Information System (INIS)

    In the framework of the Generation IV forum, there is a renewal of interest in self-sustainable thorium fuel cycles applied to various concepts such as Molten Salt Reactors [1, 2] or High Temperature Reactors [3, 4]. Precise evaluations of the U-233 production potential relying on existing reactors such as PWRs [5] or CANDUs [6] are hence necessary. As a consequence of its design (online refueling and D2O moderator in a thermal spectrum), the CANDU reactor has moreover an excellent neutron economy and consequently a high fissile conversion ratio [7]. For these reasons, we try here, with a shorter term view, to re-evaluate the economic competitiveness of once-through thorium-based fuel cycles in CANDU [8]. Two simulation tools are used: the deterministic Canadian cell code DRAGON [9] and MURE [10], a C++ tool for reactor evolution calculations based on the Monte Carlo code MCNP [11]. (authors)

  19. Sign Learning Kink-based (SiLK) Quantum Monte Carlo for molecular systems.

    Science.gov (United States)

    Ma, Xiaoyao; Hall, Randall W; Löffler, Frank; Kowalski, Karol; Bhaskaran-Nair, Kiran; Jarrell, Mark; Moreno, Juana

    2016-01-01

    The Sign Learning Kink (SiLK) based Quantum Monte Carlo (QMC) method is used to calculate the ab initio ground state energies for multiple geometries of the H2O, N2, and F2 molecules. The method is based on Feynman's path integral formulation of quantum mechanics and has two stages. The first stage is called the learning stage and reduces the well-known QMC minus sign problem by optimizing the linear combinations of Slater determinants which are used in the second stage, a conventional QMC simulation. The method is tested using different vector spaces and compared to the results of other quantum chemical methods and to exact diagonalization. Our findings demonstrate that the SiLK method is accurate and reduces or eliminates the minus sign problem. PMID:26747795

  20. Sign Learning Kink-based (SiLK) Quantum Monte Carlo for molecular systems

    Science.gov (United States)

    Ma, Xiaoyao; Hall, Randall W.; Löffler, Frank; Kowalski, Karol; Bhaskaran-Nair, Kiran; Jarrell, Mark; Moreno, Juana

    2016-01-01

    The Sign Learning Kink (SiLK) based Quantum Monte Carlo (QMC) method is used to calculate the ab initio ground state energies for multiple geometries of the H2O, N2, and F2 molecules. The method is based on Feynman's path integral formulation of quantum mechanics and has two stages. The first stage is called the learning stage and reduces the well-known QMC minus sign problem by optimizing the linear combinations of Slater determinants which are used in the second stage, a conventional QMC simulation. The method is tested using different vector spaces and compared to the results of other quantum chemical methods and to exact diagonalization. Our findings demonstrate that the SiLK method is accurate and reduces or eliminates the minus sign problem.

  1. Sign learning kink-based (SiLK) quantum Monte Carlo for molecular systems

    Energy Technology Data Exchange (ETDEWEB)

    Ma, Xiaoyao; Hall, Randall W.; Loffler, Frank; Kowalski, Karol; Bhaskaran-Nair, Kiran; Jarrell, Mark; Moreno, Juana

    2016-01-07

    The Sign Learning Kink (SiLK) based Quantum Monte Carlo (QMC) method is used to calculate the ab initio ground state energies for multiple geometries of the H2O, N2, and F2 molecules. The method is based on Feynman’s path integral formulation of quantum mechanics and has two stages. The first stage is called the learning stage and reduces the well-known QMC minus sign problem by optimizing the linear combinations of Slater determinants which are used in the second stage, a conventional QMC simulation. The method is tested using different vector spaces and compared to the results of other quantum chemical methods and to exact diagonalization. Our findings demonstrate that the SiLK method is accurate and reduces or eliminates the minus sign problem.

  2. Sign Learning Kink-based (SiLK) Quantum Monte Carlo for molecular systems

    Energy Technology Data Exchange (ETDEWEB)

    Ma, Xiaoyao [Department of Physics and Astronomy, Louisiana State University, Baton Rouge, Louisiana 70803 (United States); Hall, Randall W. [Department of Natural Sciences and Mathematics, Dominican University of California, San Rafael, California 94901 (United States); Department of Chemistry, Louisiana State University, Baton Rouge, Louisiana 70803 (United States); Löffler, Frank [Center for Computation and Technology, Louisiana State University, Baton Rouge, Louisiana 70803 (United States); Kowalski, Karol [William R. Wiley Environmental Molecular Sciences Laboratory, Battelle, Pacific Northwest National Laboratory, Richland, Washington 99352 (United States); Bhaskaran-Nair, Kiran; Jarrell, Mark; Moreno, Juana [Department of Physics and Astronomy, Louisiana State University, Baton Rouge, Louisiana 70803 (United States); Center for Computation and Technology, Louisiana State University, Baton Rouge, Louisiana 70803 (United States)

    2016-01-07

    The Sign Learning Kink (SiLK) based Quantum Monte Carlo (QMC) method is used to calculate the ab initio ground state energies for multiple geometries of the H{sub 2}O, N{sub 2}, and F{sub 2} molecules. The method is based on Feynman’s path integral formulation of quantum mechanics and has two stages. The first stage is called the learning stage and reduces the well-known QMC minus sign problem by optimizing the linear combinations of Slater determinants which are used in the second stage, a conventional QMC simulation. The method is tested using different vector spaces and compared to the results of other quantum chemical methods and to exact diagonalization. Our findings demonstrate that the SiLK method is accurate and reduces or eliminates the minus sign problem.

  3. Sign Learning Kink-based (SiLK) Quantum Monte Carlo for molecular systems

    International Nuclear Information System (INIS)

    The Sign Learning Kink (SiLK) based Quantum Monte Carlo (QMC) method is used to calculate the ab initio ground state energies for multiple geometries of the H2O, N2, and F2 molecules. The method is based on Feynman’s path integral formulation of quantum mechanics and has two stages. The first stage is called the learning stage and reduces the well-known QMC minus sign problem by optimizing the linear combinations of Slater determinants which are used in the second stage, a conventional QMC simulation. The method is tested using different vector spaces and compared to the results of other quantum chemical methods and to exact diagonalization. Our findings demonstrate that the SiLK method is accurate and reduces or eliminates the minus sign problem

  4. Simulation of Cone Beam CT System Based on Monte Carlo Method

    CERN Document Server

    Wang, Yu; Cao, Ruifen; Hu, Liqin; Li, Bingbing

    2014-01-01

    Adaptive Radiation Therapy (ART) was developed based on Image-guided Radiation Therapy (IGRT) and it is the trend of photon radiation therapy. To get a better use of Cone Beam CT (CBCT) images for ART, the CBCT system model was established based on Monte Carlo program and validated against the measurement. The BEAMnrc program was adopted to the KV x-ray tube. Both IOURCE-13 and ISOURCE-24 were chosen to simulate the path of beam particles. The measured Percentage Depth Dose (PDD) and lateral dose profiles under 1cm water were compared with the dose calculated by DOSXYZnrc program. The calculated PDD was better than 1% within the depth of 10cm. More than 85% points of calculated lateral dose profiles was within 2%. The correct CBCT system model helps to improve CBCT image quality for dose verification in ART and assess the CBCT image concomitant dose risk.

  5. Monte Carlo dose calculation using a cell processor based PlayStation 3 system

    International Nuclear Information System (INIS)

    This study investigates the performance of the EGSnrc computer code coupled with a Cell-based hardware in Monte Carlo simulation of radiation dose in radiotherapy. Performance evaluations of two processor-intensive functions namely, HOWNEAR and RANMARGET in the EGSnrc code were carried out basing on the 20-80 rule (Pareto principle). The execution speeds of the two functions were measured by the profiler gprof specifying the number of executions and total time spent on the functions. A testing architecture designed for Cell processor was implemented in the evaluation using a PlayStation3 (PS3) system. The evaluation results show that the algorithms examined are readily parallelizable on the Cell platform, provided that an architectural change of the EGSnrc was made. However, as the EGSnrc performance was limited by the PowerPC Processing Element in the PS3, PC coupled with graphics processing units or GPCPU may provide a more viable avenue for acceleration.

  6. Single event upset cross section calculation for secondary particles induced by proton using Geant4

    International Nuclear Information System (INIS)

    Based on Monte-Carlo software Geant4, a model for calculating the proton single event upset (SEU) cross section of SRAM cell was presented. The secondary particles induced by protons were considered and effective sensitive regions were determined according to the range of the secondary particles. The single event upset and multiple bits upset (MBU) cross sections for protons with different energy were calculated. The results are in agreement with the theoretical and experimental data. (authors)

  7. GPU-based Monte Carlo radiotherapy dose calculation using phase-space sources

    International Nuclear Information System (INIS)

    A novel phase-space source implementation has been designed for graphics processing unit (GPU)-based Monte Carlo dose calculation engines. Short of full simulation of the linac head, using a phase-space source is the most accurate method to model a clinical radiation beam in dose calculations. However, in GPU-based Monte Carlo dose calculations where the computation efficiency is very high, the time required to read and process a large phase-space file becomes comparable to the particle transport time. Moreover, due to the parallelized nature of GPU hardware, it is essential to simultaneously transport particles of the same type and similar energies but separated spatially to yield a high efficiency. We present three methods for phase-space implementation that have been integrated into the most recent version of the GPU-based Monte Carlo radiotherapy dose calculation package gDPM v3.0. The first method is to sequentially read particles from a patient-dependent phase-space and sort them on-the-fly based on particle type and energy. The second method supplements this with a simple secondary collimator model and fluence map implementation so that patient-independent phase-space sources can be used. Finally, as the third method (called the phase-space-let, or PSL, method) we introduce a novel source implementation utilizing pre-processed patient-independent phase-spaces that are sorted by particle type, energy and position. Position bins located outside a rectangular region of interest enclosing the treatment field are ignored, substantially decreasing simulation time with little effect on the final dose distribution. The three methods were validated in absolute dose against BEAMnrc/DOSXYZnrc and compared using gamma-index tests (2%/2 mm above the 10% isodose). It was found that the PSL method has the optimal balance between accuracy and efficiency and thus is used as the default method in gDPM v3.0. Using the PSL method, open fields of 4 × 4, 10 × 10 and 30 × 30 cm

  8. GPU-based Monte Carlo radiotherapy dose calculation using phase-space sources.

    Science.gov (United States)

    Townson, Reid W; Jia, Xun; Tian, Zhen; Graves, Yan Jiang; Zavgorodni, Sergei; Jiang, Steve B

    2013-06-21

    A novel phase-space source implementation has been designed for graphics processing unit (GPU)-based Monte Carlo dose calculation engines. Short of full simulation of the linac head, using a phase-space source is the most accurate method to model a clinical radiation beam in dose calculations. However, in GPU-based Monte Carlo dose calculations where the computation efficiency is very high, the time required to read and process a large phase-space file becomes comparable to the particle transport time. Moreover, due to the parallelized nature of GPU hardware, it is essential to simultaneously transport particles of the same type and similar energies but separated spatially to yield a high efficiency. We present three methods for phase-space implementation that have been integrated into the most recent version of the GPU-based Monte Carlo radiotherapy dose calculation package gDPM v3.0. The first method is to sequentially read particles from a patient-dependent phase-space and sort them on-the-fly based on particle type and energy. The second method supplements this with a simple secondary collimator model and fluence map implementation so that patient-independent phase-space sources can be used. Finally, as the third method (called the phase-space-let, or PSL, method) we introduce a novel source implementation utilizing pre-processed patient-independent phase-spaces that are sorted by particle type, energy and position. Position bins located outside a rectangular region of interest enclosing the treatment field are ignored, substantially decreasing simulation time with little effect on the final dose distribution. The three methods were validated in absolute dose against BEAMnrc/DOSXYZnrc and compared using gamma-index tests (2%/2 mm above the 10% isodose). It was found that the PSL method has the optimal balance between accuracy and efficiency and thus is used as the default method in gDPM v3.0. Using the PSL method, open fields of 4 × 4, 10 × 10 and 30 × 30 cm

  9. GGEMS-Brachy: GPU GEant4-based Monte Carlo simulation for brachytherapy applications

    Science.gov (United States)

    Lemaréchal, Yannick; Bert, Julien; Falconnet, Claire; Després, Philippe; Valeri, Antoine; Schick, Ulrike; Pradier, Olivier; Garcia, Marie-Paule; Boussion, Nicolas; Visvikis, Dimitris

    2015-07-01

    In brachytherapy, plans are routinely calculated using the AAPM TG43 formalism which considers the patient as a simple water object. An accurate modeling of the physical processes considering patient heterogeneity using Monte Carlo simulation (MCS) methods is currently too time-consuming and computationally demanding to be routinely used. In this work we implemented and evaluated an accurate and fast MCS on Graphics Processing Units (GPU) for brachytherapy low dose rate (LDR) applications. A previously proposed Geant4 based MCS framework implemented on GPU (GGEMS) was extended to include a hybrid GPU navigator, allowing navigation within voxelized patient specific images and analytically modeled 125I seeds used in LDR brachytherapy. In addition, dose scoring based on track length estimator including uncertainty calculations was incorporated. The implemented GGEMS-brachy platform was validated using a comparison with Geant4 simulations and reference datasets. Finally, a comparative dosimetry study based on the current clinical standard (TG43) and the proposed platform was performed on twelve prostate cancer patients undergoing LDR brachytherapy. Considering patient 3D CT volumes of 400  × 250  × 65 voxels and an average of 58 implanted seeds, the mean patient dosimetry study run time for a 2% dose uncertainty was 9.35 s (≈500 ms 10-6 simulated particles) and 2.5 s when using one and four GPUs, respectively. The performance of the proposed GGEMS-brachy platform allows envisaging the use of Monte Carlo simulation based dosimetry studies in brachytherapy compatible with clinical practice. Although the proposed platform was evaluated for prostate cancer, it is equally applicable to other LDR brachytherapy clinical applications. Future extensions will allow its application in high dose rate brachytherapy applications.

  10. GGEMS-Brachy: GPU GEant4-based Monte Carlo simulation for brachytherapy applications

    International Nuclear Information System (INIS)

    In brachytherapy, plans are routinely calculated using the AAPM TG43 formalism which considers the patient as a simple water object. An accurate modeling of the physical processes considering patient heterogeneity using Monte Carlo simulation (MCS) methods is currently too time-consuming and computationally demanding to be routinely used. In this work we implemented and evaluated an accurate and fast MCS on Graphics Processing Units (GPU) for brachytherapy low dose rate (LDR) applications. A previously proposed Geant4 based MCS framework implemented on GPU (GGEMS) was extended to include a hybrid GPU navigator, allowing navigation within voxelized patient specific images and analytically modeled 125I seeds used in LDR brachytherapy. In addition, dose scoring based on track length estimator including uncertainty calculations was incorporated. The implemented GGEMS-brachy platform was validated using a comparison with Geant4 simulations and reference datasets. Finally, a comparative dosimetry study based on the current clinical standard (TG43) and the proposed platform was performed on twelve prostate cancer patients undergoing LDR brachytherapy. Considering patient 3D CT volumes of 400  × 250  × 65 voxels and an average of 58 implanted seeds, the mean patient dosimetry study run time for a 2% dose uncertainty was 9.35 s (≈500 ms 10−6 simulated particles) and 2.5 s when using one and four GPUs, respectively. The performance of the proposed GGEMS-brachy platform allows envisaging the use of Monte Carlo simulation based dosimetry studies in brachytherapy compatible with clinical practice. Although the proposed platform was evaluated for prostate cancer, it is equally applicable to other LDR brachytherapy clinical applications. Future extensions will allow its application in high dose rate brachytherapy applications. (paper)

  11. Possible magnetism based on orbital motion of protons in ice

    CERN Document Server

    Yen, Fei; Liu, Yongsheng; Berlie, Adam

    2016-01-01

    A peak anomaly is observed in the magnetic susceptibility as a function of temperature in solid H2O near Tp=60 K. At external magnetic fields below 2 kOe, Tp becomes positive in the temperature range between 45 and 66 K. The magnetic field dependence of the susceptibility in the same temperature range exhibits an inverted ferromagnetic hysteretic loop superimposed on top of the diamagnetic signature of ice at fields below 600 Oe. We suggest that a fraction of protons that are capable of undergoing correlated tunneling in a hexagonal path without disrupting the stoichiometry of the lattice create an induced magnetic field opposite to the induced magnetic field created by the electrons upon application of an external field which counters the overall diamagnetism of the material.

  12. Development of a space radiation Monte Carlo computer simulation based on the FLUKA and ROOT codes.

    Science.gov (United States)

    Pinsky, L S; Wilson, T L; Ferrari, A; Sala, P; Carminati, F; Brun, R

    2001-01-01

    This NASA funded project is proceeding to develop a Monte Carlo-based computer simulation of the radiation environment in space. With actual funding only initially in place at the end of May 2000, the study is still in the early stage of development. The general tasks have been identified and personnel have been selected. The code to be assembled will be based upon two major existing software packages. The radiation transport simulation will be accomplished by updating the FLUKA Monte Carlo program, and the user interface will employ the ROOT software being developed at CERN. The end-product will be a Monte Carlo-based code which will complement the existing analytic codes such as BRYNTRN/HZETRN presently used by NASA to evaluate the effects of radiation shielding in space. The planned code will possess the ability to evaluate the radiation environment for spacecraft and habitats in Earth orbit, in interplanetary space, on the lunar surface, or on a planetary surface such as Mars. Furthermore, it will be useful in the design and analysis of experiments such as ACCESS (Advanced Cosmic-ray Composition Experiment for Space Station), which is an Office of Space Science payload currently under evaluation for deployment on the International Space Station (ISS). FLUKA will be significantly improved and tailored for use in simulating space radiation in four ways. First, the additional physics not presently within the code that is necessary to simulate the problems of interest, namely the heavy ion inelastic processes, will be incorporated. Second, the internal geometry package will be replaced with one that will substantially increase the calculation speed as well as simplify the data input task. Third, default incident flux packages that include all of the different space radiation sources of interest will be included. Finally, the user interface and internal data structure will be melded together with ROOT, the object-oriented data analysis infrastructure system. Beyond

  13. A filtering approach based on Gaussian-powerlaw convolutions for local PET verification of proton radiotherapy

    International Nuclear Information System (INIS)

    Because proton beams activate positron emitters in patients, positron emission tomography (PET) has the potential to play a unique role in the in vivo verification of proton radiotherapy. Unfortunately, the PET image is not directly proportional to the delivered radiation dose distribution. Current treatment verification strategies using PET therefore compare the actual PET image with full-blown Monte Carlo simulations of the PET signal. In this paper, we describe a simpler and more direct way to reconstruct the expected PET signal from the local radiation dose distribution near the distal fall-off region, which is calculated by the treatment planning programme. Under reasonable assumptions, the PET image can be described as a convolution of the dose distribution with a filter function. We develop a formalism to derive the filter function analytically. The main concept is the introduction of 'Q-tilde' functions defined as the convolution of a Gaussian with a powerlaw function. Special Q-tilde functions are the Gaussian itself and the error function. The convolution of two Q-tilde functions is another Q-tilde function. By fitting elementary dose distributions and their corresponding PET signals with Q-tilde functions, we derive the Q-tilde function approximation of the filter. The new filtering method has been validated through comparisons with Monte Carlo calculations and, in one case, with measured data. While the basic concept is developed under idealized conditions assuming that the absorbing medium is homogeneous near the distal fall-off region, a generalization to inhomogeneous situations is also described. As a result, the method can determine the distal fall-off region of the PET signal, and consequently the range of the proton beam, with millimetre accuracy. Quantification of the produced activity is possible. In conclusion, the PET activity resulting from a proton beam treatment can be determined by locally filtering the dose distribution as obtained from

  14. Theoretical study of the influence of ribose on the proton transfer phenomenon of nucleic acid bases

    International Nuclear Information System (INIS)

    The first comprehensive theoretical study of ribose's effects on the behavior of proton transfer of nucleic acid base is presented. The specific hydrogen bonding of the ribose hydroxyls plays a very important role in the stabilization of the structure of ribonucleoside. Nine stable uridine conformations have been reported. The intermolecular proton transfer of the isolated, monohydrated uridine complexes in three different regions were extensively explored on the basis of density functional theory at the B3LYP/6-31+G* level. With the introduction of the ribose, not only the structural parameters of the nucleic acid bases changed, but also the energy barriers of the proton transfer process changed. Furthermore, changes of the electron distributions of the molecular orbital of the nucleic acid bases were also analyzed by NBO analysis. Consideration of the ribose's influence represents a much more real situation in the RNA

  15. Studies on anhydrous proton conducting membranes based on imidazole derivatives and sulfonated polyimide

    International Nuclear Information System (INIS)

    Anhydrous proton conducting membranes based on sulfonated polyimide (sPI) and imidazole derivatives were prepared. The acid-base composite membranes show a good chemical oxidation stability and high thermal stability. The addition of imidazole derivatives in sPIs can improve the chemical oxidation stability of the composite membranes enormously, and even much better than that of pure sPI. The proton conductivity of a typical sPI/xUI(2-undecylimidazole) composite membrane can reach 10-3 S cm-1 at 180 deg. C under the anhydrous condition. The proton conductivity of the acid-base composite membranes increases significantly with increasing content of UI. Moreover, UI in sPI/xUI composite membrane is difficult to be brought out by the vapor due to the existence of long hydrophobic moiety, which will improve the stability and lifetime of the membranes in the fuel cells

  16. Noninvasive spectral imaging of skin chromophores based on multiple regression analysis aided by Monte Carlo simulation

    Science.gov (United States)

    Nishidate, Izumi; Wiswadarma, Aditya; Hase, Yota; Tanaka, Noriyuki; Maeda, Takaaki; Niizeki, Kyuichi; Aizu, Yoshihisa

    2011-08-01

    In order to visualize melanin and blood concentrations and oxygen saturation in human skin tissue, a simple imaging technique based on multispectral diffuse reflectance images acquired at six wavelengths (500, 520, 540, 560, 580 and 600nm) was developed. The technique utilizes multiple regression analysis aided by Monte Carlo simulation for diffuse reflectance spectra. Using the absorbance spectrum as a response variable and the extinction coefficients of melanin, oxygenated hemoglobin, and deoxygenated hemoglobin as predictor variables, multiple regression analysis provides regression coefficients. Concentrations of melanin and total blood are then determined from the regression coefficients using conversion vectors that are deduced numerically in advance, while oxygen saturation is obtained directly from the regression coefficients. Experiments with a tissue-like agar gel phantom validated the method. In vivo experiments with human skin of the human hand during upper limb occlusion and of the inner forearm exposed to UV irradiation demonstrated the ability of the method to evaluate physiological reactions of human skin tissue.

  17. MaGe - a Geant4-based Monte Carlo framework for low-background experiments

    CERN Document Server

    Chan, Yuen-Dat; Henning, Reyco; Gehman, Victor M; Johnson, Rob A; Jordan, David V; Kazkaz, Kareem; Knapp, Markus; Kroninger, Kevin; Lenz, Daniel; Liu, Jing; Liu, Xiang; Marino, Michael G; Mokhtarani, Akbar; Pandola, Luciano; Schubert, Alexis G; Tomei, Claudia

    2008-01-01

    A Monte Carlo framework, MaGe, has been developed based on the Geant4 simulation toolkit. Its purpose is to simulate physics processes in low-energy and low-background radiation detectors, specifically for the Majorana and Gerda $^{76}$Ge neutrinoless double-beta decay experiments. This jointly-developed tool is also used to verify the simulation of physics processes relevant to other low-background experiments in Geant4. The MaGe framework contains simulations of prototype experiments and test stands, and is easily extended to incorporate new geometries and configurations while still using the same verified physics processes, tunings, and code framework. This reduces duplication of efforts and improves the robustness of and confidence in the simulation output.

  18. A Monte Carlo simulation based inverse propagation method for stochastic model updating

    Science.gov (United States)

    Bao, Nuo; Wang, Chunjie

    2015-08-01

    This paper presents an efficient stochastic model updating method based on statistical theory. Significant parameters have been selected implementing the F-test evaluation and design of experiments, and then the incomplete fourth-order polynomial response surface model (RSM) has been developed. Exploiting of the RSM combined with Monte Carlo simulation (MCS), reduces the calculation amount and the rapid random sampling becomes possible. The inverse uncertainty propagation is given by the equally weighted sum of mean and covariance matrix objective functions. The mean and covariance of parameters are estimated synchronously by minimizing the weighted objective function through hybrid of particle-swarm and Nelder-Mead simplex optimization method, thus the better correlation between simulation and test is achieved. Numerical examples of a three degree-of-freedom mass-spring system under different conditions and GARTEUR assembly structure validated the feasibility and effectiveness of the proposed method.

  19. Electric conduction in semiconductors: a pedagogical model based on the Monte Carlo method

    International Nuclear Information System (INIS)

    We present a pedagogic approach aimed at modelling electric conduction in semiconductors in order to describe and explain some macroscopic properties, such as the characteristic behaviour of resistance as a function of temperature. A simple model of the band structure is adopted for the generation of electron-hole pairs as well as for the carrier transport in moderate electric fields. The semiconductor behaviour is described by substituting the traditional statistical approach (requiring a deep mathematical background) with microscopic models, based on the Monte Carlo method, in which simple rules applied to microscopic particles and quasi-particles determine the macroscopic properties. We compare measurements of electric properties of matter with 'virtual experiments' built by using some models where the physical concepts can be presented at different formalization levels

  20. CAD-based Monte Carlo program for integrated simulation of nuclear system SuperMC

    International Nuclear Information System (INIS)

    SuperMC is a (Computer-Aided-Design) CAD-based Monte Carlo (MC) program for integrated simulation of nuclear systems developed by FDS Team (China), making use of hybrid MC-deterministic method and advanced computer technologies. The design aim, architecture and main methodology of SuperMC are presented in this paper. The taking into account of multi-physics processes and the use of advanced computer technologies such as automatic geometry modeling, intelligent data analysis and visualization, high performance parallel computing and cloud computing, contribute to the efficiency of the code. SuperMC2.1, the latest version of the code for neutron, photon and coupled neutron and photon transport calculation, has been developed and validated by using a series of benchmarking cases such as the fusion reactor ITER model and the fast reactor BN-600 model

  1. Calculation and analysis of heat source of PWR assemblies based on Monte Carlo method

    International Nuclear Information System (INIS)

    When fission occurs in nuclear fuel in reactor core, it releases numerous neutron and γ radiation, which takes energy deposition in fuel components and yields many factors such as thermal stressing and radiation damage influencing the safe operation of a reactor. Using the three-dimensional Monte Carlo transport calculation program MCNP and continuous cross-section database based on ENDF/B series to calculate the heat rate of the heat source on reference assemblies of a PWR when loading with 18-month short refueling cycle mode, and get the precise values of the control rod, thimble plug and new burnable poison rod within Gd, so as to provide basis for reactor design and safety verification. (authors)

  2. Seabed radioactivity based on in situ measurements and Monte Carlo simulations

    International Nuclear Information System (INIS)

    Activity concentration measurements were carried out on the seabed, by implementing the underwater detection system KATERINA. The efficiency calibration was performed in the energy range 350–2600 keV, using in situ and laboratory measurements. The efficiency results were reproduced and extended in a broadened range of energies from 150 to 2600 keV, by Monte Carlo simulations, using the MCNP5 code. The concentrations of 40K, 214Bi and 208Tl were determined utilizing the present approach. The results were validated by laboratory measurements. - Highlights: • The KATERINA system was applied for marine sediments. • MC simulations using MCNP5 reproduced experimental energy spectra and efficiency. • The in-situ method provided quantitative measurements. • The measurements were validated with lab-based methods

  3. Monte Carlo-based Noise Compensation in Coil Intensity Corrected Endorectal MRI

    CERN Document Server

    Lui, Dorothy; Haider, Masoom; Wong, Alexander

    2015-01-01

    Background: Prostate cancer is one of the most common forms of cancer found in males making early diagnosis important. Magnetic resonance imaging (MRI) has been useful in visualizing and localizing tumor candidates and with the use of endorectal coils (ERC), the signal-to-noise ratio (SNR) can be improved. The coils introduce intensity inhomogeneities and the surface coil intensity correction built into MRI scanners is used to reduce these inhomogeneities. However, the correction typically performed at the MRI scanner level leads to noise amplification and noise level variations. Methods: In this study, we introduce a new Monte Carlo-based noise compensation approach for coil intensity corrected endorectal MRI which allows for effective noise compensation and preservation of details within the prostate. The approach accounts for the ERC SNR profile via a spatially-adaptive noise model for correcting non-stationary noise variations. Such a method is useful particularly for improving the image quality of coil i...

  4. A Monte Carlo-based treatment-planning tool for ion beam therapy

    CERN Document Server

    Böhlen, T T; Dosanjh, M; Ferrari, A; Haberer, T; Parodi, K; Patera, V; Mairan, A

    2013-01-01

    Ion beam therapy, as an emerging radiation therapy modality, requires continuous efforts to develop and improve tools for patient treatment planning (TP) and research applications. Dose and fluence computation algorithms using the Monte Carlo (MC) technique have served for decades as reference tools for accurate dose computations for radiotherapy. In this work, a novel MC-based treatment-planning (MCTP) tool for ion beam therapy using the pencil beam scanning technique is presented. It allows single-field and simultaneous multiple-fields optimization for realistic patient treatment conditions and for dosimetric quality assurance for irradiation conditions at state-of-the-art ion beam therapy facilities. It employs iterative procedures that allow for the optimization of absorbed dose and relative biological effectiveness (RBE)-weighted dose using radiobiological input tables generated by external RBE models. Using a re-implementation of the local effect model (LEM), theMCTP tool is able to perform TP studies u...

  5. Simulation of nuclear material identification system based on Monte Carlo sampling method

    International Nuclear Information System (INIS)

    Background: Caused by the danger of radioactivity, nuclear material identification is sometimes a difficult problem. Purpose: In order to reflect the particle transport processes in nuclear fission and present the effectiveness of the signatures of Nuclear Materials Identification System (NMIS), based on physical principles and experimental statistical data. Methods: We established a Monte Carlo simulation model of nuclear material identification system and then acquired three channels of time domain pulse signal. Results: Auto-Correlation Functions (AC), Cross-Correlation Functions (CC), Auto Power Spectral Densities (APSD) and Cross Power Spectral Densities (CPSD) between channels can obtain several signatures, which can show some characters of nuclear material. Conclusions: The simulation results indicate that the way can help to further study the features of the system. (authors)

  6. GPU-based fast Monte Carlo simulation for radiotherapy dose calculation

    CERN Document Server

    Jia, Xun; Graves, Yan Jiang; Folkerts, Michael; Jiang, Steve B

    2011-01-01

    Monte Carlo (MC) simulation is commonly considered to be the most accurate dose calculation method in radiotherapy. However, its efficiency still requires improvement for many routine clinical applications. In this paper, we present our recent progress towards the development a GPU-based MC dose calculation package, gDPM v2.0. It utilizes the parallel computation ability of a GPU to achieve high efficiency, while maintaining the same particle transport physics as in the original DPM code and hence the same level of simulation accuracy. In GPU computing, divergence of execution paths between threads can considerably reduce the efficiency. Since photons and electrons undergo different physics and hence attain different execution paths, we use a simulation scheme where photon transport and electron transport are separated to partially relieve the thread divergence issue. High performance random number generator and hardware linear interpolation are also utilized. We have also developed various components to hand...

  7. Earthquake Forecasting Based on Data Assimilation: Sequential Monte Carlo Methods for Renewal Processes

    CERN Document Server

    Werner, M J; Sornette, D

    2009-01-01

    In meteorology, engineering and computer sciences, data assimilation is routinely employed as the optimal way to combine noisy observations with prior model information for obtaining better estimates of a state, and thus better forecasts, than can be achieved by ignoring data uncertainties. Earthquake forecasting, too, suffers from measurement errors and partial model information and may thus gain significantly from data assimilation. We present perhaps the first fully implementable data assimilation method for earthquake forecasts generated by a point-process model of seismicity. We test the method on a synthetic and pedagogical example of a renewal process observed in noise, which is relevant to the seismic gap hypothesis, models of characteristic earthquakes and to recurrence statistics of large quakes inferred from paleoseismic data records. To address the non-Gaussian statistics of earthquakes, we use sequential Monte Carlo methods, a set of flexible simulation-based methods for recursively estimating ar...

  8. Monte Carlo simulation of grating-based neutron phase contrast imaging at CPHS

    International Nuclear Information System (INIS)

    Since the launching of the Compact Pulsed Hadron Source (CPHS) project of Tsinghua University in 2009, works have begun on the design and engineering of an imaging/radiography instrument for the neutron source provided by CPHS. The instrument will perform basic tasks such as transmission imaging and computerized tomography. Additionally, we include in the design the utilization of coded-aperture and grating-based phase contrast methodology, as well as the options of prompt gamma-ray analysis and neutron-energy selective imaging. Previously, we had implemented the hardware and data-analysis software for grating-based X-ray phase contrast imaging. Here, we investigate Geant4-based Monte Carlo simulations of neutron refraction phenomena and then model the grating-based neutron phase contrast imaging system according to the classic-optics-based method. The simulated experimental results of the retrieving phase shift gradient information by five-step phase-stepping approach indicate the feasibility of grating-based neutron phase contrast imaging as an option for the cold neutron imaging instrument at the CPHS.

  9. A CAD based automatic modeling method for primitive solid based Monte Carlo calculation geometry

    International Nuclear Information System (INIS)

    The Multi-Physics Coupling Analysis Modeling Program (MCAM), developed by FDS Team, China, is an advanced modeling tool aiming to solve the modeling challenges for multi-physics coupling simulation. The automatic modeling method for SuperMC, the Super Monte Carlo Calculation Program for Nuclear and Radiation Process, was recently developed and integrated in MCAM5.2. This method could bi-convert between CAD model and SuperMC input file. While converting from CAD model to SuperMC model, the CAD model was decomposed into several convex solids set, and then corresponding SuperMC convex basic solids were generated and output. While inverting from SuperMC model to CAD model, the basic primitive solids was created and related operation was done to according the SuperMC model. This method was benchmarked with ITER Benchmark model. The results showed that the method was correct and effective. (author)

  10. Monte Carlo simulation for internal radiation dosimetry based on the high resolution Visible Chinese Human

    International Nuclear Information System (INIS)

    The internal radiation dose calculations based on Chinese models is important in nuclear medicine. Most of the existing models are based on the physical and anatomical data of Caucasian, whose anatomical structure and physiological parameters are quite different from the Chinese, may lead significant effect on internal radiation. Therefore, it is necessary to establish the model based on the Chinese ethnic characteristics, and applied to radiation dosimetry calculation. In this study, a voxel model was established based on the high resolution Visible Chinese Human (VCH). The transport procedure of photon and electron was simulated using the MCNPX Monte Carlo code. Absorbed fraction (AF) and specific absorbed fraction (SAF) were calculated and S-factors and mean absorbed doses for organs with 99mTc located in liver were also obtained. In comparison with those of VIP-Man and MIRD models, discrepancies were found to be correlated with the racial and anatomical differences in organ mass and inter-organ distance. The internal dosimetry data based on other models that were used to apply to Chinese adult population are replaced with Chinese specific data. The obtained results provide a reference for nuclear medicine, such as dose verification after surgery and potential radiation evaluation for radionuclides in preclinical research, etc. (authors)

  11. TPSPET—A TPS-based approach for in vivo dose verification with PET in proton therapy

    International Nuclear Information System (INIS)

    Since the interest in ion-irradiation for tumour therapy has significantly increased over the last few decades, intensive investigations are performed to improve the accuracy of this form of patient treatment. One major goal is the development of methods for in vivo dose verification. In proton therapy, a PET (positron emission tomography)-based approach measuring the irradiation-induced tissue activation inside the patient has been already clinically implemented. The acquired PET images can be compared to an expectation, derived under the assumption of a correct treatment application, to validate the particle range and the lateral field position in vivo. In the context of this work, TPSPET is introduced as a new approach to predict proton-irradiation induced three-dimensional positron emitter distributions by means of the same algorithms of the clinical treatment planning system (TPS). In order to perform additional activity calculations, reaction-channel-dependent input positron emitter depth distributions are necessary, which are determined from the application of a modified filtering approach to the TPS reference depth dose profiles in water. This paper presents the implementation of TPSPET on the basis of the research treatment planning software treatment planning for particles. The results are validated in phantom and patient studies against Monte Carlo simulations, and compared to β+-emitter distributions obtained from a slightly modified version of the originally proposed one-dimensional filtering approach applied to three-dimensional dose distributions. In contrast to previously introduced methods, TPSPET provides a faster implementation, the results show no sensitivity to lateral field extension and the predicted β+-emitter densities are fully consistent to the planned treatment dose as they are calculated by the same pencil beam algorithms. These findings suggest a large potential of the application of TPSPET for in vivo dose verification in the daily

  12. Full modelling of the MOSAIC animal PET system based on the GATE Monte Carlo simulation code

    International Nuclear Information System (INIS)

    within 9%. For a 410-665 keV energy window, the measured sensitivity for a centred point source was 1.53% and mouse and rat scatter fractions were respectively 12.0% and 18.3%. The scattered photons produced outside the rat and mouse phantoms contributed to 24% and 36% of total simulated scattered coincidences. Simulated and measured single and prompt count rates agreed well for activities up to the electronic saturation at 110 MBq for the mouse and rat phantoms. Volumetric spatial resolution was 17.6 μL at the centre of the FOV with differences less than 6% between experimental and simulated spatial resolution values. The comprehensive evaluation of the Monte Carlo modelling of the Mosaic(TM) system demonstrates that the GATE package is adequately versatile and appropriate to accurately describe the response of an Anger logic based animal PET system

  13. Direct absorbed dose to water determination based on water calorimetry in scanning proton beam delivery

    International Nuclear Information System (INIS)

    Purpose: The aim of this manuscript is to describe the direct measurement of absolute absorbed dose to water in a scanned proton radiotherapy beam using a water calorimeter primary standard. Methods: The McGill water calorimeter, which has been validated in photon and electron beams as well as in HDR 192Ir brachytherapy, was used to measure the absorbed dose to water in double scattering and scanning proton irradiations. The measurements were made at the Massachusetts General Hospital proton radiotherapy facility. The correction factors in water calorimetry were numerically calculated and various parameters affecting their magnitude and uncertainty were studied. The absorbed dose to water was compared to that obtained using an Exradin T1 Chamber based on the IAEA TRS-398 protocol. Results: The overall 1-sigma uncertainty on absorbed dose to water amounts to 0.4% and 0.6% in scattered and scanned proton water calorimetry, respectively. This compares to an overall uncertainty of 1.9% for currently accepted IAEA TRS-398 reference absorbed dose measurement protocol. The absorbed dose from water calorimetry agrees with the results from TRS-398 well to within 1-sigma uncertainty. Conclusions: This work demonstrates that a primary absorbed dose standard based on water calorimetry is feasible in scattered and scanned proton beams.

  14. Pattern recognition and data mining software based on artificial neural networks applied to proton transfer in aqueous environments

    OpenAIRE

    Tahat, Amani; Martí Rabassa, Jordi; Khwaldeh, Ali; Tahat, Kaher

    2014-01-01

    In computational physics proton transfer phenomena could be viewed as pattern classification problems based on a set of input features allowing to classify the proton motion into two categories: transfer‘occurred’and transfer‘not occurred’. The goal of this paper is to evaluate the use of artificial neural networks in the classification of proton transfer events, based on the feed-forward back propagation neural network, used as a classifier to distinguish between the two transfer cases. In t...

  15. Proton radiography to improve proton therapy treatment

    NARCIS (Netherlands)

    Takatsu, J.; van der Graaf, E. R.; Van Goethem, M. -J.; van Beuzekom, M.; Klaver, T.; Visser, J.; Brandenburg, S.; Biegun, A. K.

    2016-01-01

    The quality of cancer treatment with protons critically depends on an accurate prediction of the proton stopping powers for the tissues traversed by the protons. Today, treatment planning in proton radiotherapy is based on stopping power calculations from densities of X-ray Computed Tomography (CT)

  16. Unfiltered Monte Carlo-based tungsten anode spectral model from 20 to 640 kV

    Science.gov (United States)

    Hernandez, A. M.; Boone, John M.

    2014-03-01

    A Monte Carlo-based tungsten anode spectral model, conceptually similar to the previously-developed TASMIP model, was developed. This new model provides essentially unfiltered x-ray spectra with better energy resolution and significantly extends the range of tube potentials for available spectra. MCNPX was used to simulate x-ray spectra as a function of tube potential for a conventional x-ray tube configuration with several anode compositions. Thirty five x-ray spectra were simulated and used as the basis of interpolating a complete set of tungsten x-ray spectra (at 1 kV intervals) from 20 to 640 kV. Additionally, Rh and Mo anode x-ray spectra were simulated from 20 to 60 kV. Cubic splines were used to construct piecewise polynomials that interpolate the photon fluence per energy bin as a function of tube potential for each anode material. The tungsten anode spectral model using interpolating cubic splines (TASMICS) generates minimally-filtered (0.8 mm Be) x-ray spectra from 20 to 640 kV with 1 keV energy bins. The rhodium and molybdenum anode spectral models (RASMICS and MASMICS, respectively) generate minimally-filtered x-ray spectra from 20 to 60 kV with 1 keV energy bins. TASMICS spectra showed no statistically significant differences when compared with the empirical TASMIP model, the semi-empirical Birch and Marshall model, and a Monte Carlo spectrum reported in AAPM TG 195. The RASMICS and MASMICS spectra showed no statistically significant differences when compared with their counterpart RASMIP and MASMIP models. Spectra from the TASMICS, MASMICS, and RASMICS models are available in spreadsheet format for interested users.

  17. Importance of precise positioning for proton beam therapy in the base of skull and cervical spine.

    Science.gov (United States)

    Tatsuzaki, H; Urie, M M

    1991-08-01

    Using proton beam therapy, high doses have been delivered to chordomas and chondrosarcomas of the base of skull and cervical spine. Dose inhomogeneity to the tumors has been accepted in order to maintain normal tissue tolerances, and detailed attention to patient immobilization and to precise positioning has minimized the margins necessary to ensure these dose constraints. This study examined the contribution of precise positioning to the better dose localization achieved in these treatments. Three patients whose tumors represented different anatomic geometries were studied. Treatment plans were developed which treated as much of the tumor as possible to 74 Cobalt-Gray-Equivalent (CGE) while maintaining the central brain stem and central spinal cord at less than or equal to 48 CGE, the surface of the brain stem, surface of the spinal cord, and optic structures at less than or equal to 60 CGE, and the temporal lobes at less than or equal to 5% likelihood of complication using a biophysical model of normal tissue complication probability. Two positioning accuracies were assumed: 3 mm and 10 mm. Both proton beam plans and 10 MV X ray beam plans were developed with these assumptions and dose constraints. In all cases with the same positioning uncertainties, the proton beam plans delivered more dose to a larger percentage of the tumor volume and the estimated tumor control probability was higher than with the X ray plans. However, without precise positioning both the proton plans and the X ray plans deteriorated, with a 12% to 25% decrease in estimated tumor control probability. In all but one case, the difference between protons with good positioning and poor positioning was greater than the difference between protons and X rays, both with good positioning. Hence in treating these tumors, which are in close proximity to critical normal tissues, attention to immobilization and precise positioning is essential. With good positioning, proton beam therapy permits higher

  18. SU-E-J-147: Monte Carlo Study of the Precision and Accuracy of Proton CT Reconstructed Relative Stopping Power Maps

    International Nuclear Information System (INIS)

    Purpose: The quantification of the intrinsic performances of proton computed tomography (pCT) as a modality for treatment planning in proton therapy. The performance of an ideal pCT scanner is studied as a function of various parameters. Methods: Using GATE/Geant4, we simulated an ideal pCT scanner and scans of several cylindrical phantoms with various tissue equivalent inserts of different sizes. Insert materials were selected in order to be of clinical relevance. Tomographic images were reconstructed using a filtered backprojection algorithm taking into account the scattering of protons into the phantom. To quantify the performance of the ideal pCT scanner, we study the precision and the accuracy with respect to the theoretical relative stopping power ratios (RSP) values for different beam energies, imaging doses, insert sizes and detector positions. The planning range uncertainty resulting from the reconstructed RSP is also assessed by comparison with the range of the protons in the analytically simulated phantoms. Results: The results indicate that pCT can intrinsically achieve RSP resolution below 1%, for most examined tissues at beam energies below 300 MeV and for imaging doses around 1 mGy. RSP maps accuracy of less than 0.5 % is observed for most tissue types within the studied dose range (0.2–1.5 mGy). Finally, the uncertainty in the proton range due to the accuracy of the reconstructed RSP map is well below 1%. Conclusion: This work explores the intrinsic performance of pCT as an imaging modality for proton treatment planning. The obtained results show that under ideal conditions, 3D RSP maps can be reconstructed with an accuracy better than 1%. Hence, pCT is a promising candidate for reducing the range uncertainties introduced by the use of X-ray CT alongside with a semiempirical calibration to RSP.Supported by the DFG Cluster of Excellence Munich-Centre for Advanced Photonics (MAP)

  19. A collision history-based approach to sensitivity/perturbation calculations in the continuous energy Monte Carlo code SERPENT

    International Nuclear Information System (INIS)

    Highlights: • We present a new Monte Carlo method to perform sensitivity/perturbation calculations. • Sensitivity of keff, reaction rates, point kinetics parameters to nuclear data. • Fully continuous implicitly constrained Monte Carlo sensitivities to scattering distributions. • Implementation of the method in the continuous energy Monte Carlo code SERPENT. • Verification against ERANOS and TSUNAMI generalized perturbation theory results. - Abstract: In this work, the implementation of a collision history-based approach to sensitivity/perturbation calculations in the Monte Carlo code SERPENT is discussed. The proposed methods allow the calculation of the effects of nuclear data perturbation on several response functions: the effective multiplication factor, reaction rate ratios and bilinear ratios (e.g., effective kinetics parameters). SERPENT results are compared to ERANOS and TSUNAMI Generalized Perturbation Theory calculations for two fast metallic systems and for a PWR pin-cell benchmark. New methods for the calculation of sensitivities to angular scattering distributions are also presented, which adopts fully continuous (in energy and angle) Monte Carlo estimators

  20. MRI-Based Computed Tomography Metal Artifact Correction Method for Improving Proton Range Calculation Accuracy

    Energy Technology Data Exchange (ETDEWEB)

    Park, Peter C. [Department of Radiation Physics, University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Schreibmann, Eduard; Roper, Justin; Elder, Eric; Crocker, Ian [Department of Radiation Oncology, Winship Cancer Institute of Emory University, Atlanta, Georgia (United States); Fox, Tim [Varian Medical Systems, Palo Alto, California (United States); Zhu, X. Ronald [Department of Radiation Physics, University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Dong, Lei [Scripps Proton Therapy Center, San Diego, California (United States); Dhabaan, Anees, E-mail: anees.dhabaan@emory.edu [Department of Radiation Oncology, Winship Cancer Institute of Emory University, Atlanta, Georgia (United States)

    2015-03-15

    Purpose: Computed tomography (CT) artifacts can severely degrade dose calculation accuracy in proton therapy. Prompted by the recently increased popularity of magnetic resonance imaging (MRI) in the radiation therapy clinic, we developed an MRI-based CT artifact correction method for improving the accuracy of proton range calculations. Methods and Materials: The proposed method replaces corrupted CT data by mapping CT Hounsfield units (HU number) from a nearby artifact-free slice, using a coregistered MRI. MRI and CT volumetric images were registered with use of 3-dimensional (3D) deformable image registration (DIR). The registration was fine-tuned on a slice-by-slice basis by using 2D DIR. Based on the intensity of paired MRI pixel values and HU from an artifact-free slice, we performed a comprehensive analysis to predict the correct HU for the corrupted region. For a proof-of-concept validation, metal artifacts were simulated on a reference data set. Proton range was calculated using reference, artifactual, and corrected images to quantify the reduction in proton range error. The correction method was applied to 4 unique clinical cases. Results: The correction method resulted in substantial artifact reduction, both quantitatively and qualitatively. On respective simulated brain and head and neck CT images, the mean error was reduced from 495 and 370 HU to 108 and 92 HU after correction. Correspondingly, the absolute mean proton range errors of 2.4 cm and 1.7 cm were reduced to less than 2 mm in both cases. Conclusions: Our MRI-based CT artifact correction method can improve CT image quality and proton range calculation accuracy for patients with severe CT artifacts.

  1. MRI-Based Computed Tomography Metal Artifact Correction Method for Improving Proton Range Calculation Accuracy

    International Nuclear Information System (INIS)

    Purpose: Computed tomography (CT) artifacts can severely degrade dose calculation accuracy in proton therapy. Prompted by the recently increased popularity of magnetic resonance imaging (MRI) in the radiation therapy clinic, we developed an MRI-based CT artifact correction method for improving the accuracy of proton range calculations. Methods and Materials: The proposed method replaces corrupted CT data by mapping CT Hounsfield units (HU number) from a nearby artifact-free slice, using a coregistered MRI. MRI and CT volumetric images were registered with use of 3-dimensional (3D) deformable image registration (DIR). The registration was fine-tuned on a slice-by-slice basis by using 2D DIR. Based on the intensity of paired MRI pixel values and HU from an artifact-free slice, we performed a comprehensive analysis to predict the correct HU for the corrupted region. For a proof-of-concept validation, metal artifacts were simulated on a reference data set. Proton range was calculated using reference, artifactual, and corrected images to quantify the reduction in proton range error. The correction method was applied to 4 unique clinical cases. Results: The correction method resulted in substantial artifact reduction, both quantitatively and qualitatively. On respective simulated brain and head and neck CT images, the mean error was reduced from 495 and 370 HU to 108 and 92 HU after correction. Correspondingly, the absolute mean proton range errors of 2.4 cm and 1.7 cm were reduced to less than 2 mm in both cases. Conclusions: Our MRI-based CT artifact correction method can improve CT image quality and proton range calculation accuracy for patients with severe CT artifacts

  2. Treatment of the Shrodinger equation through a Monte Carlo method based upon the generalized Feynman-Kac formula

    International Nuclear Information System (INIS)

    We present a new Monte Carlo method based upon the theoretical proposal of Claverie and Soto. By contrast with other Quantum Monte Carlo methods used so far, the present approach uses a pure diffusion process without any branching. The many-fermion problem (with the specific constraint due to the Pauli principle) receives a natural solution in the framework of this method: in particular, there is neither the fixed-node approximation not the nodal release problem which occur in other approaches (see, e.g., Ref. 8 for a recent account). We give some numerical results concerning simple systems in order to illustrate the numerical feasibility of the proposed algorithm

  3. Development of a GPU-based Monte Carlo dose calculation code for coupled electron-photon transport

    CERN Document Server

    Jia, Xun; Sempau, Josep; Choi, Dongju; Majumdar, Amitava; Jiang, Steve B

    2009-01-01

    Monte Carlo simulation is the most accurate method for absorbed dose calculations in radiotherapy. Its efficiency still requires improvement for routine clinical applications, especially for online adaptive radiotherapy. In this paper, we report our recent development on a GPU-based Monte Carlo dose calculation code for coupled electron-photon transport. We have implemented the Dose Planning Method (DPM) Monte Carlo dose calculation package (Sempau et al, Phys. Med. Biol., 45(2000)2263-2291) on GPU architecture under CUDA platform. The implementation has been tested with respect to the original sequential DPM code on CPU in two cases. Our results demonstrate the adequate accuracy of the GPU implementation for both electron and photon beams in radiotherapy energy range. A speed up factor of 4.5 and 5.5 times have been observed for electron and photon testing cases, respectively, using an NVIDIA Tesla C1060 GPU card against a 2.27GHz Intel Xeon CPU processor .

  4. A Mechanism-Based Approach to Predict the Relative Biological Effectiveness of Protons and Carbon Ions in Radiation Therapy

    International Nuclear Information System (INIS)

    Purpose: The physical and potential biological advantages of proton and carbon ions have not been fully exploited in radiation therapy for the treatment of cancer. In this work, an approach to predict proton and carbon ion relative biological effectiveness (RBE) in a representative spread-out Bragg peak (SOBP) is derived using the repair-misrepair-fixation (RMF) model. Methods and Materials: Formulas linking dose-averaged linear-quadratic parameters to DSB induction and processing are derived from the RMF model. The Monte Carlo Damage Simulation (MCDS) software is used to quantify the effects of radiation quality on the induction of DNA double-strand breaks (DSB). Trends in parameters α and β for clinically relevant proton and carbon ion kinetic energies are determined. Results: Proton and carbon ion RBE are shown to increase as particle energy, dose, and tissue α/β ratios decrease. Entrance RBE is ∼1.0 and ∼1.3 for protons and carbon ions, respectively. For doses in the range of 0.5 to 10 Gy, proton RBE ranges from 1.02 (proximal edge) to 1.4 (distal edge). Over the same dose range, the RBE for carbon ions ranges from 1.5 on the proximal edge to 6.7 on the distal edge. Conclusions: The proposed approach is advantageous because the RBE for clinically relevant particle distributions is guided by well-established physical and biological (track structure) considerations. The use of an independently tested Monte Carlo model to predict the effects of radiation quality on DSB induction also minimizes the number of ad hoc biological parameters that must be determined to predict RBE. Large variations in predicted RBE across an SOBP may produce undesirable biological hot and cold spots. These results highlight the potential for the optimization of physical dose for a uniform biological effect.

  5. Radiosurgery with photons or protons for benign and malignant tumours of the skull base: a review

    Directory of Open Access Journals (Sweden)

    Amichetti Maurizio

    2012-12-01

    Full Text Available Abstract Stereotactic radiosurgery (SRS is an important treatment option for intracranial lesions. Many studies have shown the effectiveness of photon-SRS for the treatment of skull base (SB tumours; however, limited data are available for proton-SRS. Several photon-SRS techniques, including Gamma Knife, modified linear accelerators (Linac and CyberKnife, have been developed and several studies have compared treatment plan characteristics between protons and photons. The principles of classical radiobiology are similar for protons and photons even though they differ in terms of physical properties and interaction with matter resulting in different dose distributions. Protons have special characteristics that allow normal tissues to be spared better than with the use of photons, although their potential clinical superiority remains to be demonstrated. A critical analysis of the fundamental radiobiological principles, dosimetric characteristics, clinical results, and toxicity of proton- and photon-SRS for SB tumours is provided and discussed with an attempt of defining the advantages and limits of each radiosurgical technique.

  6. Studies on PVA based nanocomposite Proton Exchange Membrane for Direct methanol fuel cell (DMFC) applications

    Science.gov (United States)

    Bahavan Palani, P.; Kannan, R.; Rajashabala, S.; Rajendran, S.; Velraj, G.

    2015-02-01

    Different concentrations of Poly (vinyl alcohol)/Montmorillonite (PVA/MMT) based proton exchange membranes (PEMs) have been prepared by solution casting method. The structural and electrical properties of these composite membranes have been characterized by using X-ray diffraction (XRD), Fourier transform infrared spectroscopic (FTIR) and AC impedance spectroscopic methods. The conductivity of the PEMs has been estimated for the different concentration of MMT. Water/Methanol uptake measurement were also analyzed for the prepared PEMs and presented. The proton conductivity studies were carried out at room temperature with 100% of humidity.

  7. Catalyst Degradation in High Temperature Proton Exchange Membrane Fuel Cells Based on Acid Doped Polybenzimidazole Membranes

    DEFF Research Database (Denmark)

    Cleemann, Lars Nilausen; Buazar, F.; Li, Qingfeng; Jensen, Jens Oluf; Pan, Chao; Steenberg, T.; Dai, S.; Bjerrum, Niels J.

    2013-01-01

    Degradation of carbon supported platinum catalysts is a major failure mode for the long term durability of high temperature proton exchange membrane fuel cells based on phosphoric acid doped polybenzimidazole membranes. With Vulcan carbon black as a reference, thermally treated carbon black and...

  8. An UV photochromic memory effect in proton-based WO3 electrochromic devices

    International Nuclear Information System (INIS)

    We report an UV photochromic memory effect on a standard proton-based WO3 electrochromic device. It exhibits two memory states, associated with the colored and bleached states of the device, respectively. Such an effect can be used to enhance device performance (increasing the dynamic range), re-energize commercial electrochromic devices, and develop memory devices

  9. Monte Carlo vs. Pencil Beam based optimization of stereotactic lung IMRT

    Directory of Open Access Journals (Sweden)

    Weinmann Martin

    2009-12-01

    Full Text Available Abstract Background The purpose of the present study is to compare finite size pencil beam (fsPB and Monte Carlo (MC based optimization of lung intensity-modulated stereotactic radiotherapy (lung IMSRT. Materials and methods A fsPB and a MC algorithm as implemented in a biological IMRT planning system were validated by film measurements in a static lung phantom. Then, they were applied for static lung IMSRT planning based on three different geometrical patient models (one phase static CT, density overwrite one phase static CT, average CT of the same patient. Both 6 and 15 MV beam energies were used. The resulting treatment plans were compared by how well they fulfilled the prescribed optimization constraints both for the dose distributions calculated on the static patient models and for the accumulated dose, recalculated with MC on each of 8 CTs of a 4DCT set. Results In the phantom measurements, the MC dose engine showed discrepancies Conclusions It is feasible to employ the MC dose engine for optimization of lung IMSRT and the plans are superior to fsPB. Use of static patient models introduces a bias in the MC dose distribution compared to the 4D MC recalculated dose, but this bias is predictable and therefore MC based optimization on static patient models is considered safe.

  10. A new method for RGB to CIELAB color space transformation based on Markov chain Monte Carlo

    Science.gov (United States)

    Chen, Yajun; Liu, Ding; Liang, Junli

    2013-10-01

    During printing quality inspection, the inspection of color error is an important content. However, the RGB color space is device-dependent, usually RGB color captured from CCD camera must be transformed into CIELAB color space, which is perceptually uniform and device-independent. To cope with the problem, a Markov chain Monte Carlo (MCMC) based algorithms for the RGB to the CIELAB color space transformation is proposed in this paper. Firstly, the modeling color targets and testing color targets is established, respectively used in modeling and performance testing process. Secondly, we derive a Bayesian model for estimation the coefficients of a polynomial, which can be used to describe the relation between RGB and CIELAB color space. Thirdly, a Markov chain is set up base on Gibbs sampling algorithm (one of the MCMC algorithm) to estimate the coefficients of polynomial. Finally, the color difference of testing color targets is computed for evaluating the performance of the proposed method. The experimental results showed that the nonlinear polynomial regression based on MCMC algorithm is effective, whose performance is similar to the least square approach and can accurately model the RGB to the CIELAB color space conversion and guarantee the color error evaluation for printing quality inspection system.

  11. Fast CPU-based Monte Carlo simulation for radiotherapy dose calculation

    Science.gov (United States)

    Ziegenhein, Peter; Pirner, Sven; Kamerling, Cornelis Ph; Oelfke, Uwe

    2015-08-01

    Monte-Carlo (MC) simulations are considered to be the most accurate method for calculating dose distributions in radiotherapy. Its clinical application, however, still is limited by the long runtimes conventional implementations of MC algorithms require to deliver sufficiently accurate results on high resolution imaging data. In order to overcome this obstacle we developed the software-package PhiMC, which is capable of computing precise dose distributions in a sub-minute time-frame by leveraging the potential of modern many- and multi-core CPU-based computers. PhiMC is based on the well verified dose planning method (DPM). We could demonstrate that PhiMC delivers dose distributions which are in excellent agreement to DPM. The multi-core implementation of PhiMC scales well between different computer architectures and achieves a speed-up of up to 37× compared to the original DPM code executed on a modern system. Furthermore, we could show that our CPU-based implementation on a modern workstation is between 1.25× and 1.95× faster than a well-known GPU implementation of the same simulation method on a NVIDIA Tesla C2050. Since CPUs work on several hundreds of GB RAM the typical GPU memory limitation does not apply for our implementation and high resolution clinical plans can be calculated.

  12. Proton linac for hospital-based fast neutron therapy and radioisotope production

    International Nuclear Information System (INIS)

    Recent developments in linac technology have led to the design of a hospital-based proton linac for fast neutron therapy. The 180 microamp average current allows beam to be diverted for radioisotope production during treatments while maintaining an acceptable dose rate. During dedicated operation, dose rates greater than 280 neutron rads per minute are achievable at depth, DMAX = 1.6 cm with source to axis distance, SAD = 190 cm. Maximum machine energy is 70 MeV and several intermediate energies are available for optimizing production of isotopes for Positron Emission Tomography and other medical applications. The linac can be used to produce a horizontal or a gantry can be added to the downstream end of the linac for conventional patient positioning. The 70 MeV protons can also be used for proton therapy for ocular melanomas. 17 refs., 1 fig., 1 tab

  13. A research plan based on high intensity proton accelerator Neutron Science Research Center

    Energy Technology Data Exchange (ETDEWEB)

    Mizumoto, Motoharu [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1997-03-01

    A plan called Neutron Science Research Center (NSRC) has been proposed in JAERI. The center is a complex composed of research facilities based on a proton linac with an energy of 1.5GeV and an average current of 10mA. The research facilities will consist of Thermal/Cold Neutron Facility, Neutron Irradiation Facility, Neutron Physics Facility, OMEGA/Nuclear Energy Facility, Spallation RI Beam Facility, Meson/Muon Facility and Medium Energy Experiment Facility, where high intensity proton beam and secondary particle beams such as neutron, pion, muon and unstable radio isotope (RI) beams generated from the proton beam will be utilized for innovative researches in the fields on nuclear engineering and basic sciences. (author)

  14. Ultrafast cone-beam CT scatter correction with GPU-based Monte Carlo simulation

    Directory of Open Access Journals (Sweden)

    Yuan Xu

    2014-03-01

    Full Text Available Purpose: Scatter artifacts severely degrade image quality of cone-beam CT (CBCT. We present an ultrafast scatter correction framework by using GPU-based Monte Carlo (MC simulation and prior patient CT image, aiming at automatically finish the whole process including both scatter correction and reconstruction within 30 seconds.Methods: The method consists of six steps: 1 FDK reconstruction using raw projection data; 2 Rigid Registration of planning CT to the FDK results; 3 MC scatter calculation at sparse view angles using the planning CT; 4 Interpolation of the calculated scatter signals to other angles; 5 Removal of scatter from the raw projections; 6 FDK reconstruction using the scatter-corrected projections. In addition to using GPU to accelerate MC photon simulations, we also use a small number of photons and a down-sampled CT image in simulation to further reduce computation time. A novel denoising algorithm is used to eliminate MC noise from the simulated scatter images caused by low photon numbers. The method is validated on one simulated head-and-neck case with 364 projection angles.Results: We have examined variation of the scatter signal among projection angles using Fourier analysis. It is found that scatter images at 31 angles are sufficient to restore those at all angles with < 0.1% error. For the simulated patient case with a resolution of 512 × 512 × 100, we simulated 5 × 106 photons per angle. The total computation time is 20.52 seconds on a Nvidia GTX Titan GPU, and the time at each step is 2.53, 0.64, 14.78, 0.13, 0.19, and 2.25 seconds, respectively. The scatter-induced shading/cupping artifacts are substantially reduced, and the average HU error of a region-of-interest is reduced from 75.9 to 19.0 HU.Conclusion: A practical ultrafast MC-based CBCT scatter correction scheme is developed. It accomplished the whole procedure of scatter correction and reconstruction within 30 seconds.----------------------------Cite this

  15. Proton radiation therapy for chordomas and chondrosarcomas of the skull base.

    Science.gov (United States)

    Hug, E B; Slater, J D

    2000-10-01

    Most patients with conventional radiotherapy after surgery die with local disease progression. The superior local tumor control and overall survival achieved with fractionated proton RT can be attributed to improved dose localization characteristics of protons, resulting in higher doses delivered. Patients with base of skull neoplasms are increasingly considered for stereotactic radiosurgery. Recently, Muthukumar et al reported for the University of Pittsburgh group on cobalt-60 Gamma Knife (Elekta Instruments, Atlanta, GA) therapy for 15 patients with chordomas or chondrosarcomas of the base of the skull. With tumor volumes ranging between 0.98 and 10.3 mL (mean, 4.6 mL), doses to the tumor margin varying from 12 to 20 Gy (median, 18 Gy) were delivered. Two patients were treated without histologic tumor confirmation. After a median follow-up time of 40 months, 2 patients had died of disease, 2 patients had succumbed to intercurrent disease, and 1 patient surviving at the time of analysis had developed tumor progression. Neither actuarial local control nor actuarial survival data were presented. In the LLUMC series, most tumors exceeded sizes reportedly suitable for radiosurgery or were of a highly irregular configuration. Nevertheless, in 11 patients, tumors less than 15 mL in size remained locally controlled as did tumors sized between 15 and 25 mL in 11 additional patients; these patients were thus potential candidates for stereotactic radiosurgery. At present, too few reports on radiosurgery contain sufficient patient numbers and statistical analyses to permit one to draw conclusions about the feasibility of radiosurgery for chordomas and chondrosarcomas of the base of the skull. A principal difference between proton RT and radiosurgery as currently practiced in most centers concerns target definition. In proton RT, the GTV is treated. In addition, a clinical volume is defined, which is distinctly different from the GTV in size and shape, to include the

  16. Development of CAD-Based Geometry Processing Module for a Monte Carlo Particle Transport Analysis Code

    International Nuclear Information System (INIS)

    As The Monte Carlo (MC) particle transport analysis for a complex system such as research reactor, accelerator, and fusion facility may require accurate modeling of the complicated geometry. Its manual modeling by using the text interface of a MC code to define the geometrical objects is tedious, lengthy and error-prone. This problem can be overcome by taking advantage of modeling capability of the computer aided design (CAD) system. There have been two kinds of approaches to develop MC code systems utilizing the CAD data: the external format conversion and the CAD kernel imbedded MC simulation. The first approach includes several interfacing programs such as McCAD, MCAM, GEOMIT etc. which were developed to automatically convert the CAD data into the MCNP geometry input data. This approach makes the most of the existing MC codes without any modifications, but implies latent data inconsistency due to the difference of the geometry modeling system. In the second approach, a MC code utilizes the CAD data for the direct particle tracking or the conversion to an internal data structure of the constructive solid geometry (CSG) and/or boundary representation (B-rep) modeling with help of a CAD kernel. MCNP-BRL and OiNC have demonstrated their capabilities of the CAD-based MC simulations. Recently we have developed a CAD-based geometry processing module for the MC particle simulation by using the OpenCASCADE (OCC) library. In the developed module, CAD data can be used for the particle tracking through primitive CAD surfaces (hereafter the CAD-based tracking) or the internal conversion to the CSG data structure. In this paper, the performances of the text-based model, the CAD-based tracking, and the internal CSG conversion are compared by using an in-house MC code, McSIM, equipped with the developed CAD-based geometry processing module

  17. Proton therapy physics

    CERN Document Server

    2012-01-01

    Proton Therapy Physics goes beyond current books on proton therapy to provide an in-depth overview of the physics aspects of this radiation therapy modality, eliminating the need to dig through information scattered in the medical physics literature. After tracing the history of proton therapy, the book summarizes the atomic and nuclear physics background necessary for understanding proton interactions with tissue. It describes the physics of proton accelerators, the parameters of clinical proton beams, and the mechanisms to generate a conformal dose distribution in a patient. The text then covers detector systems and measuring techniques for reference dosimetry, outlines basic quality assurance and commissioning guidelines, and gives examples of Monte Carlo simulations in proton therapy. The book moves on to discussions of treatment planning for single- and multiple-field uniform doses, dose calculation concepts and algorithms, and precision and uncertainties for nonmoving and moving targets. It also exami...

  18. Monte Carlo based investigation of Berry phase for depth resolved characterization of biomedical scattering samples

    Energy Technology Data Exchange (ETDEWEB)

    Baba, Justin S [ORNL; John, Dwayne O [ORNL; Koju, Vijay [ORNL

    2015-01-01

    The propagation of light in turbid media is an active area of research with relevance to numerous investigational fields, e.g., biomedical diagnostics and therapeutics. The statistical random-walk nature of photon propagation through turbid media is ideal for computational based modeling and simulation. Ready access to super computing resources provide a means for attaining brute force solutions to stochastic light-matter interactions entailing scattering by facilitating timely propagation of sufficient (>10million) photons while tracking characteristic parameters based on the incorporated physics of the problem. One such model that works well for isotropic but fails for anisotropic scatter, which is the case for many biomedical sample scattering problems, is the diffusion approximation. In this report, we address this by utilizing Berry phase (BP) evolution as a means for capturing anisotropic scattering characteristics of samples in the preceding depth where the diffusion approximation fails. We extend the polarization sensitive Monte Carlo method of Ramella-Roman, et al.,1 to include the computationally intensive tracking of photon trajectory in addition to polarization state at every scattering event. To speed-up the computations, which entail the appropriate rotations of reference frames, the code was parallelized using OpenMP. The results presented reveal that BP is strongly correlated to the photon penetration depth, thus potentiating the possibility of polarimetric depth resolved characterization of highly scattering samples, e.g., biological tissues.

  19. Monte Carlo mesh tallies based on a Kernel Density Estimator approach using integrated particle tracks

    International Nuclear Information System (INIS)

    A new Monte Carlo mesh tally based on a Kernel Density Estimator (KDE) approach using integrated particle tracks is presented. We first derive the KDE integral-track estimator and present a brief overview of its implementation as an alternative to the MCNP fmesh tally. To facilitate a valid quantitative comparison between these two tallies for verification purposes, there are two key issues that must be addressed. The first of these issues involves selecting a good data transfer method to convert the nodal-based KDE results into their cell-averaged equivalents (or vice versa with the cell-averaged MCNP results). The second involves choosing an appropriate resolution of the mesh, since if it is too coarse this can introduce significant errors into the reference MCNP solution. After discussing both of these issues in some detail, we present the results of a convergence analysis that shows the KDE integral-track and MCNP fmesh tallies are indeed capable of producing equivalent results for some simple 3D transport problems. In all cases considered, there was clear convergence from the KDE results to the reference MCNP results as the number of particle histories was increased. (authors)

  20. The development of GPU-based parallel PRNG for Monte Carlo applications in CUDA Fortran

    Directory of Open Access Journals (Sweden)

    Hamed Kargaran

    2016-04-01

    Full Text Available The implementation of Monte Carlo simulation on the CUDA Fortran requires a fast random number generation with good statistical properties on GPU. In this study, a GPU-based parallel pseudo random number generator (GPPRNG have been proposed to use in high performance computing systems. According to the type of GPU memory usage, GPU scheme is divided into two work modes including GLOBAL_MODE and SHARED_MODE. To generate parallel random numbers based on the independent sequence method, the combination of middle-square method and chaotic map along with the Xorshift PRNG have been employed. Implementation of our developed PPRNG on a single GPU showed a speedup of 150x and 470x (with respect to the speed of PRNG on a single CPU core for GLOBAL_MODE and SHARED_MODE, respectively. To evaluate the accuracy of our developed GPPRNG, its performance was compared to that of some other commercially available PPRNGs such as MATLAB, FORTRAN and Miller-Park algorithm through employing the specific standard tests. The results of this comparison showed that the developed GPPRNG in this study can be used as a fast and accurate tool for computational science applications.

  1. The development of GPU-based parallel PRNG for Monte Carlo applications in CUDA Fortran

    Science.gov (United States)

    Kargaran, Hamed; Minuchehr, Abdolhamid; Zolfaghari, Ahmad

    2016-04-01

    The implementation of Monte Carlo simulation on the CUDA Fortran requires a fast random number generation with good statistical properties on GPU. In this study, a GPU-based parallel pseudo random number generator (GPPRNG) have been proposed to use in high performance computing systems. According to the type of GPU memory usage, GPU scheme is divided into two work modes including GLOBAL_MODE and SHARED_MODE. To generate parallel random numbers based on the independent sequence method, the combination of middle-square method and chaotic map along with the Xorshift PRNG have been employed. Implementation of our developed PPRNG on a single GPU showed a speedup of 150x and 470x (with respect to the speed of PRNG on a single CPU core) for GLOBAL_MODE and SHARED_MODE, respectively. To evaluate the accuracy of our developed GPPRNG, its performance was compared to that of some other commercially available PPRNGs such as MATLAB, FORTRAN and Miller-Park algorithm through employing the specific standard tests. The results of this comparison showed that the developed GPPRNG in this study can be used as a fast and accurate tool for computational science applications.

  2. Development of an unstructured mesh based geometry model in the Serpent 2 Monte Carlo code

    International Nuclear Information System (INIS)

    This paper presents a new unstructured mesh based geometry type, developed in the Serpent 2 Monte Carlo code as a by-product of another study related to multi-physics applications and coupling to CFD codes. The new geometry type is intended for the modeling of complicated and irregular objects, which are not easily constructed using the conventional CSG based approach. The capability is put to test by modeling the 'Stanford Critical Bunny' – a variation of a well-known 3D test case for methods used in the world of computer graphics. The results show that the geometry routine in Serpent 2 can handle the unstructured mesh, and that the use of delta-tracking results in a considerable reduction in the overall calculation time as the geometry is refined. The methodology is still very much under development, with the final goal of implementing a geometry routine capable of reading standardized geometry formats used by 3D design and imaging tools in industry and medical physics. (author)

  3. Patient-specific stopping power calibration for proton therapy planning based on single-detector proton radiography

    International Nuclear Information System (INIS)

    A simple robust optimizer has been developed that can produce patient-specific calibration curves to convert x-ray computed tomography (CT) numbers to relative stopping powers (HU-RSPs) for proton therapy treatment planning. The difference between a digitally reconstructed radiograph water-equivalent path length (DRRWEPL) map through the x-ray CT dataset and a proton radiograph (set as the ground truth) is minimized by optimizing the HU-RSP calibration curve. The function of the optimizer is validated with synthetic datasets that contain no noise and its robustness is shown against CT noise. Application of the procedure is then demonstrated on a plastic and a real tissue phantom, with proton radiographs produced using a single detector. The mean errors using generic/optimized calibration curves between the DRRWEPL map and the proton radiograph were 1.8/0.4% for a plastic phantom and −2.1/ − 0.2% for a real tissue phantom. It was then demonstrated that these optimized calibration curves offer a better prediction of the water equivalent path length at a therapeutic depth. We believe that these promising results are suggestive that a single proton radiograph could be used to generate a patient-specific calibration curve as part of the current proton treatment planning workflow. (paper)

  4. TH-E-BRE-08: GPU-Monte Carlo Based Fast IMRT Plan Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Li, Y; Tian, Z; Shi, F; Jiang, S; Jia, X [The University of Texas Southwestern Medical Ctr, Dallas, TX (United States)

    2014-06-15

    Purpose: Intensity-modulated radiation treatment (IMRT) plan optimization needs pre-calculated beamlet dose distribution. Pencil-beam or superposition/convolution type algorithms are typically used because of high computation speed. However, inaccurate beamlet dose distributions, particularly in cases with high levels of inhomogeneity, may mislead optimization, hindering the resulting plan quality. It is desire to use Monte Carlo (MC) methods for beamlet dose calculations. Yet, the long computational time from repeated dose calculations for a number of beamlets prevents this application. It is our objective to integrate a GPU-based MC dose engine in lung IMRT optimization using a novel two-steps workflow. Methods: A GPU-based MC code gDPM is used. Each particle is tagged with an index of a beamlet where the source particle is from. Deposit dose are stored separately for beamlets based on the index. Due to limited GPU memory size, a pyramid space is allocated for each beamlet, and dose outside the space is neglected. A two-steps optimization workflow is proposed for fast MC-based optimization. At first step, rough beamlet dose calculations is conducted with only a small number of particles per beamlet. Plan optimization is followed to get an approximated fluence map. In the second step, more accurate beamlet doses are calculated, where sampled number of particles for a beamlet is proportional to the intensity determined previously. A second-round optimization is conducted, yielding the final Result. Results: For a lung case with 5317 beamlets, 10{sup 5} particles per beamlet in the first round, and 10{sup 8} particles per beam in the second round are enough to get a good plan quality. The total simulation time is 96.4 sec. Conclusion: A fast GPU-based MC dose calculation method along with a novel two-step optimization workflow are developed. The high efficiency allows the use of MC for IMRT optimizations.

  5. CAD-based Monte Carlo program for integrated simulation of nuclear system SuperMC

    International Nuclear Information System (INIS)

    Highlights: • The new developed CAD-based Monte Carlo program named SuperMC for integrated simulation of nuclear system makes use of hybrid MC-deterministic method and advanced computer technologies. SuperMC is designed to perform transport calculation of various types of particles, depletion and activation calculation including isotope burn-up, material activation and shutdown dose, and multi-physics coupling calculation including thermo-hydraulics, fuel performance and structural mechanics. The bi-directional automatic conversion between general CAD models and physical settings and calculation models can be well performed. Results and process of simulation can be visualized with dynamical 3D dataset and geometry model. Continuous-energy cross section, burnup, activation, irradiation damage and material data etc. are used to support the multi-process simulation. Advanced cloud computing framework makes the computation and storage extremely intensive simulation more attractive just as a network service to support design optimization and assessment. The modular design and generic interface promotes its flexible manipulation and coupling of external solvers. • The new developed and incorporated advanced methods in SuperMC was introduced including hybrid MC-deterministic transport method, particle physical interaction treatment method, multi-physics coupling calculation method, geometry automatic modeling and processing method, intelligent data analysis and visualization method, elastic cloud computing technology and parallel calculation method. • The functions of SuperMC2.1 integrating automatic modeling, neutron and photon transport calculation, results and process visualization was introduced. It has been validated by using a series of benchmarking cases such as the fusion reactor ITER model and the fast reactor BN-600 model. - Abstract: Monte Carlo (MC) method has distinct advantages to simulate complicated nuclear systems and is envisioned as a routine

  6. Measurement of the proton light response of various LAB based scintillators and its implication for supernova neutrino detection via neutrino-proton scattering

    CERN Document Server

    von Krosigk, B; Nolte, R; Röttger, S; Zuber, K

    2013-01-01

    The proton light output function in electron-equivalent energy of various scintillators based on linear alkylbenzene (LAB) has been measured in the energy range from 1 MeV to 17.15 MeV for the first time. The measurement was performed at the Physikalisch-Technische Bundesanstalt (PTB) using a neutron beam with continuous energy distribution. The proton light output data is extracted from proton recoil spectra originating from neutron-proton scattering in the scintillator. The functional behavior of the proton light output is described succesfully by Birks' law with a Birks constant kB between (0.0094 +/- 0.0002) cm/MeV and (0.0098 +/- 0.0003) cm/MeV for the different LAB solutions. The constant C, parameterizing the quadratic term in the generalized Birks law, is consistent with zero for all investigated scintillators with an upper limit (95% CL) of about 10^{-7} cm^2/MeV^2. The resulting quenching factors are especially important for future planned supernova neutrino detection based on the elastic scattering...

  7. Muusikamaailm : "Euroopa muusikakuu" Baselis. Leif Ove Andsnes Londonis. Konkursipreemiaid. Monte Pederson lahkunud / Priit Kuusk

    Index Scriptorium Estoniae

    Kuusk, Priit, 1938-

    2001-01-01

    Novembrikuus elab šveitsi linn Basel "Euroopa muusikakuu" tähe all. Noor norra pianist Leif Ove Andsnes kutsuti Londonisse esinema. Konkursipreemiaid erinevatel konkurssidelt. Suri ameerika laulja Monte Pederson

  8. Monte-Carlo Simulation for PDC-Based Optical CDMA System

    OpenAIRE

    FAHIM AZIZ UMRANI; AHSAN AHMED URSANI; ABDUL WAHEED UMRANI

    2010-01-01

    This paper presents the Monte-Carlo simulation of Optical CDMA (Code Division Multiple Access) systems, and analyse its performance in terms of the BER (Bit Error Rate). The spreading sequence chosen for CDMA is Perfect Difference Codes. Furthermore, this paper derives the expressions of noise variances from first principles to calibrate the noise for both bipolar (electrical domain) and unipolar (optical domain) signalling required for Monte-Carlo simulation. The simulated res...

  9. Fiber optic microprobes with rare-earth-based phosphor tips for proton beam characterization

    Science.gov (United States)

    Darafsheh, Arash; Kassaee, Alireza; Taleei, Reza; Dolney, Derek; Finlay, Jarod C.

    2016-03-01

    We investigated the feasibility of using fiber optics probes with rare-earth-based phosphor tips for proton beam radiation dosimetry. We designed and fabricated a fiber probe with submillimeter resolution based on TbF3 phosphors and evaluated its performance for measurement of proton beams including profiles and range. The fiber optic probe, embedded in tissue-mimicking plastics, was irradiated with a clinical proton beam and the luminescence spectroscopy was performed by a CCD-coupled spectrograph to analyze the emission spectra of the fiber tip. By using a linear fitting algorithm we extracted the contribution of the ionoluminescence signal to obtain the percentage depth dose in phantoms and compared that with measurements performed with a standard ion chamber. We observed a quenching effect in the spread out Bragg peak region, manifested as an under-responding of the signal due to the high linear energy transfer of the beam. However, the beam profiles measurements were not affected by the quenching effect indicating that the fiber probes can be used for high-resolution measurements of proton beams profile.

  10. CT based treatment planning system of proton beam therapy for ocular melanoma

    Energy Technology Data Exchange (ETDEWEB)

    Nakano, Takashi E-mail: tnakano@med.gunma-u.ac.jp; Kanai, Tatsuaki; Furukawa, Shigeo; Shibayama, Kouichi; Sato, Sinichiro; Hiraoka, Takeshi; Morita, Shinroku; Tsujii, Hirohiko

    2003-09-01

    A computed tomography (CT) based treatment planning system of proton beam therapy was established specially for ocular melanoma treatment. A technique of collimated proton beams with maximum energy of 70 MeV are applied for treatment for ocular melanoma. The vertical proton beam line has a range modulator for spreading beams out, a multi-leaf collimator, an aperture, light beam localizer, field light, and X-ray verification system. The treatment planning program includes; eye model, selecting the best direction of gaze, designing the shape of aperture, determining the proton range and range modulation necessary to encompass the target volume, and indicating the relative positions of the eyes, beam center and creation of beam aperture. Tumor contours are extracted from CT/MRI images of 1 mm thickness by assistant by various information of fundus photography and ultrasonography. The CT image-based treatment system for ocular melanoma is useful for Japanese patients as having thick choroid membrane in terms of dose sparing to skin and normal organs in the eye. The characteristics of the system and merits/demerits were reported.

  11. A GPU-based Monte Carlo dose calculation code for photon transport in a voxel phantom

    Energy Technology Data Exchange (ETDEWEB)

    Bellezzo, M.; Do Nascimento, E.; Yoriyaz, H., E-mail: mbellezzo@gmail.br [Instituto de Pesquisas Energeticas e Nucleares / CNEN, Av. Lineu Prestes 2242, Cidade Universitaria, 05508-000 Sao Paulo (Brazil)

    2014-08-15

    As the most accurate method to estimate absorbed dose in radiotherapy, Monte Carlo method has been widely used in radiotherapy treatment planning. Nevertheless, its efficiency can be improved for clinical routine applications. In this paper, we present the CUBMC code, a GPU-based Mc photon transport algorithm for dose calculation under the Compute Unified Device Architecture platform. The simulation of physical events is based on the algorithm used in Penelope, and the cross section table used is the one generated by the Material routine, als present in Penelope code. Photons are transported in voxel-based geometries with different compositions. To demonstrate the capabilities of the algorithm developed in the present work four 128 x 128 x 128 voxel phantoms have been considered. One of them is composed by a homogeneous water-based media, the second is composed by bone, the third is composed by lung and the fourth is composed by a heterogeneous bone and vacuum geometry. Simulations were done considering a 6 MeV monoenergetic photon point source. There are two distinct approaches that were used for transport simulation. The first of them forces the photon to stop at every voxel frontier, the second one is the Woodcock method, where the photon stop in the frontier will be considered depending on the material changing across the photon travel line. Dose calculations using these methods are compared for validation with Penelope and MCNP5 codes. Speed-up factors are compared using a NVidia GTX 560-Ti GPU card against a 2.27 GHz Intel Xeon CPU processor. (Author)

  12. A GPU-based Monte Carlo dose calculation code for photon transport in a voxel phantom

    International Nuclear Information System (INIS)

    As the most accurate method to estimate absorbed dose in radiotherapy, Monte Carlo method has been widely used in radiotherapy treatment planning. Nevertheless, its efficiency can be improved for clinical routine applications. In this paper, we present the CUBMC code, a GPU-based Mc photon transport algorithm for dose calculation under the Compute Unified Device Architecture platform. The simulation of physical events is based on the algorithm used in Penelope, and the cross section table used is the one generated by the Material routine, als present in Penelope code. Photons are transported in voxel-based geometries with different compositions. To demonstrate the capabilities of the algorithm developed in the present work four 128 x 128 x 128 voxel phantoms have been considered. One of them is composed by a homogeneous water-based media, the second is composed by bone, the third is composed by lung and the fourth is composed by a heterogeneous bone and vacuum geometry. Simulations were done considering a 6 MeV monoenergetic photon point source. There are two distinct approaches that were used for transport simulation. The first of them forces the photon to stop at every voxel frontier, the second one is the Woodcock method, where the photon stop in the frontier will be considered depending on the material changing across the photon travel line. Dose calculations using these methods are compared for validation with Penelope and MCNP5 codes. Speed-up factors are compared using a NVidia GTX 560-Ti GPU card against a 2.27 GHz Intel Xeon CPU processor. (Author)

  13. Monte Carlo simulation of the response of a hadronic calorimeter to protons of momentum 3.5 to 200 GeV/c

    International Nuclear Information System (INIS)

    The response of scintillation counters in an ionization calorimeter to incident protons of momenta 3.5 to 200 GeV/c was simulated using the CALOR computer code system. Results of the simulation are compared with data taken at Brookhaven National Laboratory for 50-, 100-, and 278-GeV hadrons. Mechanisms which produce large pulse heights for low-energy incident particles are discussed. 14 references, 7 figures

  14. Analysis by Monte Carlo simulations of the sensitivity to single event upset of SRAM memories under spatial proton or terrestrial neutron environment

    International Nuclear Information System (INIS)

    Electronic systems in space and terrestrial environments are subjected to a flow of particles of natural origin, which can induce dysfunctions. These particles can cause Single Event Upsets (SEU) in SRAM memories. Although non-destructive, the SEU can have consequences on the equipment functioning in applications requiring a great reliability (airplane, satellite, launcher, medical, etc). Thus, an evaluation of the sensitivity of the component technology is necessary to predict the reliability of a system. In atmospheric environment, the SEU sensitivity is mainly caused by the secondary ions resulting from the nuclear reactions between the neutrons and the atoms of the component. In space environment, the protons with strong energies induce the same effects as the atmospheric neutrons. In our work, a new code of prediction of the rate of SEU has been developed (MC-DASIE) in order to quantify the sensitivity for a given environment and to explore the mechanisms of failures according to technology. This code makes it possible to study various technologies of memories SRAM (Bulk and SOI) in neutron and proton environment between 1 MeV and 1 GeV. Thus, MC-DASIE was used with experiment data to study the effect of integration on the sensitivity of the memories in terrestrial environment, a comparison between the neutron and proton irradiations and the influence of the modeling of the target component on the calculation of the rate of SEU. (author)

  15. Application of the measurement-based Monte Carlo method in nasopharyngeal cancer patients for intensity modulated radiation therapy

    International Nuclear Information System (INIS)

    This study aims to utilize a measurement-based Monte Carlo (MBMC) method to evaluate the accuracy of dose distributions calculated using the Eclipse radiotherapy treatment planning system (TPS) based on the anisotropic analytical algorithm. Dose distributions were calculated for the nasopharyngeal carcinoma (NPC) patients treated with the intensity modulated radiotherapy (IMRT). Ten NPC IMRT plans were evaluated by comparing their dose distributions with those obtained from the in-house MBMC programs for the same CT images and beam geometry. To reconstruct the fluence distribution of the IMRT field, an efficiency map was obtained by dividing the energy fluence of the intensity modulated field by that of the open field, both acquired from an aS1000 electronic portal imaging device. The integrated image of the non-gated mode was used to acquire the full dose distribution delivered during the IMRT treatment. This efficiency map redistributed the particle weightings of the open field phase-space file for IMRT applications. Dose differences were observed in the tumor and air cavity boundary. The mean difference between MBMC and TPS in terms of the planning target volume coverage was 0.6% (range: 0.0–2.3%). The mean difference for the conformity index was 0.01 (range: 0.0–0.01). In conclusion, the MBMC method serves as an independent IMRT dose verification tool in a clinical setting. - Highlights: ► The patient-based Monte Carlo method serves as a reference standard to verify IMRT doses. ► 3D Dose distributions for NPC patients have been verified by the Monte Carlo method. ► Doses predicted by the Monte Carlo method matched closely with those by the TPS. ► The Monte Carlo method predicted a higher mean dose to the middle ears than the TPS. ► Critical organ doses should be confirmed to avoid overdose to normal organs

  16. Foundation of an analytical proton beamlet model for inclusion in a general proton dose calculation system

    Science.gov (United States)

    Ulmer, W.; Schaffner, B.

    2011-03-01

    We have developed a model for proton depth dose and lateral distributions based on Monte Carlo calculations (GEANT4) and an integration procedure of Bethe-Bloch equation (BBE). The model accounts for the transport of primary and secondary protons, the creation of recoil protons and heavy recoil nuclei as well as lateral scattering of these contributions. The buildup, which is experimentally observed in higher energy depth dose curves, is modeled by an inclusion of two different origins: (1) secondary reaction protons with a contribution of ca. 65% of the buildup (for monoenergetic protons). (2) Landau tails as well as Gaussian type of fluctuations for range straggling effects. All parameters of the model for initially monoenergetic proton beams have been obtained from Monte Carlo calculations or checked by them. Furthermore, there are a few parameters, which can be obtained by fitting the model to the measured depth dose curves in order to describe individual characteristics of the beamline—the most important being the initial energy spread. We find that the free parameters of the depth dose model can be predicted for any intermediate energy from a couple of measured curves.

  17. A Monte Carlo-based knee phantom for in vivo measurements of 241Am in bone

    International Nuclear Information System (INIS)

    Determination of internal contamination of 241Am can be done by direct counting of gamma emission using a Whole Body Counter. Due to the strong attenuation of the low-energy photons, it is advised to perform the measurement on bones surrounded by a thin layer of tissue. In vivo measurements are performed at CIEMAT using a system of four Low-Energy germanium (LE Ge) detectors calibrated with realistic anthropomorphic phantoms. As an alternative, Monte Carlo techniques are applied on voxel phantoms based on tomographic images to avoid the need of different physical phantoms for different radionuclides and organs. This technique is employed to study the convenience of americium measurements in the knee for the evaluation of the deposition in the whole skeleton. The spatial distribution of the photon fluence through a cylinder along the axis of the leg has been calculated to determine the best counting geometry. The detection efficiency is then calculated and the results are compared with those obtained using the physical phantom to validate the proposed method

  18. Monte Carlo based time-domain Hspice noise simulation for CSA-CRRC circuit

    International Nuclear Information System (INIS)

    We present a time-domain Monte Carlo based Hspice noise simulation for a charge-sensitive preamplifier-CRRC (CSA-CRRC) circuit with random amplitude piecewise noise waveform. The amplitude distribution of thermal noise is modeled with Gaussian random number. For 1/f noise, its amplitude distribution is modeled with several low-pass filters with thermal noise generators. These time-domain noise sources are connected in parallel with the drain and source nodes of the CMOS input transistor of CSA. The Hspice simulation of the CSA-CRRC circuit with these noise sources yielded ENC values at the output node of the shaper for thermal and 1/f noise of 47e- and 732e-, respectively. ENC values calculated from the frequency-domain transfer function and its integration are 44e- and 882e-, respectively. The values for Hspice simulation are similar to those for frequency-domain calculation. A test chip was designed and fabricated for this study. The measured ENC value was 904 e-. This study shows that the time-domain noise modeling is valid and the transient Hspice noise simulation can be an effective tool for low-noise circuit design

  19. Monte Carlo simulation of novel breast imaging modalities based on coherent x-ray scattering

    International Nuclear Information System (INIS)

    We present upgraded versions of MC-GPU and penEasyImaging, two open-source Monte Carlo codes for the simulation of radiographic projections and CT, that have been extended and validated to account for the effect of molecular interference in the coherent x-ray scatter. The codes were first validation by comparison between simulated and measured energy dispersive x-ray diffraction (EDXRD) spectra. A second validation was by evaluation of the rejection factor of a focused anti-scatter grid. To exemplify the capabilities of the new codes, the modified MC-GPU code was used to examine the possibility of characterizing breast tissue composition and microcalcifications in a volume of interest inside a whole breast phantom using EDXRD and to simulate a coherent scatter computed tomography (CSCT) system based on first generation CT acquisition geometry. It was confirmed that EDXRD and CSCT have the potential to characterize tissue composition inside a whole breast. The GPU-accelerated code was able to simulate, in just a few hours, a complete CSCT acquisition composed of 9758 independent pencil-beam projections. In summary, it has been shown that the presented software can be used for fast and accurate simulation of novel breast imaging modalities relying on scattering measurements and therefore can assist in the characterization and optimization of promising modalities currently under development. (paper)

  20. Adjoint-based deviational Monte Carlo methods for phonon transport calculations

    Science.gov (United States)

    Péraud, Jean-Philippe M.; Hadjiconstantinou, Nicolas G.

    2015-06-01

    In the field of linear transport, adjoint formulations exploit linearity to derive powerful reciprocity relations between a variety of quantities of interest. In this paper, we develop an adjoint formulation of the linearized Boltzmann transport equation for phonon transport. We use this formulation for accelerating deviational Monte Carlo simulations of complex, multiscale problems. Benefits include significant computational savings via direct variance reduction, or by enabling formulations which allow more efficient use of computational resources, such as formulations which provide high resolution in a particular phase-space dimension (e.g., spectral). We show that the proposed adjoint-based methods are particularly well suited to problems involving a wide range of length scales (e.g., nanometers to hundreds of microns) and lead to computational methods that can calculate quantities of interest with a cost that is independent of the system characteristic length scale, thus removing the traditional stiffness of kinetic descriptions. Applications to problems of current interest, such as simulation of transient thermoreflectance experiments or spectrally resolved calculation of the effective thermal conductivity of nanostructured materials, are presented and discussed in detail.

  1. IVF cycle cost estimation using Activity Based Costing and Monte Carlo simulation.

    Science.gov (United States)

    Cassettari, Lucia; Mosca, Marco; Mosca, Roberto; Rolando, Fabio; Costa, Mauro; Pisaturo, Valerio

    2016-03-01

    The Authors present a new methodological approach in stochastic regime to determine the actual costs of an healthcare process. The paper specifically shows the application of the methodology for the determination of the cost of an Assisted reproductive technology (ART) treatment in Italy. The reason of this research comes from the fact that deterministic regime is inadequate to implement an accurate estimate of the cost of this particular treatment. In fact the durations of the different activities involved are unfixed and described by means of frequency distributions. Hence the need to determine in addition to the mean value of the cost, the interval within which it is intended to vary with a known confidence level. Consequently the cost obtained for each type of cycle investigated (in vitro fertilization and embryo transfer with or without intracytoplasmic sperm injection), shows tolerance intervals around the mean value sufficiently restricted as to make the data obtained statistically robust and therefore usable also as reference for any benchmark with other Countries. It should be noted that under a methodological point of view the approach was rigorous. In fact it was used both the technique of Activity Based Costing for determining the cost of individual activities of the process both the Monte Carlo simulation, with control of experimental error, for the construction of the tolerance intervals on the final result. PMID:24752546

  2. Monte Carlo efficiency calibration of a neutron generator-based total-body irradiator

    International Nuclear Information System (INIS)

    Many body composition measurement systems are calibrated against a single-sized reference phantom. Prompt-gamma neutron activation (PGNA) provides the only direct measure of total body nitrogen (TBN), an index of the body's lean tissue mass. In PGNA systems, body size influences neutron flux attenuation, induced gamma signal distribution, and counting efficiency. Thus, calibration based on a single-sized phantom could result in inaccurate TBN values. We used Monte Carlo simulations (MCNP-5; Los Alamos National Laboratory) in order to map a system's response to the range of body weights (65-160 kg) and body fat distributions (25-60%) in obese humans. Calibration curves were constructed to derive body-size correction factors relative to a standard reference phantom, providing customized adjustments to account for differences in body habitus of obese adults. The use of MCNP-generated calibration curves should allow for a better estimate of the true changes in lean tissue mass that many occur during intervention programs focused only on weight loss. (author)

  3. Monte Carlo vs. Pencil Beam based optimization of stereotactic lung IMRT

    International Nuclear Information System (INIS)

    The purpose of the present study is to compare finite size pencil beam (fsPB) and Monte Carlo (MC) based optimization of lung intensity-modulated stereotactic radiotherapy (lung IMSRT). A fsPB and a MC algorithm as implemented in a biological IMRT planning system were validated by film measurements in a static lung phantom. Then, they were applied for static lung IMSRT planning based on three different geometrical patient models (one phase static CT, density overwrite one phase static CT, average CT) of the same patient. Both 6 and 15 MV beam energies were used. The resulting treatment plans were compared by how well they fulfilled the prescribed optimization constraints both for the dose distributions calculated on the static patient models and for the accumulated dose, recalculated with MC on each of 8 CTs of a 4DCT set. In the phantom measurements, the MC dose engine showed discrepancies < 2%, while the fsPB dose engine showed discrepancies of up to 8% in the presence of lateral electron disequilibrium in the target. In the patient plan optimization, this translates into violations of organ at risk constraints and unpredictable target doses for the fsPB optimized plans. For the 4D MC recalculated dose distribution, MC optimized plans always underestimate the target doses, but the organ at risk doses were comparable. The results depend on the static patient model, and the smallest discrepancy was found for the MC optimized plan on the density overwrite one phase static CT model. It is feasible to employ the MC dose engine for optimization of lung IMSRT and the plans are superior to fsPB. Use of static patient models introduces a bias in the MC dose distribution compared to the 4D MC recalculated dose, but this bias is predictable and therefore MC based optimization on static patient models is considered safe

  4. Modeling parameterized geometry in GPU-based Monte Carlo particle transport simulation for radiotherapy.

    Science.gov (United States)

    Chi, Yujie; Tian, Zhen; Jia, Xun

    2016-08-01

    Monte Carlo (MC) particle transport simulation on a graphics-processing unit (GPU) platform has been extensively studied recently due to the efficiency advantage achieved via massive parallelization. Almost all of the existing GPU-based MC packages were developed for voxelized geometry. This limited application scope of these packages. The purpose of this paper is to develop a module to model parametric geometry and integrate it in GPU-based MC simulations. In our module, each continuous region was defined by its bounding surfaces that were parameterized by quadratic functions. Particle navigation functions in this geometry were developed. The module was incorporated to two previously developed GPU-based MC packages and was tested in two example problems: (1) low energy photon transport simulation in a brachytherapy case with a shielded cylinder applicator and (2) MeV coupled photon/electron transport simulation in a phantom containing several inserts of different shapes. In both cases, the calculated dose distributions agreed well with those calculated in the corresponding voxelized geometry. The averaged dose differences were 1.03% and 0.29%, respectively. We also used the developed package to perform simulations of a Varian VS 2000 brachytherapy source and generated a phase-space file. The computation time under the parameterized geometry depended on the memory location storing the geometry data. When the data was stored in GPU's shared memory, the highest computational speed was achieved. Incorporation of parameterized geometry yielded a computation time that was ~3 times of that in the corresponding voxelized geometry. We also developed a strategy to use an auxiliary index array to reduce frequency of geometry calculations and hence improve efficiency. With this strategy, the computational time ranged in 1.75-2.03 times of the voxelized geometry for coupled photon/electron transport depending on the voxel dimension of the auxiliary index array, and in 0

  5. Modeling parameterized geometry in GPU-based Monte Carlo particle transport simulation for radiotherapy

    Science.gov (United States)

    Chi, Yujie; Tian, Zhen; Jia, Xun

    2016-08-01

    Monte Carlo (MC) particle transport simulation on a graphics-processing unit (GPU) platform has been extensively studied recently due to the efficiency advantage achieved via massive parallelization. Almost all of the existing GPU-based MC packages were developed for voxelized geometry. This limited application scope of these packages. The purpose of this paper is to develop a module to model parametric geometry and integrate it in GPU-based MC simulations. In our module, each continuous region was defined by its bounding surfaces that were parameterized by quadratic functions. Particle navigation functions in this geometry were developed. The module was incorporated to two previously developed GPU-based MC packages and was tested in two example problems: (1) low energy photon transport simulation in a brachytherapy case with a shielded cylinder applicator and (2) MeV coupled photon/electron transport simulation in a phantom containing several inserts of different shapes. In both cases, the calculated dose distributions agreed well with those calculated in the corresponding voxelized geometry. The averaged dose differences were 1.03% and 0.29%, respectively. We also used the developed package to perform simulations of a Varian VS 2000 brachytherapy source and generated a phase-space file. The computation time under the parameterized geometry depended on the memory location storing the geometry data. When the data was stored in GPU’s shared memory, the highest computational speed was achieved. Incorporation of parameterized geometry yielded a computation time that was ~3 times of that in the corresponding voxelized geometry. We also developed a strategy to use an auxiliary index array to reduce frequency of geometry calculations and hence improve efficiency. With this strategy, the computational time ranged in 1.75–2.03 times of the voxelized geometry for coupled photon/electron transport depending on the voxel dimension of the auxiliary index array, and in 0

  6. Feasibility of a laser-driven, ground-based proton source to simulate the space environmental for semiconductors

    International Nuclear Information System (INIS)

    Laser-driven proton beams for ground-based study of space radiation effects on semiconductor devices are considered. Laser irradiation focused onto thin foil targets can generate proton spectra at intensity and fluence levels that are adequate to make such laser-driven sources feasible for this space application. Technical areas for further development are also briefly discussed. (author)

  7. Validation of a personalized dosimetric evaluation tool (Oedipe) for targeted radiotherapy based on the Monte Carlo MCNPX code

    International Nuclear Information System (INIS)

    Dosimetric studies are necessary for all patients treated with targeted radiotherapy. In order to attain the precision required, we have developed Oedipe, a dosimetric tool based on the MCNPX Monte Carlo code. The anatomy of each patient is considered in the form of a voxel-based geometry created using computed tomography (CT) images or magnetic resonance imaging (MRI). Oedipe enables dosimetry studies to be carried out at the voxel scale. Validation of the results obtained by comparison with existing methods is complex because there are multiple sources of variation: calculation methods (different Monte Carlo codes, point kernel), patient representations (model or specific) and geometry definitions (mathematical or voxel-based). In this paper, we validate Oedipe by taking each of these parameters into account independently. Monte Carlo methodology requires long calculation times, particularly in the case of voxel-based geometries, and this is one of the limits of personalized dosimetric methods. However, our results show that the use of voxel-based geometry as opposed to a mathematically defined geometry decreases the calculation time two-fold, due to an optimization of the MCNPX2.5e code. It is therefore possible to envisage the use of Oedipe for personalized dosimetry in the clinical context of targeted radiotherapy

  8. Mechanisms and energetics for N-glycosidic bond cleavage of protonated adenine nucleosides: N3 protonation induces base rotation and enhances N-glycosidic bond stability.

    Science.gov (United States)

    Wu, R R; Rodgers, M T

    2016-06-21

    Our previous gas-phase infrared multiple photon dissociation action spectroscopy study of protonated 2'-deoxyadenosine and adenosine, [dAdo+H](+) and [Ado+H](+), found that both N3 and N1 protonated conformers are populated with the N3 protonated ground-state conformers predominant in the experiments. Therefore, N-glycosidic bond dissociation mechanisms of N3 and N1 protonated [dAdo+H](+) and [Ado+H](+) and the associated quantitative thermochemical values are investigated here using both experimental and theoretical approaches. Threshold collision-induced dissociation (TCID) of [dAdo+H](+) and [Ado+H](+) with Xe is studied using guided ion beam tandem mass spectrometry techniques. For both systems, N-glycosidic bond cleavage reactions are observed as the major dissociation pathways resulting in production of protonated adenine or elimination of neutral adenine. Electronic structure calculations are performed at the B3LYP/6-311+G(d,p) level of theory to probe the potential energy surfaces (PESs) for N-glycosidic bond cleavage of [dAdo+H](+) and [Ado+H](+). Relative energetics of the reactants, transition states, intermediates and products along the PESs for N-glycosidic bond cleavage are determined at the B3LYP/6-311+G(2d,2p), B3LYP-GD3BJ/6-311+G(2d,2p), and MP2(full)/6-311+G(2d,2p) levels of theory. The predicted N-glycosidic bond dissociation mechanisms for the N3 and N1 protonated species differ. Base rotation of the adenine residue enables formation of a strong N3H(+)O5' hydrogen-bonding interaction that stabilizes the N3 protonated species and its glycosidic bond. Comparison between experiment and theory indicates that the N3 protonated species determine the threshold energies, as excellent agreement between the measured and B3LYP computed activation energies (AEs) and reaction enthalpies (ΔHrxns) for N-glycosidic bond cleavage of the N3 protonated species is found. PMID:27240654

  9. An exercise in model validation: Comparing univariate statistics and Monte Carlo-based multivariate statistics

    International Nuclear Information System (INIS)

    The complexity of mathematical models used by practicing engineers is increasing due to the growing availability of sophisticated mathematical modeling tools and ever-improving computational power. For this reason, the need to define a well-structured process for validating these models against experimental results has become a pressing issue in the engineering community. This validation process is partially characterized by the uncertainties associated with the modeling effort as well as the experimental results. The net impact of the uncertainties on the validation effort is assessed through the 'noise level of the validation procedure', which can be defined as an estimate of the 95% confidence uncertainty bounds for the comparison error between actual experimental results and model-based predictions of the same quantities of interest. Although general descriptions associated with the construction of the noise level using multivariate statistics exists in the literature, a detailed procedure outlining how to account for the systematic and random uncertainties is not available. In this paper, the methodology used to derive the covariance matrix associated with the multivariate normal pdf based on random and systematic uncertainties is examined, and a procedure used to estimate this covariance matrix using Monte Carlo analysis is presented. The covariance matrices are then used to construct approximate 95% confidence constant probability contours associated with comparison error results for a practical example. In addition, the example is used to show the drawbacks of using a first-order sensitivity analysis when nonlinear local sensitivity coefficients exist. Finally, the example is used to show the connection between the noise level of the validation exercise calculated using multivariate and univariate statistics.

  10. A study of potential numerical pitfalls in GPU-based Monte Carlo dose calculation

    Science.gov (United States)

    Magnoux, Vincent; Ozell, Benoît; Bonenfant, Éric; Després, Philippe

    2015-07-01

    The purpose of this study was to evaluate the impact of numerical errors caused by the floating point representation of real numbers in a GPU-based Monte Carlo code used for dose calculation in radiation oncology, and to identify situations where this type of error arises. The program used as a benchmark was bGPUMCD. Three tests were performed on the code, which was divided into three functional components: energy accumulation, particle tracking and physical interactions. First, the impact of single-precision calculations was assessed for each functional component. Second, a GPU-specific compilation option that reduces execution time as well as precision was examined. Third, a specific function used for tracking and potentially more sensitive to precision errors was tested by comparing it to a very high-precision implementation. Numerical errors were found in two components of the program. Because of the energy accumulation process, a few voxels surrounding a radiation source end up with a lower computed dose than they should. The tracking system contained a series of operations that abnormally amplify rounding errors in some situations. This resulted in some rare instances (less than 0.1%) of computed distances that are exceedingly far from what they should have been. Most errors detected had no significant effects on the result of a simulation due to its random nature, either because they cancel each other out or because they only affect a small fraction of particles. The results of this work can be extended to other types of GPU-based programs and be used as guidelines to avoid numerical errors on the GPU computing platform.

  11. A voxel-based mouse for internal dose calculations using Monte Carlo simulations (MCNP)

    International Nuclear Information System (INIS)

    Murine models are useful for targeted radiotherapy pre-clinical experiments. These models can help to assess the potential interest of new radiopharmaceuticals. In this study, we developed a voxel-based mouse for dosimetric estimates. A female nude mouse (30 g) was frozen and cut into slices. High-resolution digital photographs were taken directly on the frozen block after each section. Images were segmented manually. Monoenergetic photon or electron sources were simulated using the MCNP4c2 Monte Carlo code for each source organ, in order to give tables of S-factors (in Gy Bq sup - sup 1 s sup - sup 1) for all target organs. Results obtained from monoenergetic particles were then used to generate S-factors for several radionuclides of potential interest in targeted radiotherapy. Thirteen source and 25 target regions were considered in this study. For each source region, 16 photon and 16 electron energies were simulated. Absorbed fractions, specific absorbed fractions and S-factors were calculated for 16 radionuclides of interest for targeted radiotherapy. The results obtained generally agree well with data published previously. For electron energies ranging from 0.1 to 2.5 MeV, the self-absorbed fraction varies from 0.98 to 0.376 for the liver, and from 0.89 to 0.04 for the thyroid. Electrons cannot be considered as 'non-penetrating' radiation for energies above 0.5 MeV for mouse organs. This observation can be generalized to radionuclides: for example, the beta self-absorbed fraction for the thyroid was 0.616 for I-131; absorbed fractions for Y-90 for left kidney-to-left kidney and for left kidney-to-spleen were 0.486 and 0.058, respectively. Our voxel-based mouse allowed us to generate a dosimetric database for use in preclinical targeted radiotherapy experiments. (author)

  12. SU-E-T-554: Monte Carlo Calculation of Source Terms and Attenuation Lengths for Neutrons Produced by 50–200 MeV Protons On Brass

    Energy Technology Data Exchange (ETDEWEB)

    Ramos-Mendez, J; Faddegon, B [University of California San Francisco, San Francisco, CA (United States); Paganetti, H [Massachusetts General Hospital, Boston, MA (United States)

    2015-06-15

    Purpose: We used TOPAS (TOPAS wraps and extends Geant4 for medical physicists) to compare Geant4 physics models with published data for neutron shielding calculations. Subsequently, we calculated the source terms and attenuation lengths (shielding data) of the total ambient dose equivalent (TADE) in concrete for neutrons produced by protons in brass. Methods: Stage1: The Bertini and Binary nuclear models available in Geant4 were compared with published attenuation at depth of the TADE in concrete and iron. Stage2: Shielding data of the TADE in concrete was calculated for 50– 200 MeV proton beams on brass. Stage3: Shielding data from Stage2 was extrapolated for 235 MeV proton beams. This data was used in a point-line-source analytical model to calculate the ambient dose per unit therapeutic dose at two locations inside one treatment room at the Francis H Burr Proton Therapy Center. Finally, we compared these results with experimental data and full TOPAS simulations. Results: At larger angles (∼130o) the TADE in concrete calculated with the Bertini model was about 9 times larger than that calculated with the Binary model. The attenuation length in concrete calculated with the Binary model agreed with published data within 7%±0.4% (statistical uncertainty) for the deepest regions and 5%±0.1% for shallower regions. For iron the agreement was within 3%±0.1%. The ambient dose per therapeutic dose calculated with the Binary model, relative to the experimental data, was a ratio of 0.93±0.16 and 1.23±0.24 for two locations. The analytical model overestimated the dose by four orders of magnitude. These differences are attributed to the complexity of the geometry. Conclusion: The Binary and Bertini models gave comparable results, with the Binary model giving the best agreement with published data at large angle. Shielding data we calculated using the Binary model is useful for fast shielding calculations with other analytical models. This work was supported by

  13. A new Monte Carlo-based treatment plan optimization approach for intensity modulated radiation therapy

    Science.gov (United States)

    Li, Yongbao; Tian, Zhen; Shi, Feng; Song, Ting; Wu, Zhaoxia; Liu, Yaqiang; Jiang, Steve; Jia, Xun

    2015-04-01

    Intensity-modulated radiation treatment (IMRT) plan optimization needs beamlet dose distributions. Pencil-beam or superposition/convolution type algorithms are typically used because of their high computational speed. However, inaccurate beamlet dose distributions may mislead the optimization process and hinder the resulting plan quality. To solve this problem, the Monte Carlo (MC) simulation method has been used to compute all beamlet doses prior to the optimization step. The conventional approach samples the same number of particles from each beamlet. Yet this is not the optimal use of MC in this problem. In fact, there are beamlets that have very small intensities after solving the plan optimization problem. For those beamlets, it may be possible to use fewer particles in dose calculations to increase efficiency. Based on this idea, we have developed a new MC-based IMRT plan optimization framework that iteratively performs MC dose calculation and plan optimization. At each dose calculation step, the particle numbers for beamlets were adjusted based on the beamlet intensities obtained through solving the plan optimization problem in the last iteration step. We modified a GPU-based MC dose engine to allow simultaneous computations of a large number of beamlet doses. To test the accuracy of our modified dose engine, we compared the dose from a broad beam and the summed beamlet doses in this beam in an inhomogeneous phantom. Agreement within 1% for the maximum difference and 0.55% for the average difference was observed. We then validated the proposed MC-based optimization schemes in one lung IMRT case. It was found that the conventional scheme required 106 particles from each beamlet to achieve an optimization result that was 3% difference in fluence map and 1% difference in dose from the ground truth. In contrast, the proposed scheme achieved the same level of accuracy with on average 1.2 × 105 particles per beamlet. Correspondingly, the computation time

  14. Development of a practical Monte Carlo based fuel management system for the Penn State University Breazeale Research Reactor (PSBR)

    International Nuclear Information System (INIS)

    A practical fuel management system for the he Pennsylvania State University Breazeale Research Reactor (PSBR) based on the advanced Monte Carlo methodology was developed from the existing fuel management tool in this research. Several modeling improvements were implemented to the old system. The improved fuel management system can now utilize the burnup dependent cross section libraries generated specifically for PSBR fuel and it is also able to update the cross sections of these libraries by the Monte Carlo calculation automatically. Considerations were given to balance the computation time and the accuracy of the cross section update. Thus, certain types of a limited number of isotopes, which are considered 'important', are calculated and updated by the scheme. Moreover, the depletion algorithm of the existing fuel management tool was replaced from the predictor only to the predictor-corrector depletion scheme to account for burnup spectrum changes during the burnup step more accurately. An intermediate verification of the fuel management system was performed to assess the correctness of the newly implemented schemes against HELIOS. It was found that the agreement of both codes is good when the same energy released per fission (Q values) is used. Furthermore, to be able to model the reactor at various temperatures, the fuel management tool is able to utilize automatically the continuous cross sections generated at different temperatures. Other additional useful capabilities were also added to the fuel management tool to make it easy to use and be practical. As part of the development, a hybrid nodal diffusion/Monte Carlo calculation was devised to speed up the Monte Carlo calculation by providing more converged initial source distribution for the Monte Carlo calculation from the nodal diffusion calculation. Finally, the fuel management system was validated against the measured data using several actual PSBR core loadings. The agreement of the predicted core

  15. Main Parameters of LCxFCC Based Electron-Proton Colliders

    CERN Document Server

    Acar, Y C; Oner, B B; Sultansoy, S

    2016-01-01

    Multi-TeV center of mass energy ep colliders based on the Future Circular Collider (FCC) and linear colliders (LC) are proposed and corresponding luminosity values are estimated. Parameters of upgraded versions of the FCC are determined to optimize luminosity of electron-proton collisions keeping beam-beam effects in mind. It is shown that L_{ep}\\sim10^{32}\\,cm^{-2}s^{-1} can be achieved with moderate upgrade of the FCC parameters.

  16. Electrocatalytic Hydrogen Production by an Aluminum(III) Complex: Ligand-Based Proton and Electron Transfer.

    Science.gov (United States)

    Thompson, Emily J; Berben, Louise A

    2015-09-28

    Environmentally sustainable hydrogen-evolving electrocatalysts are key in a renewable fuel economy, and ligand-based proton and electron transfer could circumvent the need for precious metal ions in electrocatalytic H2 production. Herein, we show that electrocatalytic generation of H2 by a redox-active ligand complex of Al(3+) occurs at -1.16 V vs. SCE (500 mV overpotential). PMID:26249108

  17. High temperature proton exchange membranes based on polybenzimidazoles for fuel cells

    OpenAIRE

    Li, Qingfeng; Jensen, Jens Oluf; Savinell, Robert F; Bjerrum, Niels J.

    2009-01-01

    To achieve high temperature operation of proton exchange membrane fuel cells (PEMFC), preferably under ambient pressure, acid–base polymer membranes represent an effective approach. The phosphoric acid-doped polybenzimidazole membrane seems so far the most successful system in the field. It has in recent years motivated extensive research activities with great progress. This treatise is devoted to updating the development, covering polymer synthesis, membrane casting, physicochemical characte...

  18. Monte-Carlo simulation of an ultra small-angle neutron scattering instrument based on Soller slits

    Energy Technology Data Exchange (ETDEWEB)

    Rieker, T. [Univ. of New Mexico, Albuquerque, NM (United States); Hubbard, P. [Sandia National Labs., Albuquerque, NM (United States)

    1997-09-01

    Monte Carlo simulations are used to investigate an ultra small-angle neutron scattering instrument for use at a pulsed source based on a Soller slit collimator and analyzer. The simulations show that for a q{sub min} of {approximately}le-4 {angstrom}{sup -1} (15 {angstrom} neutrons) a few tenths of a percent of the incident flux is transmitted through both collimators at q=0.

  19. Monte-Carlo Simulation for PDC-Based Optical CDMA System

    Directory of Open Access Journals (Sweden)

    FAHIM AZIZ UMRANI

    2010-10-01

    Full Text Available This paper presents the Monte-Carlo simulation of Optical CDMA (Code Division Multiple Access systems, and analyse its performance in terms of the BER (Bit Error Rate. The spreading sequence chosen for CDMA is Perfect Difference Codes. Furthermore, this paper derives the expressions of noise variances from first principles to calibrate the noise for both bipolar (electrical domain and unipolar (optical domain signalling required for Monte-Carlo simulation. The simulated results conform to the theory and show that the receiver gain mismatch and splitter loss at the transceiver degrades the system performance.

  20. Developments of new proton conducting membranes based on different polybenzimidazole structures for fuel cells applications

    Energy Technology Data Exchange (ETDEWEB)

    Carollo, A.; Quartarone, E.; Tomasi, C.; Mustarelli, P.; Belotti, F.; Magistris, A. [Department of Physical Chemistry, IENI-CNR and INSTM, University of Pavia Via Taramelli 16, 27100 Pavia (Italy); Maestroni, F.; Parachini, M.; Garlaschelli, L.; Righetti, P.P. [Department of Organic Chemistry, University of Pavia Via Taramelli 12, 27100 Pavia (Italy)

    2006-09-29

    The current goal on PEMFCs research points towards the optimization of devices working at temperatures above 100{sup o}C and at low humidity levels. Acid-doped polybenzimidazoles are particularly appealing because of high proton conductivity without humidification and promising fuel cells performances. In this paper we present the development of new proton conducting membranes based on different polybenzimidazole (PBI) structures. Phosphoric acid-doped membranes, synthesized from benzimidazole-based monomers with increased basicity and molecular weight, are presented and discussed. Test of methanol crossover and diffusion were performed in order to check the membrane suitability for DMFCs. Both the acid doping level and proton conductivity remarkably increase with the membrane molecular weight and basicity, which strictly depend on the amount of NH-groups as well as on their position in the polymer backbone. In particular, a conductivity value exceeding 0.1Scm{sup -1} at RH=40% and 80{sup o}C was reached in the case of the pyridine-based PBI. (author)

  1. Test Population Selection from Weibull-Based, Monte Carlo Simulations of Fatigue Life

    Science.gov (United States)

    Vlcek, Brian L.; Zaretsky, Erwin V.; Hendricks, Robert C.

    2012-01-01

    Fatigue life is probabilistic and not deterministic. Experimentally establishing the fatigue life of materials, components, and systems is both time consuming and costly. As a result, conclusions regarding fatigue life are often inferred from a statistically insufficient number of physical tests. A proposed methodology for comparing life results as a function of variability due to Weibull parameters, variability between successive trials, and variability due to size of the experimental population is presented. Using Monte Carlo simulation of randomly selected lives from a large Weibull distribution, the variation in the L10 fatigue life of aluminum alloy AL6061 rotating rod fatigue tests was determined as a function of population size. These results were compared to the L10 fatigue lives of small (10 each) populations from AL2024, AL7075 and AL6061. For aluminum alloy AL6061, a simple algebraic relationship was established for the upper and lower L10 fatigue life limits as a function of the number of specimens failed. For most engineering applications where less than 30 percent variability can be tolerated in the maximum and minimum values, at least 30 to 35 test samples are necessary. The variability of test results based on small sample sizes can be greater than actual differences, if any, that exists between materials and can result in erroneous conclusions. The fatigue life of AL2024 is statistically longer than AL6061 and AL7075. However, there is no statistical difference between the fatigue lives of AL6061 and AL7075 even though AL7075 had a fatigue life 30 percent greater than AL6061.

  2. TH-C-17A-08: Monte Carlo Based Design of Efficient Scintillating Fiber Dosimeters

    International Nuclear Information System (INIS)

    Purpose: To accurately predict Cherenkov radiation generation in scintillating fiber dosimeters. Quantifying Cherenkov radiation provides a method for optimizing fiber dimensions, orientation, optical filters, and photodiode spectral sensitivity to achieve efficient real time imaging dosimeter designs. Methods: We develop in-house Monte Carlo simulation software to model polymer scintillation fibers' fluorescence and Cherenkov emission in megavoltage clinical beams. The model computes emissions using generation probabilities, wavelength sampling, fiber photon capture, and fiber transport efficiency and incorporates the fiber's index of refraction, optical attenuation in the Cherenkov and visible spectrum and fiber dimensions. Detector component selection based on parameters such as silicon photomultiplier efficiency and optical coupling filters separates Cherenkov radiation from the dose-proportional scintillating emissions. The computation uses spectral and geometrical separation of Cherenkov radiation, however other filtering techniques can expand the model. Results: We compute Cherenkov generation per electron and fiber capture and transmission of those photons toward the detector with incident electron beam angle dependence. The model accounts for beam obliquity and nonperpendicular electron fiber impingement, which increases Cherenkov emission and trapping. The rotational angle around square fibers shows trapping efficiency variation from the normally incident minimum to a maximum at 45 degrees rotation. For rotation in the plane formed by the fiber axis and its surface normal, trapping efficiency increases with angle from the normal. The Cherenkov spectrum follows the theoretical curve from 300nm to 800nm, the wavelength range of interest defined by silicon photomultiplier and photodiode spectral efficiency. Conclusion: We are able to compute Cherenkov generation in realistic real time scintillating fiber dosimeter geometries. Design parameters

  3. Monte Carlo-based searching as a tool to study carbohydrate structure.

    Science.gov (United States)

    Dowd, Michael K; Kiely, Donald E; Zhang, Jinsong

    2011-07-01

    A torsion angle-based Monte Carlo searching routine was developed and applied to several carbohydrate modeling problems. The routine was developed as a Unix shell script that calls several programs, which allows it to be interfaced with multiple potential functions and various utilities for evaluating conformers. In its current form, the program operates with several versions of the MM3 and MM4 molecular mechanics programs and has a module to calculate hydrogen-hydrogen coupling constants. The routine was used to study the low-energy exo-cyclic substituents of β-D-glucopyranose and the conformers of D-glucaramide, both of which had been previously studied with MM3 by full conformational searches. For these molecules, the program found all previously reported low-energy structures. The routine was also used to find favorable conformers of 2,3,4,5-tetra-O-acetyl-N,N'-dimethyl-D-glucaramide and D-glucitol, the latter of which is believed to have many low-energy forms. Finally, the technique was used to study the inter-ring conformations of β-gentiobiose, a β-(1→6)-linked disaccharide of D-glucopyranose. The program easily found conformers in the 10 previously identified low-energy regions for this disaccharide. In 6 of the 10 local regions, the same previously identified low-energy structures were found. In the remaining four regions, the search identified structures with slightly lower energies than those previously reported. The approach should be useful for extending modeling studies on acyclic monosaccharides and possibly oligosaccharides. PMID:21536262

  4. The Enhancement on Proton Conductivity of Stable Polyoxometalate-Based Coordination Polymers by the Synergistic Effect of MultiProton Units.

    Science.gov (United States)

    Li, Jing; Cao, Xue-Li; Wang, Yuan-Yuan; Zhang, Shu-Ran; Du, Dong-Ying; Qin, Jun-Sheng; Li, Shun-Li; Su, Zhong-Min; Lan, Ya-Qian

    2016-06-27

    Two novel polyoxometalate (POM)-based coordination polymers, namely, [Co(bpz)(Hbpz)][Co(SO4 )0.5 (H2 O)2 (bpz)]4 [PMo(VI) 8 Mo(V) 4 V(IV) 4 O42 ]⋅13 H2 O (NENU-530) and [Ni2 (bpz)(Hbpz)3 (H2 O)2 ][PMo(VI) 8 Mo(V) 4 V(IV) 4 O44 ]⋅8 H2 O (NENU-531) (H2 bpz=3,3',5,5'-tetramethyl-4,4'-bipyrazole), were isolated by hydrothermal methods, which represented 3D networks constructed by POM units, the protonated ligand and sulfate group. In contrast with most POM-based coordination polymers, these two compounds exhibit exceptional excellent chemical and thermal stability. More importantly, NENU-530 shows a high proton conductivity of 1.5×10(-3)  S cm(-1) at 75 °C and 98 % RH, which is one order of magnitude higher than that of NENU-531. Furthermore, structural analysis and functional measurement successfully demonstrated that the introduction of sulfate group is favorable for proton conductivity. Herein, the syntheses, crystal structures, proton conductivity, and the relationship between structure and property are presented. PMID:27243145

  5. Monte Carlo-based multiphysics coupling analysis of x-ray pulsar telescope

    Science.gov (United States)

    Li, Liansheng; Deng, Loulou; Mei, Zhiwu; Zuo, Fuchang; Zhou, Hao

    2015-10-01

    X-ray pulsar telescope (XPT) is a complex optical payload, which involves optical, mechanical, electrical and thermal disciplines. The multiphysics coupling analysis (MCA) plays an important role in improving the in-orbit performance. However, the conventional MCA methods encounter two serious problems in dealing with the XTP. One is that both the energy and reflectivity information of X-ray can't be taken into consideration, which always misunderstands the essence of XPT. Another is that the coupling data can't be transferred automatically among different disciplines, leading to computational inefficiency and high design cost. Therefore, a new MCA method for XPT is proposed based on the Monte Carlo method and total reflective theory. The main idea, procedures and operational steps of the proposed method are addressed in detail. Firstly, it takes both the energy and reflectivity information of X-ray into consideration simultaneously. And formulate the thermal-structural coupling equation and multiphysics coupling analysis model based on the finite element method. Then, the thermalstructural coupling analysis under different working conditions has been implemented. Secondly, the mirror deformations are obtained using construction geometry function. Meanwhile, the polynomial function is adopted to fit the deformed mirror and meanwhile evaluate the fitting error. Thirdly, the focusing performance analysis of XPT can be evaluated by the RMS. Finally, a Wolter-I XPT is taken as an example to verify the proposed MCA method. The simulation results show that the thermal-structural coupling deformation is bigger than others, the vary law of deformation effect on the focusing performance has been obtained. The focusing performances of thermal-structural, thermal, structural deformations have degraded 30.01%, 14.35% and 7.85% respectively. The RMS of dispersion spot are 2.9143mm, 2.2038mm and 2.1311mm. As a result, the validity of the proposed method is verified through

  6. Monte Carlo-based QA for IMRT of head and neck cancers

    Science.gov (United States)

    Tang, F.; Sham, J.; Ma, C.-M.; Li, J.-S.

    2007-06-01

    It is well-known that the presence of large air cavity in a dense medium (or patient) introduces significant electronic disequilibrium when irradiated with megavoltage X-ray field. This condition may worsen by the possible use of tiny beamlets in intensity-modulated radiation therapy (IMRT). Commercial treatment planning systems (TPSs), in particular those based on the pencil-beam method, do not provide accurate dose computation for the lungs and other cavity-laden body sites such as the head and neck. In this paper we present the use of Monte Carlo (MC) technique for dose re-calculation of IMRT of head and neck cancers. In our clinic, a turn-key software system is set up for MC calculation and comparison with TPS-calculated treatment plans as part of the quality assurance (QA) programme for IMRT delivery. A set of 10 off-the-self PCs is employed as the MC calculation engine with treatment plan parameters imported from the TPS via a graphical user interface (GUI) which also provides a platform for launching remote MC simulation and subsequent dose comparison with the TPS. The TPS-segmented intensity maps are used as input for the simulation hence skipping the time-consuming simulation of the multi-leaf collimator (MLC). The primary objective of this approach is to assess the accuracy of the TPS calculations in the presence of air cavities in the head and neck whereas the accuracy of leaf segmentation is verified by fluence measurement using a fluoroscopic camera-based imaging device. This measurement can also validate the correct transfer of intensity maps to the record and verify system. Comparisons between TPS and MC calculations of 6 MV IMRT for typical head and neck treatments review regional consistency in dose distribution except at and around the sinuses where our pencil-beam-based TPS sometimes over-predicts the dose by up to 10%, depending on the size of the cavities. In addition, dose re-buildup of up to 4% is observed at the posterior nasopharyngeal

  7. Monte Carlo-based QA for IMRT of head and neck cancers

    International Nuclear Information System (INIS)

    It is well-known that the presence of large air cavity in a dense medium (or patient) introduces significant electronic disequilibrium when irradiated with megavoltage X-ray field. This condition may worsen by the possible use of tiny beamlets in intensity-modulated radiation therapy (IMRT). Commercial treatment planning systems (TPSs), in particular those based on the pencil-beam method, do not provide accurate dose computation for the lungs and other cavity-laden body sites such as the head and neck. In this paper we present the use of Monte Carlo (MC) technique for dose re-calculation of IMRT of head and neck cancers. In our clinic, a turn-key software system is set up for MC calculation and comparison with TPS-calculated treatment plans as part of the quality assurance (QA) programme for IMRT delivery. A set of 10 off-the-self PCs is employed as the MC calculation engine with treatment plan parameters imported from the TPS via a graphical user interface (GUI) which also provides a platform for launching remote MC simulation and subsequent dose comparison with the TPS. The TPS-segmented intensity maps are used as input for the simulation hence skipping the time-consuming simulation of the multi-leaf collimator (MLC). The primary objective of this approach is to assess the accuracy of the TPS calculations in the presence of air cavities in the head and neck whereas the accuracy of leaf segmentation is verified by fluence measurement using a fluoroscopic camera-based imaging device. This measurement can also validate the correct transfer of intensity maps to the record and verify system. Comparisons between TPS and MC calculations of 6 MV IMRT for typical head and neck treatments review regional consistency in dose distribution except at and around the sinuses where our pencil-beam-based TPS sometimes over-predicts the dose by up to 10%, depending on the size of the cavities. In addition, dose re-buildup of up to 4% is observed at the posterior nasopharyngeal

  8. First macro Monte Carlo based commercial dose calculation module for electron beam treatment planning—new issues for clinical consideration

    Science.gov (United States)

    Ding, George X.; Duggan, Dennis M.; Coffey, Charles W.; Shokrani, Parvaneh; Cygler, Joanna E.

    2006-06-01

    The purpose of this study is to present our experience of commissioning, testing and use of the first commercial macro Monte Carlo based dose calculation algorithm for electron beam treatment planning and to investigate new issues regarding dose reporting (dose-to-water versus dose-to-medium) as well as statistical uncertainties for the calculations arising when Monte Carlo based systems are used in patient dose calculations. All phantoms studied were obtained by CT scan. The calculated dose distributions and monitor units were validated against measurements with film and ionization chambers in phantoms containing two-dimensional (2D) and three-dimensional (3D) type low- and high-density inhomogeneities at different source-to-surface distances. Beam energies ranged from 6 to 18 MeV. New required experimental input data for commissioning are presented. The result of validation shows an excellent agreement between calculated and measured dose distributions. The calculated monitor units were within 2% of measured values except in the case of a 6 MeV beam and small cutout fields at extended SSDs (>110 cm). The investigation on the new issue of dose reporting demonstrates the differences up to 4% for lung and 12% for bone when 'dose-to-medium' is calculated and reported instead of 'dose-to-water' as done in a conventional system. The accuracy of the Monte Carlo calculation is shown to be clinically acceptable even for very complex 3D-type inhomogeneities. As Monte Carlo based treatment planning systems begin to enter clinical practice, new issues, such as dose reporting and statistical variations, may be clinically significant. Therefore it is imperative that a consistent approach to dose reporting is used.

  9. First macro Monte Carlo based commercial dose calculation module for electron beam treatment planning-new issues for clinical consideration

    International Nuclear Information System (INIS)

    The purpose of this study is to present our experience of commissioning, testing and use of the first commercial macro Monte Carlo based dose calculation algorithm for electron beam treatment planning and to investigate new issues regarding dose reporting (dose-to-water versus dose-to-medium) as well as statistical uncertainties for the calculations arising when Monte Carlo based systems are used in patient dose calculations. All phantoms studied were obtained by CT scan. The calculated dose distributions and monitor units were validated against measurements with film and ionization chambers in phantoms containing two-dimensional (2D) and three-dimensional (3D) type low- and high-density inhomogeneities at different source-to-surface distances. Beam energies ranged from 6 to 18 MeV. New required experimental input data for commissioning are presented. The result of validation shows an excellent agreement between calculated and measured dose distributions. The calculated monitor units were within 2% of measured values except in the case of a 6 MeV beam and small cutout fields at extended SSDs (>110 cm). The investigation on the new issue of dose reporting demonstrates the differences up to 4% for lung and 12% for bone when 'dose-to-medium' is calculated and reported instead of 'dose-to-water' as done in a conventional system. The accuracy of the Monte Carlo calculation is shown to be clinically acceptable even for very complex 3D-type inhomogeneities. As Monte Carlo based treatment planning systems begin to enter clinical practice, new issues, such as dose reporting and statistical variations, may be clinically significant. Therefore it is imperative that a consistent approach to dose reporting is used

  10. Monte Carlo based calibration of scintillation detectors for laboratory and in situ gamma ray measurements

    NARCIS (Netherlands)

    van der Graaf, E. R.; Limburg, J.; Koomans, R. L.; Tijs, M.

    2011-01-01

    The calibration of scintillation detectors for gamma radiation in a well characterized setup can be transferred to other geometries using Monte Carlo simulations to account for the differences between the calibration and the other geometry. In this study a calibration facility was used that is const

  11. Performance analysis based on a Monte Carlo simulation of a liquid xenon PET detector

    International Nuclear Information System (INIS)

    Liquid xenon is a very attractive medium for position-sensitive gamma-ray detectors for a very wide range of applications, namely, in medical radionuclide imaging. Recently, the authors have proposed a liquid xenon detector for positron emission tomography (PET). In this paper, some aspects of the performance of a liquid xenon PET detector prototype were studied by means of Monte Carlo simulation

  12. The information-based complexity of approximation problem by adaptive Monte Carlo methods

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    In this paper, we study the complexity of information of approximation problem on the multivariate Sobolev space with bounded mixed derivative MWpr,α(Td), 1 < p < ∞, in the norm of Lq(Td), 1 < q < ∞, by adaptive Monte Carlo methods. Applying the discretization technique and some properties of pseudo-s-scale, we determine the exact asymptotic orders of this problem.

  13. Development of three-dimensional program based on Monte Carlo and discrete ordinates bidirectional coupling method

    International Nuclear Information System (INIS)

    The Monte Carlo (MC) and discrete ordinates (SN) are the commonly used methods in the design of radiation shielding. Monte Carlo method is able to treat the geometry exactly, but time-consuming in dealing with the deep penetration problem. The discrete ordinate method has great computational efficiency, but it is quite costly in computer memory and it suffers from ray effect. Single discrete ordinates method or single Monte Carlo method has limitation in shielding calculation for large complex nuclear facilities. In order to solve the problem, the Monte Carlo and discrete ordinates bidirectional coupling method is developed. The bidirectional coupling method is implemented in the interface program to transfer the particle probability distribution of MC and angular flux of discrete ordinates. The coupling method combines the advantages of MC and SN. The test problems of cartesian and cylindrical coordinate have been calculated by the coupling methods. The calculation results are performed with comparison to MCNP and TORT and satisfactory agreements are obtained. The correctness of the program is proved. (authors)

  14. Geometry navigation acceleration based on automatic neighbor search and oriented bounding box in Monte Carlo simulation

    International Nuclear Information System (INIS)

    Geometry navigation plays the most fundamental role in Monte Carlo particle transport simulation. It's mainly responsible for locating a particle inside which geometry volume it is and computing the distance to the volume boundary along the certain particle trajectory during each particle history. Geometry navigation directly affects the run-time performance of the Monte Carlo particle transport simulation, especially for large scale complicated systems. Two geometry acceleration algorithms, the automatic neighbor search algorithm and the oriented bounding box algorithm, are presented for improving geometry navigation performance. The algorithms have been implemented in the Super Monte Carlo Calculation Program for Nuclear and Radiation Process (SuperMC) version 2.0. The FDS-II and ITER benchmark models have been tested to highlight the efficiency gains that can be achieved by using the acceleration algorithms. The exact gains may be problem dependent, but testing results showed that runtime of Monte Carlo simulation can be considerably reduced 50%∼60% with the proposed acceleration algorithms. (author)

  15. A Nonvolatile MOSFET Memory Device Based on Mobile Protons in SiO(2) Thin Films

    Energy Technology Data Exchange (ETDEWEB)

    Vanheusden, K.; Warren, W.L.; Devine, R.A.B.; Fleetwood, D.M.; Draper, B.L.; Schwank, J.R.

    1999-03-02

    It is shown how mobile H{sup +} ions can be generated thermally inside the oxide layer of Si/SiO{sub 2}/Si structures. The technique involves only standard silicon processing steps: the nonvolatile field effect transistor (NVFET) is based on a standard MOSFET with thermally grown SiO{sub 2} capped with a poly-silicon layer. The capped thermal oxide receives an anneal at {approximately}1100 C that enables the incorporation of the mobile protons into the gate oxide. The introduction of the protons is achieved by a subsequent 500-800 C anneal in a hydrogen-containing ambient, such as forming gas (N{sub 2}:H{sub 2} 95:5). The mobile protons are stable and entrapped inside the oxide layer, and unlike alkali ions, their space-charge distribution can be controlled and rapidly rearranged at room temperature by an applied electric field. Using this principle, a standard MOS transistor can be converted into a nonvolatile memory transistor that can be switched between normally on and normally off. Switching speed, retention, endurance, and radiation tolerance data are presented showing that this non-volatile memory technology can be competitive with existing Si-based non-volatile memory technologies such as the floating gate technologies (e.g. Flash memory).

  16. Qualitative Simulation of Photon Transport in Free Space Based on Monte Carlo Method and Its Parallel Implementation

    Directory of Open Access Journals (Sweden)

    Jimin Liang

    2010-01-01

    Full Text Available During the past decade, Monte Carlo method has obtained wide applications in optical imaging to simulate photon transport process inside tissues. However, this method has not been effectively extended to the simulation of free-space photon transport at present. In this paper, a uniform framework for noncontact optical imaging is proposed based on Monte Carlo method, which consists of the simulation of photon transport both in tissues and in free space. Specifically, the simplification theory of lens system is utilized to model the camera lens equipped in the optical imaging system, and Monte Carlo method is employed to describe the energy transformation from the tissue surface to the CCD camera. Also, the focusing effect of camera lens is considered to establish the relationship of corresponding points between tissue surface and CCD camera. Furthermore, a parallel version of the framework is realized, making the simulation much more convenient and effective. The feasibility of the uniform framework and the effectiveness of the parallel version are demonstrated with a cylindrical phantom based on real experimental results.

  17. Dual-energy CT-based material extraction for tissue segmentation in Monte Carlo dose calculations

    Science.gov (United States)

    Bazalova, Magdalena; Carrier, Jean-François; Beaulieu, Luc; Verhaegen, Frank

    2008-05-01

    Monte Carlo (MC) dose calculations are performed on patient geometries derived from computed tomography (CT) images. For most available MC codes, the Hounsfield units (HU) in each voxel of a CT image have to be converted into mass density (ρ) and material type. This is typically done with a (HU; ρ) calibration curve which may lead to mis-assignment of media. In this work, an improved material segmentation using dual-energy CT-based material extraction is presented. For this purpose, the differences in extracted effective atomic numbers Z and the relative electron densities ρe of each voxel are used. Dual-energy CT material extraction based on parametrization of the linear attenuation coefficient for 17 tissue-equivalent inserts inside a solid water phantom was done. Scans of the phantom were acquired at 100 kVp and 140 kVp from which Z and ρe values of each insert were derived. The mean errors on Z and ρe extraction were 2.8% and 1.8%, respectively. Phantom dose calculations were performed for 250 kVp and 18 MV photon beams and an 18 MeV electron beam in the EGSnrc/DOSXYZnrc code. Two material assignments were used: the conventional (HU; ρ) and the novel (HU; ρ, Z) dual-energy CT tissue segmentation. The dose calculation errors using the conventional tissue segmentation were as high as 17% in a mis-assigned soft bone tissue-equivalent material for the 250 kVp photon beam. Similarly, the errors for the 18 MeV electron beam and the 18 MV photon beam were up to 6% and 3% in some mis-assigned media. The assignment of all tissue-equivalent inserts was accurate using the novel dual-energy CT material assignment. As a result, the dose calculation errors were below 1% in all beam arrangements. Comparable improvement in dose calculation accuracy is expected for human tissues. The dual-energy tissue segmentation offers a significantly higher accuracy compared to the conventional single-energy segmentation.

  18. Monte Carlo simulation and radiometric characterization of proton irradiated [18O]H₂O for the treatment of the waste streams originated from [18F]FDG synthesis process.

    Science.gov (United States)

    Remetti, Romolo; Burgio, Nunzio T; Maciocco, Luca; Arcese, Manuele; Filannino, M Azzurra

    2011-07-01

    The aim of this work is quantifying the radionuclidic impurities of the irradiated [(18)O]water originated by the [(18)F]FDG synthesis process, and characterizing, from a radioprotection point of view, the waste streams produced. Two samples of 2.4ml [(18)O]H(2)O, contained in two different target cells, have been irradiated with a proton current of 37μA in a PETtrace cyclotron for about one hour each; after irradiation, without performing any chemical purification process but waiting only for the (18)F decay, they have been transferred in two vials and measured by HPGe gamma spectrometry and, subsequently, by Liquid Scintillation Counting. Previously, Monte Carlo calculations had been carried out in order to estimate the radionuclides generated within the target components ([(18)O]H(2)O, silver body and Havar® foil), with the aim to identify the nuclides expected to be found in the irradiated water. Experimental results for the two samples, normalized to the same irradiation time, show practically the same value of tritium concentration (about 36kBq/ml) while gamma emitters activity concentrations exhibit a greater spread. Considering that tritium derives from water activation while other pollutants are caused by activated cell materials released into water through erosion/corrosion mechanisms, such a spread is likely to be attributable to differences in the proton beam shape and position (production of different natural circulation patterns inside the target and different erosion mechanisms of the target cell walls). Both tritium and the other radioactive pollutants exhibit absolute values of activity and activity concentrations below the exemption limits set down in EURATOM Council Directive 96/29. PMID:21353574

  19. Biopolymer Materials Based Carboxymethyl Cellulose as a Proton Conducting Biopolymer Electrolyte for Application in Rechargeable Proton Battery

    International Nuclear Information System (INIS)

    This paper presents the discovery on proton conducting biopolymer electrolyte (BPE) by incorporating various NH4Br composition (wt%) with biopolymer materials carboxymethyl cellulose (CMC) which has been prepared via solution casting method. The biopolymer–salt complex formation has been analyzed through Fourier Transform Infrared (FTIR) spectroscopy, Thermo Gravimetric Analysis (TGA), impedance and transference number measurement (TNM). The highest ionic conductivity at ambient temperature is 1.12 × 10−4 S cm−1 for sample containing 25 wt% NH4Br. It has been shown that the conducting element in this work are predominantly due to proton (H+) which was confirmed via FTIR and TNM analysis. Rechargeable proton conducting BPE battery have been fabricated with the configuration of Zn + ZnSO4.7H2O/BPE/MnO2 and produced a maximum open circuit potential (OCP) of 1.36 V at ambient temperature and showed good rechargeability. This work implies that the possible practical application of the present electrolytes as a new invention in the fabrication of electrochemical devices

  20. Determining the incident electron fluence for Monte Carlo-based photon treatment planning using a standard measured data set

    International Nuclear Information System (INIS)

    An accurate dose calculation in phantom and patient geometries requires an accurate description of the radiation source. Errors in the radiation source description are propagated through the dose calculation. With the emergence of linear accelerators whose dosimetric characteristics are similar to within measurement uncertainty, the same radiation source description can be used as the input to dose calculation for treatment planning at many institutions with the same linear accelerator model. Our goal in the current research was to determine the initial electron fluence above the linear accelerator target for such an accelerator to allow a dose calculation in water to within 1% or 1 mm of the measured data supplied by the manufacturer. The method used for both the radiation source description and the patient transport was Monte Carlo. The linac geometry was input into the Monte Carlo code using the accelerator's manufacturer's specifications. Assumptions about the initial electron source above the target were made based on previous studies. The free parameters derived for the calculations were the mean energy and radial Gaussian width of the initial electron fluence and the target density. A combination of the free parameters yielded an initial electron fluence that, when transported through the linear accelerator and into the phantom, allowed a dose-calculation agreement to the experimental ion chamber data to within the specified criteria at both 6 and 18 MV nominal beam energies, except near the surface, particularly for the 18 MV beam. To save time during Monte Carlo treatment planning, the initial electron fluence was transported through part of the treatment head to a plane between the monitor chambers and the jaws and saved as phase-space files. These files are used for clinical Monte Carlo-based treatment planning and are freely available from the authors

  1. An easily sintered, chemically stable, barium zirconate-based proton conductor for high-performance proton-conducting solid oxide fuel cells

    KAUST Repository

    Sun, Wenping

    2014-07-25

    Yttrium and indium co-doped barium zirconate is investigated to develop a chemically stable and sintering active proton conductor for solid oxide fuel cells (SOFCs). BaZr0.8Y0.2-xInxO3- δ possesses a pure cubic perovskite structure. The sintering activity of BaZr0.8Y0.2-xInxO3- δ increases significantly with In concentration. BaZr0.8Y0.15In0.05O3- δ (BZYI5) exhibits the highest total electrical conductivity among the sintered oxides. BZYI5 also retains high chemical stability against CO2, vapor, and reduction of H2. The good sintering activity, high conductivity, and chemical stability of BZYI5 facilitate the fabrication of durable SOFCs based on a highly conductive BZYI5 electrolyte film by cost-effective ceramic processes. Fully dense BZYI5 electrolyte film is successfully prepared on the anode substrate by a facile drop-coating technique followed by co-firing at 1400 °C for 5 h in air. The BZYI5 film exhibits one of the highest conductivity among the BaZrO3-based electrolyte films with various sintering aids. BZYI5-based single cells output very encouraging and by far the highest peak power density for BaZrO3-based proton-conducting SOFCs, reaching as high as 379 mW cm-2 at 700 °C. The results demonstrate that Y and In co-doping is an effective strategy for exploring sintering active and chemically stable BaZrO3-based proton conductors for high performance proton-conducting SOFCs. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Future colliders based on a modulated proton bunch driven plasma wakefield acceleration

    CERN Document Server

    Xia, Guoxing; Muggli, Patric

    2012-01-01

    Recent simulation shows that a self-modulated high energy proton bunch can excite a large amplitude plasma wakefield and accelerate an externally injected electron bunch to the energy frontier in a single stage acceleration through a long plasma channel. Based on this scheme, future colliders, either an electron-positron linear collider (e+-e- collider) or an electron-hadron collider (e-p collider) can be conceived. In this paper, we discuss some key design issues for an e+-e- collider and a high energy e-p collider, based on the existing infrastructure of the CERN accelerator complex.

  3. Proton exchange membrane fuel cells modeling based on artificial neural networks

    Institute of Scientific and Technical Information of China (English)

    Yudong Tian; Xinjian Zhu; Guangyi Cao

    2005-01-01

    To understand the complexity of the mathematical models of a proton exchange membrane fuel cell (PEMFC) and their shortage of practical PEMFC control, the PEMFC complex mechanism and the existing PEMFC models are analyzed, and artificial neural networks based PEMFC modeling is advanced. The structure, algorithm, training and simulation of PEMFC modeling based on improved BP networks are given out in detail. The computer simulation and conducted experiment verify that this model is fast and accurate, and can be used as a suitable operational model for PEMFC real-time control.

  4. Uncertainties in Monte Carlo-based absorbed dose calculations for an experimental benchmark

    International Nuclear Information System (INIS)

    There is a need to verify the accuracy of general purpose Monte Carlo codes like EGSnrc, which are commonly employed for investigations of dosimetric problems in radiation therapy. A number of experimental benchmarks have been published to compare calculated values of absorbed dose to experimentally determined values. However, there is a lack of absolute benchmarks, i.e. benchmarks without involved normalization which may cause some quantities to be cancelled. Therefore, at the Physikalisch-Technische Bundesanstalt a benchmark experiment was performed, which aimed at the absolute verification of radiation transport calculations for dosimetry in radiation therapy. A thimble-type ionization chamber in a solid phantom was irradiated by high-energy bremsstrahlung and the mean absorbed dose in the sensitive volume was measured per incident electron of the target. The characteristics of the accelerator and experimental setup were precisely determined and the results of a corresponding Monte Carlo simulation with EGSnrc are presented within this study. For a meaningful comparison, an analysis of the uncertainty of the Monte Carlo simulation is necessary. In this study uncertainties with regard to the simulation geometry, the radiation source, transport options of the Monte Carlo code and specific interaction cross sections are investigated, applying the general methodology of the Guide to the expression of uncertainty in measurement. Besides studying the general influence of changes in transport options of the EGSnrc code, uncertainties are analyzed by estimating the sensitivity coefficients of various input quantities in a first step. Secondly, standard uncertainties are assigned to each quantity which are known from the experiment, e.g. uncertainties for geometric dimensions. Data for more fundamental quantities such as photon cross sections and the I-value of electron stopping powers are taken from literature. The significant uncertainty contributions are identified as

  5. Uncertainties in Monte Carlo-based absorbed dose calculations for an experimental benchmark

    Science.gov (United States)

    Renner, F.; Wulff, J.; Kapsch, R.-P.; Zink, K.

    2015-10-01

    There is a need to verify the accuracy of general purpose Monte Carlo codes like EGSnrc, which are commonly employed for investigations of dosimetric problems in radiation therapy. A number of experimental benchmarks have been published to compare calculated values of absorbed dose to experimentally determined values. However, there is a lack of absolute benchmarks, i.e. benchmarks without involved normalization which may cause some quantities to be cancelled. Therefore, at the Physikalisch-Technische Bundesanstalt a benchmark experiment was performed, which aimed at the absolute verification of radiation transport calculations for dosimetry in radiation therapy. A thimble-type ionization chamber in a solid phantom was irradiated by high-energy bremsstrahlung and the mean absorbed dose in the sensitive volume was measured per incident electron of the target. The characteristics of the accelerator and experimental setup were precisely determined and the results of a corresponding Monte Carlo simulation with EGSnrc are presented within this study. For a meaningful comparison, an analysis of the uncertainty of the Monte Carlo simulation is necessary. In this study uncertainties with regard to the simulation geometry, the radiation source, transport options of the Monte Carlo code and specific interaction cross sections are investigated, applying the general methodology of the Guide to the expression of uncertainty in measurement. Besides studying the general influence of changes in transport options of the EGSnrc code, uncertainties are analyzed by estimating the sensitivity coefficients of various input quantities in a first step. Secondly, standard uncertainties are assigned to each quantity which are known from the experiment, e.g. uncertainties for geometric dimensions. Data for more fundamental quantities such as photon cross sections and the I-value of electron stopping powers are taken from literature. The significant uncertainty contributions are identified as

  6. Proton induced X-ray emission and proton induced gamma ray emission analysis in geochemical exploration for gold and base metal deposits

    Energy Technology Data Exchange (ETDEWEB)

    Pwa, Aung E-mail: a_pwa@postoffice.utas.edu.au; Siegele, R.; Cohen, D.D.; Stelcer, E.; Moort, J.C. van

    2002-05-01

    Proton induced X-ray emission (PIXE) and proton induced gamma ray emission (PIGME) analysis has been used in geochemical exploration to determine various elements in rocks and regolith in relation to gold and base metal mineralisation. Elements analysed by PIXE include K, Fe, Ca, Ti, Mn, Cl, Ga, Rb, Sr, Zr, Y, Nb, Cu, Zn, Pb, Ni, As, V and Mo, and those by PIGME are Al, Na, Mg, F and Li. One of our research areas is Cobar, northwest of New South Wales, Australia. The study areas include the McKinnons and Peak gold deposits, the Wagga Tank base metal deposit and Lower Tank prospect, northeast of the CSA mine. Au, Cu, Zn, Pb, As and Ni are elevated as ore indicators near and around the ore deposits while K, Al, Ca, Na, Ti, Rb, Sr, Ga and V are depleted due to feldspar and mica destruction during alteration.

  7. Proton induced X-ray emission and proton induced gamma ray emission analysis in geochemical exploration for gold and base metal deposits

    International Nuclear Information System (INIS)

    Proton induced X-ray emission (PIXE) and proton induced gamma ray emission (PIGME) analysis has been used in geochemical exploration to determine various elements in rocks and regolith in relation to gold and base metal mineralisation. Elements analysed by PIXE include K, Fe, Ca, Ti, Mn, Cl, Ga, Rb, Sr, Zr, Y, Nb, Cu, Zn, Pb, Ni, As, V and Mo, and those by PIGME are Al, Na, Mg, F and Li. One of our research areas is Cobar, northwest of New South Wales, Australia. The study areas include the McKinnons and Peak gold deposits, the Wagga Tank base metal deposit and Lower Tank prospect, northeast of the CSA mine. Au, Cu, Zn, Pb, As and Ni are elevated as ore indicators near and around the ore deposits while K, Al, Ca, Na, Ti, Rb, Sr, Ga and V are depleted due to feldspar and mica destruction during alteration

  8. SPL-based Proton Driver for a nu-Factory at CERN

    CERN Document Server

    Benedetto, E; Garoby, R; Meddahi, M

    2010-01-01

    The conceptual design and feasibility studies for a nu-Factory Proton Driver based on the CERN Superconducting Proton Linac (SPL) have been com- pleted. In the proposed scenario, the 4 MW proton beam (H- beam) is acceler- ated with the upgraded High Power (HP)-SPL to 5 GeV, stored in an accumu- lator ring and Þnally transported to a compressor ring, where bunch rotation takes place, in order to achieve the speciÞc time structure. We here summa- rize the choices in terms of lattice, magnet technology and RF manipulations in the two rings. The possible critical issues, such as heating of the foil for the charge-exchange injection, space-charge problems in the compressor and beam stability in the accumulator ring, have been addressed and are shown not to be show-stoppers. The analysis focuses on the baseline scenario, consider- ing 6 bunches in the accumulator, and preliminary studies are discussed for the option of 3 or a single bunch per burst.

  9. Development of a hybrid multi-scale phantom for Monte-Carlo based internal dosimetry

    International Nuclear Information System (INIS)

    Full text of publication follows. Aim: in recent years several phantoms were developed for radiopharmaceutical dosimetry in clinical and preclinical settings. Voxel-based models (Zubal, Max/Fax, ICRP110) were developed to reach a level of realism that could not be achieved by mathematical models. In turn, 'hybrid' models (XCAT, MOBY/ROBY, Mash/Fash) allow a further degree of versatility by offering the possibility to finely tune each model according to various parameters. However, even 'hybrid' models require the generation of a voxel version for Monte-Carlo modeling of radiation transport. Since absorbed dose simulation time is strictly related to geometry spatial sampling, a compromise should be made between phantom realism and simulation speed. This trade-off leads on one side in an overestimation of the size of small radiosensitive structures such as the skin or hollow organs' walls, and on the other hand to unnecessarily detailed voxellization of large, homogeneous structures. The Aim of this work is to develop a hybrid multi-resolution phantom model for Geant4 and Gate, to better characterize energy deposition in small structures while preserving reasonable computation times. Materials and Methods: we have developed a pipeline for the conversion of preexisting phantoms into a multi-scale Geant4 model. Meshes of each organ are created from raw binary images of a phantom and then voxellized to the smallest spatial sampling required by the user. The user can then decide to re-sample the internal part of each organ, while leaving a layer of smallest voxels at the edge of the organ. In this way, the realistic shape of the organ is maintained while reducing the voxel number in the inner part. For hollow organs, the wall is always modeled using the smallest voxel sampling. This approach allows choosing different voxel resolutions for each organ according to a specific application. Results: preliminary results show that it is possible to

  10. A comparison of Monte Carlo generators

    CERN Document Server

    Golan, Tomasz

    2014-01-01

    A comparison of GENIE, NEUT, NUANCE, and NuWro Monte Carlo neutrino event generators is presented using a set of four observables: protons multiplicity, total visible energy, most energetic proton momentum, and $\\pi^+$ two-dimensional energy vs cosine distribution.

  11. Preparation and Conducting Behavior of Amphibious Organic/Inorganic Hybrid Proton Exchange Membranes Based on Benzyltetrazole

    Institute of Scientific and Technical Information of China (English)

    QIAO Li-gen; SHI Wen-fang

    2012-01-01

    A series of novel amphibious organic/inorganic hybrid proton exchange membranes with H3PO4 doped which could be used under both wet and dry conditions was prepared through a sol-gel process based on acrylated triethoxysilane(A-TES)and benzyltetrazole-modified triethoxysilane(BT-TES).The dual-curing approach including UV-curing and thermal curing was used to obtain the crosslinked membranes.Polyethylene glycol(400)diacrylate(PEGDA)was used as an oligomer to form the polymeric matrix.The molecular structures of precursors were characterized by 1H,13C and 29Si NMR spectra.The thermogravimetric analysis(TGA)results show that the membranes exhibit acceptable thermal stability for their application at above 200 ℃.The differential scanning calorimeter(DSC)determination indicates that the crosslinked membranes with the mass ratios of below 1.6 of BT-TES to A-TES and the same mass of H3PO4 doped as that of A-TES possess the-Tgs,and the lowest Tg(-28.9 ℃)exists for the membrane with double mass of H3PO4 doped as well.The high proton conductivity in a range of 9.4-17.3 mS/cm with the corresponding water uptake of 19.1%-32.8% of the membranes was detected at 90 ℃ under wet conditions.Meanwhile,the proton conductivity in a dry environment for the membrane with a mass ratio of 2.4 of BT-TES to A-TES and double H3PO4 loading increases from 4.89× 10-2 mS/cm at 30 ℃ to 25.7 mS/cm at 140 ℃.The excellent proton transport ability under both hydrous and anhydrous conditions demonstrates a potential application in the polymer electrolyte membrane fuel cells.

  12. Calculation of the dynamic component of the radiation dose in HDR brachytherapy based on Monte Carlo simulations

    International Nuclear Information System (INIS)

    A method for the calculation of the transit doses in HDR brachytherapy based on Monte Carlo simulations has been presented. The transit doses resulting from a linear implant with seven dwell positions is simulated by performing calculations at all positions in which, the moving 192Ir source, instantaneously, had its geometrical centre located exactly between two adjacent dwell positions. Discrete step sizes of 0.25 cm were used to calculate the dose rates and the total transit dose at any of the calculation points evaluated. By comparing this method to the results obtained from Sievert Integrals, we observed dose calculation errors ranging from 32 to 21% for the examples considered. The errors could be much higher for longer treatment lengths where contributions from points near the longitudinal axis of the source become more important. To date, the most accurate method of calculating doses in radiotherapy is by Monte Carlo Simulations but the long computational times associated with it renders its use in treatment planning impracticable. The Sievert Integral algorithms on the other hand are simple, versatile and very easy to use but its accuracy had been repeatedly put into question for low energy isotopes like iridium. We therefore advocate a modification of the Sievert Integral algorithms by superimposing the output from Monte Carlo Simulations on the Sievert Integrals when dealing with low energy isotopes. In this way, we would be combining accuracy, simplicity and reasonable computational times (author)

  13. New approach based on tetrahedral-mesh geometry for accurate 4D Monte Carlo patient-dose calculation

    International Nuclear Information System (INIS)

    In the present study, to achieve accurate 4D Monte Carlo dose calculation in radiation therapy, we devised a new approach that combines (1) modeling of the patient body using tetrahedral-mesh geometry based on the patient’s 4D CT data, (2) continuous movement/deformation of the tetrahedral patient model by interpolation of deformation vector fields acquired through deformable image registration, and (3) direct transportation of radiation particles during the movement and deformation of the tetrahedral patient model. The results of our feasibility study show that it is certainly possible to construct 4D patient models (= phantoms) with sufficient accuracy using the tetrahedral-mesh geometry and to directly transport radiation particles during continuous movement and deformation of the tetrahedral patient model. This new approach not only produces more accurate dose distribution in the patient but also replaces the current practice of using multiple 3D voxel phantoms and combining multiple dose distributions after Monte Carlo simulations. For routine clinical application of our new approach, the use of fast automatic segmentation algorithms is a must. In order to achieve, simultaneously, both dose accuracy and computation speed, the number of tetrahedrons for the lungs should be optimized. Although the current computation speed of our new 4D Monte Carlo simulation approach is slow (i.e. ∼40 times slower than that of the conventional dose accumulation approach), this problem is resolvable by developing, in Geant4, a dedicated navigation class optimized for particle transportation in tetrahedral-mesh geometry. (paper)

  14. Cloud-based Monte Carlo modelling of BSSRDF for the rendering of human skin appearance (Conference Presentation)

    Science.gov (United States)

    Doronin, Alexander; Rushmeier, Holly E.; Meglinski, Igor; Bykov, Alexander V.

    2016-03-01

    We present a new Monte Carlo based approach for the modelling of Bidirectional Scattering-Surface Reflectance Distribution Function (BSSRDF) for accurate rendering of human skin appearance. The variations of both skin tissues structure and the major chromophores are taken into account correspondingly to the different ethnic and age groups. The computational solution utilizes HTML5, accelerated by the graphics processing units (GPUs), and therefore is convenient for the practical use at the most of modern computer-based devices and operating systems. The results of imitation of human skin reflectance spectra, corresponding skin colours and examples of 3D faces rendering are presented and compared with the results of phantom studies.

  15. Monte Carol-Based Dosimetry of Beta-Emitters for Intravascular Brachytherapy

    Energy Technology Data Exchange (ETDEWEB)

    Choi, C.K.

    2002-06-25

    Monte Carlo simulations for radiation dosimetry and the experimental verifications of the simulations have been developed for the treatment geometry of intravascular brachytherapy, a form of radionuclide therapy for occluded coronary disease (restenosis). Monte Carlo code, MCNP4C, has been used to calculate the radiation dose from the encapsulated array of B-emitting seeds (Sr/Y-source train). Solid water phantoms have been fabricated to measure the dose on the radiochromic films that were exposed to the beta source train for both linear and curved coronary vessel geometries. While the dose difference for the 5-degree curved vessel at the prescription point of f+2.0 mm is within the 10% guideline set by the AAPM, however, the difference increased dramatically to 16.85% for the 10-degree case which requires additional adjustment for the acceptable dosimetry planning. The experimental dose measurements agree well with the simulation results

  16. Tetrahedral-mesh-based computational human phantom for fast Monte Carlo dose calculations

    International Nuclear Information System (INIS)

    Although polygonal-surface computational human phantoms can address several critical limitations of conventional voxel phantoms, their Monte Carlo simulation speeds are much slower than those of voxel phantoms. In this study, we sought to overcome this problem by developing a new type of computational human phantom, a tetrahedral mesh phantom, by converting a polygonal surface phantom to a tetrahedral mesh geometry. The constructed phantom was implemented in the Geant4 Monte Carlo code to calculate organ doses as well as to measure computation speed, the values were then compared with those for the original polygonal surface phantom. It was found that using the tetrahedral mesh phantom significantly improved the computation speed by factors of between 150 and 832 considering all of the particles and simulated energies other than the low-energy neutrons (0.01 and 1 MeV), for which the improvement was less significant (17.2 and 8.8 times, respectively). (paper)

  17. Proton scanner

    International Nuclear Information System (INIS)

    The scanner is based on the nuclear scattering of high energy protons by the nucleons (protons and neutrons) included in the atomic nuclei. Because of the wide scattering angle, three coordinates in space of the interaction point can be computed, giving directly three dimensional radiographs. Volumic resolution is of about a few cubic-millimeters. Because the base interaction is the strong nuclear force, the atomic dependence of the information obtained is different from that of the X-ray scanner, for which the base interaction is electro-magnetic force. (orig./VJ)

  18. Evaluation of IMRT plans of prostate carcinoma from four treatment planning systems based on Monte Carlo

    International Nuclear Information System (INIS)

    Objective: With the Monte Carlo method to recalculate the IMRT dose distributions from four TPS to provide a platform for independent comparison and evaluation of the plan quality.These results will help make a clinical decision as which TPS will be used for prostate IMRT planning. Methods: Eleven prostate cancer cases were planned with the Corvus, Xio, Pinnacle and Eclipse TPS. The plans were recalculated by Monte Carlo using leaf sequences and MUs for individual plans. Dose-volume-histograms and isodose distributions were compared. Other quantities such as Dmin (the minimum dose received by 99% of CTV/PTV), Dmax (the maximum dose received by 1% of CTV/PTV), V110%, V105%, V95% (the volume of CTV/PTV receiving 110%, 105%, 95% of the prescription dose), the volume of rectum and bladder receiving >65 Gy and >40 Gy, and the volume of femur receiving >50 Gy were evaluated. Total segments and MUs were also compared. Results: The Monte Carlo results agreed with the dose distributions from the TPS to within 3%/3 mm. The Xio, Pinnacle and Eclipse plans show less target dose heterogeneity and lower V65 and V40 for the rectum and bladder compared to the Corvus plans. The PTV Dmin is about 2 Gy lower for Xio plans than others while the Corvus plans have slightly lower female head V50 (0.03% and 0.58%) than others. The Corvus plans require significantly most segments (187.8) and MUs (1264.7) to deliver and the Pinnacle plans require fewest segments (82.4) and MUs (703.6). Conclusions: We have tested an independent Monte Carlo dose calculation system for dose reconstruction and plan evaluation. This system provides a platform for the fair comparison and evaluation of treatment plans to facilitate clinical decision making in selecting a TPS and beam delivery system for particular treatment sites. (authors)

  19. Monte Carlo tests of the Rasch model based on scalability coefficients

    DEFF Research Database (Denmark)

    Christensen, Karl Bang; Kreiner, Svend

    that summarizes the number of Guttman errors in the data matrix. These coefficients are shown to yield efficient tests of the Rasch model using p-values computed using Markov chain Monte Carlo methods. The power of the tests of unequal item discrimination, and their ability to distinguish between local...... dependence and unequal item discrimination, are discussed. The methods are illustrated and motivated using a simulation study and a real data example....

  20. Uncertainty Analysis of Power Grid Investment Capacity Based on Monte Carlo

    Science.gov (United States)

    Qin, Junsong; Liu, Bingyi; Niu, Dongxiao

    By analyzing the influence factors of the investment capacity of power grid, to depreciation cost, sales price and sales quantity, net profit, financing and GDP of the second industry as the dependent variable to build the investment capacity analysis model. After carrying out Kolmogorov-Smirnov test, get the probability distribution of each influence factor. Finally, obtained the grid investment capacity uncertainty of analysis results by Monte Carlo simulation.

  1. A Monte Carlo method based on antithetic variates for network reliability computations

    OpenAIRE

    El Khadiri, Mohamed; Rubino, Gerardo

    1992-01-01

    The exact evaluation of usual reliability measures of communication networks is seriously limited because of the excessive computational time usually needed to obtain them. In the general case, the computation of almost all the interesting reliability metrics are NP-hard problems. An alternative approach is to estimate them by means of a Monte Carlo simulation. This allows to deal with larger models than those that can be evaluated exactly. In this paper, we propose an algorithm much more per...

  2. Detailed spatial measurements and Monte Carlo analysis of the transportation phenomena of thermal and epithermal neutrons from the 12-GeV proton transport line to an access maze

    International Nuclear Information System (INIS)

    In order to investigate the neutron transportation from a beam-line tunnel to an access maze at a 12-GeV proton accelerator, we measured the spatial distribution of thermal and epithermal neutrons by using the Au activation method in detail. Gold foils were placed at about 70 positions in the maze in the case of the insertion (or extraction) of a copper target of 1 mm thickness into (or from) the beam axis in front of the maze. After the end of accelerator operation, relative activities of the Au foils were simultaneously measured by using an imaging plate technique and the radioactivity of one reference foil was also measured with a HPGe detector to convert to the absolute activities of all foils. It was found that the neutrons reach to the depth of the maze in the case of the insertion of the copper target. This result reflects higher proportion of high-energy particles from the copper target to that from other beam loss points and high-energy particles become the successive source of low-energy neutrons. Furthermore, it was found that several circumstances such as door walls and electric wire cables obviously affect the absorption effect of thermal neutrons. The reaction rates obtained in this study were also used for the benchmark of the Monte Carlo simulation code, MARS15 (version of February 2008). The results of the MARS15 calculations precisely reproduced experimental results and significant effects of the electric wire cables and door walls

  3. Proton radiography in plasma

    Energy Technology Data Exchange (ETDEWEB)

    Volpe, L., E-mail: luca.volpe@mib.infn.it [Universita degli Studi di Milano-Bicocca, Piazza della scienza 3, Milano 20126 (Italy); Batani, D.; Morace, A. [Universita degli Studi di Milano-Bicocca, Piazza della scienza 3, Milano 20126 (Italy); Nicolai, Ph.; Regan, C. [CELIA, Universite de Bordeaux, CNRS, CEA, F33405 (France); Ravasio, A. [LULI, UMR 7605, CNRS, CEA, Universite Paris VI, Ecole Polytechnique, 91128 Palaiseau Cedex (France)

    2011-10-11

    Generation of high intensity and well collimated multi-energetic proton beams from laser-matter interaction extends the possibility to use protons as a diagnostic tool to image imploding target in Inertial Confinement Fusion (ICF) experiments. Due to the very large mass densities reached during implosion, protons traveling through the target undergo a very large number of collisions. Therefore the analysis of experimentally obtained proton images requires care and accurate numerical simulations using both hydrodynamic and Monte Carlo codes. The impact of multiple scattering needs to be carefully considered by taking into account the exact stopping power for dense matter and for the underdense plasma corona. In our paper, density, temperature and ionization degree profiles of the imploding target are obtained by 2D hydrodynamic simulations performed using CHIC code. Proton radiography images are simulated using the Monte Carlo code (MCNPX; adapted to correctly describe multiple scattering and plasma stopping power) in order to reconstruct the complete hydrodynamic history of the imploding target. Finally we develop a simple analytical model to study the performance of proton radiography as a function of initial experimental parameters, and identify two different regimes for proton radiography in ICF.

  4. Design and evaluation of a Monte Carlo based model of an orthovoltage treatment system

    International Nuclear Information System (INIS)

    The aim of this study was to develop a flexible framework of an orthovoltage treatment system capable of calculating and visualizing dose distributions in different phantoms and CT datasets. The framework provides a complete set of various filters, applicators and X-ray energies and therefore can be adapted to varying studies or be used for educational purposes. A dedicated user friendly graphical interface was developed allowing for easy setup of the simulation parameters and visualization of the results. For the Monte Carlo simulations the EGSnrc Monte Carlo code package was used. Building the geometry was accomplished with the help of the EGSnrc C++ class library. The deposited dose was calculated according to the KERMA approximation using the track-length estimator. The validation against measurements showed a good agreement within 4-5% deviation, down to depths of 20% of the depth dose maximum. Furthermore, to show its capabilities, the validated model was used to calculate the dose distribution on two CT datasets. Typical Monte Carlo calculation time for these simulations was about 10 minutes achieving an average statistical uncertainty of 2% on a standard PC. However, this calculation time depends strongly on the used CT dataset, tube potential, filter material/thickness and applicator size.

  5. Design and evaluation of a Monte Carlo based model of an orthovoltage treatment system

    Energy Technology Data Exchange (ETDEWEB)

    Penchev, Petar; Maeder, Ulf; Fiebich, Martin [IMPS University of Applied Sciences, Giessen (Germany). Inst. of Medical Physics and Radiation Protection; Zink, Klemens [IMPS University of Applied Sciences, Giessen (Germany). Inst. of Medical Physics and Radiation Protection; University Hospital Marburg (Germany). Dept. of Radiotherapy and Oncology

    2015-07-01

    The aim of this study was to develop a flexible framework of an orthovoltage treatment system capable of calculating and visualizing dose distributions in different phantoms and CT datasets. The framework provides a complete set of various filters, applicators and X-ray energies and therefore can be adapted to varying studies or be used for educational purposes. A dedicated user friendly graphical interface was developed allowing for easy setup of the simulation parameters and visualization of the results. For the Monte Carlo simulations the EGSnrc Monte Carlo code package was used. Building the geometry was accomplished with the help of the EGSnrc C++ class library. The deposited dose was calculated according to the KERMA approximation using the track-length estimator. The validation against measurements showed a good agreement within 4-5% deviation, down to depths of 20% of the depth dose maximum. Furthermore, to show its capabilities, the validated model was used to calculate the dose distribution on two CT datasets. Typical Monte Carlo calculation time for these simulations was about 10 minutes achieving an average statistical uncertainty of 2% on a standard PC. However, this calculation time depends strongly on the used CT dataset, tube potential, filter material/thickness and applicator size.

  6. Development and validation of MCNPX-based Monte Carlo treatment plan verification system

    Directory of Open Access Journals (Sweden)

    Iraj Jabbari

    2015-01-01

    Full Text Available A Monte Carlo treatment plan verification (MCTPV system was developed for clinical treatment plan verification (TPV, especially for the conformal and intensity-modulated radiotherapy (IMRT plans. In the MCTPV, the MCNPX code was used for particle transport through the accelerator head and the patient body. MCTPV has an interface with TiGRT planning system and reads the information which is needed for Monte Carlo calculation transferred in digital image communications in medicine-radiation therapy (DICOM-RT format. In MCTPV several methods were applied in order to reduce the simulation time. The relative dose distribution of a clinical prostate conformal plan calculated by the MCTPV was compared with that of TiGRT planning system. The results showed well implementation of the beams configuration and patient information in this system. For quantitative evaluation of MCTPV a two-dimensional (2D diode array (MapCHECK2 and gamma index analysis were used. The gamma passing rate (3%/3 mm of an IMRT plan was found to be 98.5% for total beams. Also, comparison of the measured and Monte Carlo calculated doses at several points inside an inhomogeneous phantom for 6- and 18-MV photon beams showed a good agreement (within 1.5%. The accuracy and timing results of MCTPV showed that MCTPV could be used very efficiently for additional assessment of complicated plans such as IMRT plan.

  7. MONTE-4 for Monte Carlo simulations with high performance

    International Nuclear Information System (INIS)

    The Monte Carlo machine MONTE-4, has been developed based on the architecture of existing supercomputer with a design philosophy to realize high performance in vector-parallel processing of Monte Carlo codes for particle transport problems. The effective performance of this Monte Carlo machine is presented through practical applications of multi-group criticality safety code KENO-IV and continuous-energy neutron/photon transport code MCNP. Ten times speedup has been obtained on MONTE-4 compared with the execution time in the scalar processing. (K.A.)

  8. Development of a practical fuel management system for PSBR based on advanced three-dimensional Monte Carlo coupled depletion methodology

    Science.gov (United States)

    Tippayakul, Chanatip

    The main objective of this research is to develop a practical fuel management system for the Pennsylvania State University Breazeale research reactor (PSBR) based on several advanced Monte Carlo coupled depletion methodologies. Primarily, this research involved two major activities: model and method developments and analyses and validations of the developed models and methods. The starting point of this research was the utilization of the earlier developed fuel management tool, TRIGSIM, to create the Monte Carlo model of core loading 51 (end of the core loading). It was found when comparing the normalized power results of the Monte Carlo model to those of the current fuel management system (using HELIOS/ADMARC-H) that they agreed reasonably well (within 2%--3% differences on average). Moreover, the reactivity of some fuel elements was calculated by the Monte Carlo model and it was compared with measured data. It was also found that the fuel element reactivity results of the Monte Carlo model were in good agreement with the measured data. However, the subsequent task of analyzing the conversion from the core loading 51 to the core loading 52 using TRIGSIM showed quite significant difference of each control rod worth between the Monte Carlo model and the current methodology model. The differences were mainly caused by inconsistent absorber atomic number densities between the two models. Hence, the model of the first operating core (core loading 2) was revised in light of new information about the absorber atomic densities to validate the Monte Carlo model with the measured data. With the revised Monte Carlo model, the results agreed better to the measured data. Although TRIGSIM showed good modeling and capabilities, the accuracy of TRIGSIM could be further improved by adopting more advanced algorithms. Therefore, TRIGSIM was planned to be upgraded. The first task of upgrading TRIGSIM involved the improvement of the temperature modeling capability. The new TRIGSIM was

  9. Proton Conduction in a Phosphonate-Based Metal-Organic Framework Mediated by Intrinsic "Free Diffusion inside a Sphere".

    Science.gov (United States)

    Pili, Simona; Argent, Stephen P; Morris, Christopher G; Rought, Peter; García-Sakai, Victoria; Silverwood, Ian P; Easun, Timothy L; Li, Ming; Warren, Mark R; Murray, Claire A; Tang, Chiu C; Yang, Sihai; Schröder, Martin

    2016-05-25

    Understanding the molecular mechanism of proton conduction is crucial for the design of new materials with improved conductivity. Quasi-elastic neutron scattering (QENS) has been used to probe the mechanism of proton diffusion within a new phosphonate-based metal-organic framework (MOF) material, MFM-500(Ni). QENS suggests that the proton conductivity (4.5 × 10(-4) S/cm at 98% relative humidity and 25 °C) of MFM-500(Ni) is mediated by intrinsic "free diffusion inside a sphere", representing the first example of such a mechanism observed in MOFs. PMID:27182787

  10. An analytic linear accelerator source model for GPU-based Monte Carlo dose calculations

    Science.gov (United States)

    Tian, Zhen; Li, Yongbao; Folkerts, Michael; Shi, Feng; Jiang, Steve B.; Jia, Xun

    2015-10-01

    Recently, there has been a lot of research interest in developing fast Monte Carlo (MC) dose calculation methods on graphics processing unit (GPU) platforms. A good linear accelerator (linac) source model is critical for both accuracy and efficiency considerations. In principle, an analytical source model should be more preferred for GPU-based MC dose engines than a phase-space file-based model, in that data loading and CPU-GPU data transfer can be avoided. In this paper, we presented an analytical field-independent source model specifically developed for GPU-based MC dose calculations, associated with a GPU-friendly sampling scheme. A key concept called phase-space-ring (PSR) was proposed. Each PSR contained a group of particles that were of the same type, close in energy and reside in a narrow ring on the phase-space plane located just above the upper jaws. The model parameterized the probability densities of particle location, direction and energy for each primary photon PSR, scattered photon PSR and electron PSR. Models of one 2D Gaussian distribution or multiple Gaussian components were employed to represent the particle direction distributions of these PSRs. A method was developed to analyze a reference phase-space file and derive corresponding model parameters. To efficiently use our model in MC dose calculations on GPU, we proposed a GPU-friendly sampling strategy, which ensured that the particles sampled and transported simultaneously are of the same type and close in energy to alleviate GPU thread divergences. To test the accuracy of our model, dose distributions of a set of open fields in a water phantom were calculated using our source model and compared to those calculated using the reference phase-space files. For the high dose gradient regions, the average distance-to-agreement (DTA) was within 1 mm and the maximum DTA within 2 mm. For relatively low dose gradient regions, the root-mean-square (RMS) dose difference was within 1.1% and the maximum

  11. Monte Carlo Based Calibration and Uncertainty Analysis of a Coupled Plant Growth and Hydrological Model

    Science.gov (United States)

    Houska, Tobias; Multsch, Sebastian; Kraft, Philipp; Frede, Hans-Georg; Breuer, Lutz

    2014-05-01

    Computer simulations are widely used to support decision making and planning in the agriculture sector. On the one hand, many plant growth models use simplified hydrological processes and structures, e.g. by the use of a small number of soil layers or by the application of simple water flow approaches. On the other hand, in many hydrological models plant growth processes are poorly represented. Hence, fully coupled models with a high degree of process representation would allow a more detailed analysis of the dynamic behaviour of the soil-plant interface. We used the Python programming language to couple two of such high process oriented independent models and to calibrate both models simultaneously. The Catchment Modelling Framework (CMF) simulated soil hydrology based on the Richards equation and the Van-Genuchten-Mualem retention curve. CMF was coupled with the Plant growth Modelling Framework (PMF), which predicts plant growth on the basis of radiation use efficiency, degree days, water shortage and dynamic root biomass allocation. The Monte Carlo based Generalised Likelihood Uncertainty Estimation (GLUE) method was applied to parameterize the coupled model and to investigate the related uncertainty of model predictions to it. Overall, 19 model parameters (4 for CMF and 15 for PMF) were analysed through 2 x 106 model runs randomly drawn from an equally distributed parameter space. Three objective functions were used to evaluate the model performance, i.e. coefficient of determination (R2), bias and model efficiency according to Nash Sutcliffe (NSE). The model was applied to three sites with different management in Muencheberg (Germany) for the simulation of winter wheat (Triticum aestivum L.) in a cross-validation experiment. Field observations for model evaluation included soil water content and the dry matters of roots, storages, stems and leaves. Best parameter sets resulted in NSE of 0.57 for the simulation of soil moisture across all three sites. The shape

  12. Monte Carlo-based treatment planning system calculation engine for microbeam radiation therapy

    International Nuclear Information System (INIS)

    Purpose: Microbeam radiation therapy (MRT) is a synchrotron radiotherapy technique that explores the limits of the dose-volume effect. Preclinical studies have shown that MRT irradiations (arrays of 25-75-μm-wide microbeams spaced by 200-400 μm) are able to eradicate highly aggressive animal tumor models while healthy tissue is preserved. These promising results have provided the basis for the forthcoming clinical trials at the ID17 Biomedical Beamline of the European Synchrotron Radiation Facility (ESRF). The first step includes irradiation of pets (cats and dogs) as a milestone before treatment of human patients. Within this context, accurate dose calculations are required. The distinct features of both beam generation and irradiation geometry in MRT with respect to conventional techniques require the development of a specific MRT treatment planning system (TPS). In particular, a Monte Carlo (MC)-based calculation engine for the MRT TPS has been developed in this work. Experimental verification in heterogeneous phantoms and optimization of the computation time have also been performed. Methods: The penelope/penEasy MC code was used to compute dose distributions from a realistic beam source model. Experimental verification was carried out by means of radiochromic films placed within heterogeneous slab-phantoms. Once validation was completed, dose computations in a virtual model of a patient, reconstructed from computed tomography (CT) images, were performed. To this end, decoupling of the CT image voxel grid (a few cubic millimeter volume) to the dose bin grid, which has micrometer dimensions in the transversal direction of the microbeams, was performed. Optimization of the simulation parameters, the use of variance-reduction (VR) techniques, and other methods, such as the parallelization of the simulations, were applied in order to speed up the dose computation. Results: Good agreement between MC simulations and experimental results was achieved, even at the

  13. Monte Carlo-based treatment planning system calculation engine for microbeam radiation therapy

    Energy Technology Data Exchange (ETDEWEB)

    Martinez-Rovira, I.; Sempau, J.; Prezado, Y. [Institut de Tecniques Energetiques, Universitat Politecnica de Catalunya, Diagonal 647, Barcelona E-08028 (Spain) and ID17 Biomedical Beamline, European Synchrotron Radiation Facility (ESRF), 6 rue Jules Horowitz B.P. 220, F-38043 Grenoble Cedex (France); Institut de Tecniques Energetiques, Universitat Politecnica de Catalunya, Diagonal 647, Barcelona E-08028 (Spain); Laboratoire Imagerie et modelisation en neurobiologie et cancerologie, UMR8165, Centre National de la Recherche Scientifique (CNRS), Universites Paris 7 et Paris 11, Bat 440., 15 rue Georges Clemenceau, F-91406 Orsay Cedex (France)

    2012-05-15

    Purpose: Microbeam radiation therapy (MRT) is a synchrotron radiotherapy technique that explores the limits of the dose-volume effect. Preclinical studies have shown that MRT irradiations (arrays of 25-75-{mu}m-wide microbeams spaced by 200-400 {mu}m) are able to eradicate highly aggressive animal tumor models while healthy tissue is preserved. These promising results have provided the basis for the forthcoming clinical trials at the ID17 Biomedical Beamline of the European Synchrotron Radiation Facility (ESRF). The first step includes irradiation of pets (cats and dogs) as a milestone before treatment of human patients. Within this context, accurate dose calculations are required. The distinct features of both beam generation and irradiation geometry in MRT with respect to conventional techniques require the development of a specific MRT treatment planning system (TPS). In particular, a Monte Carlo (MC)-based calculation engine for the MRT TPS has been developed in this work. Experimental verification in heterogeneous phantoms and optimization of the computation time have also been performed. Methods: The penelope/penEasy MC code was used to compute dose distributions from a realistic beam source model. Experimental verification was carried out by means of radiochromic films placed within heterogeneous slab-phantoms. Once validation was completed, dose computations in a virtual model of a patient, reconstructed from computed tomography (CT) images, were performed. To this end, decoupling of the CT image voxel grid (a few cubic millimeter volume) to the dose bin grid, which has micrometer dimensions in the transversal direction of the microbeams, was performed. Optimization of the simulation parameters, the use of variance-reduction (VR) techniques, and other methods, such as the parallelization of the simulations, were applied in order to speed up the dose computation. Results: Good agreement between MC simulations and experimental results was achieved, even at

  14. Generation of scintigraphic images in a virtual dosimetry trial based on Monte Carlo modelling

    International Nuclear Information System (INIS)

    Full text of publication follows. Aim: the purpose of dosimetry calculations in therapeutic nuclear medicine is to maximize tumour absorbed dose while minimizing normal tissue toxicities. However a wide heterogeneity of dosimetric approaches is observed: there is no standardized dosimetric protocol to date. The DosiTest project (www.dositest.com) intends to identify critical steps in the dosimetry chain by implementing clinical dosimetry in different Nuclear Medicine departments, on scintigraphic images generated by Monte Carlo simulation from a same virtual patient. This study aims at presenting the different steps contributing to image generation, following the imaging protocol of a given participating centre, Milan's European Institute of Oncology (IEO). Materiel and methods: the chosen clinical application is that of 111In-pentetreotide (OctreoscanTM). Pharmacokinetic data from the literature are used to derive a compartmental model. The kinetic rates between 6 compartments (liver, spleen, kidneys, blood, urine, remainder body) were obtained from WinSaam [3]: the activity in each compartment is known at any time point. The TestDose [1] software (computing architecture of DosiTest) implements the NURBS-based phantom NCAT-WB [2] to generate anatomical data for the virtual patient. IEO gamma-camera was modelled with GATE [4] v6.2. Scintigraphic images were simulated for each compartment and the resulting projections were weighted by the respective pharmacokinetics for each compartment. The final step consisted in aggregating each compartment to generate the resulting image. Results: following IEO's imaging protocol, planar and tomographic image simulations were generated at various time points. Computation times (on a 480 virtual cores computing cluster) for 'step and shoot' whole body simulations (5 steps/time point) and acceptable statistics were: 10 days for extra-vascular fluid, 28 h for blood, 12 h for liver, 7 h for kidneys, and 1-2 h for

  15. Particle Swarm Optimization based predictive control of Proton Exchange Membrane Fuel Cell (PEMFC)

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    Proton Exchange Membrane Fuel Cells (PEMFCs) are the main focus of their current development as power sources because they are capable of higher power density and faster start-up than other fuel cells. The humidification system and output performance of PEMFC stack are briefly analyzed. Predictive control of PEMFC based on Support Vector Regression Machine (SVRM) is presented and the SVRM is constructed. The processing plant is modelled on SVRM and the predictive control law is obtained by using Particle Swarm Optimization (PSO). The simulation and the results showed that the SVRM and the PSO receding optimization applied to the PEMFC predictive control yielded good performance.

  16. Time-resolved imaging of prompt-gamma rays for proton range verification using a knife-edge slit camera based on digital photon counters

    OpenAIRE

    Cambraia Lopes, P; Clementel, E; Crespo, P; Henrotin, S; Huizenga, J.; G. Janssens; Parodi, K.; Prieels, D.; Roellinghoff, F; Smeets, J.; Stichelbaut, F.; Schaart, D. R.

    2015-01-01

    Proton range monitoring may facilitate online adaptive proton therapy and improve treatment outcomes. Imaging of proton-induced prompt gamma (PG) rays using a knife-edge slit collimator is currently under investigation as a potential tool for real-time proton range monitoring. A major challenge in collimated PG imaging is the suppression of neutron-induced background counts. In this work, we present an initial performance test of two knife-edge slit camera prototypes based on arrays of digita...

  17. Fluence-based dosimetry of proton and heavier ion beams using single track detectors.

    Science.gov (United States)

    Klimpki, G; Mescher, H; Akselrod, M S; Jäkel, O; Greilich, S

    2016-02-01

    Due to their superior spatial resolution, small and biocompatible fluorescent nuclear track detectors (FNTDs) open up the possibility of characterizing swift heavy charged particle fields on a single track level. Permanently stored spectroscopic information such as energy deposition and particle field composition is of particular importance in heavy ion radiotherapy, since radiation quality is one of the decisive predictors for clinical outcome. Findings presented within this paper aim towards single track reconstruction and fluence-based dosimetry of proton and heavier ion fields. Three-dimensional information on individual ion trajectories through the detector volume is obtained using fully automated image processing software. Angular distributions of multidirectional fields can be measured accurately within  ±2° uncertainty. This translates into less than 5% overall fluence deviation from the chosen irradiation reference. The combination of single ion tracking with an improved energy loss calibration curve based on 90 FNTD irradiations with protons as well as helium, carbon and oxygen ions enables spectroscopic analysis of a detector irradiated in Bragg peak proximity of a 270 MeV u(-1) carbon ion field. Fluence-based dosimetry results agree with treatment planning software reference. PMID:26757791

  18. Fluence-based dosimetry of proton and heavier ion beams using single track detectors

    Science.gov (United States)

    Klimpki, G.; Mescher, H.; Akselrod, M. S.; Jäkel, O.; Greilich, S.

    2016-02-01

    Due to their superior spatial resolution, small and biocompatible fluorescent nuclear track detectors (FNTDs) open up the possibility of characterizing swift heavy charged particle fields on a single track level. Permanently stored spectroscopic information such as energy deposition and particle field composition is of particular importance in heavy ion radiotherapy, since radiation quality is one of the decisive predictors for clinical outcome. Findings presented within this paper aim towards single track reconstruction and fluence-based dosimetry of proton and heavier ion fields. Three-dimensional information on individual ion trajectories through the detector volume is obtained using fully automated image processing software. Angular distributions of multidirectional fields can be measured accurately within  ±2° uncertainty. This translates into less than 5% overall fluence deviation from the chosen irradiation reference. The combination of single ion tracking with an improved energy loss calibration curve based on 90 FNTD irradiations with protons as well as helium, carbon and oxygen ions enables spectroscopic analysis of a detector irradiated in Bragg peak proximity of a 270 MeV u-1 carbon ion field. Fluence-based dosimetry results agree with treatment planning software reference.

  19. Pattern recognition and data mining software based on artificial neural networks applied to proton transfer in aqueous environments

    International Nuclear Information System (INIS)

    In computational physics proton transfer phenomena could be viewed as pattern classification problems based on a set of input features allowing classification of the proton motion into two categories: transfer ‘occurred’ and transfer ‘not occurred’. The goal of this paper is to evaluate the use of artificial neural networks in the classification of proton transfer events, based on the feed-forward back propagation neural network, used as a classifier to distinguish between the two transfer cases. In this paper, we use a new developed data mining and pattern recognition tool for automating, controlling, and drawing charts of the output data of an Empirical Valence Bond existing code. The study analyzes the need for pattern recognition in aqueous proton transfer processes and how the learning approach in error back propagation (multilayer perceptron algorithms) could be satisfactorily employed in the present case. We present a tool for pattern recognition and validate the code including a real physical case study. The results of applying the artificial neural networks methodology to crowd patterns based upon selected physical properties (e.g., temperature, density) show the abilities of the network to learn proton transfer patterns corresponding to properties of the aqueous environments, which is in turn proved to be fully compatible with previous proton transfer studies. (condensed matter: structural, mechanical, and thermal properties)

  20. Forward hadron production in ultraperipheral proton-heavy-ion collisions at the LHC and RHIC

    CERN Document Server

    Mitsuka, Gaku

    2015-01-01

    We discuss hadron production in the forward rapidity region in ultraperipheral proton-lead collisions at the LHC and proton-gold collisions at RHIC. Our discussion is based on the Monte Carlo simulations of the interactions of virtual photons emitted by a fast moving nucleus with a proton beam. We simulate the virtual photon flux with the STARLIGHT event generator and then particle production with the SOPHIA, DPMJET, and PYTHIA event generators. We show the rapidity distributions of charged and neutral particles, and the momentum distributions of neutral pions and neutrons at forward rapidities. According to the Monte Carlo simulations, we find large cross sections of ultraperipheral collisions for particle production especially in the very forward region, leading to substantial background contributions to investigations of collective nuclear effects and spin physics. Finally we can distinguish between proton-nucleus inelastic interactions and ultraperipheral collisions with additional requirements of either ...

  1. Unstructured mesh based multi-physics interface for CFD code coupling in the Serpent 2 Monte Carlo code

    International Nuclear Information System (INIS)

    This paper presents an unstructured mesh based multi-physics interface implemented in the Serpent 2 Monte Carlo code, for the purpose of coupling the neutronics solution to component-scale thermal hydraulics calculations, such as computational fluid dynamics (CFD). The work continues the development of a multi-physics coupling scheme, which relies on the separation of state-point information from the geometry input, and the capability to handle temperature and density distributions by a rejection sampling algorithm. The new interface type is demonstrated by a simplified molten-salt reactor test case, using a thermal hydraulics solution provided by the CFD solver in OpenFOAM. (author)

  2. MONTE CARLO CALCULATION OF ENERGY DEPOSITION BY DELTA RAYS AROUND ION TRACKS

    Institute of Scientific and Technical Information of China (English)

    张纯祥; 刘小伟; 等

    1994-01-01

    The radial distribution of dose around the path of a heavy ion has been studied by a Monte Carlo transport analysis of the delta rays produced along the track of a heavy ion based on classical binary collision dynamics and a single scattering model for the electron transport process.Result comparisons among this work and semi-empirical expression based delta ray theory of track structure,as well as other Monte Carlo calculations are made for 1,3MeV protons and several heavy ions.The results of the Monte Carlo simulations for energetic heavy ions are in agreement with experimental data and with results of different methods.The characteristic of this Monte Carlo calculation is a simulation of the delta rays theory of track structure.

  3. Sampling-Based Nuclear Data Uncertainty Quantification for Continuous Energy Monte Carlo Codes

    OpenAIRE

    Zhu, Ting

    2015-01-01

    The goal of the present PhD research is to establish a methodology of nuclear data uncertainty quantification (NDUQ) for MCNPX, the continuous-energy Monte-Carlo (M-C) code. The high fidelity (continuous-energy treatment and flexible geometry modelling) of MCNPX makes it the choice of routine criticality safety calculations at PSI/LRS, but also raises challenges for NDUQ by conventional sensitivity/uncertainty (S/U) methods. The methodology developed during this PhD research is fundamentally ...

  4. Random vibration analysis of switching apparatus based on Monte Carlo method

    Institute of Scientific and Technical Information of China (English)

    ZHAI Guo-fu; CHEN Ying-hua; REN Wan-bin

    2007-01-01

    The performance in vibration environment of switching apparatus containing mechanical contact is an important element when judging the apparatus's reliability. A piecewise linear two-degrees-of-freedom mathematical model considering contact loss was built in this work, and the vibration performance of the model under random external Gaussian white noise excitation was investigated by using Monte Carlo simulation in Matlab/Simulink. Simulation showed that the spectral content and statistical characters of the contact force coincided strongly with reality. The random vibration character of the contact system was solved using time (numerical) domain simulation in this paper. Conclusions reached here are of great importance for reliability design of switching apparatus.

  5. Microlens assembly error analysis for light field camera based on Monte Carlo method

    Science.gov (United States)

    Li, Sai; Yuan, Yuan; Zhang, Hao-Wei; Liu, Bin; Tan, He-Ping

    2016-08-01

    This paper describes numerical analysis of microlens assembly errors in light field cameras using the Monte Carlo method. Assuming that there were no manufacturing errors, home-built program was used to simulate images of coupling distance error, movement error and rotation error that could appear during microlens installation. By researching these images, sub-aperture images and refocus images, we found that the images present different degrees of fuzziness and deformation for different microlens assembly errors, while the subaperture image presents aliasing, obscured images and other distortions that result in unclear refocus images.

  6. Monte Carlo calculations for design of An accelerator based PGNAA facility

    International Nuclear Information System (INIS)

    Monte Carlo calculations were carried out for design of a set up for Prompt Gamma Ray Neutron Activation Analysis (PGNAA) by 14 MeV neutrons to analyze cement raw material samples. The calculations were carried out using code the MCNP4B2. Various geometry parameters of the PGNAA experimental setup such as sample thickness, moderator geometry and detector shielding etc were optimized by maximizing the prompt gamma ray yield of different elements of sample material. Finally calibration curve of the PGNAA setup were generated for various concentrations of calcium in the material sample. Results of this simulation are presented. (author)

  7. Monte Carlo calculations for design of An accelerator based PGNAA facility

    Energy Technology Data Exchange (ETDEWEB)

    Nagadi, M.M.; Naqvi, A.A. [King Fahd University of Petroleum and Minerals, Center for Applied Physical Sciences, Dhahran (Saudi Arabia); Rehman, Khateeb-ur; Kidwai, S. [King Fahd University of Petroleum and Minerals, Department of Physics, Dhahran (Saudi Arabia)

    2002-08-01

    Monte Carlo calculations were carried out for design of a set up for Prompt Gamma Ray Neutron Activation Analysis (PGNAA) by 14 MeV neutrons to analyze cement raw material samples. The calculations were carried out using code the MCNP4B2. Various geometry parameters of the PGNAA experimental setup such as sample thickness, moderator geometry and detector shielding etc were optimized by maximizing the prompt gamma ray yield of different elements of sample material. Finally calibration curve of the PGNAA setup were generated for various concentrations of calcium in the material sample. Results of this simulation are presented. (author)

  8. Broad energy spectrum of laser-accelerated protons for spallation-related physics

    International Nuclear Information System (INIS)

    A beam of MeV protons, accelerated by ultraintense laser-pulse interactions with a thin target foil, is used to investigate nuclear reactions of interest for spallation physics. The laser-generated proton beam is shown (protons were measured) to have a broad energy distribution, which closely resembles the expected energy spectrum of evaporative protons (below 50 MeV) produced in GeV-proton-induced spallation reactions. The protons are used to quantify the distribution of residual radioisotopes produced in a representative spallation target (Pb), and the results are compared with calculated predictions based on spectra modeled with nuclear Monte Carlo codes. Laser-plasma particle accelerators are shown to provide data relevant to the design and development of accelerator driven systems

  9. First tests for an online treatment monitoring system with in-beam PET for proton therapy

    CERN Document Server

    Kraan, Aafke C; Belcari, N; Camarlinghi, N; Cappucci, F; Ciocca, M; Ferrari, A; Ferretti, S; Mairani, A; Molinelli, S; Pullia, M; Retico, A; Sala, P; Sportelli, G; Del Guerra, A; Rosso, V

    2014-01-01

    PET imaging is a non-invasive technique for particle range verification in proton therapy. It is based on measuring the beta+ annihilations caused by nuclear interactions of the protons in the patient. In this work we present measurements for proton range verification in phantoms, performed at the CNAO particle therapy treatment center in Pavia, Italy, with our 10 x 10 cm^2 planar PET prototype DoPET. PMMA phantoms were irradiated with mono-energetic proton beams and clinical treatment plans, and PET data were acquired during and shortly after proton irradiation. We created 1-D profiles of the beta+ activity along the proton beam-axis, and evaluated the difference between the proximal rise and the distal fall-off position of the activity distribution. A good agreement with FLUKA Monte Carlo predictions was obtained. We also assessed the system response when the PMMA phantom contained an air cavity. The system was able to detect these cavities quickly after irradiation.

  10. Evaluation of the interindividual human variation in bioactivation of methyleugenol using physiologically based kinetic modeling and Monte Carlo simulations

    Energy Technology Data Exchange (ETDEWEB)

    Al-Subeihi, Ala' A.A., E-mail: subeihi@yahoo.com [Division of Toxicology, Wageningen University, Tuinlaan 5, 6703 HE Wageningen (Netherlands); BEN-HAYYAN-Aqaba International Laboratories, Aqaba Special Economic Zone Authority (ASEZA), P. O. Box 2565, Aqaba 77110 (Jordan); Alhusainy, Wasma; Kiwamoto, Reiko; Spenkelink, Bert [Division of Toxicology, Wageningen University, Tuinlaan 5, 6703 HE Wageningen (Netherlands); Bladeren, Peter J. van [Division of Toxicology, Wageningen University, Tuinlaan 5, 6703 HE Wageningen (Netherlands); Nestec S.A., Avenue Nestlé 55, 1800 Vevey (Switzerland); Rietjens, Ivonne M.C.M.; Punt, Ans [Division of Toxicology, Wageningen University, Tuinlaan 5, 6703 HE Wageningen (Netherlands)

    2015-03-01

    The present study aims at predicting the level of formation of the ultimate carcinogenic metabolite of methyleugenol, 1′-sulfooxymethyleugenol, in the human population by taking variability in key bioactivation and detoxification reactions into account using Monte Carlo simulations. Depending on the metabolic route, variation was simulated based on kinetic constants obtained from incubations with a range of individual human liver fractions or by combining kinetic constants obtained for specific isoenzymes with literature reported human variation in the activity of these enzymes. The results of the study indicate that formation of 1′-sulfooxymethyleugenol is predominantly affected by variation in i) P450 1A2-catalyzed bioactivation of methyleugenol to 1′-hydroxymethyleugenol, ii) P450 2B6-catalyzed epoxidation of methyleugenol, iii) the apparent kinetic constants for oxidation of 1′-hydroxymethyleugenol, and iv) the apparent kinetic constants for sulfation of 1′-hydroxymethyleugenol. Based on the Monte Carlo simulations a so-called chemical-specific adjustment factor (CSAF) for intraspecies variation could be derived by dividing different percentiles by the 50th percentile of the predicted population distribution for 1′-sulfooxymethyleugenol formation. The obtained CSAF value at the 90th percentile was 3.2, indicating that the default uncertainty factor of 3.16 for human variability in kinetics may adequately cover the variation within 90% of the population. Covering 99% of the population requires a larger uncertainty factor of 6.4. In conclusion, the results showed that adequate predictions on interindividual human variation can be made with Monte Carlo-based PBK modeling. For methyleugenol this variation was observed to be in line with the default variation generally assumed in risk assessment. - Highlights: • Interindividual human differences in methyleugenol bioactivation were simulated. • This was done using in vitro incubations, PBK modeling

  11. Evaluation of the interindividual human variation in bioactivation of methyleugenol using physiologically based kinetic modeling and Monte Carlo simulations

    International Nuclear Information System (INIS)

    The present study aims at predicting the level of formation of the ultimate carcinogenic metabolite of methyleugenol, 1′-sulfooxymethyleugenol, in the human population by taking variability in key bioactivation and detoxification reactions into account using Monte Carlo simulations. Depending on the metabolic route, variation was simulated based on kinetic constants obtained from incubations with a range of individual human liver fractions or by combining kinetic constants obtained for specific isoenzymes with literature reported human variation in the activity of these enzymes. The results of the study indicate that formation of 1′-sulfooxymethyleugenol is predominantly affected by variation in i) P450 1A2-catalyzed bioactivation of methyleugenol to 1′-hydroxymethyleugenol, ii) P450 2B6-catalyzed epoxidation of methyleugenol, iii) the apparent kinetic constants for oxidation of 1′-hydroxymethyleugenol, and iv) the apparent kinetic constants for sulfation of 1′-hydroxymethyleugenol. Based on the Monte Carlo simulations a so-called chemical-specific adjustment factor (CSAF) for intraspecies variation could be derived by dividing different percentiles by the 50th percentile of the predicted population distribution for 1′-sulfooxymethyleugenol formation. The obtained CSAF value at the 90th percentile was 3.2, indicating that the default uncertainty factor of 3.16 for human variability in kinetics may adequately cover the variation within 90% of the population. Covering 99% of the population requires a larger uncertainty factor of 6.4. In conclusion, the results showed that adequate predictions on interindividual human variation can be made with Monte Carlo-based PBK modeling. For methyleugenol this variation was observed to be in line with the default variation generally assumed in risk assessment. - Highlights: • Interindividual human differences in methyleugenol bioactivation were simulated. • This was done using in vitro incubations, PBK modeling

  12. Optimization of mass of plastic scintillator film for flow-cell based tritium monitoring: a Monte Carlo study

    International Nuclear Information System (INIS)

    Over the years, various types of tritium-in-air monitors have been designed and developed based on different principles. Ionization chamber, proportional counter and scintillation detector systems are few among them. A plastic scintillator based, flow-cell type online tritium-in-air monitoring system was developed for online monitoring of tritium in air. The value of the scintillator mass inside the cell-volume, which maximizes the response of the detector system, should be obtained to get maximum efficiency. The present study is aimed to optimize the amount of mass of the plastic scintillator film for the flow-cell based tritium monitoring instrument so that maximum efficiency is achieved. The Monte Carlo based EGSnrc code system has been used for this purpose

  13. Proton radiography to improve proton therapy treatment

    International Nuclear Information System (INIS)

    The quality of cancer treatment with protons critically depends on an accurate prediction of the proton stopping powers for the tissues traversed by the protons. Today, treatment planning in proton radiotherapy is based on stopping power calculations from densities of X-ray Computed Tomography (CT) images. This causes systematic uncertainties in the calculated proton range in a patient of typically 3–4%, but can become even 10% in bone regions [1,2,3,4,5,6,7,8]. This may lead to no dose in parts of the tumor and too high dose in healthy tissues [1]. A direct measurement of proton stopping powers with high-energy protons will allow reducing these uncertainties and will improve the quality of the treatment. Several studies have shown that a sufficiently accurate radiograph can be obtained by tracking individual protons traversing a phantom (patient) [4,6,10]. Our studies benefit from the gas-filled time projection chambers based on GridPix technology [2], developed at Nikhef, capable of tracking a single proton. A BaF2 crystal measuring the residual energy of protons was used. Proton radiographs of phantom consisting of different tissue-like materials were measured with a 30×30 mm2 150 MeV proton beam. Measurements were simulated with the Geant4 toolkit.First experimental and simulated energy radiographs are in very good agreement [3]. In this paper we focus on simulation studies of the proton scattering angle as it affects the position resolution of the proton energy loss radiograph. By selecting protons with a small scattering angle, the image quality can be improved significantly

  14. Proton radiography to improve proton therapy treatment

    Science.gov (United States)

    Takatsu, J.; van der Graaf, E. R.; Van Goethem, M.-J.; van Beuzekom, M.; Klaver, T.; Visser, J.; Brandenburg, S.; Biegun, A. K.

    2016-01-01

    The quality of cancer treatment with protons critically depends on an accurate prediction of the proton stopping powers for the tissues traversed by the protons. Today, treatment planning in proton radiotherapy is based on stopping power calculations from densities of X-ray Computed Tomography (CT) images. This causes systematic uncertainties in the calculated proton range in a patient of typically 3-4%, but can become even 10% in bone regions [1,2,3,4,5,6,7,8]. This may lead to no dose in parts of the tumor and too high dose in healthy tissues [1]. A direct measurement of proton stopping powers with high-energy protons will allow reducing these uncertainties and will improve the quality of the treatment. Several studies have shown that a sufficiently accurate radiograph can be obtained by tracking individual protons traversing a phantom (patient) [4,6,10]. Our studies benefit from the gas-filled time projection chambers based on GridPix technology [2], developed at Nikhef, capable of tracking a single proton. A BaF2 crystal measuring the residual energy of protons was used. Proton radiographs of phantom consisting of different tissue-like materials were measured with a 30×30 mm2 150 MeV proton beam. Measurements were simulated with the Geant4 toolkit.First experimental and simulated energy radiographs are in very good agreement [3]. In this paper we focus on simulation studies of the proton scattering angle as it affects the position resolution of the proton energy loss radiograph. By selecting protons with a small scattering angle, the image quality can be improved significantly.

  15. Commissioning of a compact laser-based proton beam line for high intensity bunches around 10Â MeV

    Science.gov (United States)

    Busold, S.; Schumacher, D.; Deppert, O.; Brabetz, C.; Kroll, F.; Blažević, A.; Bagnoud, V.; Roth, M.

    2014-03-01

    We report on the first results of experiments with a new laser-based proton beam line at the GSI accelerator facility in Darmstadt. It delivers high current bunches at proton energies around 9.6 MeV, containing more than 109 particles in less than 10 ns and with tunable energy spread down to 2.7% (ΔE/E0 at FWHM). A target normal sheath acceleration stage serves as a proton source and a pulsed solenoid provides for beam collimation and energy selection. Finally a synchronous radio frequency (rf) field is applied via a rf cavity for energy compression at a synchronous phase of -90 deg. The proton bunch is characterized at the end of the very compact beam line, only 3 m behind the laser matter interaction point, which defines the particle source.

  16. TH-C-BRD-05: Reducing Proton Beam Range Uncertainty with Patient-Specific CT HU to RSP Calibrations Based On Single-Detector Proton Radiography

    International Nuclear Information System (INIS)

    Purpose: Beam range uncertainty in proton treatment comes primarily from converting the patient's X-ray CT (xCT) dataset to relative stopping power (RSP). Current practices use a single curve for this conversion, produced by a stoichiometric calibration based on tissue composition data for average, healthy, adult humans, but not for the individual in question. Proton radiographs produce water-equivalent path length (WEPL) maps, dependent on the RSP of tissues within the specific patient. This work investigates the use of such WEPL maps to optimize patient-specific calibration curves for reducing beam range uncertainty. Methods: The optimization procedure works on the principle of minimizing the difference between the known WEPL map, obtained from a proton radiograph, and a digitally-reconstructed WEPL map (DRWM) through an RSP dataset, by altering the calibration curve that is used to convert the xCT into an RSP dataset. DRWMs were produced with Plastimatch, an in-house developed software, and an optimization procedure was implemented in Matlab. Tests were made on a range of systems including simulated datasets with computed WEPL maps and phantoms (anthropomorphic and real biological tissue) with WEPL maps measured by single detector proton radiography. Results: For the simulated datasets, the optimizer showed excellent results. It was able to either completely eradicate or significantly reduce the root-mean-square-error (RMSE) in the WEPL for the homogeneous phantoms (to zero for individual materials or from 1.5% to 0.2% for the simultaneous optimization of multiple materials). For the heterogeneous phantom the RMSE was reduced from 1.9% to 0.3%. Conclusion: An optimization procedure has been designed to produce patient-specific calibration curves. Test results on a range of systems with different complexities and sizes have been promising for accurate beam range control in patients. This project was funded equally by the Engineering and Physical Sciences

  17. Heavy Ion SEU Cross Section Calculation Based on Proton Experimental Data, and Vice Versa

    CERN Document Server

    Wrobel, F; Pouget, V; Dilillo, L; Ecoffet, R; Lorfèvre, E; Bezerra, F; Brugger, M; Saigné, F

    2014-01-01

    The aim of this work is to provide a method to calculate single event upset (SEU) cross sections by using experimental data. Valuable tools such as PROFIT and SIMPA already focus on the calculation of the proton cross section by using heavy ions cross-section experiments. However, there is no available tool that calculates heavy ion cross sections based on measured proton cross sections with no knowledge of the technology. We based our approach on the diffusion-collection model with the aim of analyzing the characteristics of transient currents that trigger SEUs. We show that experimental cross sections could be used to characterize the pulses that trigger an SEU. Experimental results allow yet defining an empirical rule to identify the transient current that are responsible for an SEU. Then, the SEU cross section can be calculated for any kind of particle and any energy with no need to know the Spice model of the cell. We applied our method to some technologies (250 nm, 90 nm and 65 nm bulk SRAMs) and we sho...

  18. Parameterization of brachytherapy source phase space file for Monte Carlo-based clinical brachytherapy dose calculation

    International Nuclear Information System (INIS)

    A common approach to implementing the Monte Carlo method for the calculation of brachytherapy radiation dose deposition is to use a phase space file containing information on particles emitted from a brachytherapy source. However, the loading of the phase space file during the dose calculation consumes a large amount of computer random access memory, imposing a higher requirement for computer hardware. In this study, we propose a method to parameterize the information (e.g., particle location, direction and energy) stored in the phase space file by using several probability distributions. This method was implemented for dose calculations of a commercial Ir-192 high dose rate source. Dose calculation accuracy of the parameterized source was compared to the results observed using the full phase space file in a simple water phantom and in a clinical breast cancer case. The results showed the parameterized source at a size of 200 kB was as accurate as the phase space file represented source of 1.1 GB. By using the parameterized source representation, a compact Monte Carlo job can be designed, which allows an easy setup for parallel computing in brachytherapy planning. (paper)

  19. Parameterization of brachytherapy source phase space file for Monte Carlo-based clinical brachytherapy dose calculation

    Science.gov (United States)

    Zhang, M.; Zou, W.; Chen, T.; Kim, L.; Khan, A.; Haffty, B.; Yue, N. J.

    2014-01-01

    A common approach to implementing the Monte Carlo method for the calculation of brachytherapy radiation dose deposition is to use a phase space file containing information on particles emitted from a brachytherapy source. However, the loading of the phase space file during the dose calculation consumes a large amount of computer random access memory, imposing a higher requirement for computer hardware. In this study, we propose a method to parameterize the information (e.g., particle location, direction and energy) stored in the phase space file by using several probability distributions. This method was implemented for dose calculations of a commercial Ir-192 high dose rate source. Dose calculation accuracy of the parameterized source was compared to the results observed using the full phase space file in a simple water phantom and in a clinical breast cancer case. The results showed the parameterized source at a size of 200 kB was as accurate as the phase space file represented source of 1.1 GB. By using the parameterized source representation, a compact Monte Carlo job can be designed, which allows an easy setup for parallel computing in brachytherapy planning.

  20. Simulations of fast ions distribution in stellarators based on coupled Monte Carlo fuelling and orbit codes

    International Nuclear Information System (INIS)

    The numerical simulation of the dynamics of fast ions coming from neutral beam injection (NBI) heating is an important task in fusion devices, since these particles are used as sources to heat and fuel the plasma and their uncontrolled losses can damage the walls of the reactor. This paper shows a new application that simulates these dynamics on the grid: FastDEP. FastDEP plugs together two Monte Carlo codes used in fusion science, namely FAFNER2 and ISDEP, and add new functionalities. Physically, FAFNER2 provides the fast ion initial state in the device while ISDEP calculates their evolution in time; as a result, the fast ion distribution function in TJ-II stellerator has been estimated, but the code can be used on any other device. In this paper a comparison between the physics of the two NBI injectors in TJ-II is presented, together with the differences between fast ion confinement and the driven momentum in the two cases. The simulations have been obtained using Montera, a framework developed for achieving grid efficient executions of Monte Carlo applications. (paper)

  1. A lattice-based Monte Carlo evaluation of Canada Deuterium Uranium-6 safety parameters

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Yong Hee; Hartanto, Donny; Kim, Woo Song [Dept. of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology (KAIST), Daejeon (Korea, Republic of)

    2016-06-15

    Important safety parameters such as the fuel temperature coefficient (FTC) and the power coefficient of reactivity (PCR) of the CANada Deuterium Uranium (CANDU-6) reactor have been evaluated using the Monte Carlo method. For accurate analysis of the parameters, the Doppler broadening rejection correction scheme was implemented in the MCNPX code to account for the thermal motion of the heavy uranium-238 nucleus in the neutron-U scattering reactions. In this work, a standard fuel lattice has been modeled and the fuel is depleted using MCNPX. The FTC value is evaluated for several burnup points including the mid-burnup representing a near-equilibrium core. The Doppler effect has been evaluated using several cross-section libraries such as ENDF/B-VI.8, ENDF/B-VII.0, JEFF-3.1.1, and JENDL-4.0. The PCR value is also evaluated at mid-burnup conditions to characterize the safety features of an equilibrium CANDU-6 reactor. To improve the reliability of the Monte Carlo calculations, we considered a huge number of neutron histories in this work and the standard deviation of the k-infinity values is only 0.5-1 pcm.

  2. Investigation of SIBM driven recrystallization in alpha Zirconium based on EBSD data and Monte Carlo modeling

    Science.gov (United States)

    Jedrychowski, M.; Bacroix, B.; Salman, O. U.; Tarasiuk, J.; Wronski, S.

    2015-08-01

    The work focuses on the influence of moderate plastic deformation on subsequent partial recrystallization of hexagonal zirconium (Zr702). In the considered case, strain induced boundary migration (SIBM) is assumed to be the dominating recrystallization mechanism. This hypothesis is analyzed and tested in detail using experimental EBSD-OIM data and Monte Carlo computer simulations. An EBSD investigation is performed on zirconium samples, which were channel-die compressed in two perpendicular directions: normal direction (ND) and transverse direction (TD) of the initial material sheet. The maximal applied strain was below 17%. Then, samples were briefly annealed in order to achieve a partly recrystallized state. Obtained EBSD data were analyzed in terms of texture evolution associated with a microstructural characterization, including: kernel average misorientation (KAM), grain orientation spread (GOS), twinning, grain size distributions, description of grain boundary regions. In parallel, Monte Carlo Potts model combined with experimental microstructures was employed in order to verify two main recrystallization scenarios: SIBM driven growth from deformed sub-grains and classical growth of recrystallization nuclei. It is concluded that simulation results provided by the SIBM model are in a good agreement with experimental data in terms of texture as well as microstructural evolution.

  3. GPU Acceleration of Mean Free Path Based Kernel Density Estimators for Monte Carlo Neutronics Simulations

    International Nuclear Information System (INIS)

    Kernel Density Estimators (KDEs) are a non-parametric density estimation technique that has recently been applied to Monte Carlo radiation transport simulations. Kernel density estimators are an alternative to histogram tallies for obtaining global solutions in Monte Carlo tallies. With KDEs, a single event, either a collision or particle track, can contribute to the score at multiple tally points with the uncertainty at those points being independent of the desired resolution of the solution. Thus, KDEs show potential for obtaining estimates of a global solution with reduced variance when compared to a histogram. Previously, KDEs have been applied to neutronics for one-group reactor physics problems and fixed source shielding applications. However, little work was done to obtain reaction rates using KDEs. This paper introduces a new form of the MFP KDE that is capable of handling general geometries. Furthermore, extending the MFP KDE to 2-D problems in continuous energy introduces inaccuracies to the solution. An ad-hoc solution to these inaccuracies is introduced that produces errors smaller than 4% at material interfaces.

  4. A Monte Carlo and continuum study of mechanical properties of nanoparticle based films

    Energy Technology Data Exchange (ETDEWEB)

    Ogunsola, Oluwatosin; Ehrman, Sheryl [University of Maryland, Department of Chemical and Biomolecular Engineering, Chemical and Nuclear Engineering Building (United States)], E-mail: sehrman@eng.umd.edu

    2008-01-15

    A combination Monte Carlo and equivalent-continuum simulation approach was used to investigate the structure-mechanical property relationships of titania nanoparticle deposits. Films of titania composed of nanoparticle aggregates were simulated using a Monte Carlo approach with diffusion-limited aggregation. Each aggregate in the simulation is fractal-like and random in structure. In the film structure, it is assumed that bond strength is a function of distance with two limiting values for the bond strengths: one representing the strong chemical bond between the particles at closest proximity in the aggregate and the other representing the weak van der Waals bond between particles from different aggregates. The Young's modulus of the film is estimated using an equivalent-continuum modeling approach, and the influences of particle diameter (5-100 nm) and aggregate size (3-400 particles per aggregate) on predicted Young's modulus are investigated. The Young's modulus is observed to increase with a decrease in primary particle size and is independent of the size of the aggregates deposited. Decreasing porosity resulted in an increase in Young's modulus as expected from results reported previously in the literature.

  5. GPU Acceleration of Mean Free Path Based Kernel Density Estimators for Monte Carlo Neutronics Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Burke, TImothy P. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Kiedrowski, Brian C. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Martin, William R. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Brown, Forrest B. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-11-19

    Kernel Density Estimators (KDEs) are a non-parametric density estimation technique that has recently been applied to Monte Carlo radiation transport simulations. Kernel density estimators are an alternative to histogram tallies for obtaining global solutions in Monte Carlo tallies. With KDEs, a single event, either a collision or particle track, can contribute to the score at multiple tally points with the uncertainty at those points being independent of the desired resolution of the solution. Thus, KDEs show potential for obtaining estimates of a global solution with reduced variance when compared to a histogram. Previously, KDEs have been applied to neutronics for one-group reactor physics problems and fixed source shielding applications. However, little work was done to obtain reaction rates using KDEs. This paper introduces a new form of the MFP KDE that is capable of handling general geometries. Furthermore, extending the MFP KDE to 2-D problems in continuous energy introduces inaccuracies to the solution. An ad-hoc solution to these inaccuracies is introduced that produces errors smaller than 4% at material interfaces.

  6. Proton Therapy

    Science.gov (United States)

    ... News Physician Resources Professions Site Index A-Z Proton Therapy Proton therapy delivers radiation to tumor tissue ... feel during and after the procedure? What is proton therapy and how is it used? Protons are ...

  7. Study of an intense proton beam profiler based on laser absorption

    International Nuclear Information System (INIS)

    Among the challenges of high current proton accelerators, the development of new beam diagnostics is of major importance. The main difficulty for these instruments, is the beam power which deteriorates any instruments used to catch it. The chosen detectors are therefore 'non-interceptive systems. After an introduction concerning characteristics of the used accelerator (chapter I), parameters defining a beam of particles are presented (chapter II). Among these ones, the profile is an important beam characteristic for its transport. After the description of the different types of beam profilers, their problematic application to intense beams is discussed. New physical phenomena have to be used for profilers. Thus, we have prospected optical luminescence phenomena. The light produced during the interaction of protons with the residual gas and/or locally injected is a source of informations on beam characteristics. In chapters III and IV, there is an experimental and theoretical analysis of the luminescence. Chapter V is a direct application of spectroscopic measurements to estimate the output of protons with a non-interceptive technique. With the spectral analysis, the idea of a profiler based on laser absorption is developed. This presentation is both theoretical and experimental (chapters 6 and 7). The laser absorption needs the use of metastable states we define in the chapter 6. The evolution of the metastable states, with time and space, has been rigorously studied to discuss the concept of an optical profiler. Chapter VII presents all the necessary instrumentation for the use of a laser and the first measurements with the beam. At the thesis end, the first recorded profile is presented. An experimental critic is presented with a description of the different sources of errors and the proposed cures. (author)

  8. An assessment of radiation damage in space-based germanium detectors due to solar proton events

    International Nuclear Information System (INIS)

    Radiation effects caused by solar proton events will be a common problem for many types of sensors on missions to the inner solar system because of the long cruise phases coupled with the inverse square scaling of solar particle events. As part of a study in support of the BepiColombo mission to Mercury we have undertaken a comprehensive series of tests to assess these effects on a wide range of sensors. In this paper, we report on the measurements on a large volume coaxial Ge detector which was exposed to simulated solar proton spectra of integrated fluences 8x108, 6x109 and 6x1010protonscm-2. After each irradiation the detectors performance was accessed in terms of energy resolution, efficiency and activation. The detector was then annealed and the measurements repeated before the next irradiation. The minimum operational performance criteria were based on the resolution and efficiency requirements necessary to detect and separate specific radioisotope emission lines from a planetary regolith. Specifically that the energy resolution be restored to 5 keV FWHM at 1332 keV and the detection efficiency be degraded to no more than 10% of its pre-irradiation value. The key conclusion of this study is that even after a modest solar proton event the detector requires extensive annealing. After exposure to an event of integral fluence ∼8x108protonscm-2 this amounts to ∼1 week duration at 1000C, whereas for a fluence of ∼6x1010protonscm-2, the detector requires 3.5 months of annealing to satisfy the minimum operational performance requirements and 4.5 months to return the energy resolution to <3keV FWHM at 1332 keV. As a consequence such an instrument will require constant, planned and active management throughout its operational lifetime. The impact on spacecraft operations including resource management therefore needs careful consideration

  9. Two proton-conductive hybrids based on 2-(3-pyridyl)benzimidazole molecules and Keggin-type heteropolyacids

    Energy Technology Data Exchange (ETDEWEB)

    Wei, Mei-Lin, E-mail: weimeilinhd@163.com; Wang, Yu-Xia; Wang, Xin-Jun

    2014-01-15

    Two proton-conductive organic/inorganic complexes were constructed by Keggin-type heteropolyacids and 2-(3-pyridyl)benzimidazole molecules. Single-crystal X-ray diffraction analyses revealed that two complexes crystallized in the monoclinic space group P2{sub 1}/c, exhibited different unit cell parameters, and presented different hydrogen-bonded networks constructed by 2-(3-pyridyl)benzimidazole molecules, [PMo{sub 12}O{sub 40}]{sup 3−} anions and solvent molecules. The results of thermogravimetric analyses suggest that two supramolecular complexes have different thermal stability based on the different hydrogen-bonded networks. Two complexes at 100 °C under 35–98% relative humidity showed a good proton conductivity of about 10{sup −3} S cm{sup −1}. The proton conductivities of two complexes under 98% relative humidity both increase on a logarithmic scale with temperature range from 25 to 100 °C. At 100 °C, both complexes showed poor proton conductivities of 10{sup −8}–10{sup −9} S cm{sup −1} under acetonitrile or methanol vapor. - Graphical abstract: Two molecular hybrids constructed by Keggin-type heteropolyacids and 2-(3-pyridyl)benzimidazole molecules showed good proton conductivities of 10{sup −3} S cm{sup −1} at 100 °C under 35–98% relative humidity. Display Omitted - Highlights: • 2-(3-Pyridyl)benzimidazole could form hydrogen bonds via the N–H groups. • Heteropolyacids have suitable characteristics to be used excellent proton conductors. • Two proton-conductive hybrids based on Keggin HPAs and 3-PyBim were constructed. • The structures were determined by using single-crystal X-ray diffraction data. • They showed good proton conductivities of 10{sup −3} S cm{sup −1} at 100 °C under 35–98% RH.

  10. Accelerating parameter identification of proton exchange membrane fuel cell model with ranking-based differential evolution

    International Nuclear Information System (INIS)

    Parameter identification of PEM (proton exchange membrane) fuel cell model is a very active area of research. Generally, it can be treated as a numerical optimization problem with complex nonlinear and multi-variable features. DE (differential evolution), which has been successfully used in various fields, is a simple yet efficient evolutionary algorithm for global numerical optimization. In this paper, with the objective of accelerating the process of parameter identification of PEM fuel cell models and reducing the necessary computational efforts, we firstly present a generic and simple ranking-based mutation operator for the DE algorithm. Then, the ranking-based mutation operator is incorporated into five highly-competitive DE variants to solve the PEM fuel cell model parameter identification problems. The main contributions of this work are the proposed ranking-based DE variants and their application to the parameter identification problems of PEM fuel cell models. Experiments have been conducted by using both the simulated voltage–current data and the data obtained from the literature to validate the performance of our approach. The results indicate that the ranking-based DE methods provide better results with respect to the solution quality, the convergence rate, and the success rate compared with their corresponding original DE methods. In addition, the voltage–current characteristics obtained by our approach are in good agreement with the original voltage–current curves in all cases. - Highlights: • A simple and generic ranking-based mutation operator is presented in this paper. • Several DE (differential evolution) variants are used to solve the parameter identification of PEMFC (proton exchange membrane fuel cells) model. • Results show that our method accelerates the process of parameter identification. • The V–I characteristics are in very good agreement with experimental data

  11. Monte Carlo application based on GEANT4 toolkit to simulate a laser–plasma electron beam line for radiobiological studies

    Energy Technology Data Exchange (ETDEWEB)

    Lamia, D., E-mail: debora.lamia@ibfm.cnr.it [Institute of Molecular Bioimaging and Physiology IBFM CNR – LATO, Cefalù (Italy); Russo, G., E-mail: giorgio.russo@ibfm.cnr.it [Institute of Molecular Bioimaging and Physiology IBFM CNR – LATO, Cefalù (Italy); Casarino, C.; Gagliano, L.; Candiano, G.C. [Institute of Molecular Bioimaging and Physiology IBFM CNR – LATO, Cefalù (Italy); Labate, L. [Intense Laser Irradiation Laboratory (ILIL) – National Institute of Optics INO CNR, Pisa (Italy); National Institute for Nuclear Physics INFN, Pisa Section and Frascati National Laboratories LNF (Italy); Baffigi, F.; Fulgentini, L.; Giulietti, A.; Koester, P.; Palla, D. [Intense Laser Irradiation Laboratory (ILIL) – National Institute of Optics INO CNR, Pisa (Italy); Gizzi, L.A. [Intense Laser Irradiation Laboratory (ILIL) – National Institute of Optics INO CNR, Pisa (Italy); National Institute for Nuclear Physics INFN, Pisa Section and Frascati National Laboratories LNF (Italy); Gilardi, M.C. [Institute of Molecular Bioimaging and Physiology IBFM CNR, Segrate (Italy); University of Milano-Bicocca, Milano (Italy)

    2015-06-21

    We report on the development of a Monte Carlo application, based on the GEANT4 toolkit, for the characterization and optimization of electron beams for clinical applications produced by a laser-driven plasma source. The GEANT4 application is conceived so as to represent in the most general way the physical and geometrical features of a typical laser-driven accelerator. It is designed to provide standard dosimetric figures such as percentage dose depth curves, two-dimensional dose distributions and 3D dose profiles at different positions both inside and outside the interaction chamber. The application was validated by comparing its predictions to experimental measurements carried out on a real laser-driven accelerator. The work is aimed at optimizing the source, by using this novel application, for radiobiological studies and, in perspective, for medical applications. - Highlights: • Development of a Monte Carlo application based on GEANT4 toolkit. • Experimental measurements carried out with a laser-driven acceleration system. • Validation of Geant4 application comparing experimental data with the simulated ones. • Dosimetric characterization of the acceleration system.

  12. Monte Carlo application based on GEANT4 toolkit to simulate a laser–plasma electron beam line for radiobiological studies

    International Nuclear Information System (INIS)

    We report on the development of a Monte Carlo application, based on the GEANT4 toolkit, for the characterization and optimization of electron beams for clinical applications produced by a laser-driven plasma source. The GEANT4 application is conceived so as to represent in the most general way the physical and geometrical features of a typical laser-driven accelerator. It is designed to provide standard dosimetric figures such as percentage dose depth curves, two-dimensional dose distributions and 3D dose profiles at different positions both inside and outside the interaction chamber. The application was validated by comparing its predictions to experimental measurements carried out on a real laser-driven accelerator. The work is aimed at optimizing the source, by using this novel application, for radiobiological studies and, in perspective, for medical applications. - Highlights: • Development of a Monte Carlo application based on GEANT4 toolkit. • Experimental measurements carried out with a laser-driven acceleration system. • Validation of Geant4 application comparing experimental data with the simulated ones. • Dosimetric characterization of the acceleration system

  13. Proton recoil telescope based on diamond detectors for measurement of fusion neutrons

    CERN Document Server

    Caiffi, B; Ripani, M; Pillon, M; Taiuti, M

    2015-01-01

    Diamonds are very promising candidates for the neutron diagnostics in harsh environments such as fusion reactor. In the first place this is because of their radiation hardness, exceeding that of Silicon by an order of magnitude. Also, in comparison to the standard on-line neutron diagnostics (fission chambers, silicon based detectors, scintillators), diamonds are less sensitive to $\\gamma$ rays, which represent a huge background in fusion devices. Finally, their low leakage current at high temperature suppresses the detector intrinsic noise. In this talk a CVD diamond based detector has been proposed for the measurement of the 14 MeV neutrons from D-T fusion reaction. The detector was arranged in a proton recoil telescope configuration, featuring a plastic converter in front of the sensitive volume in order to induce the (n,p) reaction. The segmentation of the sensitive volume, achieved by using two crystals, allowed to perform measurements in coincidence, which suppressed the neutron elastic scattering backg...

  14. Monte Carlo simulation of primary reactions on HPLUS based on pluto event generator

    International Nuclear Information System (INIS)

    Hadron Physics Lanzhou Spectrometer (HPLUS) is designed for the study of hadron production and decay from nucleon-nucleon interaction in the GeV region. The current formation of HPLUS and the particle identification methods for three polar angle regions are discussed. The Pluto event generator is applied to simulate the primary reactions on HPLUS, concerning four issues as followed: the agreement on pp elastic scattering angular distribution between Pluto samples and experimental data; the acceptance of charged K mesons in the strangeness production channels for the forward region of HPLUS; the dependence of the maximum energy of photons and the minimum vertex angle of two photons on the polar angle; the influence on the mass spectrum of excited states of nucleon with large resonant width from different reconstruction methods. It is proved that the Pluto event generator satisfies the requirements of Monte Carlo simulation for HPLUS. (authors)

  15. Web-Based Parallel Monte Carlo Simulation Platform for Financial Computation

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    Using Java, Java-enabled Web and object-oriented programming technologies, a framework is designed to organize multicomputer system on Intranet quickly to complete Monte Carlo simulation parallelizing. The high-performance computing environment is embedded in Web server so it can be accessed more easily. Adaptive parallelism and eager scheduling algorithm are used to realize load balancing, parallel processing and system fault-tolerance. Independent sequence pseudo-random number generator schemes to keep the parallel simulation availability. Three kinds of stock option pricing models as instances, ideal speedup and pricing results obtained on test bed. Now, as a Web service, a high-performance financial derivative security-pricing platform is set up for training and studying. The framework can also be used to develop other SPMD (single procedure multiple data) application. Robustness is still a major problem for further research.

  16. GPU-based Monte Carlo dust radiative transfer scheme applied to AGN

    CERN Document Server

    Heymann, Frank

    2012-01-01

    A three dimensional parallel Monte Carlo (MC) dust radiative transfer code is presented. To overcome the huge computing time requirements of MC treatments, the computational power of vectorized hardware is used, utilizing either multi-core computer power or graphics processing units. The approach is a self-consistent way to solve the radiative transfer equation in arbitrary dust configurations. The code calculates the equilibrium temperatures of two populations of large grains and stochastic heated polycyclic aromatic hydrocarbons (PAH). Anisotropic scattering is treated applying the Heney-Greenstein phase function. The spectral energy distribution (SED) of the object is derived at low spatial resolution by a photon counting procedure and at high spatial resolution by a vectorized ray-tracer. The latter allows computation of high signal-to-noise images of the objects at any frequencies and arbitrary viewing angles. We test the robustness of our approach against other radiative transfer codes. The SED and dust...

  17. PC-based process distribution to solve iterative Monte Carlo simulations in physical dosimetry

    International Nuclear Information System (INIS)

    A distribution model to simulate physical dosimetry measurements with Monte Carlo (MC) techniques has been developed. This approach is indicated to solve the simulations where there are continuous changes of measurement conditions (and hence of the input parameters) such as a TPR curve or the estimation of the resolution limit of an optimal densitometer in the case of small field profiles. As a comparison, a high resolution scan for narrow beams with no iterative process is presented. The model has been installed on a network PCs without any resident software. The only requirement for these PCs has been a small and temporal Linux partition in the hard disks and to be connecting by the net with our server PC. (orig.)

  18. Expression and functioning of retinal-based proton pumps in a saltern crystallizer brine.

    Science.gov (United States)

    Oren, Aharon; Abu-Ghosh, Said; Argov, Tal; Kara-Ivanov, Eliahu; Shitrit, Dror; Volpert, Adi; Horwitz, Rael

    2016-01-01

    We examined the presence of bacteriorhodopsin and other retinal protein pigments in the microbial community of the saltern crystallizer ponds in Eilat, Israel, and assessed the effect of the retinal-based proton pumps on the metabolic activity. The biota of the hypersaline (~309 g salts l(-1)) brine consisted of ~2200 β-carotene-rich Dunaliella cells and ~3.5 × 10(7) prokaryotes ml(-1), most of which were flat, square or rectangular Haloquadratum-like archaea. No indications were obtained for massive presence of Salinibacter. We estimated a concentration of bacteriorhodopsin and bacteriorhodopsin-like pigments of 3.6 nmol l(-1). When illuminated, the community respiration activity of the brine samples in which oxygenic photosynthesis was inhibited by 3-(3-4-dichlorophenyl)-1,1-dimethylurea, decreased by 40-43 %. This effect was interpreted to be the result of competition between two energy yielding systems: the bacteriorhodopsin proton pump and the respiratory chain. The results presented have important implications for the interpretation of many published data on photosynthetic and respiratory activities in hypersaline environments. PMID:26507954

  19. Proton and carbon ion radiotherapy for primary brain tumors and tumors of the skull base

    Energy Technology Data Exchange (ETDEWEB)

    Combs, Stephanie E.; Kessel, Kerstin; Habermehl, Daniel; Debus, Jurgen [Univ. Hospital of Heidelberg, Dept. of Radiation Oncology, Heidelberg (Germany)], e-mail: Stephanie.Combs@med.uni-heidelberg.de; Haberer, Thomas [Heidelberger Ionenstrahl Therapiezentrum (HIT), Heidelberg (Germany); Jaekel, Oliver [Univ. Hospital of Heidelberg, Dept. of Radiation Oncology, Heidelberg (Germany); Heidelberger Ionenstrahl Therapiezentrum (HIT), Heidelberg (Germany)

    2013-10-15

    To analyze clinical concepts, toxicity and treatment outcome in patients with brain and skull base tumors treated with photons and particle therapy. Material and methods: In total 260 patients with brain tumors and tumors of the skull base were treated at the Heidelberg Ion Therapy Center (HIT). Patients enrolled in and randomized within prospective clinical trials as well as bony or soft tissue tumors are not included in this analysis. Treatment was delivered as protons, carbon ions, or combinations of photons and a carbon ion boost. All patients are included in a tight follow-up program. The median follow-up time is 12 months (range 2-39 months). Results: Main histologies included meningioma (n = 107) for skull base lesions, pituitary adenomas (n = 14), low-grade gliomas (n = 51) as well as high-grade gliomas (n = 55) for brain tumors. In all patients treatment could be completed without any unexpected severe toxicities. No side effects > CTC Grade III were observed. To date, no severe late toxicities were observed, however, for endpoints such as secondary malignancies or neuro cognitive side effects follow-up time still remains too short. Local recurrences were mainly seen in the group of high-grade gliomas or atypical meningiomas; for benign skull base meningiomas, to date, no recurrences were observed during follow-up. Conclusion: The specific benefit of particle therapy will potentially reduce the risk of secondary malignancies as well as improve neuro cognitive outcome and quality of life (QOL); thus, longer follow-up will be necessary to confirm these endpoints. Indication-specific trials on meningiomas and gliomas are underway to elucidate the role of protons and carbon ions in these indications.

  20. Proton and carbon ion radiotherapy for primary brain tumors and tumors of the skull base

    International Nuclear Information System (INIS)

    To analyze clinical concepts, toxicity and treatment outcome in patients with brain and skull base tumors treated with photons and particle therapy. Material and methods: In total 260 patients with brain tumors and tumors of the skull base were treated at the Heidelberg Ion Therapy Center (HIT). Patients enrolled in and randomized within prospective clinical trials as well as bony or soft tissue tumors are not included in this analysis. Treatment was delivered as protons, carbon ions, or combinations of photons and a carbon ion boost. All patients are included in a tight follow-up program. The median follow-up time is 12 months (range 2-39 months). Results: Main histologies included meningioma (n = 107) for skull base lesions, pituitary adenomas (n = 14), low-grade gliomas (n = 51) as well as high-grade gliomas (n = 55) for brain tumors. In all patients treatment could be completed without any unexpected severe toxicities. No side effects > CTC Grade III were observed. To date, no severe late toxicities were observed, however, for endpoints such as secondary malignancies or neuro cognitive side effects follow-up time still remains too short. Local recurrences were mainly seen in the group of high-grade gliomas or atypical meningiomas; for benign skull base meningiomas, to date, no recurrences were observed during follow-up. Conclusion: The specific benefit of particle therapy will potentially reduce the risk of secondary malignancies as well as improve neuro cognitive outcome and quality of life (QOL); thus, longer follow-up will be necessary to confirm these endpoints. Indication-specific trials on meningiomas and gliomas are underway to elucidate the role of protons and carbon ions in these indications

  1. Development and validation of a measurement-based source model for kilovoltage cone-beam CT Monte Carlo dosimetry simulations

    International Nuclear Information System (INIS)

    measurements by 1.35%–5.31% (mean difference =−3.42%, SD = 1.09%).Conclusions: This work demonstrates the feasibility of using a measurement-based kV CBCT source model to facilitate dose calculations with Monte Carlo methods for both the radiographic and CBCT mode of operation. While this initial work validates simulations against measurements for simple geometries, future work will involve utilizing the source model to investigate kV CBCT dosimetry with more complex anthropomorphic phantoms and patient specific models

  2. Evaluation and comparison of New 4DCT based strategies for proton treatment planning for lung tumors

    International Nuclear Information System (INIS)

    To evaluate different strategies for proton lung treatment planning based on four-dimensional CT (4DCT) scans. Twelve cases, involving only gross tumor volumes (GTV), were evaluated. Single image sets of (1) maximum intensity projection (MIP3) of end inhale (EI), middle exhale (ME) and end exhale (EE) images; (2) average intensity projection (AVG) of all phase images; and (3) EE images from 4DCT scans were selected as primary images for proton treatment planning. Internal target volumes (ITVs) outlined by a clinician were imported into MIP3, AVG, and EE images as planning targets. Initially, treatment uncertainties were not included in planning. Each plan was imported into phase images of 4DCT scans. Relative volumes of GTVs covered by 95% of prescribed dose and mean ipsilateral lung dose of a phase image obtained by averaging the dose in inspiration and expiration phases were used to evaluate the quality of a plan for a particular case. For comparing different planning strategies, the mean of the averaged relative volumes of GTVs covered by 95% of prescribed dose and its standard deviation for each planning strategy for all cases were used. Then, treatment uncertainties were included in planning. Each plan was recalculated in phase images of 4DCT scans. Same strategies were used for plan evaluation except dose-volume histograms of the planning target volumes (PTVs) instead of GTVs were used and the mean and standard deviation of the relative volumes of PTVs covered by 95% of prescribed dose and the ipsilateral lung dose were used to compare different planning strategies. MIP3 plans without treatment uncertainties yielded 96.7% of the mean relative GTV covered by 95% of prescribed dose (standard deviations of 5.7% for all cases). With treatment uncertainties, MIP3 plans yielded 99.5% of mean relative PTV covered by 95% of prescribed dose (standard deviations of 0.7%). Inclusion of treatment uncertainties improved PTV dose coverage but also increased the ipsilateral

  3. Applications of Monte Carlo methods in nuclear science and engineering

    International Nuclear Information System (INIS)

    With the advent of inexpensive computing power over the past two decades and development of variance reduction techniques, applications of Monte Carlo radiation transport techniques have proliferated dramatically. The motivation of variance reduction technique is for computational efficiency. The typical variance reduction techniques worth mentioning here are: importance sampling, implicit capture, energy and angular biasing, Russian Roulette, exponential transform, next event estimator, weight window generator, range rejection technique (only for charged particles) etc. Applications of Monte Carlo in radiation transport include nuclear safeguards, accelerator applications, homeland security, nuclear criticality, health physics, radiological safety, radiography, radiotherapy physics, radiation standards, nuclear medicine (dosimetry and imaging) etc. Towards health care, Monte Carlo particle transport techniques offer exciting tools for radiotherapy research (cancer treatments involving photons, electrons, neutrons, protons, pions and other heavy ions) where they play an increasingly important role. Research and applications of Monte Carlo techniques in radiotherapy span a very wide range from fundamental studies of cross sections and development of particle transport algorithms, to clinical evaluation of treatment plans for a variety of radiotherapy modalities. Recent development is the voxel-based Monte Carlo Radiotherapy Treatment Planning involving external electron beam and patient data in the form of DICOM (Digital Imaging and Communications in Medicine) images. Articles relevant to the INIS are indexed separately

  4. SU-E-T-214: Intensity Modulated Proton Therapy (IMPT) Based On Passively Scattered Protons and Multi-Leaf Collimation: Prototype TPS and Dosimetry Study

    International Nuclear Information System (INIS)

    Purpose. Intensity-modulated proton therapy is usually implemented with multi-field optimization of pencil-beam scanning (PBS) proton fields. However, at the view of the experience with photon-IMRT, proton facilities equipped with double-scattering (DS) delivery and multi-leaf collimation (MLC) could produce highly conformal dose distributions (and possibly eliminate the need for patient-specific compensators) with a clever use of their MLC field shaping, provided that an optimal inverse TPS is developed. Methods. A prototype TPS was developed in MATLAB. The dose calculation process was based on a fluence-dose algorithm on an adaptive divergent grid. A database of dose kernels was precalculated in order to allow for fast variations of the field range and modulation during optimization. The inverse planning process was based on the adaptive simulated annealing approach, with direct aperture optimization of the MLC leaves. A dosimetry study was performed on a phantom formed by three concentrical semicylinders separated by 5 mm, of which the inner-most and outer-most were regarded as organs at risk (OARs), and the middle one as the PTV. We chose a concave target (which is not treatable with conventional DS fields) to show the potential of our technique. The optimizer was configured to minimize the mean dose to the OARs while keeping a good coverage of the target. Results. The plan produced by the prototype TPS achieved a conformity index of 1.34, with the mean doses to the OARs below 78% of the prescribed dose. This Result is hardly achievable with traditional conformal DS technique with compensators, and it compares to what can be obtained with PBS. Conclusion. It is certainly feasible to produce IMPT fields with MLC passive scattering fields. With a fully developed treatment planning system, the produced plans can be superior to traditional DS plans in terms of plan conformity and dose to organs at risk

  5. SU-E-T-214: Intensity Modulated Proton Therapy (IMPT) Based On Passively Scattered Protons and Multi-Leaf Collimation: Prototype TPS and Dosimetry Study

    Energy Technology Data Exchange (ETDEWEB)

    Sanchez-Parcerisa, D; Carabe-Fernandez, A [Department of Radiation Oncology, Hospital of the University of Pennsylvania, Philadelphia, PA (United States)

    2014-06-01

    Purpose. Intensity-modulated proton therapy is usually implemented with multi-field optimization of pencil-beam scanning (PBS) proton fields. However, at the view of the experience with photon-IMRT, proton facilities equipped with double-scattering (DS) delivery and multi-leaf collimation (MLC) could produce highly conformal dose distributions (and possibly eliminate the need for patient-specific compensators) with a clever use of their MLC field shaping, provided that an optimal inverse TPS is developed. Methods. A prototype TPS was developed in MATLAB. The dose calculation process was based on a fluence-dose algorithm on an adaptive divergent grid. A database of dose kernels was precalculated in order to allow for fast variations of the field range and modulation during optimization. The inverse planning process was based on the adaptive simulated annealing approach, with direct aperture optimization of the MLC leaves. A dosimetry study was performed on a phantom formed by three concentrical semicylinders separated by 5 mm, of which the inner-most and outer-most were regarded as organs at risk (OARs), and the middle one as the PTV. We chose a concave target (which is not treatable with conventional DS fields) to show the potential of our technique. The optimizer was configured to minimize the mean dose to the OARs while keeping a good coverage of the target. Results. The plan produced by the prototype TPS achieved a conformity index of 1.34, with the mean doses to the OARs below 78% of the prescribed dose. This Result is hardly achievable with traditional conformal DS technique with compensators, and it compares to what can be obtained with PBS. Conclusion. It is certainly feasible to produce IMPT fields with MLC passive scattering fields. With a fully developed treatment planning system, the produced plans can be superior to traditional DS plans in terms of plan conformity and dose to organs at risk.

  6. A phenomenological relative biological effectiveness (RBE) model for proton therapy based on all published in vitro cell survival data

    International Nuclear Information System (INIS)

    Proton therapy treatments are currently planned and delivered using the assumption that the proton relative biological effectiveness (RBE) relative to photons is 1.1. This assumption ignores strong experimental evidence that suggests the RBE varies along the treatment field, i.e. with linear energy transfer (LET) and with tissue type. A recent review study collected over 70 experimental reports on proton RBE, providing a comprehensive dataset for predicting RBE for cell survival. Using this dataset we developed a model to predict proton RBE based on dose, dose average LET (LETd) and the ratio of the linear-quadratic model parameters for the reference radiation (α/β)x, as the tissue specific parameter.The proposed RBE model is based on the linear quadratic model and was derived from a nonlinear regression fit to 287 experimental data points. The proposed model predicts that the RBE increases with increasing LETd and decreases with increasing (α/β)x. This agrees with previous theoretical predictions on the relationship between RBE, LETd and (α/β)x. The model additionally predicts a decrease in RBE with increasing dose and shows a relationship between both α and β with LETd. Our proposed phenomenological RBE model is derived using the most comprehensive collection of proton RBE experimental data to date. Previously published phenomenological models, based on a limited data set, may have to be revised. (paper)

  7. Bragg peak prediction from quantitative proton computed tomography using different path estimates

    OpenAIRE

    Dongxu WANG; Mackie, T. Rockwell; Wolfgang A. Tomé

    2011-01-01

    This paper characterizes the performance of the straight-line path (SLP) and cubic spline path (CSP) as path estimates used in reconstruction of proton computed tomography (pCT). The GEANT4 Monte Carlo simulation toolkit is employed to simulate the imaging phantom and proton projections. SLP, CSP and the most-probable path (MPP) are constructed based on the entrance and exit information of each proton. The physical deviations of SLP, CSP and MPP from the real path are calculated. Using a cond...

  8. Monte Carlo simulation methods in moment-based scale-bridging algorithms for thermal radiative-transfer problems

    International Nuclear Information System (INIS)

    We present a moment-based acceleration algorithm applied to Monte Carlo simulation of thermal radiative-transfer problems. Our acceleration algorithm employs a continuum system of moments to accelerate convergence of stiff absorption–emission physics. The combination of energy-conserving tallies and the use of an asymptotic approximation in optically thick regions remedy the difficulties of local energy conservation and mitigation of statistical noise in such regions. We demonstrate the efficiency and accuracy of the developed method. We also compare directly to the standard linearization-based method of Fleck and Cummings [1]. A factor of 40 reduction in total computational time is achieved with the new algorithm for an equivalent (or more accurate) solution as compared with the Fleck–Cummings algorithm