WorldWideScience

Sample records for based proton monte

  1. Technical Note: Defining cyclotron-based clinical scanning proton machines in a FLUKA Monte Carlo system.

    Science.gov (United States)

    Fiorini, Francesca; Schreuder, Niek; Van den Heuvel, Frank

    2018-02-01

    Cyclotron-based pencil beam scanning (PBS) proton machines represent nowadays the majority and most affordable choice for proton therapy facilities, however, their representation in Monte Carlo (MC) codes is more complex than passively scattered proton system- or synchrotron-based PBS machines. This is because degraders are used to decrease the energy from the cyclotron maximum energy to the desired energy, resulting in a unique spot size, divergence, and energy spread depending on the amount of degradation. This manuscript outlines a generalized methodology to characterize a cyclotron-based PBS machine in a general-purpose MC code. The code can then be used to generate clinically relevant plans starting from commercial TPS plans. The described beam is produced at the Provision Proton Therapy Center (Knoxville, TN, USA) using a cyclotron-based IBA Proteus Plus equipment. We characterized the Provision beam in the MC FLUKA using the experimental commissioning data. The code was then validated using experimental data in water phantoms for single pencil beams and larger irregular fields. Comparisons with RayStation TPS plans are also presented. Comparisons of experimental, simulated, and planned dose depositions in water plans show that same doses are calculated by both programs inside the target areas, while penumbrae differences are found at the field edges. These differences are lower for the MC, with a γ(3%-3 mm) index never below 95%. Extensive explanations on how MC codes can be adapted to simulate cyclotron-based scanning proton machines are given with the aim of using the MC as a TPS verification tool to check and improve clinical plans. For all the tested cases, we showed that dose differences with experimental data are lower for the MC than TPS, implying that the created FLUKA beam model is better able to describe the experimental beam. © 2017 The Authors. Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists

  2. A GPU-accelerated and Monte Carlo-based intensity modulated proton therapy optimization system.

    Science.gov (United States)

    Ma, Jiasen; Beltran, Chris; Seum Wan Chan Tseung, Hok; Herman, Michael G

    2014-12-01

    Conventional spot scanning intensity modulated proton therapy (IMPT) treatment planning systems (TPSs) optimize proton spot weights based on analytical dose calculations. These analytical dose calculations have been shown to have severe limitations in heterogeneous materials. Monte Carlo (MC) methods do not have these limitations; however, MC-based systems have been of limited clinical use due to the large number of beam spots in IMPT and the extremely long calculation time of traditional MC techniques. In this work, the authors present a clinically applicable IMPT TPS that utilizes a very fast MC calculation. An in-house graphics processing unit (GPU)-based MC dose calculation engine was employed to generate the dose influence map for each proton spot. With the MC generated influence map, a modified least-squares optimization method was used to achieve the desired dose volume histograms (DVHs). The intrinsic CT image resolution was adopted for voxelization in simulation and optimization to preserve spatial resolution. The optimizations were computed on a multi-GPU framework to mitigate the memory limitation issues for the large dose influence maps that resulted from maintaining the intrinsic CT resolution. The effects of tail cutoff and starting condition were studied and minimized in this work. For relatively large and complex three-field head and neck cases, i.e., >100,000 spots with a target volume of ∼ 1000 cm(3) and multiple surrounding critical structures, the optimization together with the initial MC dose influence map calculation was done in a clinically viable time frame (less than 30 min) on a GPU cluster consisting of 24 Nvidia GeForce GTX Titan cards. The in-house MC TPS plans were comparable to a commercial TPS plans based on DVH comparisons. A MC-based treatment planning system was developed. The treatment planning can be performed in a clinically viable time frame on a hardware system costing around 45,000 dollars. The fast calculation and

  3. A GPU-accelerated and Monte Carlo-based intensity modulated proton therapy optimization system

    Energy Technology Data Exchange (ETDEWEB)

    Ma, Jiasen, E-mail: ma.jiasen@mayo.edu; Beltran, Chris; Seum Wan Chan Tseung, Hok; Herman, Michael G. [Department of Radiation Oncology, Division of Medical Physics, Mayo Clinic, 200 First Street Southwest, Rochester, Minnesota 55905 (United States)

    2014-12-15

    Purpose: Conventional spot scanning intensity modulated proton therapy (IMPT) treatment planning systems (TPSs) optimize proton spot weights based on analytical dose calculations. These analytical dose calculations have been shown to have severe limitations in heterogeneous materials. Monte Carlo (MC) methods do not have these limitations; however, MC-based systems have been of limited clinical use due to the large number of beam spots in IMPT and the extremely long calculation time of traditional MC techniques. In this work, the authors present a clinically applicable IMPT TPS that utilizes a very fast MC calculation. Methods: An in-house graphics processing unit (GPU)-based MC dose calculation engine was employed to generate the dose influence map for each proton spot. With the MC generated influence map, a modified least-squares optimization method was used to achieve the desired dose volume histograms (DVHs). The intrinsic CT image resolution was adopted for voxelization in simulation and optimization to preserve spatial resolution. The optimizations were computed on a multi-GPU framework to mitigate the memory limitation issues for the large dose influence maps that resulted from maintaining the intrinsic CT resolution. The effects of tail cutoff and starting condition were studied and minimized in this work. Results: For relatively large and complex three-field head and neck cases, i.e., >100 000 spots with a target volume of ∼1000 cm{sup 3} and multiple surrounding critical structures, the optimization together with the initial MC dose influence map calculation was done in a clinically viable time frame (less than 30 min) on a GPU cluster consisting of 24 Nvidia GeForce GTX Titan cards. The in-house MC TPS plans were comparable to a commercial TPS plans based on DVH comparisons. Conclusions: A MC-based treatment planning system was developed. The treatment planning can be performed in a clinically viable time frame on a hardware system costing around 45

  4. MO-A-BRD-10: A Fast and Accurate GPU-Based Proton Transport Monte Carlo Simulation for Validating Proton Therapy Treatment Plans

    Energy Technology Data Exchange (ETDEWEB)

    Wan Chan Tseung, H; Ma, J; Beltran, C [Mayo Clinic, Rochester, MN (United States)

    2014-06-15

    Purpose: To build a GPU-based Monte Carlo (MC) simulation of proton transport with detailed modeling of elastic and non-elastic (NE) protonnucleus interactions, for use in a very fast and cost-effective proton therapy treatment plan verification system. Methods: Using the CUDA framework, we implemented kernels for the following tasks: (1) Simulation of beam spots from our possible scanning nozzle configurations, (2) Proton propagation through CT geometry, taking into account nuclear elastic and multiple scattering, as well as energy straggling, (3) Bertini-style modeling of the intranuclear cascade stage of NE interactions, and (4) Simulation of nuclear evaporation. To validate our MC, we performed: (1) Secondary particle yield calculations in NE collisions with therapeutically-relevant nuclei, (2) Pencil-beam dose calculations in homogeneous phantoms, (3) A large number of treatment plan dose recalculations, and compared with Geant4.9.6p2/TOPAS. A workflow was devised for calculating plans from a commercially available treatment planning system, with scripts for reading DICOM files and generating inputs for our MC. Results: Yields, energy and angular distributions of secondaries from NE collisions on various nuclei are in good agreement with the Geant4.9.6p2 Bertini and Binary cascade models. The 3D-gamma pass rate at 2%–2mm for 70–230 MeV pencil-beam dose distributions in water, soft tissue, bone and Ti phantoms is 100%. The pass rate at 2%–2mm for treatment plan calculations is typically above 98%. The net computational time on a NVIDIA GTX680 card, including all CPU-GPU data transfers, is around 20s for 1×10{sup 7} proton histories. Conclusion: Our GPU-based proton transport MC is the first of its kind to include a detailed nuclear model to handle NE interactions on any nucleus. Dosimetric calculations demonstrate very good agreement with Geant4.9.6p2/TOPAS. Our MC is being integrated into a framework to perform fast routine clinical QA of pencil

  5. Clinical implementation of a GPU-based simplified Monte Carlo method for a treatment planning system of proton beam therapy

    International Nuclear Information System (INIS)

    Kohno, R; Hotta, K; Nishioka, S; Matsubara, K; Tansho, R; Suzuki, T

    2011-01-01

    We implemented the simplified Monte Carlo (SMC) method on graphics processing unit (GPU) architecture under the computer-unified device architecture platform developed by NVIDIA. The GPU-based SMC was clinically applied for four patients with head and neck, lung, or prostate cancer. The results were compared to those obtained by a traditional CPU-based SMC with respect to the computation time and discrepancy. In the CPU- and GPU-based SMC calculations, the estimated mean statistical errors of the calculated doses in the planning target volume region were within 0.5% rms. The dose distributions calculated by the GPU- and CPU-based SMCs were similar, within statistical errors. The GPU-based SMC showed 12.30–16.00 times faster performance than the CPU-based SMC. The computation time per beam arrangement using the GPU-based SMC for the clinical cases ranged 9–67 s. The results demonstrate the successful application of the GPU-based SMC to a clinical proton treatment planning. (note)

  6. TU-EF-304-07: Monte Carlo-Based Inverse Treatment Plan Optimization for Intensity Modulated Proton Therapy

    International Nuclear Information System (INIS)

    Li, Y; Tian, Z; Jiang, S; Jia, X; Song, T; Wu, Z; Liu, Y

    2015-01-01

    Purpose: Intensity-modulated proton therapy (IMPT) is increasingly used in proton therapy. For IMPT optimization, Monte Carlo (MC) is desired for spots dose calculations because of its high accuracy, especially in cases with a high level of heterogeneity. It is also preferred in biological optimization problems due to the capability of computing quantities related to biological effects. However, MC simulation is typically too slow to be used for this purpose. Although GPU-based MC engines have become available, the achieved efficiency is still not ideal. The purpose of this work is to develop a new optimization scheme to include GPU-based MC into IMPT. Methods: A conventional approach using MC in IMPT simply calls the MC dose engine repeatedly for each spot dose calculations. However, this is not the optimal approach, because of the unnecessary computations on some spots that turned out to have very small weights after solving the optimization problem. GPU-memory writing conflict occurring at a small beam size also reduces computational efficiency. To solve these problems, we developed a new framework that iteratively performs MC dose calculations and plan optimizations. At each dose calculation step, the particles were sampled from different spots altogether with Metropolis algorithm, such that the particle number is proportional to the latest optimized spot intensity. Simultaneously transporting particles from multiple spots also mitigated the memory writing conflict problem. Results: We have validated the proposed MC-based optimization schemes in one prostate case. The total computation time of our method was ∼5–6 min on one NVIDIA GPU card, including both spot dose calculation and plan optimization, whereas a conventional method naively using the same GPU-based MC engine were ∼3 times slower. Conclusion: A fast GPU-based MC dose calculation method along with a novel optimization workflow is developed. The high efficiency makes it attractive for clinical

  7. TU-EF-304-07: Monte Carlo-Based Inverse Treatment Plan Optimization for Intensity Modulated Proton Therapy

    Energy Technology Data Exchange (ETDEWEB)

    Li, Y [Tsinghua University, Beijing, Beijing (China); UT Southwestern Medical Center, Dallas, TX (United States); Tian, Z; Jiang, S; Jia, X [UT Southwestern Medical Center, Dallas, TX (United States); Song, T [Southern Medical University, Guangzhou, Guangdong (China); UT Southwestern Medical Center, Dallas, TX (United States); Wu, Z; Liu, Y [Tsinghua University, Beijing, Beijing (China)

    2015-06-15

    Purpose: Intensity-modulated proton therapy (IMPT) is increasingly used in proton therapy. For IMPT optimization, Monte Carlo (MC) is desired for spots dose calculations because of its high accuracy, especially in cases with a high level of heterogeneity. It is also preferred in biological optimization problems due to the capability of computing quantities related to biological effects. However, MC simulation is typically too slow to be used for this purpose. Although GPU-based MC engines have become available, the achieved efficiency is still not ideal. The purpose of this work is to develop a new optimization scheme to include GPU-based MC into IMPT. Methods: A conventional approach using MC in IMPT simply calls the MC dose engine repeatedly for each spot dose calculations. However, this is not the optimal approach, because of the unnecessary computations on some spots that turned out to have very small weights after solving the optimization problem. GPU-memory writing conflict occurring at a small beam size also reduces computational efficiency. To solve these problems, we developed a new framework that iteratively performs MC dose calculations and plan optimizations. At each dose calculation step, the particles were sampled from different spots altogether with Metropolis algorithm, such that the particle number is proportional to the latest optimized spot intensity. Simultaneously transporting particles from multiple spots also mitigated the memory writing conflict problem. Results: We have validated the proposed MC-based optimization schemes in one prostate case. The total computation time of our method was ∼5–6 min on one NVIDIA GPU card, including both spot dose calculation and plan optimization, whereas a conventional method naively using the same GPU-based MC engine were ∼3 times slower. Conclusion: A fast GPU-based MC dose calculation method along with a novel optimization workflow is developed. The high efficiency makes it attractive for clinical

  8. SU-F-T-193: Evaluation of a GPU-Based Fast Monte Carlo Code for Proton Therapy Biological Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Taleei, R; Qin, N; Jiang, S [UT Southwestern Medical Center, Dallas, TX (United States); Peeler, C [UT MD Anderson Cancer Center, Houston, TX (United States); Jia, X [The University of Texas Southwestern Medical Ctr, Dallas, TX (United States)

    2016-06-15

    Purpose: Biological treatment plan optimization is of great interest for proton therapy. It requires extensive Monte Carlo (MC) simulations to compute physical dose and biological quantities. Recently, a gPMC package was developed for rapid MC dose calculations on a GPU platform. This work investigated its suitability for proton therapy biological optimization in terms of accuracy and efficiency. Methods: We performed simulations of a proton pencil beam with energies of 75, 150 and 225 MeV in a homogeneous water phantom using gPMC and FLUKA. Physical dose and energy spectra for each ion type on the central beam axis were scored. Relative Biological Effectiveness (RBE) was calculated using repair-misrepair-fixation model. Microdosimetry calculations were performed using Monte Carlo Damage Simulation (MCDS). Results: Ranges computed by the two codes agreed within 1 mm. Physical dose difference was less than 2.5 % at the Bragg peak. RBE-weighted dose agreed within 5 % at the Bragg peak. Differences in microdosimetric quantities such as dose average lineal energy transfer and specific energy were < 10%. The simulation time per source particle with FLUKA was 0.0018 sec, while gPMC was ∼ 600 times faster. Conclusion: Physical dose computed by FLUKA and gPMC were in a good agreement. The RBE differences along the central axis were small, and RBE-weighted dose difference was found to be acceptable. The combined accuracy and efficiency makes gPMC suitable for proton therapy biological optimization.

  9. Optimization of GATE and PHITS Monte Carlo code parameters for uniform scanning proton beam based on simulation with FLUKA general-purpose code

    Energy Technology Data Exchange (ETDEWEB)

    Kurosu, Keita [Department of Medical Physics and Engineering, Osaka University Graduate School of Medicine, Suita, Osaka 565-0871 (Japan); Department of Radiation Oncology, Osaka University Graduate School of Medicine, Suita, Osaka 565-0871 (Japan); Takashina, Masaaki; Koizumi, Masahiko [Department of Medical Physics and Engineering, Osaka University Graduate School of Medicine, Suita, Osaka 565-0871 (Japan); Das, Indra J. [Department of Radiation Oncology, Indiana University School of Medicine, Indianapolis, IN 46202 (United States); Moskvin, Vadim P., E-mail: vadim.p.moskvin@gmail.com [Department of Radiation Oncology, Indiana University School of Medicine, Indianapolis, IN 46202 (United States)

    2014-10-01

    Although three general-purpose Monte Carlo (MC) simulation tools: Geant4, FLUKA and PHITS have been used extensively, differences in calculation results have been reported. The major causes are the implementation of the physical model, preset value of the ionization potential or definition of the maximum step size. In order to achieve artifact free MC simulation, an optimized parameters list for each simulation system is required. Several authors have already proposed the optimized lists, but those studies were performed with a simple system such as only a water phantom. Since particle beams have a transport, interaction and electromagnetic processes during beam delivery, establishment of an optimized parameters-list for whole beam delivery system is therefore of major importance. The purpose of this study was to determine the optimized parameters list for GATE and PHITS using proton treatment nozzle computational model. The simulation was performed with the broad scanning proton beam. The influences of the customizing parameters on the percentage depth dose (PDD) profile and the proton range were investigated by comparison with the result of FLUKA, and then the optimal parameters were determined. The PDD profile and the proton range obtained from our optimized parameters list showed different characteristics from the results obtained with simple system. This led to the conclusion that the physical model, particle transport mechanics and different geometry-based descriptions need accurate customization in planning computational experiments for artifact-free MC simulation.

  10. Optimization of GATE and PHITS Monte Carlo code parameters for uniform scanning proton beam based on simulation with FLUKA general-purpose code

    International Nuclear Information System (INIS)

    Kurosu, Keita; Takashina, Masaaki; Koizumi, Masahiko; Das, Indra J.; Moskvin, Vadim P.

    2014-01-01

    Although three general-purpose Monte Carlo (MC) simulation tools: Geant4, FLUKA and PHITS have been used extensively, differences in calculation results have been reported. The major causes are the implementation of the physical model, preset value of the ionization potential or definition of the maximum step size. In order to achieve artifact free MC simulation, an optimized parameters list for each simulation system is required. Several authors have already proposed the optimized lists, but those studies were performed with a simple system such as only a water phantom. Since particle beams have a transport, interaction and electromagnetic processes during beam delivery, establishment of an optimized parameters-list for whole beam delivery system is therefore of major importance. The purpose of this study was to determine the optimized parameters list for GATE and PHITS using proton treatment nozzle computational model. The simulation was performed with the broad scanning proton beam. The influences of the customizing parameters on the percentage depth dose (PDD) profile and the proton range were investigated by comparison with the result of FLUKA, and then the optimal parameters were determined. The PDD profile and the proton range obtained from our optimized parameters list showed different characteristics from the results obtained with simple system. This led to the conclusion that the physical model, particle transport mechanics and different geometry-based descriptions need accurate customization in planning computational experiments for artifact-free MC simulation

  11. SU-E-T-673: Recent Developments and Comprehensive Validations of a GPU-Based Proton Monte Carlo Simulation Package, GPMC

    International Nuclear Information System (INIS)

    Qin, N; Tian, Z; Pompos, A; Jiang, S; Jia, X; Giantsoudi, D; Schuemann, J; Paganetti, H

    2015-01-01

    Purpose: A GPU-based Monte Carlo (MC) simulation package gPMC has been previously developed and high computational efficiency was achieved. This abstract reports our recent improvements on this package in terms of accuracy, functionality, and code portability. Methods: In the latest version of gPMC, nuclear interaction cross section database was updated to include data from TOPAS/Geant4. Inelastic interaction model, particularly the proton scattering angle distribution, was updated to improve overall simulation accuracy. Calculation of dose averaged LET (LETd) was implemented. gPMC was ported onto an OpenCL environment to enable portability across different computing devices (GPUs from different vendors and CPUs). We also performed comprehensive tests of the code accuracy. Dose from electro-magnetic (EM) interaction channel, primary and secondary proton doses and fluences were scored and compared with those computed by TOPAS. Results: In a homogeneous water phantom with 100 and 200 MeV beams, mean dose differences in EM channel computed by gPMC and by TOPAS were 0.28% and 0.65% of the corresponding maximum dose, respectively. With the Geant4 nuclear interaction cross section data, mean difference of primary proton dose was 0.84% for the 200 MeV case and 0.78% for the 100 MeV case. After updating inelastic interaction model, maximum difference of secondary proton fluence and dose were 0.08% and 0.5% for the 200 MeV beam, and 0.04% and 0.2% for the 100 MeV beam. In a test case with a 150MeV proton beam, the mean difference between LETd computed by gPMC and TOPAS was 0.96% within the proton range. With the OpenCL implementation, gPMC is executable on AMD and Nvidia GPUs, as well as on Intel CPU in single or multiple threads. Results on different platforms agreed within statistical uncertainty. Conclusion: Several improvements have been implemented in the latest version of gPMC, which enhanced its accuracy, functionality, and code portability

  12. SU-E-T-673: Recent Developments and Comprehensive Validations of a GPU-Based Proton Monte Carlo Simulation Package, GPMC

    Energy Technology Data Exchange (ETDEWEB)

    Qin, N; Tian, Z; Pompos, A; Jiang, S; Jia, X [UT Southwestern Medical Center, Dallas, TX (United States); Giantsoudi, D; Schuemann, J; Paganetti, H [Massachusetts General Hospital, Boston, MA (United States)

    2015-06-15

    Purpose: A GPU-based Monte Carlo (MC) simulation package gPMC has been previously developed and high computational efficiency was achieved. This abstract reports our recent improvements on this package in terms of accuracy, functionality, and code portability. Methods: In the latest version of gPMC, nuclear interaction cross section database was updated to include data from TOPAS/Geant4. Inelastic interaction model, particularly the proton scattering angle distribution, was updated to improve overall simulation accuracy. Calculation of dose averaged LET (LETd) was implemented. gPMC was ported onto an OpenCL environment to enable portability across different computing devices (GPUs from different vendors and CPUs). We also performed comprehensive tests of the code accuracy. Dose from electro-magnetic (EM) interaction channel, primary and secondary proton doses and fluences were scored and compared with those computed by TOPAS. Results: In a homogeneous water phantom with 100 and 200 MeV beams, mean dose differences in EM channel computed by gPMC and by TOPAS were 0.28% and 0.65% of the corresponding maximum dose, respectively. With the Geant4 nuclear interaction cross section data, mean difference of primary proton dose was 0.84% for the 200 MeV case and 0.78% for the 100 MeV case. After updating inelastic interaction model, maximum difference of secondary proton fluence and dose were 0.08% and 0.5% for the 200 MeV beam, and 0.04% and 0.2% for the 100 MeV beam. In a test case with a 150MeV proton beam, the mean difference between LETd computed by gPMC and TOPAS was 0.96% within the proton range. With the OpenCL implementation, gPMC is executable on AMD and Nvidia GPUs, as well as on Intel CPU in single or multiple threads. Results on different platforms agreed within statistical uncertainty. Conclusion: Several improvements have been implemented in the latest version of gPMC, which enhanced its accuracy, functionality, and code portability.

  13. Experimental validation of a Monte Carlo proton therapy nozzle model incorporating magnetically steered protons

    International Nuclear Information System (INIS)

    Peterson, S W; Polf, J; Archambault, L; Beddar, S; Bues, M; Ciangaru, G; Smith, A

    2009-01-01

    The purpose of this study is to validate the accuracy of a Monte Carlo calculation model of a proton magnetic beam scanning delivery nozzle developed using the Geant4 toolkit. The Monte Carlo model was used to produce depth dose and lateral profiles, which were compared to data measured in the clinical scanning treatment nozzle at several energies. Comparisons were also made between measured and simulated off-axis profiles to test the accuracy of the model's magnetic steering. Comparison of the 80% distal dose fall-off values for the measured and simulated depth dose profiles agreed to within 1 mm for the beam energies evaluated. Agreement of the full width at half maximum values for the measured and simulated lateral fluence profiles was within 1.3 mm for all energies. The position of measured and simulated spot positions for the magnetically steered beams agreed to within 0.7 mm of each other. Based on these results, we found that the Geant4 Monte Carlo model of the beam scanning nozzle has the ability to accurately predict depth dose profiles, lateral profiles perpendicular to the beam axis and magnetic steering of a proton beam during beam scanning proton therapy.

  14. TH-A-19A-12: A GPU-Accelerated and Monte Carlo-Based Intensity Modulated Proton Therapy Optimization System

    Energy Technology Data Exchange (ETDEWEB)

    Ma, J; Wan Chan Tseung, H; Beltran, C [Mayo Clinic, Rochester, MN (United States)

    2014-06-15

    Purpose: To develop a clinically applicable intensity modulated proton therapy (IMPT) optimization system that utilizes more accurate Monte Carlo (MC) dose calculation, rather than analytical dose calculation. Methods: A very fast in-house graphics processing unit (GPU) based MC dose calculation engine was employed to generate the dose influence map for each proton spot. With the MC generated influence map, a modified gradient based optimization method was used to achieve the desired dose volume histograms (DVH). The intrinsic CT image resolution was adopted for voxelization in simulation and optimization to preserve the spatial resolution. The optimizations were computed on a multi-GPU framework to mitigate the memory limitation issues for the large dose influence maps that Result from maintaining the intrinsic CT resolution and large number of proton spots. The dose effects were studied particularly in cases with heterogeneous materials in comparison with the commercial treatment planning system (TPS). Results: For a relatively large and complex three-field bi-lateral head and neck case (i.e. >100K spots with a target volume of ∼1000 cc and multiple surrounding critical structures), the optimization together with the initial MC dose influence map calculation can be done in a clinically viable time frame (i.e. less than 15 minutes) on a GPU cluster consisting of 24 Nvidia GeForce GTX Titan cards. The DVHs of the MC TPS plan compare favorably with those of a commercial treatment planning system. Conclusion: A GPU accelerated and MC-based IMPT optimization system was developed. The dose calculation and plan optimization can be performed in less than 15 minutes on a hardware system costing less than 45,000 dollars. The fast calculation and optimization makes the system easily expandable to robust and multi-criteria optimization. This work was funded in part by a grant from Varian Medical Systems, Inc.

  15. TH-A-19A-12: A GPU-Accelerated and Monte Carlo-Based Intensity Modulated Proton Therapy Optimization System

    International Nuclear Information System (INIS)

    Ma, J; Wan Chan Tseung, H; Beltran, C

    2014-01-01

    Purpose: To develop a clinically applicable intensity modulated proton therapy (IMPT) optimization system that utilizes more accurate Monte Carlo (MC) dose calculation, rather than analytical dose calculation. Methods: A very fast in-house graphics processing unit (GPU) based MC dose calculation engine was employed to generate the dose influence map for each proton spot. With the MC generated influence map, a modified gradient based optimization method was used to achieve the desired dose volume histograms (DVH). The intrinsic CT image resolution was adopted for voxelization in simulation and optimization to preserve the spatial resolution. The optimizations were computed on a multi-GPU framework to mitigate the memory limitation issues for the large dose influence maps that Result from maintaining the intrinsic CT resolution and large number of proton spots. The dose effects were studied particularly in cases with heterogeneous materials in comparison with the commercial treatment planning system (TPS). Results: For a relatively large and complex three-field bi-lateral head and neck case (i.e. >100K spots with a target volume of ∼1000 cc and multiple surrounding critical structures), the optimization together with the initial MC dose influence map calculation can be done in a clinically viable time frame (i.e. less than 15 minutes) on a GPU cluster consisting of 24 Nvidia GeForce GTX Titan cards. The DVHs of the MC TPS plan compare favorably with those of a commercial treatment planning system. Conclusion: A GPU accelerated and MC-based IMPT optimization system was developed. The dose calculation and plan optimization can be performed in less than 15 minutes on a hardware system costing less than 45,000 dollars. The fast calculation and optimization makes the system easily expandable to robust and multi-criteria optimization. This work was funded in part by a grant from Varian Medical Systems, Inc

  16. Proton therapy analysis using the Monte Carlo method

    Energy Technology Data Exchange (ETDEWEB)

    Noshad, Houshyar [Center for Theoretical Physics and Mathematics, AEOI, P.O. Box 14155-1339, Tehran (Iran, Islamic Republic of)]. E-mail: hnoshad@aeoi.org.ir; Givechi, Nasim [Islamic Azad University, Science and Research Branch, Tehran (Iran, Islamic Republic of)

    2005-10-01

    The range and straggling data obtained from the transport of ions in matter (TRIM) computer program were used to determine the trajectories of monoenergetic 60 MeV protons in muscle tissue by using the Monte Carlo technique. The appropriate profile for the shape of a proton pencil beam in proton therapy as well as the dose deposited in the tissue were computed. The good agreements between our results as compared with the corresponding experimental values are presented here to show the reliability of our Monte Carlo method.

  17. Neutron shielding calculations in a proton therapy facility based on Monte Carlo simulations and analytical models: Criterion for selecting the method of choice

    International Nuclear Information System (INIS)

    Titt, U.; Newhauser, W. D.

    2005-01-01

    Proton therapy facilities are shielded to limit the amount of secondary radiation to which patients, occupational workers and members of the general public are exposed. The most commonly applied shielding design methods for proton therapy facilities comprise semi-empirical and analytical methods to estimate the neutron dose equivalent. This study compares the results of these methods with a detailed simulation of a proton therapy facility by using the Monte Carlo technique. A comparison of neutron dose equivalent values predicted by the various methods reveals the superior accuracy of the Monte Carlo predictions in locations where the calculations converge. However, the reliability of the overall shielding design increases if simulation results, for which solutions have not converged, e.g. owing to too few particle histories, can be excluded, and deterministic models are being used at these locations. Criteria to accept or reject Monte Carlo calculations in such complex structures are not well understood. An optimum rejection criterion would allow all converging solutions of Monte Carlo simulation to be taken into account, and reject all solutions with uncertainties larger than the design safety margins. In this study, the optimum rejection criterion of 10% was found. The mean ratio was 26, 62% of all receptor locations showed a ratio between 0.9 and 10, and 92% were between 1 and 100. (authors)

  18. A Fano cavity test for Monte Carlo proton transport algorithms

    International Nuclear Information System (INIS)

    Sterpin, Edmond; Sorriaux, Jefferson; Souris, Kevin; Vynckier, Stefaan; Bouchard, Hugo

    2014-01-01

    Purpose: In the scope of reference dosimetry of radiotherapy beams, Monte Carlo (MC) simulations are widely used to compute ionization chamber dose response accurately. Uncertainties related to the transport algorithm can be verified performing self-consistency tests, i.e., the so-called “Fano cavity test.” The Fano cavity test is based on the Fano theorem, which states that under charged particle equilibrium conditions, the charged particle fluence is independent of the mass density of the media as long as the cross-sections are uniform. Such tests have not been performed yet for MC codes simulating proton transport. The objectives of this study are to design a new Fano cavity test for proton MC and to implement the methodology in two MC codes: Geant4 and PENELOPE extended to protons (PENH). Methods: The new Fano test is designed to evaluate the accuracy of proton transport. Virtual particles with an energy ofE 0 and a mass macroscopic cross section of (Σ)/(ρ) are transported, having the ability to generate protons with kinetic energy E 0 and to be restored after each interaction, thus providing proton equilibrium. To perform the test, the authors use a simplified simulation model and rigorously demonstrate that the computed cavity dose per incident fluence must equal (ΣE 0 )/(ρ) , as expected in classic Fano tests. The implementation of the test is performed in Geant4 and PENH. The geometry used for testing is a 10 × 10 cm 2 parallel virtual field and a cavity (2 × 2 × 0.2 cm 3 size) in a water phantom with dimensions large enough to ensure proton equilibrium. Results: For conservative user-defined simulation parameters (leading to small step sizes), both Geant4 and PENH pass the Fano cavity test within 0.1%. However, differences of 0.6% and 0.7% were observed for PENH and Geant4, respectively, using larger step sizes. For PENH, the difference is attributed to the random-hinge method that introduces an artificial energy straggling if step size is not

  19. Clinically Applicable Monte Carlo–based Biological Dose Optimization for the Treatment of Head and Neck Cancers With Spot-Scanning Proton Therapy

    Energy Technology Data Exchange (ETDEWEB)

    Wan Chan Tseung, Hok Seum, E-mail: wanchantseung.hok@mayo.edu; Ma, Jiasen; Kreofsky, Cole R.; Ma, Daniel J.; Beltran, Chris

    2016-08-01

    Purpose: Our aim is to demonstrate the feasibility of fast Monte Carlo (MC)–based inverse biological planning for the treatment of head and neck tumors in spot-scanning proton therapy. Methods and Materials: Recently, a fast and accurate graphics processor unit (GPU)–based MC simulation of proton transport was developed and used as the dose-calculation engine in a GPU-accelerated intensity modulated proton therapy (IMPT) optimizer. Besides dose, the MC can simultaneously score the dose-averaged linear energy transfer (LET{sub d}), which makes biological dose (BD) optimization possible. To convert from LET{sub d} to BD, a simple linear relation was assumed. By use of this novel optimizer, inverse biological planning was applied to 4 patients, including 2 small and 1 large thyroid tumor targets, as well as 1 glioma case. To create these plans, constraints were placed to maintain the physical dose (PD) within 1.25 times the prescription while maximizing target BD. For comparison, conventional intensity modulated radiation therapy (IMRT) and IMPT plans were also created using Eclipse (Varian Medical Systems) in each case. The same critical-structure PD constraints were used for the IMRT, IMPT, and biologically optimized plans. The BD distributions for the IMPT plans were obtained through MC recalculations. Results: Compared with standard IMPT, the biologically optimal plans for patients with small tumor targets displayed a BD escalation that was around twice the PD increase. Dose sparing to critical structures was improved compared with both IMRT and IMPT. No significant BD increase could be achieved for the large thyroid tumor case and when the presence of critical structures mitigated the contribution of additional fields. The calculation of the biologically optimized plans can be completed in a clinically viable time (<30 minutes) on a small 24-GPU system. Conclusions: By exploiting GPU acceleration, MC-based, biologically optimized plans were created for

  20. TH-CD-202-05: DECT Based Tissue Segmentation as Input to Monte Carlo Simulations for Proton Treatment Verification Using PET Imaging

    International Nuclear Information System (INIS)

    Berndt, B; Wuerl, M; Dedes, G; Landry, G; Parodi, K; Tessonnier, T; Schwarz, F; Kamp, F; Thieke, C; Belka, C; Reiser, M; Sommer, W; Bauer, J; Verhaegen, F

    2016-01-01

    Purpose: To improve agreement of predicted and measured positron emitter yields in patients, after proton irradiation for PET-based treatment verification, using a novel dual energy CT (DECT) tissue segmentation approach, overcoming known deficiencies from single energy CT (SECT). Methods: DECT head scans of 5 trauma patients were segmented and compared to existing decomposition methods with a first focus on the brain. For validation purposes, three brain equivalent solutions [water, white matter (WM) and grey matter (GM) – equivalent with respect to their reference carbon and oxygen contents and CT numbers at 90kVp and 150kVp] were prepared from water, ethanol, sucrose and salt. The activities of all brain solutions, measured during a PET scan after uniform proton irradiation, were compared to Monte Carlo simulations. Simulation inputs were various solution compositions obtained from different segmentation approaches from DECT, SECT scans, and known reference composition. Virtual GM solution salt concentration corrections were applied based on DECT measurements of solutions with varying salt concentration. Results: The novel tissue segmentation showed qualitative improvements in %C for patient brain scans (ground truth unavailable). The activity simulations based on reference solution compositions agree with the measurement within 3–5% (4–8Bq/ml). These reference simulations showed an absolute activity difference between WM (20%C) and GM (10%C) to H2O (0%C) of 43 Bq/ml and 22 Bq/ml, respectively. Activity differences between reference simulations and segmented ones varied from −6 to 1 Bq/ml for DECT and −79 to 8 Bq/ml for SECT. Conclusion: Compared to the conventionally used SECT segmentation, the DECT based segmentation indicates a qualitative and quantitative improvement. In controlled solutions, a MC input based on DECT segmentation leads to better agreement with the reference. Future work will address the anticipated improvement of quantification

  1. TH-CD-202-05: DECT Based Tissue Segmentation as Input to Monte Carlo Simulations for Proton Treatment Verification Using PET Imaging

    Energy Technology Data Exchange (ETDEWEB)

    Berndt, B; Wuerl, M; Dedes, G; Landry, G; Parodi, K [Ludwig-Maximilians-Universitaet Muenchen, Garching, DE (Germany); Tessonnier, T [Ludwig-Maximilians-Universitaet Muenchen, Garching, DE (Germany); Universitaetsklinikum Heidelberg, Heidelberg, DE (Germany); Schwarz, F; Kamp, F; Thieke, C; Belka, C; Reiser, M; Sommer, W [LMU Munich, Munich, DE (Germany); Bauer, J [Universitaetsklinikum Heidelberg, Heidelberg, DE (Germany); Heidelberg Ion-Beam Therapy Center, Heidelberg, DE (Germany); Verhaegen, F [Maastro Clinic, Maastricht (Netherlands)

    2016-06-15

    Purpose: To improve agreement of predicted and measured positron emitter yields in patients, after proton irradiation for PET-based treatment verification, using a novel dual energy CT (DECT) tissue segmentation approach, overcoming known deficiencies from single energy CT (SECT). Methods: DECT head scans of 5 trauma patients were segmented and compared to existing decomposition methods with a first focus on the brain. For validation purposes, three brain equivalent solutions [water, white matter (WM) and grey matter (GM) – equivalent with respect to their reference carbon and oxygen contents and CT numbers at 90kVp and 150kVp] were prepared from water, ethanol, sucrose and salt. The activities of all brain solutions, measured during a PET scan after uniform proton irradiation, were compared to Monte Carlo simulations. Simulation inputs were various solution compositions obtained from different segmentation approaches from DECT, SECT scans, and known reference composition. Virtual GM solution salt concentration corrections were applied based on DECT measurements of solutions with varying salt concentration. Results: The novel tissue segmentation showed qualitative improvements in %C for patient brain scans (ground truth unavailable). The activity simulations based on reference solution compositions agree with the measurement within 3–5% (4–8Bq/ml). These reference simulations showed an absolute activity difference between WM (20%C) and GM (10%C) to H2O (0%C) of 43 Bq/ml and 22 Bq/ml, respectively. Activity differences between reference simulations and segmented ones varied from −6 to 1 Bq/ml for DECT and −79 to 8 Bq/ml for SECT. Conclusion: Compared to the conventionally used SECT segmentation, the DECT based segmentation indicates a qualitative and quantitative improvement. In controlled solutions, a MC input based on DECT segmentation leads to better agreement with the reference. Future work will address the anticipated improvement of quantification

  2. Proton therapy Monte Carlo SRNA-VOX code

    Directory of Open Access Journals (Sweden)

    Ilić Radovan D.

    2012-01-01

    Full Text Available The most powerful feature of the Monte Carlo method is the possibility of simulating all individual particle interactions in three dimensions and performing numerical experiments with a preset error. These facts were the motivation behind the development of a general-purpose Monte Carlo SRNA program for proton transport simulation in technical systems described by standard geometrical forms (plane, sphere, cone, cylinder, cube. Some of the possible applications of the SRNA program are: (a a general code for proton transport modeling, (b design of accelerator-driven systems, (c simulation of proton scattering and degrading shapes and composition, (d research on proton detectors; and (e radiation protection at accelerator installations. This wide range of possible applications of the program demands the development of various versions of SRNA-VOX codes for proton transport modeling in voxelized geometries and has, finally, resulted in the ISTAR package for the calculation of deposited energy distribution in patients on the basis of CT data in radiotherapy. All of the said codes are capable of using 3-D proton sources with an arbitrary energy spectrum in an interval of 100 keV to 250 MeV.

  3. Optimization of GATE and PHITS Monte Carlo code parameters for spot scanning proton beam based on simulation with FLUKA general-purpose code

    Energy Technology Data Exchange (ETDEWEB)

    Kurosu, Keita [Department of Radiation Oncology, Indiana University School of Medicine, Indianapolis, IN 46202 (United States); Department of Radiation Oncology, Osaka University Graduate School of Medicine, Suita, Osaka 565-0871 (Japan); Department of Radiology, Osaka University Hospital, Suita, Osaka 565-0871 (Japan); Das, Indra J. [Department of Radiation Oncology, Indiana University School of Medicine, Indianapolis, IN 46202 (United States); Moskvin, Vadim P. [Department of Radiation Oncology, Indiana University School of Medicine, Indianapolis, IN 46202 (United States); Department of Radiation Oncology, St. Jude Children’s Research Hospital, Memphis, TN 38105 (United States)

    2016-01-15

    Spot scanning, owing to its superior dose-shaping capability, provides unsurpassed dose conformity, in particular for complex targets. However, the robustness of the delivered dose distribution and prescription has to be verified. Monte Carlo (MC) simulation has the potential to generate significant advantages for high-precise particle therapy, especially for medium containing inhomogeneities. However, the inherent choice of computational parameters in MC simulation codes of GATE, PHITS and FLUKA that is observed for uniform scanning proton beam needs to be evaluated. This means that the relationship between the effect of input parameters and the calculation results should be carefully scrutinized. The objective of this study was, therefore, to determine the optimal parameters for the spot scanning proton beam for both GATE and PHITS codes by using data from FLUKA simulation as a reference. The proton beam scanning system of the Indiana University Health Proton Therapy Center was modeled in FLUKA, and the geometry was subsequently and identically transferred to GATE and PHITS. Although the beam transport is managed by spot scanning system, the spot location is always set at the center of a water phantom of 600 × 600 × 300 mm{sup 3}, which is placed after the treatment nozzle. The percentage depth dose (PDD) is computed along the central axis using 0.5 × 0.5 × 0.5 mm{sup 3} voxels in the water phantom. The PDDs and the proton ranges obtained with several computational parameters are then compared to those of FLUKA, and optimal parameters are determined from the accuracy of the proton range, suppressed dose deviation, and computational time minimization. Our results indicate that the optimized parameters are different from those for uniform scanning, suggesting that the gold standard for setting computational parameters for any proton therapy application cannot be determined consistently since the impact of setting parameters depends on the proton irradiation

  4. Optimization of GATE and PHITS Monte Carlo code parameters for spot scanning proton beam based on simulation with FLUKA general-purpose code

    International Nuclear Information System (INIS)

    Kurosu, Keita; Das, Indra J.; Moskvin, Vadim P.

    2016-01-01

    Spot scanning, owing to its superior dose-shaping capability, provides unsurpassed dose conformity, in particular for complex targets. However, the robustness of the delivered dose distribution and prescription has to be verified. Monte Carlo (MC) simulation has the potential to generate significant advantages for high-precise particle therapy, especially for medium containing inhomogeneities. However, the inherent choice of computational parameters in MC simulation codes of GATE, PHITS and FLUKA that is observed for uniform scanning proton beam needs to be evaluated. This means that the relationship between the effect of input parameters and the calculation results should be carefully scrutinized. The objective of this study was, therefore, to determine the optimal parameters for the spot scanning proton beam for both GATE and PHITS codes by using data from FLUKA simulation as a reference. The proton beam scanning system of the Indiana University Health Proton Therapy Center was modeled in FLUKA, and the geometry was subsequently and identically transferred to GATE and PHITS. Although the beam transport is managed by spot scanning system, the spot location is always set at the center of a water phantom of 600 × 600 × 300 mm 3 , which is placed after the treatment nozzle. The percentage depth dose (PDD) is computed along the central axis using 0.5 × 0.5 × 0.5 mm 3 voxels in the water phantom. The PDDs and the proton ranges obtained with several computational parameters are then compared to those of FLUKA, and optimal parameters are determined from the accuracy of the proton range, suppressed dose deviation, and computational time minimization. Our results indicate that the optimized parameters are different from those for uniform scanning, suggesting that the gold standard for setting computational parameters for any proton therapy application cannot be determined consistently since the impact of setting parameters depends on the proton irradiation technique

  5. SU-E-T-254: Optimization of GATE and PHITS Monte Carlo Code Parameters for Uniform Scanning Proton Beam Based On Simulation with FLUKA General-Purpose Code

    Energy Technology Data Exchange (ETDEWEB)

    Kurosu, K [Department of Radiation Oncology, Osaka University Graduate School of Medicine, Osaka (Japan); Department of Medical Physics ' Engineering, Osaka University Graduate School of Medicine, Osaka (Japan); Takashina, M; Koizumi, M [Department of Medical Physics ' Engineering, Osaka University Graduate School of Medicine, Osaka (Japan); Das, I; Moskvin, V [Department of Radiation Oncology, Indiana University School of Medicine, Indianapolis, IN (United States)

    2014-06-01

    Purpose: Monte Carlo codes are becoming important tools for proton beam dosimetry. However, the relationships between the customizing parameters and percentage depth dose (PDD) of GATE and PHITS codes have not been reported which are studied for PDD and proton range compared to the FLUKA code and the experimental data. Methods: The beam delivery system of the Indiana University Health Proton Therapy Center was modeled for the uniform scanning beam in FLUKA and transferred identically into GATE and PHITS. This computational model was built from the blue print and validated with the commissioning data. Three parameters evaluated are the maximum step size, cut off energy and physical and transport model. The dependence of the PDDs on the customizing parameters was compared with the published results of previous studies. Results: The optimal parameters for the simulation of the whole beam delivery system were defined by referring to the calculation results obtained with each parameter. Although the PDDs from FLUKA and the experimental data show a good agreement, those of GATE and PHITS obtained with our optimal parameters show a minor discrepancy. The measured proton range R90 was 269.37 mm, compared to the calculated range of 269.63 mm, 268.96 mm, and 270.85 mm with FLUKA, GATE and PHITS, respectively. Conclusion: We evaluated the dependence of the results for PDDs obtained with GATE and PHITS Monte Carlo generalpurpose codes on the customizing parameters by using the whole computational model of the treatment nozzle. The optimal parameters for the simulation were then defined by referring to the calculation results. The physical model, particle transport mechanics and the different geometrybased descriptions need accurate customization in three simulation codes to agree with experimental data for artifact-free Monte Carlo simulation. This study was supported by Grants-in Aid for Cancer Research (H22-3rd Term Cancer Control-General-043) from the Ministry of Health

  6. Clinical implementation of full Monte Carlo dose calculation in proton beam therapy

    International Nuclear Information System (INIS)

    Paganetti, Harald; Jiang, Hongyu; Parodi, Katia; Slopsema, Roelf; Engelsman, Martijn

    2008-01-01

    The goal of this work was to facilitate the clinical use of Monte Carlo proton dose calculation to support routine treatment planning and delivery. The Monte Carlo code Geant4 was used to simulate the treatment head setup, including a time-dependent simulation of modulator wheels (for broad beam modulation) and magnetic field settings (for beam scanning). Any patient-field-specific setup can be modeled according to the treatment control system of the facility. The code was benchmarked against phantom measurements. Using a simulation of the ionization chamber reading in the treatment head allows the Monte Carlo dose to be specified in absolute units (Gy per ionization chamber reading). Next, the capability of reading CT data information was implemented into the Monte Carlo code to model patient anatomy. To allow time-efficient dose calculation, the standard Geant4 tracking algorithm was modified. Finally, a software link of the Monte Carlo dose engine to the patient database and the commercial planning system was established to allow data exchange, thus completing the implementation of the proton Monte Carlo dose calculation engine ('DoC++'). Monte Carlo re-calculated plans are a valuable tool to revisit decisions in the planning process. Identification of clinically significant differences between Monte Carlo and pencil-beam-based dose calculations may also drive improvements of current pencil-beam methods. As an example, four patients (29 fields in total) with tumors in the head and neck regions were analyzed. Differences between the pencil-beam algorithm and Monte Carlo were identified in particular near the end of range, both due to dose degradation and overall differences in range prediction due to bony anatomy in the beam path. Further, the Monte Carlo reports dose-to-tissue as compared to dose-to-water by the planning system. Our implementation is tailored to a specific Monte Carlo code and the treatment planning system XiO (Computerized Medical Systems Inc

  7. A Monte Carlo track structure code for low energy protons

    CERN Document Server

    Endo, S; Nikjoo, H; Uehara, S; Hoshi, M; Ishikawa, M; Shizuma, K

    2002-01-01

    A code is described for simulation of protons (100 eV to 10 MeV) track structure in water vapor. The code simulates molecular interaction by interaction for the transport of primary ions and secondary electrons in the form of ionizations and excitations. When a low velocity ion collides with the atoms or molecules of a target, the ion may also capture or lose electrons. The probabilities for these processes are described by the quantity cross-section. Although proton track simulation at energies above Bragg peak (>0.3 MeV) has been achieved to a high degree of precision, simulations at energies near or below the Bragg peak have only been attempted recently because of the lack of relevant cross-section data. As the hydrogen atom has a different ionization cross-section from that of a proton, charge exchange processes need to be considered in order to calculate stopping power for low energy protons. In this paper, we have used state-of-the-art Monte Carlo track simulation techniques, in conjunction with the pub...

  8. Srna - Monte Carlo codes for proton transport simulation in combined and voxelized geometries

    Directory of Open Access Journals (Sweden)

    Ilić Radovan D.

    2002-01-01

    Full Text Available This paper describes new Monte Carlo codes for proton transport simulations in complex geometrical forms and in materials of different composition. The SRNA codes were developed for three dimensional (3D dose distribution calculation in proton therapy and dosimetry. The model of these codes is based on the theory of proton multiple scattering and a simple model of compound nucleus decay. The developed package consists of two codes: SRNA-2KG and SRNA-VOX. The first code simulates proton transport in combined geometry that can be described by planes and second order surfaces. The second one uses the voxelized geometry of material zones and is specifically adopted for the application of patient computer tomography data. Transition probabilities for both codes are given by the SRNADAT program. In this paper, we will present the models and algorithms of our programs, as well as the results of the numerical experiments we have carried out applying them, along with the results of proton transport simulation obtained through the PETRA and GEANT programs. The simulation of the proton beam characterization by means of the Multi-Layer Faraday Cup and spatial distribution of positron emitters obtained by our program indicate the imminent application of Monte Carlo techniques in clinical practice.

  9. TOPAS: An innovative proton Monte Carlo platform for research and clinical applications

    Energy Technology Data Exchange (ETDEWEB)

    Perl, J.; Shin, J.; Schuemann, J.; Faddegon, B.; Paganetti, H. [SLAC National Accelerator Laboratory, 2575 Sand Hill Road, Menlo Park, California 94025 (United States); University of California San Francisco Comprehensive Cancer Center, 1600 Divisadero Street, San Francisco, California 94143-1708 (United States); Department of Radiation Oncology, Massachusetts General Hospital and Harvard Medical School, Boston, Massachusetts 02114 (United States); University of California San Francisco Comprehensive Cancer Center, 1600 Divisadero Street, San Francisco, California 94143-1708 (United States); Department of Radiation Oncology, Massachusetts General Hospital and Harvard Medical School, Boston, Massachusetts 02114 (United States)

    2012-11-15

    Purpose: While Monte Carlo particle transport has proven useful in many areas (treatment head design, dose calculation, shielding design, and imaging studies) and has been particularly important for proton therapy (due to the conformal dose distributions and a finite beam range in the patient), the available general purpose Monte Carlo codes in proton therapy have been overly complex for most clinical medical physicists. The learning process has large costs not only in time but also in reliability. To address this issue, we developed an innovative proton Monte Carlo platform and tested the tool in a variety of proton therapy applications. Methods: Our approach was to take one of the already-established general purpose Monte Carlo codes and wrap and extend it to create a specialized user-friendly tool for proton therapy. The resulting tool, TOol for PArticle Simulation (TOPAS), should make Monte Carlo simulation more readily available for research and clinical physicists. TOPAS can model a passive scattering or scanning beam treatment head, model a patient geometry based on computed tomography (CT) images, score dose, fluence, etc., save and restart a phase space, provides advanced graphics, and is fully four-dimensional (4D) to handle variations in beam delivery and patient geometry during treatment. A custom-designed TOPAS parameter control system was placed at the heart of the code to meet requirements for ease of use, reliability, and repeatability without sacrificing flexibility. Results: We built and tested the TOPAS code. We have shown that the TOPAS parameter system provides easy yet flexible control over all key simulation areas such as geometry setup, particle source setup, scoring setup, etc. Through design consistency, we have insured that user experience gained in configuring one component, scorer or filter applies equally well to configuring any other component, scorer or filter. We have incorporated key lessons from safety management, proactively

  10. TOPAS: An innovative proton Monte Carlo platform for research and clinical applications

    International Nuclear Information System (INIS)

    Perl, J.; Shin, J.; Schümann, J.; Faddegon, B.; Paganetti, H.

    2012-01-01

    Purpose: While Monte Carlo particle transport has proven useful in many areas (treatment head design, dose calculation, shielding design, and imaging studies) and has been particularly important for proton therapy (due to the conformal dose distributions and a finite beam range in the patient), the available general purpose Monte Carlo codes in proton therapy have been overly complex for most clinical medical physicists. The learning process has large costs not only in time but also in reliability. To address this issue, we developed an innovative proton Monte Carlo platform and tested the tool in a variety of proton therapy applications. Methods: Our approach was to take one of the already-established general purpose Monte Carlo codes and wrap and extend it to create a specialized user-friendly tool for proton therapy. The resulting tool, TOol for PArticle Simulation (TOPAS), should make Monte Carlo simulation more readily available for research and clinical physicists. TOPAS can model a passive scattering or scanning beam treatment head, model a patient geometry based on computed tomography (CT) images, score dose, fluence, etc., save and restart a phase space, provides advanced graphics, and is fully four-dimensional (4D) to handle variations in beam delivery and patient geometry during treatment. A custom-designed TOPAS parameter control system was placed at the heart of the code to meet requirements for ease of use, reliability, and repeatability without sacrificing flexibility. Results: We built and tested the TOPAS code. We have shown that the TOPAS parameter system provides easy yet flexible control over all key simulation areas such as geometry setup, particle source setup, scoring setup, etc. Through design consistency, we have insured that user experience gained in configuring one component, scorer or filter applies equally well to configuring any other component, scorer or filter. We have incorporated key lessons from safety management, proactively

  11. SRNA-2K5, Proton Transport Using 3-D by Monte Carlo Techniques

    International Nuclear Information System (INIS)

    Ilic, Radovan D.

    2005-01-01

    1 - Description of program or function: SRNA-2K5 performs Monte Carlo transport simulation of proton in 3D source and 3D geometry of arbitrary materials. The proton transport based on condensed history model, and on model of compound nuclei decays that creates in nonelastic nuclear interaction by proton absorption. 2 - Methods: The SRNA-2K5 package is developed for time independent simulation of proton transport by Monte Carlo techniques for numerical experiments in complex geometry, using PENGEOM from PENELOPE with different material compositions, and arbitrary spectrum of proton generated from the 3D source. This package developed for 3D proton dose distribution in proton therapy and dosimetry, and it was based on the theory of multiple scattering. The compound nuclei decay was simulated by our and Russian MSDM models using ICRU 49 and ICRU 63 data. If protons trajectory is divided on great number of steps, protons passage can be simulated according to Berger's Condensed Random Walk model. Conditions of angular distribution and fluctuation of energy loss determinate step length. Physical picture of these processes is described by stopping power, Moliere's angular distribution, Vavilov's distribution with Sulek's correction per all electron orbits, and Chadwick's cross sections for nonelastic nuclear interactions, obtained by his GNASH code. According to physical picture of protons passage and with probabilities of protons transition from previous to next stage, which is prepared by SRNADAT program, simulation of protons transport in all SRNA codes runs according to usual Monte Carlo scheme: (i) proton from the spectrum prepared for random choice of energy, position and space angle is emitted from the source; (ii) proton is loosing average energy on the step; (iii) on that step, proton experience a great number of collisions, and it changes direction of movement randomly chosen from angular distribution; (iv) random fluctuation is added to average energy loss; (v

  12. Monte Carlo calculations supporting patient plan verification in proton therapy

    Directory of Open Access Journals (Sweden)

    Thiago Viana Miranda Lima

    2016-03-01

    Full Text Available Patient’s treatment plan verification covers substantial amount of the quality assurance (QA resources, this is especially true for Intensity Modulated Proton Therapy (IMPT. The use of Monte Carlo (MC simulations in supporting QA has been widely discussed and several methods have been proposed. In this paper we studied an alternative approach from the one being currently applied clinically at Centro Nazionale di Adroterapia Oncologica (CNAO. We reanalysed the previously published data (Molinelli et al. 2013, where 9 patient plans were investigated in which the warning QA threshold of 3% mean dose deviation was crossed. The possibility that these differences between measurement and calculated dose were related to dose modelling (Treatment Planning Systems (TPS vs MC, limitations on dose delivery system or detectors mispositioning was originally explored but other factors such as the geometric description of the detectors were not ruled out. For the purpose of this work we compared ionisation-chambers measurements with different MC simulations results. It was also studied some physical effects introduced by this new approach for example inter detector interference and the delta ray thresholds. The simulations accounting for a detailed geometry typically are superior (statistical difference - p-value around 0.01 to most of the MC simulations used at CNAO (only inferior to the shift approach used. No real improvement were observed in reducing the current delta-ray threshold used (100 keV and no significant interference between ion chambers in the phantom were detected (p-value 0.81. In conclusion, it was observed that the detailed geometrical description improves the agreement between measurement and MC calculations in some cases. But in other cases position uncertainty represents the dominant uncertainty. The inter chamber disturbance was not detected for the therapeutic protons energies and the results from the current delta threshold are

  13. Srna-Monte Carlo codes for proton transport simulation in combined and voxelized geometries

    CERN Document Server

    Ilic, R D; Stankovic, S J

    2002-01-01

    This paper describes new Monte Carlo codes for proton transport simulations in complex geometrical forms and in materials of different composition. The SRNA codes were developed for three dimensional (3D) dose distribution calculation in proton therapy and dosimetry. The model of these codes is based on the theory of proton multiple scattering and a simple model of compound nucleus decay. The developed package consists of two codes: SRNA-2KG and SRNA-VOX. The first code simulates proton transport in combined geometry that can be described by planes and second order surfaces. The second one uses the voxelized geometry of material zones and is specifically adopted for the application of patient computer tomography data. Transition probabilities for both codes are given by the SRNADAT program. In this paper, we will present the models and algorithms of our programs, as well as the results of the numerical experiments we have carried out applying them, along with the results of proton transport simulation obtaine...

  14. MONTE CARLO SIMULATION MODEL OF ENERGETIC PROTON TRANSPORT THROUGH SELF-GENERATED ALFVEN WAVES

    Energy Technology Data Exchange (ETDEWEB)

    Afanasiev, A.; Vainio, R., E-mail: alexandr.afanasiev@helsinki.fi [Department of Physics, University of Helsinki (Finland)

    2013-08-15

    A new Monte Carlo simulation model for the transport of energetic protons through self-generated Alfven waves is presented. The key point of the model is that, unlike the previous ones, it employs the full form (i.e., includes the dependence on the pitch-angle cosine) of the resonance condition governing the scattering of particles off Alfven waves-the process that approximates the wave-particle interactions in the framework of quasilinear theory. This allows us to model the wave-particle interactions in weak turbulence more adequately, in particular, to implement anisotropic particle scattering instead of isotropic scattering, which the previous Monte Carlo models were based on. The developed model is applied to study the transport of flare-accelerated protons in an open magnetic flux tube. Simulation results for the transport of monoenergetic protons through the spectrum of Alfven waves reveal that the anisotropic scattering leads to spatially more distributed wave growth than isotropic scattering. This result can have important implications for diffusive shock acceleration, e.g., affect the scattering mean free path of the accelerated particles in and the size of the foreshock region.

  15. A Monte Carlo simulation of the possible use of Positron Emission Tomography in proton radiotherapy

    International Nuclear Information System (INIS)

    Del Guerra, Alberto; Di Domenico, Giovanni; Gambaccini, Mauro; Marziani, Michele

    1994-01-01

    We have used the Monte Carlo technique to evaluate the applicability of Positron Emission Tomography to in vivo dosimetry for proton radiotherapy. A fair agreement has been found between Monte Carlo results and experimental data. The simulation shows that PET can be useful especially for in vivo Bragg's peak localization. ((orig.))

  16. The Monte Carlo SRNA-VOX code for 3D proton dose distribution in voxelized geometry using CT data

    International Nuclear Information System (INIS)

    Ilic, Radovan D; Spasic-Jokic, Vesna; Belicev, Petar; Dragovic, Milos

    2005-01-01

    This paper describes the application of the SRNA Monte Carlo package for proton transport simulations in complex geometry and different material compositions. The SRNA package was developed for 3D dose distribution calculation in proton therapy and dosimetry and it was based on the theory of multiple scattering. The decay of proton induced compound nuclei was simulated by the Russian MSDM model and our own using ICRU 63 data. The developed package consists of two codes: the SRNA-2KG, which simulates proton transport in combinatorial geometry and the SRNA-VOX, which uses the voxelized geometry using the CT data and conversion of the Hounsfield's data to tissue elemental composition. Transition probabilities for both codes are prepared by the SRNADAT code. The simulation of the proton beam characterization by multi-layer Faraday cup, spatial distribution of positron emitters obtained by the SRNA-2KG code and intercomparison of computational codes in radiation dosimetry, indicate immediate application of the Monte Carlo techniques in clinical practice. In this paper, we briefly present the physical model implemented in the SRNA package, the ISTAR proton dose planning software, as well as the results of the numerical experiments with proton beams to obtain 3D dose distribution in the eye and breast tumour

  17. Monte Carlo characterisation of the Dose Magnifying Glass for proton therapy quality assurance

    Science.gov (United States)

    Merchant, A. H.; Guatelli, S.; Petesecca, M.; Jackson, M.; Rozenfeld, A. B.

    2017-01-01

    A Geant4 Monte Carlo simulation study was carried out to characterise a novel silicon strip detector, the Dose Magnifying Glass (DMG), for use in proton therapy Quality Assurance. We investigated the possibility to use DMG to determine the energy of the incident proton beam. The advantages of DMG are quick response, easy operation and high spatial resolution. In this work we theoretically proved that DMG can be used for QA in the determination of the energy of the incident proton beam, for ocular and prostate cancer therapy. The study was performed by means of Monte Carlo simulations Experimental measurements are currently on their way to confirm the results of this simulation study.

  18. Range uncertainties in proton therapy and the role of Monte Carlo simulations

    International Nuclear Information System (INIS)

    Paganetti, Harald

    2012-01-01

    The main advantages of proton therapy are the reduced total energy deposited in the patient as compared to photon techniques and the finite range of the proton beam. The latter adds an additional degree of freedom to treatment planning. The range in tissue is associated with considerable uncertainties caused by imaging, patient setup, beam delivery and dose calculation. Reducing the uncertainties would allow a reduction of the treatment volume and thus allow a better utilization of the advantages of protons. This paper summarizes the role of Monte Carlo simulations when aiming at a reduction of range uncertainties in proton therapy. Differences in dose calculation when comparing Monte Carlo with analytical algorithms are analyzed as well as range uncertainties due to material constants and CT conversion. Range uncertainties due to biological effects and the role of Monte Carlo for in vivo range verification are discussed. Furthermore, the current range uncertainty recipes used at several proton therapy facilities are revisited. We conclude that a significant impact of Monte Carlo dose calculation can be expected in complex geometries where local range uncertainties due to multiple Coulomb scattering will reduce the accuracy of analytical algorithms. In these cases Monte Carlo techniques might reduce the range uncertainty by several mm. (topical review)

  19. Water equivalence of various materials for clinical proton dosimetry by experiment and Monte Carlo simulation

    International Nuclear Information System (INIS)

    Al-Sulaiti, Leena; Shipley, David; Thomas, Russell; Kacperek, Andrzej; Regan, Patrick; Palmans, Hugo

    2010-01-01

    The accurate conversion of dose to various materials used in clinical proton dosimetry to dose-to-water is based on fluence correction factors, accounting for attenuation of primary protons and production of secondary particles due to non-elastic nuclear interactions. This work aims to investigate the depth dose distribution and the fluence correction with respect to water or graphite at water equivalent depths (WED) in different target materials relevant for dosimetry such as polymethyl methacrylate (PMMA), graphite, A-150, aluminium and copper at 60 and 200 MeV. This was done through a comparison between Monte Carlo simulation using MCNPX 2.5.0, analytical model calculations and experimental measurements at Clatterbridge Centre of Oncology (CCO) in a 60 MeV modulated and un-modulated proton beam. MCNPX simulations indicated small fluence corrections for all materials with respect to graphite and water in 60 and 200 MeV except for aluminium. The analytical calculations showed an increase in the fluence correction factor to a few percent for all materials with respect to water at 200 MeV. The experimental measurements for 60 MeV un-modulated beam indicated a good agreement with MCNPX. For the modulated beam the fluence correction factor was found to be decreasing below unity by up to few percent with depth for aluminium and copper but almost constant and unity for A-150.

  20. Monte-Carlo simulation of proton radiotherapy for human eye

    International Nuclear Information System (INIS)

    Liu Yunpeng; Tang Xiaobin; Xie Qin; Chen Feida; Geng Changran; Chen Da

    2010-01-01

    The 62 MeV proton beam was selected to develop a MCNPX model of the human eye to approximate dose delivered from proton therapy by. In the course of proton therapy, two treatment simulations were considered. The first simulation was an ideal treatment scenario. In this case, the dose of tumor was 50.03 Gy, which was at the level of effective treatment, while other organizations were in the range of acceptable dose. The second case was a worst case scenario to simulate a patient gazing directly into the treatment beam during therapy. The bulk of dose deposited in the cornea, lens, and anterior chamber region. However, the dose of tumor area was zero. The calculated results show an agreement accordance with the relative reference, which confirmed that the MCNPX code can simulate proton radiotherapy perfectly, and is a capable platform for patient planning. The data from the worst case can be used for dose reconstruction of the clinical accident. (authors)

  1. Monte Carlo simulation of prompt γ-ray emission in proton therapy using a specific track length estimator

    International Nuclear Information System (INIS)

    El Kanawati, W; Létang, J M; Sarrut, D; Freud, N; Dauvergne, D; Pinto, M; Testa, É

    2015-01-01

    A Monte Carlo (MC) variance reduction technique is developed for prompt-γ emitters calculations in proton therapy. Prompt-γ emitted through nuclear fragmentation reactions and exiting the patient during proton therapy could play an important role to help monitoring the treatment. However, the estimation of the number and the energy of emitted prompt-γ per primary proton with MC simulations is a slow process. In order to estimate the local distribution of prompt-γ emission in a volume of interest for a given proton beam of the treatment plan, a MC variance reduction technique based on a specific track length estimator (TLE) has been developed. First an elemental database of prompt-γ emission spectra is established in the clinical energy range of incident protons for all elements in the composition of human tissues. This database of the prompt-γ spectra is built offline with high statistics. Regarding the implementation of the prompt-γ TLE MC tally, each proton deposits along its track the expectation of the prompt-γ spectra from the database according to the proton kinetic energy and the local material composition. A detailed statistical study shows that the relative efficiency mainly depends on the geometrical distribution of the track length. Benchmarking of the proposed prompt-γ TLE MC technique with respect to an analogous MC technique is carried out. A large relative efficiency gain is reported, ca. 10 5 . (paper)

  2. Proton energy and scattering angle radiographs to improve proton treatment planning : a Monte Carlo study

    NARCIS (Netherlands)

    Biegun, Aleksandra; Takatsu, Jun; Nakaji, Taku; van Goethem, Marc-Jan; van der Graaf, Emiel; Koffeman, E.; Visser, Jan; Brandenburg, Sijtze

    2016-01-01

    The novel proton radiography imaging technique has a large potential to be used in direct measurement of the proton energy loss (proton stopping power, PSP) in various tissues in the patient. The uncertainty of PSPs, currently obtained from translation of X-ray Computed Tomography (xCT) images,

  3. Monte Carlo characterisation of the Dose Magnifying Glass for proton therapy quality assurance

    International Nuclear Information System (INIS)

    Merchant, A H; Guatelli, S; Petesecca, M; Jackson, M; Rozenfeld, A B

    2017-01-01

    A Geant4 Monte Carlo simulation study was carried out to characterise a novel silicon strip detector, the Dose Magnifying Glass (DMG), for use in proton therapy Quality Assurance. We investigated the possibility to use DMG to determine the energy of the incident proton beam. The advantages of DMG are quick response, easy operation and high spatial resolution. In this work we theoretically proved that DMG can be used for QA in the determination of the energy of the incident proton beam, for ocular and prostate cancer therapy. The study was performed by means of Monte Carlo simulations Experimental measurements are currently on their way to confirm the results of this simulation study. (paper)

  4. LCG Monte-Carlo Data Base

    CERN Document Server

    Bartalini, P.; Kryukov, A.; Selyuzhenkov, Ilya V.; Sherstnev, A.; Vologdin, A.

    2004-01-01

    We present the Monte-Carlo events Data Base (MCDB) project and its development plans. MCDB facilitates communication between authors of Monte-Carlo generators and experimental users. It also provides a convenient book-keeping and an easy access to generator level samples. The first release of MCDB is now operational for the CMS collaboration. In this paper we review the main ideas behind MCDB and discuss future plans to develop this Data Base further within the CERN LCG framework.

  5. The proton therapy nozzles at Samsung Medical Center: A Monte Carlo simulation study using TOPAS

    Science.gov (United States)

    Chung, Kwangzoo; Kim, Jinsung; Kim, Dae-Hyun; Ahn, Sunghwan; Han, Youngyih

    2015-07-01

    To expedite the commissioning process of the proton therapy system at Samsung Medical Center (SMC), we have developed a Monte Carlo simulation model of the proton therapy nozzles by using TOol for PArticle Simulation (TOPAS). At SMC proton therapy center, we have two gantry rooms with different types of nozzles: a multi-purpose nozzle and a dedicated scanning nozzle. Each nozzle has been modeled in detail following the geometry information provided by the manufacturer, Sumitomo Heavy Industries, Ltd. For this purpose, the novel features of TOPAS, such as the time feature or the ridge filter class, have been used, and the appropriate physics models for proton nozzle simulation have been defined. Dosimetric properties, like percent depth dose curve, spreadout Bragg peak (SOBP), and beam spot size, have been simulated and verified against measured beam data. Beyond the Monte Carlo nozzle modeling, we have developed an interface between TOPAS and the treatment planning system (TPS), RayStation. An exported radiotherapy (RT) plan from the TPS is interpreted by using an interface and is then translated into the TOPAS input text. The developed Monte Carlo nozzle model can be used to estimate the non-beam performance, such as the neutron background, of the nozzles. Furthermore, the nozzle model can be used to study the mechanical optimization of the design of the nozzle.

  6. Fast multipurpose Monte Carlo simulation for proton therapy using multi- and many-core CPU architectures

    Energy Technology Data Exchange (ETDEWEB)

    Souris, Kevin, E-mail: kevin.souris@uclouvain.be; Lee, John Aldo [Center for Molecular Imaging and Experimental Radiotherapy, Institut de Recherche Expérimentale et Clinique, Université catholique de Louvain, Avenue Hippocrate 54, 1200 Brussels, Belgium and ICTEAM Institute, Université catholique de Louvain, Louvain-la-Neuve 1348 (Belgium); Sterpin, Edmond [Center for Molecular Imaging and Experimental Radiotherapy, Institut de Recherche Expérimentale et Clinique, Université catholique de Louvain, Avenue Hippocrate 54, 1200 Brussels, Belgium and Department of Oncology, Katholieke Universiteit Leuven, O& N I Herestraat 49, 3000 Leuven (Belgium)

    2016-04-15

    Purpose: Accuracy in proton therapy treatment planning can be improved using Monte Carlo (MC) simulations. However the long computation time of such methods hinders their use in clinical routine. This work aims to develop a fast multipurpose Monte Carlo simulation tool for proton therapy using massively parallel central processing unit (CPU) architectures. Methods: A new Monte Carlo, called MCsquare (many-core Monte Carlo), has been designed and optimized for the last generation of Intel Xeon processors and Intel Xeon Phi coprocessors. These massively parallel architectures offer the flexibility and the computational power suitable to MC methods. The class-II condensed history algorithm of MCsquare provides a fast and yet accurate method of simulating heavy charged particles such as protons, deuterons, and alphas inside voxelized geometries. Hard ionizations, with energy losses above a user-specified threshold, are simulated individually while soft events are regrouped in a multiple scattering theory. Elastic and inelastic nuclear interactions are sampled from ICRU 63 differential cross sections, thereby allowing for the computation of prompt gamma emission profiles. MCsquare has been benchmarked with the GATE/GEANT4 Monte Carlo application for homogeneous and heterogeneous geometries. Results: Comparisons with GATE/GEANT4 for various geometries show deviations within 2%–1 mm. In spite of the limited memory bandwidth of the coprocessor simulation time is below 25 s for 10{sup 7} primary 200 MeV protons in average soft tissues using all Xeon Phi and CPU resources embedded in a single desktop unit. Conclusions: MCsquare exploits the flexibility of CPU architectures to provide a multipurpose MC simulation tool. Optimized code enables the use of accurate MC calculation within a reasonable computation time, adequate for clinical practice. MCsquare also simulates prompt gamma emission and can thus be used also for in vivo range verification.

  7. Fast multipurpose Monte Carlo simulation for proton therapy using multi- and many-core CPU architectures

    International Nuclear Information System (INIS)

    Souris, Kevin; Lee, John Aldo; Sterpin, Edmond

    2016-01-01

    Purpose: Accuracy in proton therapy treatment planning can be improved using Monte Carlo (MC) simulations. However the long computation time of such methods hinders their use in clinical routine. This work aims to develop a fast multipurpose Monte Carlo simulation tool for proton therapy using massively parallel central processing unit (CPU) architectures. Methods: A new Monte Carlo, called MCsquare (many-core Monte Carlo), has been designed and optimized for the last generation of Intel Xeon processors and Intel Xeon Phi coprocessors. These massively parallel architectures offer the flexibility and the computational power suitable to MC methods. The class-II condensed history algorithm of MCsquare provides a fast and yet accurate method of simulating heavy charged particles such as protons, deuterons, and alphas inside voxelized geometries. Hard ionizations, with energy losses above a user-specified threshold, are simulated individually while soft events are regrouped in a multiple scattering theory. Elastic and inelastic nuclear interactions are sampled from ICRU 63 differential cross sections, thereby allowing for the computation of prompt gamma emission profiles. MCsquare has been benchmarked with the GATE/GEANT4 Monte Carlo application for homogeneous and heterogeneous geometries. Results: Comparisons with GATE/GEANT4 for various geometries show deviations within 2%–1 mm. In spite of the limited memory bandwidth of the coprocessor simulation time is below 25 s for 10"7 primary 200 MeV protons in average soft tissues using all Xeon Phi and CPU resources embedded in a single desktop unit. Conclusions: MCsquare exploits the flexibility of CPU architectures to provide a multipurpose MC simulation tool. Optimized code enables the use of accurate MC calculation within a reasonable computation time, adequate for clinical practice. MCsquare also simulates prompt gamma emission and can thus be used also for in vivo range verification.

  8. Fast multipurpose Monte Carlo simulation for proton therapy using multi- and many-core CPU architectures.

    Science.gov (United States)

    Souris, Kevin; Lee, John Aldo; Sterpin, Edmond

    2016-04-01

    Accuracy in proton therapy treatment planning can be improved using Monte Carlo (MC) simulations. However the long computation time of such methods hinders their use in clinical routine. This work aims to develop a fast multipurpose Monte Carlo simulation tool for proton therapy using massively parallel central processing unit (CPU) architectures. A new Monte Carlo, called MCsquare (many-core Monte Carlo), has been designed and optimized for the last generation of Intel Xeon processors and Intel Xeon Phi coprocessors. These massively parallel architectures offer the flexibility and the computational power suitable to MC methods. The class-II condensed history algorithm of MCsquare provides a fast and yet accurate method of simulating heavy charged particles such as protons, deuterons, and alphas inside voxelized geometries. Hard ionizations, with energy losses above a user-specified threshold, are simulated individually while soft events are regrouped in a multiple scattering theory. Elastic and inelastic nuclear interactions are sampled from ICRU 63 differential cross sections, thereby allowing for the computation of prompt gamma emission profiles. MCsquare has been benchmarked with the gate/geant4 Monte Carlo application for homogeneous and heterogeneous geometries. Comparisons with gate/geant4 for various geometries show deviations within 2%-1 mm. In spite of the limited memory bandwidth of the coprocessor simulation time is below 25 s for 10(7) primary 200 MeV protons in average soft tissues using all Xeon Phi and CPU resources embedded in a single desktop unit. MCsquare exploits the flexibility of CPU architectures to provide a multipurpose MC simulation tool. Optimized code enables the use of accurate MC calculation within a reasonable computation time, adequate for clinical practice. MCsquare also simulates prompt gamma emission and can thus be used also for in vivo range verification.

  9. Proton recoil spectra in spherical proportional counters calculated with finite element and Monte Carlo methods

    Energy Technology Data Exchange (ETDEWEB)

    Benmosbah, M. [Laboratoire de Chimie Physique et Rayonnement Alain Chambaudet, UMR CEA E4, Universite de Franche-Comte, 16 route de Gray, 25030 Besancon Cedex (France); Groetz, J.E. [Laboratoire de Chimie Physique et Rayonnement Alain Chambaudet, UMR CEA E4, Universite de Franche-Comte, 16 route de Gray, 25030 Besancon Cedex (France)], E-mail: jegroetz@univ-fcomte.fr; Crovisier, P. [Service de Protection contre les Rayonnements, CEA Valduc, 21120 Is/Tille (France); Asselineau, B. [Laboratoire de Metrologie et de Dosimetrie des Neutrons, IRSN, Cadarache BP3, 13115 St Paul-lez-Durance (France); Truffert, H.; Cadiou, A. [AREVA NC, Etablissement de la Hague, DQSSE/PR/E/D, 50444 Beaumont-Hague Cedex (France)

    2008-08-11

    Proton recoil spectra were calculated for various spherical proportional counters using Monte Carlo simulation combined with the finite element method. Electric field lines and strength were calculated by defining an appropriate mesh and solving the Laplace equation with the associated boundary conditions, taking into account the geometry of every counter. Thus, different regions were defined in the counter with various coefficients for the energy deposition in the Monte Carlo transport code MCNPX. Results from the calculations are in good agreement with measurements for three different gas pressures at various neutron energies.

  10. Monte Carlo simulation of secondary neutron dose for scanning proton therapy using FLUKA.

    Directory of Open Access Journals (Sweden)

    Chaeyeong Lee

    Full Text Available Proton therapy is a rapidly progressing field for cancer treatment. Globally, many proton therapy facilities are being commissioned or under construction. Secondary neutrons are an important issue during the commissioning process of a proton therapy facility. The purpose of this study is to model and validate scanning nozzles of proton therapy at Samsung Medical Center (SMC by Monte Carlo simulation for beam commissioning. After the commissioning, a secondary neutron ambient dose from proton scanning nozzle (Gantry 1 was simulated and measured. This simulation was performed to evaluate beam properties such as percent depth dose curve, Bragg peak, and distal fall-off, so that they could be verified with measured data. Using the validated beam nozzle, the secondary neutron ambient dose was simulated and then compared with the measured ambient dose from Gantry 1. We calculated secondary neutron dose at several different points. We demonstrated the validity modeling a proton scanning nozzle system to evaluate various parameters using FLUKA. The measured secondary neutron ambient dose showed a similar tendency with the simulation result. This work will increase the knowledge necessary for the development of radiation safety technology in medical particle accelerators.

  11. Monte Carlo and analytical model predictions of leakage neutron exposures from passively scattered proton therapy

    International Nuclear Information System (INIS)

    Pérez-Andújar, Angélica; Zhang, Rui; Newhauser, Wayne

    2013-01-01

    Purpose: Stray neutron radiation is of concern after radiation therapy, especially in children, because of the high risk it might carry for secondary cancers. Several previous studies predicted the stray neutron exposure from proton therapy, mostly using Monte Carlo simulations. Promising attempts to develop analytical models have also been reported, but these were limited to only a few proton beam energies. The purpose of this study was to develop an analytical model to predict leakage neutron equivalent dose from passively scattered proton beams in the 100-250-MeV interval.Methods: To develop and validate the analytical model, the authors used values of equivalent dose per therapeutic absorbed dose (H/D) predicted with Monte Carlo simulations. The authors also characterized the behavior of the mean neutron radiation-weighting factor, w R , as a function of depth in a water phantom and distance from the beam central axis.Results: The simulated and analytical predictions agreed well. On average, the percentage difference between the analytical model and the Monte Carlo simulations was 10% for the energies and positions studied. The authors found that w R was highest at the shallowest depth and decreased with depth until around 10 cm, where it started to increase slowly with depth. This was consistent among all energies.Conclusion: Simple analytical methods are promising alternatives to complex and slow Monte Carlo simulations to predict H/D values. The authors' results also provide improved understanding of the behavior of w R which strongly depends on depth, but is nearly independent of lateral distance from the beam central axis

  12. Intensity modulated radiation therapy using laser-accelerated protons: a Monte Carlo dosimetric study

    International Nuclear Information System (INIS)

    Fourkal, E; Li, J S; Xiong, W; Nahum, A; Ma, C-M

    2003-01-01

    In this paper we present Monte Carlo studies of intensity modulated radiation therapy using laser-accelerated proton beams. Laser-accelerated protons coming out of a solid high-density target have broad energy and angular spectra leading to dose distributions that cannot be directly used for therapeutic applications. Through the introduction of a spectrometer-like particle selection system that delivers small pencil beams of protons with desired energy spectra it is feasible to use laser-accelerated protons for intensity modulated radiotherapy. The method presented in this paper is a three-dimensional modulation in which the proton energy spectrum and intensity of each individual beamlet are modulated to yield a homogeneous dose in both the longitudinal and lateral directions. As an evaluation of the efficacy of this method, it has been applied to two prostate cases using a variety of beam arrangements. We have performed a comparison study between intensity modulated photon plans and those for laser-accelerated protons. For identical beam arrangements and the same optimization parameters, proton plans exhibit superior coverage of the target and sparing of neighbouring critical structures. Dose-volume histogram analysis of the resulting dose distributions shows up to 50% reduction of dose to the critical structures. As the number of fields is decreased, the proton modality exhibits a better preservation of the optimization requirements on the target and critical structures. It is shown that for a two-beam arrangement (parallel-opposed) it is possible to achieve both superior target coverage with 5% dose inhomogeneity within the target and excellent sparing of surrounding tissue

  13. SU-F-T-122: 4Dand 5D Proton Dose Evaluation with Monte Carlo

    Energy Technology Data Exchange (ETDEWEB)

    Titt, U; Mirkovic, D; Yepes, P; Liu, A; Peeler, C; Randenyia, S; Mohan, R [UT MD Anderson Cancer Center, Houston, TX (United States)

    2016-06-15

    Purpose: We evaluated uncertainties in therapeutic proton doses of a lung treatment, taking into account intra-fractional geometry changes, such as breathing, and inter-fractional changes, such as tumor shrinkage and weight loss. Methods: A Monte Carlo study was performed using four dimensional CT image sets (4DCTs) and weekly repeat imaging (5DCTs) to compute fixed RBE (1.1) and variable RBE weighted dose in an actual lung treatment geometry. The MC2 Monte Carlo system was employed to simulate proton energy deposition and LET distributions according to a thoracic cancer treatment plan developed with a 3D-CT in a commercial treatment planning system, as well as in each of the phases of 4DCT sets which were recorded weekly throughout the course of the treatment. A cumulative dose distribution in relevant structures was computed and compared to the predictions of the treatment planning system. Results: Using the Monte Carlo method, dose deposition estimates with the lowest possible uncertainties were produced. Comparison with treatment planning predictions indicates that significant uncertainties may be associated with therapeutic lung dose prediction from treatment planning systems, depending on the magnitude of inter- and intra-fractional geometry changes. Conclusion: As this is just a case study, a more systematic investigation accounting for a cohort of patients is warranted; however, this is less practical because Monte Carlo simulations of such cases require enormous computational resources. Hence our study and any future case studies may serve as validation/benchmarking data for faster dose prediction engines, such as the track repeating algorithm, FDC.

  14. Coevolution Based Adaptive Monte Carlo Localization (CEAMCL

    Directory of Open Access Journals (Sweden)

    Luo Ronghua

    2008-11-01

    Full Text Available An adaptive Monte Carlo localization algorithm based on coevolution mechanism of ecological species is proposed. Samples are clustered into species, each of which represents a hypothesis of the robot's pose. Since the coevolution between the species ensures that the multiple distinct hypotheses can be tracked stably, the problem of premature convergence when using MCL in highly symmetric environments can be solved. And the sample size can be adjusted adaptively over time according to the uncertainty of the robot's pose by using the population growth model. In addition, by using the crossover and mutation operators in evolutionary computation, intra-species evolution can drive the samples move towards the regions where the desired posterior density is large. So a small size of samples can represent the desired density well enough to make precise localization. The new algorithm is termed coevolution based adaptive Monte Carlo localization (CEAMCL. Experiments have been carried out to prove the efficiency of the new localization algorithm.

  15. Monte Carlo-based tail exponent estimator

    Science.gov (United States)

    Barunik, Jozef; Vacha, Lukas

    2010-11-01

    In this paper we propose a new approach to estimation of the tail exponent in financial stock markets. We begin the study with the finite sample behavior of the Hill estimator under α-stable distributions. Using large Monte Carlo simulations, we show that the Hill estimator overestimates the true tail exponent and can hardly be used on samples with small length. Utilizing our results, we introduce a Monte Carlo-based method of estimation for the tail exponent. Our proposed method is not sensitive to the choice of tail size and works well also on small data samples. The new estimator also gives unbiased results with symmetrical confidence intervals. Finally, we demonstrate the power of our estimator on the international world stock market indices. On the two separate periods of 2002-2005 and 2006-2009, we estimate the tail exponent.

  16. Monte carlo calculation of energy deposition and ionization yield for high energy protons

    International Nuclear Information System (INIS)

    Wilson, W.E.; McDonald, J.C.; Coyne, J.J.; Paretzke, H.G.

    1985-01-01

    Recent calculations of event size spectra for neutrons use a continuous slowing down approximation model for the energy losses experienced by secondary charged particles (protons and alphas) and thus do not allow for straggling effects. Discrepancies between the calculations and experimental measurements are thought to be, in part, due to the neglect of straggling. A tractable way of including stochastics in radiation transport calculations is via the Monte Carlo method and a number of efforts directed toward simulating positive ion track structure have been initiated employing this technique. Recent results obtained with our updated and extended MOCA code for charged particle track structure are presented here. Major emphasis has been on calculating energy deposition and ionization yield spectra for recoil proton crossers since they are the most prevalent event type at high energies (>99% at 14 MeV) for small volumes. Neutron event-size spectra can be obtained from them by numerical summing and folding techniques. Data for ionization yield spectra are presented for simulated recoil protons up to 20 MeV in sites of diameters 2-1000 nm

  17. Development and verification of an analytical algorithm to predict absorbed dose distributions in ocular proton therapy using Monte Carlo simulations

    International Nuclear Information System (INIS)

    Koch, Nicholas C; Newhauser, Wayne D

    2010-01-01

    Proton beam radiotherapy is an effective and non-invasive treatment for uveal melanoma. Recent research efforts have focused on improving the dosimetric accuracy of treatment planning and overcoming the present limitation of relative analytical dose calculations. Monte Carlo algorithms have been shown to accurately predict dose per monitor unit (D/MU) values, but this has yet to be shown for analytical algorithms dedicated to ocular proton therapy, which are typically less computationally expensive than Monte Carlo algorithms. The objective of this study was to determine if an analytical method could predict absolute dose distributions and D/MU values for a variety of treatment fields like those used in ocular proton therapy. To accomplish this objective, we used a previously validated Monte Carlo model of an ocular nozzle to develop an analytical algorithm to predict three-dimensional distributions of D/MU values from pristine Bragg peaks and therapeutically useful spread-out Bragg peaks (SOBPs). Results demonstrated generally good agreement between the analytical and Monte Carlo absolute dose calculations. While agreement in the proximal region decreased for beams with less penetrating Bragg peaks compared with the open-beam condition, the difference was shown to be largely attributable to edge-scattered protons. A method for including this effect in any future analytical algorithm was proposed. Comparisons of D/MU values showed typical agreement to within 0.5%. We conclude that analytical algorithms can be employed to accurately predict absolute proton dose distributions delivered by an ocular nozzle.

  18. A Monte Carlo pencil beam scanning model for proton treatment plan simulation using GATE/GEANT4

    Energy Technology Data Exchange (ETDEWEB)

    Grevillot, L; Freud, N; Sarrut, D [Universite de Lyon, CREATIS, CNRS UMR5220, Inserm U1044, INSA-Lyon, Universite Lyon 1, Centre Leon Berard, Lyon (France); Bertrand, D; Dessy, F, E-mail: loic.grevillot@creatis.insa-lyon.fr [IBA, B-1348, Louvain-la Neuve (Belgium)

    2011-08-21

    This work proposes a generic method for modeling scanned ion beam delivery systems, without simulation of the treatment nozzle and based exclusively on beam data library (BDL) measurements required for treatment planning systems (TPS). To this aim, new tools dedicated to treatment plan simulation were implemented in the Gate Monte Carlo platform. The method was applied to a dedicated nozzle from IBA for proton pencil beam scanning delivery. Optical and energy parameters of the system were modeled using a set of proton depth-dose profiles and spot sizes measured at 27 therapeutic energies. For further validation of the beam model, specific 2D and 3D plans were produced and then measured with appropriate dosimetric tools. Dose contributions from secondary particles produced by nuclear interactions were also investigated using field size factor experiments. Pristine Bragg peaks were reproduced with 0.7 mm range and 0.2 mm spot size accuracy. A 32 cm range spread-out Bragg peak with 10 cm modulation was reproduced with 0.8 mm range accuracy and a maximum point-to-point dose difference of less than 2%. A 2D test pattern consisting of a combination of homogeneous and high-gradient dose regions passed a 2%/2 mm gamma index comparison for 97% of the points. In conclusion, the generic modeling method proposed for scanned ion beam delivery systems was applicable to an IBA proton therapy system. The key advantage of the method is that it only requires BDL measurements of the system. The validation tests performed so far demonstrated that the beam model achieves clinical performance, paving the way for further studies toward TPS benchmarking. The method involves new sources that are available in the new Gate release V6.1 and could be further applied to other particle therapy systems delivering protons or other types of ions like carbon.

  19. Transport calculation of medium-energy protons and neutrons by Monte Carlo method

    International Nuclear Information System (INIS)

    Ban, Syuuichi; Hirayama, Hideo; Katoh, Kazuaki.

    1978-09-01

    A Monte Carlo transport code, ARIES, has been developed for protons and neutrons at medium energy (25 -- 500 MeV). Nuclear data provided by R.G. Alsmiller, Jr. were used for the calculation. To simulate the cascade development in the medium, each generation was represented by a single weighted particle and an average number of emitted particles was used as the weight. Neutron fluxes were stored by the collisions density method. The cutoff energy was set to 25 MeV. Neutrons below the cutoff were stored to be used as the source for the low energy neutron transport calculation upon the discrete ordinates method. Then transport calculations were performed for both low energy neutrons (thermal -- 25 MeV) and secondary gamma-rays. Energy spectra of emitted neutrons were calculated and compared with those of published experimental and calculated results. The agreement was good for the incident particles of energy between 100 and 500 MeV. (author)

  20. SU-E-T-314: The Application of Cloud Computing in Pencil Beam Scanning Proton Therapy Monte Carlo Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Z [Reading Hospital, West Reading, PA (United States); Gao, M [ProCure Treatment Centers, Warrenville, IL (United States)

    2014-06-01

    Purpose: Monte Carlo simulation plays an important role for proton Pencil Beam Scanning (PBS) technique. However, MC simulation demands high computing power and is limited to few large proton centers that can afford a computer cluster. We study the feasibility of utilizing cloud computing in the MC simulation of PBS beams. Methods: A GATE/GEANT4 based MC simulation software was installed on a commercial cloud computing virtual machine (Linux 64-bits, Amazon EC2). Single spot Integral Depth Dose (IDD) curves and in-air transverse profiles were used to tune the source parameters to simulate an IBA machine. With the use of StarCluster software developed at MIT, a Linux cluster with 2–100 nodes can be conveniently launched in the cloud. A proton PBS plan was then exported to the cloud where the MC simulation was run. Results: The simulated PBS plan has a field size of 10×10cm{sup 2}, 20cm range, 10cm modulation, and contains over 10,000 beam spots. EC2 instance type m1.medium was selected considering the CPU/memory requirement and 40 instances were used to form a Linux cluster. To minimize cost, master node was created with on-demand instance and worker nodes were created with spot-instance. The hourly cost for the 40-node cluster was $0.63 and the projected cost for a 100-node cluster was $1.41. Ten million events were simulated to plot PDD and profile, with each job containing 500k events. The simulation completed within 1 hour and an overall statistical uncertainty of < 2% was achieved. Good agreement between MC simulation and measurement was observed. Conclusion: Cloud computing is a cost-effective and easy to maintain platform to run proton PBS MC simulation. When proton MC packages such as GATE and TOPAS are combined with cloud computing, it will greatly facilitate the pursuing of PBS MC studies, especially for newly established proton centers or individual researchers.

  1. SU-E-T-314: The Application of Cloud Computing in Pencil Beam Scanning Proton Therapy Monte Carlo Simulation

    International Nuclear Information System (INIS)

    Wang, Z; Gao, M

    2014-01-01

    Purpose: Monte Carlo simulation plays an important role for proton Pencil Beam Scanning (PBS) technique. However, MC simulation demands high computing power and is limited to few large proton centers that can afford a computer cluster. We study the feasibility of utilizing cloud computing in the MC simulation of PBS beams. Methods: A GATE/GEANT4 based MC simulation software was installed on a commercial cloud computing virtual machine (Linux 64-bits, Amazon EC2). Single spot Integral Depth Dose (IDD) curves and in-air transverse profiles were used to tune the source parameters to simulate an IBA machine. With the use of StarCluster software developed at MIT, a Linux cluster with 2–100 nodes can be conveniently launched in the cloud. A proton PBS plan was then exported to the cloud where the MC simulation was run. Results: The simulated PBS plan has a field size of 10×10cm 2 , 20cm range, 10cm modulation, and contains over 10,000 beam spots. EC2 instance type m1.medium was selected considering the CPU/memory requirement and 40 instances were used to form a Linux cluster. To minimize cost, master node was created with on-demand instance and worker nodes were created with spot-instance. The hourly cost for the 40-node cluster was $0.63 and the projected cost for a 100-node cluster was $1.41. Ten million events were simulated to plot PDD and profile, with each job containing 500k events. The simulation completed within 1 hour and an overall statistical uncertainty of < 2% was achieved. Good agreement between MC simulation and measurement was observed. Conclusion: Cloud computing is a cost-effective and easy to maintain platform to run proton PBS MC simulation. When proton MC packages such as GATE and TOPAS are combined with cloud computing, it will greatly facilitate the pursuing of PBS MC studies, especially for newly established proton centers or individual researchers

  2. Experimental validation of the TOPAS Monte Carlo system for passive scattering proton therapy

    International Nuclear Information System (INIS)

    Testa, M.; Schümann, J.; Lu, H.-M.; Paganetti, H.; Shin, J.; Faddegon, B.; Perl, J.

    2013-01-01

    Purpose: TOPAS (TOol for PArticle Simulation) is a particle simulation code recently developed with the specific aim of making Monte Carlo simulations user-friendly for research and clinical physicists in the particle therapy community. The authors present a thorough and extensive experimental validation of Monte Carlo simulations performed with TOPAS in a variety of setups relevant for proton therapy applications. The set of validation measurements performed in this work represents an overall end-to-end testing strategy recommended for all clinical centers planning to rely on TOPAS for quality assurance or patient dose calculation and, more generally, for all the institutions using passive-scattering proton therapy systems. Methods: The authors systematically compared TOPAS simulations with measurements that are performed routinely within the quality assurance (QA) program in our institution as well as experiments specifically designed for this validation study. First, the authors compared TOPAS simulations with measurements of depth-dose curves for spread-out Bragg peak (SOBP) fields. Second, absolute dosimetry simulations were benchmarked against measured machine output factors (OFs). Third, the authors simulated and measured 2D dose profiles and analyzed the differences in terms of field flatness and symmetry and usable field size. Fourth, the authors designed a simple experiment using a half-beam shifter to assess the effects of multiple Coulomb scattering, beam divergence, and inverse square attenuation on lateral and longitudinal dose profiles measured and simulated in a water phantom. Fifth, TOPAS’ capabilities to simulate time dependent beam delivery was benchmarked against dose rate functions (i.e., dose per unit time vs time) measured at different depths inside an SOBP field. Sixth, simulations of the charge deposited by protons fully stopping in two different types of multilayer Faraday cups (MLFCs) were compared with measurements to benchmark the

  3. The impact of different Monte Carlo models on the cross section measurement of top-pair production at 7 TeV proton-proton collisions

    International Nuclear Information System (INIS)

    Krause, Claudius

    2010-09-01

    High energy proton-proton collisions lead to a large amount of secondary particles to be measured in a detector. A final state containing top quarks is of particular interest. But top quarks are only produced in a small fraction of the collisions. Hence, criteria must be defined to separate events containing top quarks from the background. From detectors, we record signals, for example hits in the tracker system or deposits in the calorimeters. In order to obtain the momentum of the particles, we apply algorithms to reconstruct tracks in space. More sophisticated algorithms are needed to identify the flavour of quarks, such as b-tagging. Several steps are needed to test these algorithms. Collision products of proton-proton events are generated using Monte Carlo techniques and their passage through the detector is simulated. After that, the algorithms are applied and the signal efficiency and the mistagging rate can be obtained. There are, however, many different approaches and algorithms realized in programs, so the question arises if the choice of the Monte Carlo generator influences the measured quantities. In this thesis, two commonly used Monte Carlo generators, SHERPA and MadGraph/MadEvent, are compared and the differences in the selection efficiency of semimuonic tt events are estimated. In addition, the distributions of kinematic variables are shown. A special chapter about the matching of matrix elements with parton showers is included. The main algorithms, CKKW for SHERPA and MLM for MadGraph/MadEvent, are introduced.

  4. The impact of different Monte Carlo models on the cross section measurement of top-pair production at 7 TeV proton-proton collisions

    Energy Technology Data Exchange (ETDEWEB)

    Krause, Claudius

    2012-04-15

    High energy proton-proton collisions lead to a large amount of secondary particles to be measured in a detector. A final state containing top quarks is of particular interest. But top quarks are only produced in a small fraction of the collisions. Hence, criteria must be defined to separate events containing top quarks from the background. From detectors, we record signals, for example hits in the tracker system or deposits in the calorimeters. In order to obtain the momentum of the particles, we apply algorithms to reconstruct tracks in space. More sophisticated algorithms are needed to identify the flavour of quarks, such as b-tagging. Several steps are needed to test these algorithms. Collision products of proton-proton events are generated using Monte Carlo techniques and their passage through the detector is simulated. After that, the algorithms are applied and the signal efficiency and the mistagging rate can be obtained. There are, however, many different approaches and algorithms realized in programs, so the question arises if the choice of the Monte Carlo generator influences the measured quantities. In this thesis, two commonly used Monte Carlo generators, SHERPA and MadGraph/MadEvent, are compared and the differences in the selection efficiency of semimuonic tt events are estimated. In addition, the distributions of kinematic variables are shown. A special chapter about the matching of matrix elements with parton showers is included. The main algorithms, CKKW for SHERPA and MLM for MadGraph/MadEvent, are introduced.

  5. Use of Monte Carlo software to aid design of a proton therapy nozzle

    Energy Technology Data Exchange (ETDEWEB)

    Swanepoel, M.W. [Medical Radiation Group, iThemba LABS, P.O. Box 22, Somerset West 7129 (South Africa)], E-mail: mark@tlabs.ac.za; Jones, D.T.L. [Medical Radiation Group, iThemba LABS, P.O. Box 22, Somerset West 7129 (South Africa)

    2007-09-21

    A second proton therapy nozzle is being developed at iThemba LABS to irradiate lesions in the body, thus complementing an existing facility for head and neck treatments. A passive scattering system is being developed, the complexity of which necessitates Monte Carlo simulations. We have used MCNPX to set the apertures and spacing of collimators, to model dose distributions in water, to check and modify beam scattering and energy modulating components, and to check radiation shields. The comprehensive shielding model was adapted for other problems by reducing the types of particles transported, limiting the extent and complexity of the geometry, and where possible killing particles by setting their importance to zero. Our results appear to indicate that the Rossi and Greisen description of multiple Coulomb scattering as used in MCNPX predicts high-Z, large angle scattering acceptably well for modeling proton therapy nozzles. MCNPX is easy to learn and implement, but has disadvantages when used to model therapy nozzles: (1) it does not yet offer a true capability to model electromagnetic interactions, (2) it cannot model moving components, and (3) it uses energy rather than range cut-offs for particles. Hence a GEANT4 model of the new nozzle is also being implemented.

  6. Geant4 Monte Carlo simulation of absorbed dose and radiolysis yields enhancement from a gold nanoparticle under MeV proton irradiation

    Energy Technology Data Exchange (ETDEWEB)

    Tran, H.N., E-mail: tranngochoang@tdt.edu.vn [Division of Nuclear Physics, Ton Duc Thang University, Tan Phong Ward, District 7, Ho Chi Minh City (Viet Nam); Faculty of Applied Sciences, Ton Duc Thang University, Tan Phong Ward, District 7, Ho Chi Minh City (Viet Nam); Karamitros, M. [Notre Dame Radiation Laboratory, University of Notre-Dame, IN 46556 (United States); Ivanchenko, V.N. [Geant4 Associates International Ltd, Hebden Bridge (United Kingdom); Guatelli, S.; McKinnon, S. [Centre For Medical Radiation Physics, University of Wollongong (Australia); Illawarra Health and Medical Research, University of Wollongong, NSW (Australia); Murakami, K.; Sasaki, T.; Okada, S. [Computing Research Center, High Energy Accelerator Organization, KEK, Tsukuba City (Japan); Bordage, M.C. [INSERM, UMR 1037, CRCT, F-31000 Toulouse (France); Univ. Toulouse III-Paul Sabatier, UMR 1037, CRCT, F-31000 Toulouse (France); Francis, Z. [Saint Joseph University, Faculty of Sciences, Department of Physics, Beirut (Lebanon); El Bitar, Z. [Institut Pluridisciplinaire Hubert Curien/IN2P3/CNRS, Strasbourg (France); Bernal, M.A. [Instituto de Física Gleb Wataghin, Universidade Estadual de Campinas, SP (Brazil); Shin, J.I. [Division of Heavy Ion Clinical Research, Korea Institute of Radiological and Medical Science, 75, Nowon-ro, Nowon-gu, Seoul (Korea, Republic of); Lee, S.B. [Proton Therapy Center, National Cancer Center, 323, Ilsan-ro, Ilsandong-gu, Goyang-si, Gyeonggi-do (Korea, Republic of); Barberet, Ph. [Univ. Bordeaux, CENBG, UMR 5797, F-33170 Gradignan (France); CNRS, IN2P3, CENBG, UMR 5797, F-33170 Gradignan (France); Tran, T.T. [VNUHCM-University of Science (Viet Nam); Brown, J.M.C. [School of Mathematics and Physics, Queen’s University Belfast, Belfast, Northern Ireland (United Kingdom); and others

    2016-04-15

    Gold nanoparticles have been reported as a possible radio-sensitizer agent in radiation therapy due to their ability to increase energy deposition and subsequent direct damage to cells and DNA within their local vicinity. Moreover, this increase in energy deposition also results in an increase of the radiochemical yields. In this work we present, for the first time, an in silico investigation, based on the general purpose Monte Carlo simulation toolkit Geant4, into energy deposition and radical species production around a spherical gold nanoparticle 50 nm in diameter via proton irradiation. Simulations were preformed for incident proton energies ranging from 2 to 170 MeV, which are of interest for clinical proton therapy.

  7. Reference dosimetry of proton pencil beams based on dose-area product: a proof of concept.

    Science.gov (United States)

    Gomà, Carles; Safai, Sairos; Vörös, Sándor

    2017-06-21

    This paper describes a novel approach to the reference dosimetry of proton pencil beams based on dose-area product ([Formula: see text]). It depicts the calibration of a large-diameter plane-parallel ionization chamber in terms of dose-area product in a 60 Co beam, the Monte Carlo calculation of beam quality correction factors-in terms of dose-area product-in proton beams, the Monte Carlo calculation of nuclear halo correction factors, and the experimental determination of [Formula: see text] of a single proton pencil beam. This new approach to reference dosimetry proves to be feasible, as it yields [Formula: see text] values in agreement with the standard and well-established approach of determining the absorbed dose to water at the centre of a broad homogeneous field generated by the superposition of regularly-spaced proton pencil beams.

  8. Monte Carlo investigation of the low-dose envelope from scanned proton pencil beams

    International Nuclear Information System (INIS)

    Sawakuchi, Gabriel O; Titt, Uwe; Mirkovic, Dragan; Ciangaru, George; Zhu, X Ronald; Sahoo, Narayan; Gillin, Michael T; Mohan, Radhe

    2010-01-01

    Scanned proton pencil beams carry a low-dose envelope that extends several centimeters from the individual beam's central axis. Thus, the total delivered dose depends on the size of the target volume and the corresponding number and intensity of beams necessary to cover the target volume uniformly. This dependence must be considered in dose calculation algorithms used by treatment planning systems. In this work, we investigated the sources of particles contributing to the low-dose envelope using the Monte Carlo technique. We used a validated model of our institution's scanning beam line to determine the contributions to the low-dose envelope from secondary particles created in a water phantom and particles scattered in beam line components. Our results suggested that, for high-energy beams, secondary particles produced by nuclear interactions in the water phantom are the major contributors to the low-dose envelope. For low-energy beams, the low-dose envelope is dominated by particles undergoing multiple Coulomb scattering in the beam line components and water phantom. Clearly, in the latter situation, the low-dose envelope depends directly on beam line design features. Finally, we investigated the dosimetric consequences of the low-dose envelope. Our results showed that if not modeled properly the low-dose envelope may cause clinically relevant dose disturbance in the target volume. This work suggested that this low-dose envelope is beam line specific for low-energy beams, should be thoroughly experimentally characterized and validated during commissioning of the treatment planning system, and therefore is of great concern for accurate delivery of proton scanning beam doses.

  9. Monte Carlo simulations of the dosimetric impact of radiopaque fiducial markers for proton radiotherapy of the prostate

    Science.gov (United States)

    Newhauser, Wayne; Fontenot, Jonas; Koch, Nicholas; Dong, Lei; Lee, Andrew; Zheng, Yuanshui; Waters, Laurie; Mohan, Radhe

    2007-06-01

    Many clinical studies have demonstrated that implanted radiopaque fiducial markers improve targeting accuracy in external-beam radiotherapy, but little is known about the dose perturbations these markers may cause in patients receiving proton radiotherapy. The objective of this study was to determine what types of implantable markers are visible in setup radiographs and, at the same time, perturb the therapeutic proton dose to the prostate by less than 10%. The radiographic visibility of the markers was assessed by visual inspection of lateral setup radiographs of a pelvic phantom using a kilovoltage x-ray imaging system. The fiducial-induced perturbations in the proton dose were estimated with Monte Carlo simulations. The influence of marker material, size, placement depth and orientation within the pelvis was examined. The radiographic tests confirmed that gold and stainless steel markers were clearly visible and that titanium markers were not. The Monte Carlo simulations revealed that titanium and stainless steel markers minimally perturbed the proton beam, but gold markers cast unacceptably large dose shadows. A 0.9 mm diameter, 3.1 mm long cylindrical stainless steel marker provides good radiographic visibility yet perturbs the proton dose distribution in the prostate by less than 8% when using a parallel opposed lateral beam arrangement.

  10. SU-F-T-146: Comparing Monte Carlo Simulations with Commissioning Beam Data for Mevion S250 Proton Therapy System

    Energy Technology Data Exchange (ETDEWEB)

    Prusator, M; Jin, H; Ahmad, S; Chen, Y [University of Oklahoma Health Sciences Center, Oklahoma City, OK (United States)

    2016-06-15

    Purpose: To evaluate the Monte Carlo simulated beam data with the measured commissioning data for the Mevion S250 proton therapy system. Method: The Mevion S250 proton therapy system utilizes a passive double scattering technique with a unique gantry mounted superconducting accelerator and offers effective proton therapy in a compact design concept. The field shaping system (FSS) includes first scattering foil, range modulator wheel (RMW), second scattering foil and post absorber and offers two field sizes and a total of 24 treatment options from proton range of 5 cm to 32 cm. The treatment nozzle was modeled in detail using TOPAS (TOolkit for PArticle Simulation) Monte Carlo code. The timing feathers of the moving modulator wheels were also implemented to generate the Spread Out Bragg Peak (SOBP). The simulation results including pristine Bragg Peak, SOBP and dose profiles were compared with the data measured during beam commissioning. Results: The comparison between the measured data and the simulation data show excellent agreement. For pristine proton Bragg Peaks, the simulated proton range (depth of distal 90%) values agreed well with the measured range values within 1 mm accuracy. The differences of the distal falloffs (depth from distal 80% to 20%) were also found to be less than 1 mm between the simulations and measurements. For the SOBP, the widths of modulation (depth of proximal 95% to distal 90%) were also found to agree with the measurement within 1 mm. The flatness of the simulated and measured lateral profiles was found to be 0.6 % and 1.1 %, respectively. Conclusion: The agreement between simulations and measurements demonstrate that TOPAS could be used as a viable platform to proton therapy applications. The matched simulation results offer a great tool and open opportunity for variety of applications.

  11. Prediction of production of {sup 22}Na in a gas-cell target irradiated by protons using Monte Carlo tracking

    Energy Technology Data Exchange (ETDEWEB)

    Eslami, M., E-mail: mohammad.eslami25@yahoo.com [Department of Physics, Faculty of Science, University of Zanjan, Zengan (Zanjan) (Iran, Islamic Republic of); Kakavand, T. [Department of Physics, Faculty of Science, University of Zanjan, Zengan (Zanjan) (Iran, Islamic Republic of); Department of Physics, Faculty of Science, Imam Khomeini International University, Qazvin (Iran, Islamic Republic of); Mirzaii, M.; Rajabifar, S. [Agricultural, Medical and Industrial Research School, Nuclear Science and Technology Research Institute, AEOI, Karaj (Iran, Islamic Republic of)

    2015-01-01

    Highlights: • Angular distribution of the proton beam in a gaseous environment. • Particle energy distribution profile and proton flux within gas-cell target with MCNPX. • Detection of the residual nuclei during the nuclear reactions. • Estimation of production yield for {sup 22,nat}Ne(p,x){sup 22}Na reactions. - Abstract: The {sup 22}Ne(p,n){sup 22}Na is an optimal reaction for the cyclotron production of {sup 22}Na. This work tends to monitor the proton induced production of {sup 22}Na in a gas-cell target, containing natural and enriched neon gas, using Monte Carlo method. The excitation functions of reactions are calculated by both TALYS-1.6 and ALICE/ASH codes and then the optimum energy range of projectile for the high yield production is selected. A free gaseous environment of neon at a particular pressure and temperature is prearranged and the proton beam is transported within it using Monte Carlo codes MCNPX and SRIM. The beam monitoring performed by each of these codes indicates that the gas-cell has to be designed as conical frustum to reach desired interactions. The MCNPX is also employed to calculate the energy distribution of proton in the designed target and estimation of the residual nuclei during irradiation. The production yield of {sup 22}Na in {sup 22}Ne(p,n){sup 22}Na and {sup nat}Ne(p,x){sup 22}Na reactions are estimated and it shows a good agreement with the experimental results. The results demonstrate that Monte Carlo makes available a beneficial manner to design and optimize the gas targets as well as calibration of detectors, which can be used for the radionuclide production purposes.

  12. Research on perturbation based Monte Carlo reactor criticality search

    International Nuclear Information System (INIS)

    Li Zeguang; Wang Kan; Li Yangliu; Deng Jingkang

    2013-01-01

    Criticality search is a very important aspect in reactor physics analysis. Due to the advantages of Monte Carlo method and the development of computer technologies, Monte Carlo criticality search is becoming more and more necessary and feasible. Traditional Monte Carlo criticality search method is suffered from large amount of individual criticality runs and uncertainty and fluctuation of Monte Carlo results. A new Monte Carlo criticality search method based on perturbation calculation is put forward in this paper to overcome the disadvantages of traditional method. By using only one criticality run to get initial k_e_f_f and differential coefficients of concerned parameter, the polynomial estimator of k_e_f_f changing function is solved to get the critical value of concerned parameter. The feasibility of this method was tested. The results show that the accuracy and efficiency of perturbation based criticality search method are quite inspiring and the method overcomes the disadvantages of traditional one. (authors)

  13. Computational details of the Monte Carlo simulation of proton and electron tracks

    International Nuclear Information System (INIS)

    Zaider, M.; Brenner, D.J.

    1983-01-01

    The code PROTON simulates the elastic and nonelastic interactions of protons and electrons in water vapor. In this paper, the treatment of elastic angular scattering of electrons as utilized in PROTON is described and compared with alternate formalisms. The sensitivity of the calculation to different treatments of this process is examined in terms of proximity functions of energy deposition. 5 figures

  14. Monte Carlo based diffusion coefficients for LMFBR analysis

    International Nuclear Information System (INIS)

    Van Rooijen, Willem F.G.; Takeda, Toshikazu; Hazama, Taira

    2010-01-01

    A method based on Monte Carlo calculations is developed to estimate the diffusion coefficient of unit cells. The method uses a geometrical model similar to that used in lattice theory, but does not use the assumption of a separable fundamental mode used in lattice theory. The method uses standard Monte Carlo flux and current tallies, and the continuous energy Monte Carlo code MVP was used without modifications. Four models are presented to derive the diffusion coefficient from tally results of flux and partial currents. In this paper the method is applied to the calculation of a plate cell of the fast-spectrum critical facility ZEBRA. Conventional calculations of the diffusion coefficient diverge in the presence of planar voids in the lattice, but our Monte Carlo method can treat this situation without any problem. The Monte Carlo method was used to investigate the influence of geometrical modeling as well as the directional dependence of the diffusion coefficient. The method can be used to estimate the diffusion coefficient of complicated unit cells, the limitation being the capabilities of the Monte Carlo code. The method will be used in the future to confirm results for the diffusion coefficient obtained of the Monte Carlo code. The method will be used in the future to confirm results for the diffusion coefficient obtained with deterministic codes. (author)

  15. Multigroup and coupled forward-adjoint Monte Carlo calculation efficiencies for secondary neutron doses from proton beams

    International Nuclear Information System (INIS)

    Kelsey IV, Charles T.; Prinja, Anil K.

    2011-01-01

    We evaluate the Monte Carlo calculation efficiency for multigroup transport relative to continuous energy transport using the MCNPX code system to evaluate secondary neutron doses from a proton beam. We consider both fully forward simulation and application of a midway forward adjoint coupling method to the problem. Previously we developed tools for building coupled multigroup proton/neutron cross section libraries and showed consistent results for continuous energy and multigroup proton/neutron transport calculations. We observed that forward multigroup transport could be more efficient than continuous energy. Here we quantify solution efficiency differences for a secondary radiation dose problem characteristic of proton beam therapy problems. We begin by comparing figures of merit for forward multigroup and continuous energy MCNPX transport and find that multigroup is 30 times more efficient. Next we evaluate efficiency gains for coupling out-of-beam adjoint solutions with forward in-beam solutions. We use a variation of a midway forward-adjoint coupling method developed by others for neutral particle transport. Our implementation makes use of the surface source feature in MCNPX and we use spherical harmonic expansions for coupling in angle rather than solid angle binning. The adjoint out-of-beam transport for organs of concern in a phantom or patient can be coupled with numerous forward, continuous energy or multigroup, in-beam perturbations of a therapy beam line configuration. Out-of-beam dose solutions are provided without repeating out-of-beam transport. (author)

  16. Evaluation of ion chamber dependent correction factors for ionisation chamber dosimetry in proton beams using a Monte Carlo method

    International Nuclear Information System (INIS)

    Palmans, H.; Verhaegen, F.

    1995-01-01

    In the last decade, several clinical proton beam therapy facilities have been developed. To satisfy the demand for uniformity in clinical (routine) proton beam dosimetry two dosimetry protocols (ECHED and AAPM) have been published. Both protocols neglect the influence of ion chamber dependent parameters on dose determination in proton beams because of the scatter properties of these beams, although the problem has not been studied thoroughly yet. A comparison between water calorimetry and ionisation chamber dosimetry showed a discrepancy of 2.6% between the former method and ionometry following the ECHED protocol. Possibly, a small part of this difference can be attributed to chamber dependent correction factors. Indications for this possibility are found in ionometry measurements. To allow the simulation of complex geometries with different media necessary for the study of those corrections, an existing proton Monte Carlo code (PTRAN, Berger) has been modified. The original code, that applies Mollire's multiple scattering theory and Vavilov's energy straggling theory, calculates depth dose profiles, energy distributions and radial distributions for pencil beams in water. Comparisons with measurements and calculations reported in the literature are done to test the program's accuracy. Preliminary results of the influence of chamber design and chamber materials on dose to water determination are presented

  17. Evaluation of ion chamber dependent correction factors for ionisation chamber dosimetry in proton beams using a Monte Carlo method

    Energy Technology Data Exchange (ETDEWEB)

    Palmans, H [Ghent Univ. (Belgium). Dept. of Biomedical Physics; Verhaegen, F

    1995-12-01

    In the last decade, several clinical proton beam therapy facilities have been developed. To satisfy the demand for uniformity in clinical (routine) proton beam dosimetry two dosimetry protocols (ECHED and AAPM) have been published. Both protocols neglect the influence of ion chamber dependent parameters on dose determination in proton beams because of the scatter properties of these beams, although the problem has not been studied thoroughly yet. A comparison between water calorimetry and ionisation chamber dosimetry showed a discrepancy of 2.6% between the former method and ionometry following the ECHED protocol. Possibly, a small part of this difference can be attributed to chamber dependent correction factors. Indications for this possibility are found in ionometry measurements. To allow the simulation of complex geometries with different media necessary for the study of those corrections, an existing proton Monte Carlo code (PTRAN, Berger) has been modified. The original code, that applies Mollire`s multiple scattering theory and Vavilov`s energy straggling theory, calculates depth dose profiles, energy distributions and radial distributions for pencil beams in water. Comparisons with measurements and calculations reported in the literature are done to test the program`s accuracy. Preliminary results of the influence of chamber design and chamber materials on dose to water determination are presented.

  18. Experimental and Monte Carlo studies of fluence corrections for graphite calorimetry in low- and high-energy clinical proton beams

    International Nuclear Information System (INIS)

    Lourenço, Ana; Thomas, Russell; Bouchard, Hugo; Kacperek, Andrzej; Vondracek, Vladimir; Royle, Gary; Palmans, Hugo

    2016-01-01

    Purpose: The aim of this study was to determine fluence corrections necessary to convert absorbed dose to graphite, measured by graphite calorimetry, to absorbed dose to water. Fluence corrections were obtained from experiments and Monte Carlo simulations in low- and high-energy proton beams. Methods: Fluence corrections were calculated to account for the difference in fluence between water and graphite at equivalent depths. Measurements were performed with narrow proton beams. Plane-parallel-plate ionization chambers with a large collecting area compared to the beam diameter were used to intercept the whole beam. High- and low-energy proton beams were provided by a scanning and double scattering delivery system, respectively. A mathematical formalism was established to relate fluence corrections derived from Monte Carlo simulations, using the FLUKA code [A. Ferrari et al., “FLUKA: A multi-particle transport code,” in CERN 2005-10, INFN/TC 05/11, SLAC-R-773 (2005) and T. T. Böhlen et al., “The FLUKA Code: Developments and challenges for high energy and medical applications,” Nucl. Data Sheets 120, 211–214 (2014)], to partial fluence corrections measured experimentally. Results: A good agreement was found between the partial fluence corrections derived by Monte Carlo simulations and those determined experimentally. For a high-energy beam of 180 MeV, the fluence corrections from Monte Carlo simulations were found to increase from 0.99 to 1.04 with depth. In the case of a low-energy beam of 60 MeV, the magnitude of fluence corrections was approximately 0.99 at all depths when calculated in the sensitive area of the chamber used in the experiments. Fluence correction calculations were also performed for a larger area and found to increase from 0.99 at the surface to 1.01 at greater depths. Conclusions: Fluence corrections obtained experimentally are partial fluence corrections because they account for differences in the primary and part of the secondary

  19. Experimental and Monte Carlo studies of fluence corrections for graphite calorimetry in low- and high-energy clinical proton beams

    Energy Technology Data Exchange (ETDEWEB)

    Lourenço, Ana, E-mail: am.lourenco@ucl.ac.uk [Department of Medical Physics and Biomedical Engineering, University College London, London WC1E 6BT, United Kingdom and Division of Acoustics and Ionising Radiation, National Physical Laboratory, Teddington TW11 0LW (United Kingdom); Thomas, Russell; Bouchard, Hugo [Division of Acoustics and Ionising Radiation, National Physical Laboratory, Teddington TW11 0LW (United Kingdom); Kacperek, Andrzej [National Eye Proton Therapy Centre, Clatterbridge Cancer Centre, Wirral CH63 4JY (United Kingdom); Vondracek, Vladimir [Proton Therapy Center, Budinova 1a, Prague 8 CZ-180 00 (Czech Republic); Royle, Gary [Department of Medical Physics and Biomedical Engineering, University College London, London WC1E 6BT (United Kingdom); Palmans, Hugo [Division of Acoustics and Ionising Radiation, National Physical Laboratory, Teddington TW11 0LW, United Kingdom and Medical Physics Group, EBG MedAustron GmbH, A-2700 Wiener Neustadt (Austria)

    2016-07-15

    Purpose: The aim of this study was to determine fluence corrections necessary to convert absorbed dose to graphite, measured by graphite calorimetry, to absorbed dose to water. Fluence corrections were obtained from experiments and Monte Carlo simulations in low- and high-energy proton beams. Methods: Fluence corrections were calculated to account for the difference in fluence between water and graphite at equivalent depths. Measurements were performed with narrow proton beams. Plane-parallel-plate ionization chambers with a large collecting area compared to the beam diameter were used to intercept the whole beam. High- and low-energy proton beams were provided by a scanning and double scattering delivery system, respectively. A mathematical formalism was established to relate fluence corrections derived from Monte Carlo simulations, using the FLUKA code [A. Ferrari et al., “FLUKA: A multi-particle transport code,” in CERN 2005-10, INFN/TC 05/11, SLAC-R-773 (2005) and T. T. Böhlen et al., “The FLUKA Code: Developments and challenges for high energy and medical applications,” Nucl. Data Sheets 120, 211–214 (2014)], to partial fluence corrections measured experimentally. Results: A good agreement was found between the partial fluence corrections derived by Monte Carlo simulations and those determined experimentally. For a high-energy beam of 180 MeV, the fluence corrections from Monte Carlo simulations were found to increase from 0.99 to 1.04 with depth. In the case of a low-energy beam of 60 MeV, the magnitude of fluence corrections was approximately 0.99 at all depths when calculated in the sensitive area of the chamber used in the experiments. Fluence correction calculations were also performed for a larger area and found to increase from 0.99 at the surface to 1.01 at greater depths. Conclusions: Fluence corrections obtained experimentally are partial fluence corrections because they account for differences in the primary and part of the secondary

  20. Comparison of Monte Carlo simulations with proton experiment for a thick Au absorber

    International Nuclear Information System (INIS)

    Yevseyeva, Olga; Assis, Joaquim T. de; Diaz, Katherin S.; Hormaza, Joel M.; Lopes, Ricardo T.

    2009-01-01

    Proton therapy applications deal with relatively thick targets like the human head or the trunk. Therefore, relatively small differences in the total proton stopping power given, for example, by the different models provided by GEANT4, could lead to significant disagreement in the final proton energy spectra when integrated along lengthy proton trajectories. This work presents a comparison of proton energy spectra for 49.1 MeV protons passing through a couple of Au absorbers with different thicknesses obtained by GEANT4.8.2 simulations using ICRU49, Ziegler1985 and Ziegler2000 models. The comparison was made with the experimental data of Tschalaer, with TRIM/SRIM2008 and MCNPX2.4.0 simulations, and the Payne analytical solution for the transport equation in the Fokker-Plank approximation. It is shown that the simulations reproduce the experimental spectra with some detectable contradictions. It should be noted that all the spectra lay at the proton energies significantly above 2 MeV, i.e. in the so-called 'Bethe-Bloch region'. Therefore the observed disagreements in GEANT4 results, simulated with different models, are somewhat unexpected. Further studies for a better understanding and to obtain definitive conclusions are necessary. (author)

  1. New estimation of secondary particle multiplicity of nuclear interactions in proton therapy using multicollisional plus evaporation Monte Carlo calculations

    International Nuclear Information System (INIS)

    Mesa, J.; Rodrigues, T. E.; Garcia-Trapaga, C. E.; Arruda-Neto, J. D. T.; Shtejer, K. . Email. jmesa@ibb.unesp.br

    2007-01-01

    Secondary particles contribute to dose deposition in critical organs outside the irradiated target volume. However, the literature regarding specifically to neutron dose and other secondary particles from proton therapy is limited. This issue is of special relevance for young patients, particularly when life expectancy is long, fundamentally if we consider that the art of cancer treatment is finding the right balance between tumor control and injury to normal tissues. In this work we have obtained spectra and multiplicities for neutrons and other secondary particles emitted in the reactions of protons: p+ 12 C, p+ 16 O, p+ 40 Ca and p+ 14 N, for proton energies from 100 to 200 MeV. In this sense, we have used a quite sophisticate multicollisional Monte Carlo code (MCMC) for pre-equilibrium emission, plus de-excitation of residual nucleus by two ways: evaporation of particles (mainly nucleons, but also composites) and possibly fission in the case of heavy residues. The code was developed in our group, with very recently improvements that take into account Pauli-blocking effects in a novel and more precise way, as well as a more rigorous energy balance, an energy stopping time criterion for pre-equilibrium emission, and the inclusion of deuteron, triton and 3 He emissions in the evaporation step

  2. Calculation of primary and secondary dose in proton therapy of brain tumors using Monte Carlo method

    International Nuclear Information System (INIS)

    Moghbel Esfahani, F.; Alamatsaz, M.; Karimian, A.

    2012-01-01

    High-energy beams of protons offer significant advantages for the treatment of deep-seated local tumors. Their physical depth-dose distribution in tissue is characterized by a small entrance dose and a distinct maximum - Bragg peak - near the end of range with a sharp falloff at the distal edge. Therefore, research must be done to investigate the possible negative and positive effects of using proton therapy as a treatment modality. In proton therapy, protons do account for the vast majority of dose. However, when protons travel through matter, secondary particles are created by the interactions of protons and matter en route to and within the patient. It is believed that secondary dose can lead to secondary cancer, especially in pediatric cases. Therefore, the focus of this work is determining both primary and secondary dose. Dose calculations were performed by MCNPX in tumoral and healthy parts of brain. The brain tumor has a 10 mm diameter and is located 16 cm under the skin surface. The brain was simulated by a cylindrical water phantom with the dimensions of 19 x 19cm 2 (length x diameter), with 0.5 cm thickness of plexiglass (C 4 H 6 O 2 ). Then beam characteristics were investigated to ensure the accuracy of the model. Simulations were initially validated with against packages such as SRIM/TRIM. Dose calculations were performed using different configurations to evaluate depth-dose profiles and dose 2D distributions.The results of the simulation show that the best proton energy interval, to cover completely the brain tumor, is from 152 to 154 MeV. (authors)

  3. A combined molecular dynamics and Monte Carlo simulation of the spatial distribution of energy deposition by proton beams in liquid water

    Energy Technology Data Exchange (ETDEWEB)

    Garcia-Molina, Rafael [Departamento de Fisica, Centro de Investigacion en Optica y Nanofisica (CIOyN), Universidad de Murcia, E-30100 Murcia (Spain); Abril, Isabel [Departament de Fisica Aplicada, Universitat d' Alacant, E-03080 Alacant (Spain); Heredia-Avalos, Santiago [Departament de Fisica, Enginyeria de Sistemes i Teoria del Senyal, Universitat d' Alacant, E-03080 Alacant (Spain); Kyriakou, Ioanna; Emfietzoglou, Dimitris, E-mail: rgm@um.es [Medical Physics Laboratory, University of Ioannina Medical School, GR-45110 Ioannina (Greece)

    2011-10-07

    We have evaluated the spatial distribution of energy deposition by proton beams in liquid water using the simulation code SEICS (Simulation of Energetic Ions and Clusters through Solids), which combines molecular dynamics and Monte Carlo techniques and includes the main interaction phenomena between the projectile and the target constituents: (i) the electronic stopping force due to energy loss to target electronic excitations, including fluctuations due to the energy-loss straggling, (ii) the elastic scattering with the target nuclei, with their corresponding energy loss and (iii) the dynamical changes in projectile charge state due to electronic capture and loss processes. An important feature of SEICS is the accurate account of the excitation spectrum of liquid water, based on a consistent solid-state description of its energy-loss-function over the whole energy and momentum space. We analyse how the above-mentioned interactions affect the depth distribution of the energy delivered in liquid water by proton beams with incident energies of the order of several MeV. Our simulations show that the position of the Bragg peak is determined mainly by the stopping power, whereas its width can be attributed to the energy-loss straggling. Multiple elastic scattering processes contribute slightly only at the distal part of the Bragg peak. The charge state of the projectiles only changes when approaching the end of their trajectories, i.e. near the Bragg peak. We have also simulated the proton-beam energy distribution at several depths in the liquid water target, and found that it is determined mainly by the fluctuation in the energy loss of the projectile, evaluated through the energy-loss straggling. We conclude that a proper description of the target excitation spectrum as well as the inclusion of the energy-loss straggling is essential in the calculation of the proton beam depth-dose distribution.

  4. A combined molecular dynamics and Monte Carlo simulation of the spatial distribution of energy deposition by proton beams in liquid water

    International Nuclear Information System (INIS)

    Garcia-Molina, Rafael; Abril, Isabel; Heredia-Avalos, Santiago; Kyriakou, Ioanna; Emfietzoglou, Dimitris

    2011-01-01

    We have evaluated the spatial distribution of energy deposition by proton beams in liquid water using the simulation code SEICS (Simulation of Energetic Ions and Clusters through Solids), which combines molecular dynamics and Monte Carlo techniques and includes the main interaction phenomena between the projectile and the target constituents: (i) the electronic stopping force due to energy loss to target electronic excitations, including fluctuations due to the energy-loss straggling, (ii) the elastic scattering with the target nuclei, with their corresponding energy loss and (iii) the dynamical changes in projectile charge state due to electronic capture and loss processes. An important feature of SEICS is the accurate account of the excitation spectrum of liquid water, based on a consistent solid-state description of its energy-loss-function over the whole energy and momentum space. We analyse how the above-mentioned interactions affect the depth distribution of the energy delivered in liquid water by proton beams with incident energies of the order of several MeV. Our simulations show that the position of the Bragg peak is determined mainly by the stopping power, whereas its width can be attributed to the energy-loss straggling. Multiple elastic scattering processes contribute slightly only at the distal part of the Bragg peak. The charge state of the projectiles only changes when approaching the end of their trajectories, i.e. near the Bragg peak. We have also simulated the proton-beam energy distribution at several depths in the liquid water target, and found that it is determined mainly by the fluctuation in the energy loss of the projectile, evaluated through the energy-loss straggling. We conclude that a proper description of the target excitation spectrum as well as the inclusion of the energy-loss straggling is essential in the calculation of the proton beam depth-dose distribution.

  5. Integration and evaluation of automated Monte Carlo simulations in the clinical practice of scanned proton and carbon ion beam therapy.

    Science.gov (United States)

    Bauer, J; Sommerer, F; Mairani, A; Unholtz, D; Farook, R; Handrack, J; Frey, K; Marcelos, T; Tessonnier, T; Ecker, S; Ackermann, B; Ellerbrock, M; Debus, J; Parodi, K

    2014-08-21

    Monte Carlo (MC) simulations of beam interaction and transport in matter are increasingly considered as essential tools to support several aspects of radiation therapy. Despite the vast application of MC to photon therapy and scattered proton therapy, clinical experience in scanned ion beam therapy is still scarce. This is especially the case for ions heavier than protons, which pose additional issues like nuclear fragmentation and varying biological effectiveness. In this work, we present the evaluation of a dedicated framework which has been developed at the Heidelberg Ion Beam Therapy Center to provide automated FLUKA MC simulations of clinical patient treatments with scanned proton and carbon ion beams. Investigations on the number of transported primaries and the dimension of the geometry and scoring grids have been performed for a representative class of patient cases in order to provide recommendations on the simulation settings, showing that recommendations derived from the experience in proton therapy cannot be directly translated to the case of carbon ion beams. The MC results with the optimized settings have been compared to the calculations of the analytical treatment planning system (TPS), showing that regardless of the consistency of the two systems (in terms of beam model in water and range calculation in different materials) relevant differences can be found in dosimetric quantities and range, especially in the case of heterogeneous and deep seated treatment sites depending on the ion beam species and energies, homogeneity of the traversed tissue and size of the treated volume. The analysis of typical TPS speed-up approximations highlighted effects which deserve accurate treatment, in contrast to adequate beam model simplifications for scanned ion beam therapy. In terms of biological dose calculations, the investigation of the mixed field components in realistic anatomical situations confirmed the findings of previous groups so far reported only in

  6. submitter Proton therapy treatment monitoring with the DoPET system: activity range, positron emitters evaluation and comparison with Monte Carlo predictions

    CERN Document Server

    Muraro, S; Belcari, N; Bisogni, M G; Camarlinghi, N; Cristoforetti, L; Guerra, A Del; Ferrari, A; Fracchiolla, F; Morrocchi, M; Righetto, R; Sala, P; Schwarz, M; Sportelli, G; Topi, A; Rosso, V

    2017-01-01

    Ion beam irradiations can deliver conformal dose distributions minimizing damage to healthy tissues thanks to their characteristic dose profiles. Nevertheless, the location of the Bragg peak can be affected by different sources of range uncertainties: a critical issue is the treatment verification. During the treatment delivery, nuclear interactions between the ions and the irradiated tissues generate β+ emitters: the detection of this activity signal can be used to perform the treatment monitoring if an expected activity distribution is available for comparison. Monte Carlo (MC) codes are widely used in the particle therapy community to evaluate the radiation transport and interaction with matter. In this work, FLUKA MC code was used to simulate the experimental conditions of irradiations performed at the Proton Therapy Center in Trento (IT). Several mono-energetic pencil beams were delivered on phantoms mimicking human tissues. The activity signals were acquired with a PET system (DoPET) based on two plana...

  7. A Simulation Study for Radiation Treatment Planning Based on the Atomic Physics of the Proton-Boron Fusion Reaction

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Sunmi; Yoon, Do-Kun; Shin, Han-Back; Jung, Joo-Young; Kim, Moo-Sub; Kim, Kyeong-Hyeon; Jang, Hong-Seok; Suh, Tae Suk [the Catholic University of Korea, Seoul (Korea, Republic of)

    2017-03-15

    The purpose of this research is to demonstrate, based on a Monte Carlo simulation code, the procedure of radiation treatment planning for proton-boron fusion therapy (PBFT). A discrete proton beam (60 - 120 MeV) relevant to the Bragg peak was simulated using a Monte Carlo particle extended (MCNPX, Ver. 2.6.0, National Laboratory, Los Alamos NM, USA) simulation code. After computed tomography (CT) scanning of a virtual water phantom including air cavities, the acquired CT images were converted using the simulation source code. We set the boron uptake regions (BURs) in the simulated water phantom to achieve the proton-boron fusion reaction. Proton sources irradiated the BUR, in the phantom. The acquired dose maps were overlapped with the original CT image of the phantom to analyze the dose volume histogram (DVH). We successfully confirmed amplifications of the proton doses (average: 130%) at the target regions. From the DVH result for each simulation, we acquired a relatively accurate dose map for the treatment. A simulation was conducted to characterize the dose distribution and verify the feasibility of proton boron fusion therapy (PBFT). We observed a variation in proton range and developed a tumor targeting technique for treatment that was more accurate and powerful than both conventional proton therapy and boron-neutron capture therapy.

  8. Monte Carlo FLUKA code simulation for study of {sup 68}Ga production by direct proton-induced reaction

    Energy Technology Data Exchange (ETDEWEB)

    Mokhtari Oranj, Leila; Kakavand, Tayeb [Physics Faculty, Zanjan University, P.O. Box 451-313, Zanjan (Iran, Islamic Republic of); Sadeghi, Mahdi, E-mail: msadeghi@nrcam.org [Agricultural, Medical and Industrial Research School, Nuclear Science and Technology Research Institute, P.O. Box 31485-498, Karaj (Iran, Islamic Republic of); Aboudzadeh Rovias, Mohammadreza [Agricultural, Medical and Industrial Research School, Nuclear Science and Technology Research Institute, P.O. Box 31485-498, Karaj (Iran, Islamic Republic of)

    2012-06-11

    {sup 68}Ga is an important radionuclide for positron emission tomography. {sup 68}Ga can be produced by the {sup 68}Zn(p,n){sup 68}Ga reaction in a common biomedical cyclotrons. To facilitate optimization of target design and study activation of materials, Monte Carlo code can be used to simulate the irradiation of the target materials with charged hadrons. In this paper, FLUKA code simulation was employed to prototype a Zn target for the production of {sup 68}Ga by proton irradiation. Furthermore, the experimental data were compared with the estimated values for the thick target yield produced in the irradiation time according to FLUKA code. In conclusion, FLUKA code can be used for estimation of the production yield.

  9. Continuous energy Monte Carlo method based lattice homogeinzation

    International Nuclear Information System (INIS)

    Li Mancang; Yao Dong; Wang Kan

    2014-01-01

    Based on the Monte Carlo code MCNP, the continuous energy Monte Carlo multi-group constants generation code MCMC has been developed. The track length scheme has been used as the foundation of cross section generation. The scattering matrix and Legendre components require special techniques, and the scattering event method has been proposed to solve this problem. Three methods have been developed to calculate the diffusion coefficients for diffusion reactor core codes and the Legendre method has been applied in MCMC. To the satisfaction of the equivalence theory, the general equivalence theory (GET) and the superhomogenization method (SPH) have been applied to the Monte Carlo method based group constants. The super equivalence method (SPE) has been proposed to improve the equivalence. GET, SPH and SPE have been implemented into MCMC. The numerical results showed that generating the homogenization multi-group constants via Monte Carlo method overcomes the difficulties in geometry and treats energy in continuum, thus provides more accuracy parameters. Besides, the same code and data library can be used for a wide range of applications due to the versatility. The MCMC scheme can be seen as a potential alternative to the widely used deterministic lattice codes. (authors)

  10. Fluence correction factors for graphite calorimetry in a low-energy clinical proton beam: I. Analytical and Monte Carlo simulations.

    Science.gov (United States)

    Palmans, H; Al-Sulaiti, L; Andreo, P; Shipley, D; Lühr, A; Bassler, N; Martinkovič, J; Dobrovodský, J; Rossomme, S; Thomas, R A S; Kacperek, A

    2013-05-21

    The conversion of absorbed dose-to-graphite in a graphite phantom to absorbed dose-to-water in a water phantom is performed by water to graphite stopping power ratios. If, however, the charged particle fluence is not equal at equivalent depths in graphite and water, a fluence correction factor, kfl, is required as well. This is particularly relevant to the derivation of absorbed dose-to-water, the quantity of interest in radiotherapy, from a measurement of absorbed dose-to-graphite obtained with a graphite calorimeter. In this work, fluence correction factors for the conversion from dose-to-graphite in a graphite phantom to dose-to-water in a water phantom for 60 MeV mono-energetic protons were calculated using an analytical model and five different Monte Carlo codes (Geant4, FLUKA, MCNPX, SHIELD-HIT and McPTRAN.MEDIA). In general the fluence correction factors are found to be close to unity and the analytical and Monte Carlo codes give consistent values when considering the differences in secondary particle transport. When considering only protons the fluence correction factors are unity at the surface and increase with depth by 0.5% to 1.5% depending on the code. When the fluence of all charged particles is considered, the fluence correction factor is about 0.5% lower than unity at shallow depths predominantly due to the contributions from alpha particles and increases to values above unity near the Bragg peak. Fluence correction factors directly derived from the fluence distributions differential in energy at equivalent depths in water and graphite can be described by kfl = 0.9964 + 0.0024·zw-eq with a relative standard uncertainty of 0.2%. Fluence correction factors derived from a ratio of calculated doses at equivalent depths in water and graphite can be described by kfl = 0.9947 + 0.0024·zw-eq with a relative standard uncertainty of 0.3%. These results are of direct relevance to graphite calorimetry in low-energy protons but given that the fluence

  11. SU-E-J-145: Validation of An Analytical Model for in Vivo Range Verification Using GATE Monte Carlo Simulation in Proton Therapy

    International Nuclear Information System (INIS)

    Lee, C; Lin, H; Chao, T; Hsiao, I; Chuang, K

    2015-01-01

    Purpose: Predicted PET images on the basis of analytical filtering approach for proton range verification has been successful developed and validated using FLUKA Monte Carlo (MC) codes and phantom measurements. The purpose of the study is to validate the effectiveness of analytical filtering model for proton range verification on GATE/GEANT4 Monte Carlo simulation codes. Methods: In this study, we performed two experiments for validation of predicted β+-isotope by the analytical model with GATE/GEANT4 simulations. The first experiments to evaluate the accuracy of predicting β+-yields as a function of irradiated proton energies. In second experiment, we simulate homogeneous phantoms of different materials irradiated by a mono-energetic pencil-like proton beam. The results of filtered β+-yields distributions by the analytical model is compared with those of MC simulated β+-yields in proximal and distal fall-off ranges. Results: The results investigate the distribution between filtered β+-yields and MC simulated β+-yields distribution in different conditions. First, we found that the analytical filtering can be applied over the whole range of the therapeutic energies. Second, the range difference between filtered β+-yields and MC simulated β+-yields at the distal fall-off region are within 1.5mm for all materials used. The findings validated the usefulness of analytical filtering model on range verification of proton therapy on GATE Monte Carlo simulations. In addition, there is a larger discrepancy between filtered prediction and MC simulated β+-yields using GATE code, especially in proximal region. This discrepancy might Result from the absence of wellestablished theoretical models for predicting the nuclear interactions. Conclusion: Despite the fact that large discrepancies of the distributions between MC-simulated and predicted β+-yields were observed, the study prove the effectiveness of analytical filtering model for proton range verification using

  12. An experimental approach to improve the Monte Carlo modelling of offline PET/CT-imaging of positron emitters induced by scanned proton beams

    Science.gov (United States)

    Bauer, J.; Unholtz, D.; Kurz, C.; Parodi, K.

    2013-08-01

    We report on the experimental campaign carried out at the Heidelberg Ion-Beam Therapy Center (HIT) to optimize the Monte Carlo (MC) modelling of proton-induced positron-emitter production. The presented experimental strategy constitutes a pragmatic inverse approach to overcome the known uncertainties in the modelling of positron-emitter production due to the lack of reliable cross-section data for the relevant therapeutic energy range. This work is motivated by the clinical implementation of offline PET/CT-based treatment verification at our facility. Here, the irradiation induced tissue activation in the patient is monitored shortly after the treatment delivery by means of a commercial PET/CT scanner and compared to a MC simulated activity expectation, derived under the assumption of a correct treatment delivery. At HIT, the MC particle transport and interaction code FLUKA is used for the simulation of the expected positron-emitter yield. For this particular application, the code is coupled to externally provided cross-section data of several proton-induced reactions. Studying experimentally the positron-emitting radionuclide yield in homogeneous phantoms provides access to the fundamental production channels. Therefore, five different materials have been irradiated by monoenergetic proton pencil beams at various energies and the induced β+ activity subsequently acquired with a commercial full-ring PET/CT scanner. With the analysis of dynamically reconstructed PET images, we are able to determine separately the spatial distribution of different radionuclide concentrations at the starting time of the PET scan. The laterally integrated radionuclide yields in depth are used to tune the input cross-section data such that the impact of both the physical production and the imaging process on the various positron-emitter yields is reproduced. The resulting cross-section data sets allow to model the absolute level of measured β+ activity induced in the investigated

  13. Development of Monte Carlo input code for proton, alpha and heavy ion microdosimetric trac structure simulations

    International Nuclear Information System (INIS)

    Douglass, M.; Bezak, E.

    2010-01-01

    Full text: Radiobiology science is important for cancer treatment as it improves our understanding of radiation induced cell death. Monte Carlo simulations playa crucial role in developing improved knowledge of cellular processes. By model Ii ng the cell response to radiation damage and verifying with experimental data, understanding of cell death through direct radiation hits and bystander effects can be obtained. A Monte Carlo input code was developed using 'Geant4' to simulate cellular level radiation interactions. A physics list which enables physically accurate interactions of heavy ions to energies below 100 e V was implemented. A simple biological cell model was also implemented. Each cell consists of three concentric spheres representing the nucleus, cytoplasm and the membrane. This will enable all critical cell death channels to be investigated (i.e. membrane damage, nucleus/DNA). The current simulation has the ability to predict the positions of ionization events within the individual cell components on I micron scale. We have developed a Geant4 simulation for investigation of radiation damage to cells on sub-cellular scale (∼I micron). This code currently allows the positions of the ionisation events within the individual components of the cell enabling a more complete picture of cell death to be developed. The next stage will include expansion of the code to utilise non-regular cell lattice. (author)

  14. GPU based Monte Carlo for PET image reconstruction: detector modeling

    International Nuclear Information System (INIS)

    Légrády; Cserkaszky, Á; Lantos, J.; Patay, G.; Bükki, T.

    2011-01-01

    Monte Carlo (MC) calculations and Graphical Processing Units (GPUs) are almost like the dedicated hardware designed for the specific task given the similarities between visible light transport and neutral particle trajectories. A GPU based MC gamma transport code has been developed for Positron Emission Tomography iterative image reconstruction calculating the projection from unknowns to data at each iteration step taking into account the full physics of the system. This paper describes the simplified scintillation detector modeling and its effect on convergence. (author)

  15. Monte Carlo evaluation of derivative-based global sensitivity measures

    Energy Technology Data Exchange (ETDEWEB)

    Kucherenko, S. [Centre for Process Systems Engineering, Imperial College London, London SW7 2AZ (United Kingdom)], E-mail: s.kucherenko@ic.ac.uk; Rodriguez-Fernandez, M. [Process Engineering Group, Instituto de Investigaciones Marinas, Spanish Council for Scientific Research (C.S.I.C.), C/ Eduardo Cabello, 6, 36208 Vigo (Spain); Pantelides, C.; Shah, N. [Centre for Process Systems Engineering, Imperial College London, London SW7 2AZ (United Kingdom)

    2009-07-15

    A novel approach for evaluation of derivative-based global sensitivity measures (DGSM) is presented. It is compared with the Morris and the Sobol' sensitivity indices methods. It is shown that there is a link between DGSM and Sobol' sensitivity indices. DGSM are very easy to implement and evaluate numerically. The computational time required for numerical evaluation of DGSM is many orders of magnitude lower than that for estimation of the Sobol' sensitivity indices. It is also lower than that for the Morris method. Efficiencies of Monte Carlo (MC) and quasi-Monte Carlo (QMC) sampling methods for calculation of DGSM are compared. It is shown that the superiority of QMC over MC depends on the problem's effective dimension, which can also be estimated using DGSM.

  16. Monte Carlo evaluation of derivative-based global sensitivity measures

    International Nuclear Information System (INIS)

    Kucherenko, S.; Rodriguez-Fernandez, M.; Pantelides, C.; Shah, N.

    2009-01-01

    A novel approach for evaluation of derivative-based global sensitivity measures (DGSM) is presented. It is compared with the Morris and the Sobol' sensitivity indices methods. It is shown that there is a link between DGSM and Sobol' sensitivity indices. DGSM are very easy to implement and evaluate numerically. The computational time required for numerical evaluation of DGSM is many orders of magnitude lower than that for estimation of the Sobol' sensitivity indices. It is also lower than that for the Morris method. Efficiencies of Monte Carlo (MC) and quasi-Monte Carlo (QMC) sampling methods for calculation of DGSM are compared. It is shown that the superiority of QMC over MC depends on the problem's effective dimension, which can also be estimated using DGSM.

  17. Parallel proton transfer pathways in aqueous acid-base reactions

    NARCIS (Netherlands)

    Cox, M.J.; Bakker, H.J.

    2008-01-01

    We study the mechanism of proton transfer (PT) between the photoacid 8-hydroxy-1,3, 6-pyrenetrisulfonic acid (HPTS) and the base chloroacetate in aqueous solution. We investigate both proton and deuteron transfer reactions in solutions with base concentrations ranging from 0.25M to 4M. Using

  18. SU-E-T-553: Monte Carlo Calculation of Proton Bragg Peak Displacements in the Presence of Al2O3:C Dosimeters

    Energy Technology Data Exchange (ETDEWEB)

    Young, L; Yang, F [Univ Washington, Seattle, WA (United States)

    2015-06-15

    Purpose: The application of optically stimulated luminescence dosimeters (OSLDs) may be extended to clinical investigations verifying irradiated doses in small animal models. In proton beams, the accurate positioning of the Bragg peak is essential for tumor targeting. The purpose of this study was to estimate the displacement of a pristine Bragg peak when an Al2O3:C nanodot (Landauer, Inc.) is placed on the surface of a water phantom and to evaluate corresponding changes in dose. Methods: Clinical proton pencil beam simulations were carried out with using TOPAS, a Monte Carlo platform layered on top of GEANT4. Point-shaped beams with no energy spread were modeled for energies 100MV, 150MV, 200MV, and 250MV. Dose scoring for 100,000 particle histories was conducted within a water phantom (20cm × 20cm irradiated area, 40cm depth) with its surface placed 214.5cm away from the source. The modeled nanodot had a 4mm radius and 0.2mm thickness. Results: A comparative analysis of Monte Carlo depth dose profiles modeled for these proton pencil beams did not demonstrate an energy dependent in the Bragg peak shift. The shifts in Bragg Peak depth for water phantoms modeled with a nanodot on the phantom surface ranged between 2.7 to 3.2 mm. In all cases, the Bragg Peaks were shifted closer to the irradiation source. The peak dose in phantoms with an OSLD remained unchanged with percent dose differences less than 0.55% when compared to phantom doses without the nanodot. Conclusion: Monte Carlo calculations show that the presence of OSLD nanodots in proton beam therapy will not change the position of a pristine Bragg Peak by more than 3 mm. Although the 3.0 mm shift will not have a detrimental effect in patients receiving proton therapy, this effect may not be negligible in dose verification measurements for mouse models at lower proton beam energies.

  19. Monte Carlo patient study on the comparison of prompt gamma and PET imaging for range verification in proton therapy

    Science.gov (United States)

    Moteabbed, M.; España, S.; Paganetti, H.

    2011-02-01

    The purpose of this work was to compare the clinical adaptation of prompt gamma (PG) imaging and positron emission tomography (PET) as independent tools for non-invasive proton beam range verification and treatment validation. The PG range correlation and its differences with PET have been modeled for the first time in a highly heterogeneous tissue environment, using different field sizes and configurations. Four patients with different tumor locations (head and neck, prostate, spine and abdomen) were chosen to compare the site-specific behaviors of the PG and PET images, using both passive scattered and pencil beam fields. Accurate reconstruction of dose, PG and PET distributions was achieved by using the planning computed tomography (CT) image in a validated GEANT4-based Monte Carlo code capable of modeling the treatment nozzle and patient anatomy in detail. The physical and biological washout phenomenon and decay half-lives for PET activity for the most abundant isotopes such as 11C, 15O, 13N, 30P and 38K were taken into account in the data analysis. The attenuation of the gamma signal after traversing the patient geometry and respective detection efficiencies were estimated for both methods to ensure proper comparison. The projected dose, PG and PET profiles along many lines in the beam direction were analyzed to investigate the correlation consistency across the beam width. For all subjects, the PG method showed on average approximately 10 times higher gamma production rates than the PET method before, and 60 to 80 times higher production after including the washout correction and acquisition time delay. This rate strongly depended on tissue density and elemental composition. For broad passive scattered fields, it was demonstrated that large differences exist between PG and PET signal falloff positions and the correlation with the dose distribution for different lines in the beam direction. These variations also depended on the treatment site and the

  20. SU-F-T-155: Validation of a Commercial Monte Carlo Dose Calculation Algorithm for Proton Therapy

    Energy Technology Data Exchange (ETDEWEB)

    Saini, J; Wong, T [SCCA Proton Therapy Center, Seattle, WA (United States); St James, S; Stewart, R; Bloch, C [University of Washington, Seattle, WA (United States); Traneus, E [Raysearch Laboratories AB, Stockholm. (Sweden)

    2016-06-15

    Purpose: Compare proton pencil beam scanning dose measurements to GATE/GEANT4 (GMC) and RayStation™ Monte Carlo (RMC) simulations. Methods: Proton pencil beam models of the IBA gantry at the Seattle Proton Therapy Center were developed in the GMC code system and a research build of the RMC. For RMC, a preliminary beam model that does not account for upstream halo was used. Depth dose and lateral profiles are compared for the RMC, GMC and a RayStation™ pencil beam dose (RPB) model for three spread out Bragg peaks (SOBPs) in homogenous water phantom. SOBP comparisons were also made among the three models for a phantom with a (i) 2 cm bone and a (ii) 0.5 cm titanium insert. Results: Measurements and GMC estimates of R80 range agree to within 1 mm, and the mean point-to-point dose difference is within 1.2% for all integrated depth dose (IDD) profiles. The dose differences at the peak are 1 to 2%. All of the simulated spot sigmas are within 0.15 mm of the measured values. For the three SOBPs considered, the maximum R80 deviation from measurement for GMC was −0.35 mm, RMC 0.5 mm, and RPB −0.1 mm. The minimum gamma pass using the 3%/3mm criterion for all the profiles was 94%. The dose comparison for heterogeneous inserts in low dose gradient regions showed dose differences greater than 10% at the distal edge of interface between RPB and GMC. The RMC showed improvement and agreed with GMC to within 7%. Conclusion: The RPB dosimetry show clinically significant differences (> 10%) from GMC and RMC estimates. The RMC algorithm is superior to the RPB dosimetry in heterogeneous media. We suspect modelling of the beam’s halo may be responsible for a portion of the remaining discrepancy and that RayStation will reduce this discrepancy as they finalize the release. Erik Traneus is employed as a Research Scientist at RaySearch Laboratories. The research build of the RayStation TPS used in the study was made available to the SCCA free of charge. RaySearch did not provide

  1. SU-E-T-243: MonteCarlo Simulation Study of Polymer and Radiochromic Gel for Three-Dimensional Proton Dose Distribution

    International Nuclear Information System (INIS)

    Park, M; Jung, H; Kim, G; Ji, Y; Kim, K; Park, S

    2014-01-01

    Purpose: To estimate the three dimensional dose distributions in a polymer gel and a radiochromic gel by comparing with the virtual water phantom exposed to proton beams by applying Monte Carlo simulation. Methods: The polymer gel dosimeter is the compositeness material of gelatin, methacrylic acid, hydroquinone, tetrakis, and distilled water. The radiochromic gel is PRESAGE product. The densities of polymer and radiochromic gel were 1.040 and 1.0005 g/cm3, respectively. The shape of water phantom was a hexahedron with the size of 13 × 13 × 15 cm3. The proton beam energies of 72 and 116 MeV were used in the simulation. Proton beam was directed to the top of the phantom with Z-axis and the shape of beam was quadrangle with 10 × 10 cm2 dimension. The Percent depth dose and the dose distribution were evaluated for estimating the dose distribution of proton particle in two gel dosimeters, and compared with the virtual water phantom. Results: The Bragg-peak for proton particles in two gel dosimeters was similar to the virtual water phantom. Bragg-peak regions of polymer gel, radiochromic gel, and virtual water phantom were represented in the identical region (4.3 cm) for 72 MeV proton beam. For 116 MeV proton beam, the Bragg-peak regions of polymer gel, radiochromic gel, and virtual water phantom were represented in 9.9, 9.9 and 9.7 cm, respectively. The dose distribution of proton particles in polymer gel, radiochromic gel, and virtual water phantom was approximately identical in the case of 72 and 116 MeV energies. The errors for the simulation were under 10%. Conclusion: This work indicates the evaluation of three dimensional dose distributions by exposing proton particles to polymer and radiochromic gel dosimeter by comparing with the water phantom. The polymer gel and the radiochromic gel dosimeter show similar dose distributions for the proton beams

  2. SU-E-J-82: Intra-Fraction Proton Beam-Range Verification with PET Imaging: Feasibility Studies with Monte Carlo Simulations and Statistical Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Lou, K [U.T M.D. Anderson Cancer Center, Houston, TX (United States); Rice University, Houston, TX (United States); Mirkovic, D; Sun, X; Zhu, X; Poenisch, F; Grosshans, D; Shao, Y [U.T M.D. Anderson Cancer Center, Houston, TX (United States); Clark, J [Rice University, Houston, TX (United States)

    2014-06-01

    Purpose: To study the feasibility of intra-fraction proton beam-range verification with PET imaging. Methods: Two phantoms homogeneous cylindrical PMMA phantoms (290 mm axial length, 38 mm and 200 mm diameter respectively) were studied using PET imaging: a small phantom using a mouse-sized PET (61 mm diameter field of view (FOV)) and a larger phantom using a human brain-sized PET (300 mm FOV). Monte Carlo (MC) simulations (MCNPX and GATE) were used to simulate 179.2 MeV proton pencil beams irradiating the two phantoms and be imaged by the two PET systems. A total of 50 simulations were conducted to generate 50 positron activity distributions and correspondingly 50 measured activity-ranges. The accuracy and precision of these activity-ranges were calculated under different conditions (including count statistics and other factors, such as crystal cross-section). Separate from the MC simulations, an activity distribution measured from a simulated PET image was modeled as a noiseless positron activity distribution corrupted by Poisson counting noise. The results from these two approaches were compared to assess the impact of count statistics on the accuracy and precision of activity-range calculations. Results: MC Simulations show that the accuracy and precision of an activity-range are dominated by the number (N) of coincidence events of the reconstructed image. They are improved in a manner that is inversely proportional to 1/sqrt(N), which can be understood from the statistical modeling. MC simulations also indicate that the coincidence events acquired within the first 60 seconds with 10{sup 9} protons (small phantom) and 10{sup 10} protons (large phantom) are sufficient to achieve both sub-millimeter accuracy and precision. Conclusion: Under the current MC simulation conditions, the initial study indicates that the accuracy and precision of beam-range verification are dominated by count statistics, and intra-fraction PET image-based beam-range verification is

  3. SU-F-T-157: Physics Considerations Regarding Dosimetric Accuracy of Analytical Dose Calculations for Small Field Proton Therapy: A Monte Carlo Study

    Energy Technology Data Exchange (ETDEWEB)

    Geng, C [Massachusetts General Hospital, Boston, MA (United States); Nanjing University of Aeronautics and Astronautics, Nanjing (China); Daartz, J; Cheung, K; Bussiere, M; Shih, H; Paganetti, H; Schuemann, J [Massachusetts General Hospital, Boston, MA (United States)

    2016-06-15

    Purpose: To evaluate the accuracy of dose calculations by analytical dose calculation methods (ADC) for small field proton therapy in a gantry based passive scattering facility. Methods: 50 patients with intra-cranial disease were evaluated in the study. Treatment plans followed standard prescription and optimization procedures of proton stereotactic radiosurgery. Dose distributions calculated with the Monte Carlo (MC) toolkit TOPAS were used to represent delivered treatments. The MC dose was first adjusted using the output factor (OF) applied clinically. This factor is determined from the field size and the prescribed range. We then introduced a normalization factor to measure the difference in mean dose between the delivered dose (MC dose with OF) and the dose calculated by ADC for each beam. The normalization was determined by the mean dose of the center voxels of the target area. We compared delivered dose distributions and those calculated by ADC in terms of dose volume histogram parameters and beam range distributions. Results: The mean target dose for a whole treatment is generally within 5% comparing delivered dose (MC dose with OF) and ADC dose. However, the differences can be as great as 11% for shallow and small target treated with a thick range compensator. Applying the normalization factor to the MC dose with OF can reduce the mean dose difference to less than 3%. Considering range uncertainties, the generally applied margins (3.5% of the prescribed range + 1mm) to cover uncertainties in range might not be sufficient to guarantee tumor coverage. The range difference for R90 (90% distal dose falloff) is affected by multiple factors, such as the heterogeneity index. Conclusion: This study indicates insufficient accuracy calculating proton doses using ADC. Our results suggest that uncertainties of target doses are reduced using MC techniques, improving the dosimetric accuracy for proton stereotactic radiosurgery. The work was supported by NIH/NCI under CA

  4. Proton conduction based on intracrystalline chemical reaction

    International Nuclear Information System (INIS)

    Schuck, G.; Lechner, R.E.; Langer, K.

    2002-01-01

    Proton conductivity in M 3 H(SeO 4 ) 2 crystals (M=K, Rb, Cs) is shown to be due to a dynamic disorder in the form of an intracrystalline chemical equilibrium reaction: alternation between the association of the monomers [HSeO 4 ] 1- and [SeO 4 ] 2- resulting in the dimer [H(SeO 4 ) 2 ] 3- (H-bond formation) and the dissociation of the latter into the two monomers (H-bond breaking). By a combination of quasielastic neutron scattering and FTIR spectroscopy, reaction rates were obtained, as well as rates of proton exchange between selenate ions, leading to diffusion. The results demonstrate that this reaction plays a central role in the mechanism of proton transport in these solid-state protonic conductors. (orig.)

  5. Simulation study of proton inelastic interaction with nuclei in the 50 to 350 MeV range by Monte Carlo Method

    International Nuclear Information System (INIS)

    Peres, J.C.

    1982-11-01

    This study settles a contribution to the proton-nucleus inelastic interaction simulation. Experimental display of deuton, triton, ion 3 He and alpha clusters in nucleus led us to include them in intranuclear cascade. We use a FERMI type distribution of nucleons; knowledge of each fundamental phenomenon allowed us to follow every particle moving in the medium. Inelastic interaction simulation was performed by the use of a MONTE CARLO method [fr

  6. Monte Carlo based radial shield design of typical PWR reactor

    Energy Technology Data Exchange (ETDEWEB)

    Gul, Anas; Khan, Rustam; Qureshi, M. Ayub; Azeem, Muhammad Waqar; Raza, S.A. [Pakistan Institute of Engineering and Applied Sciences, Islamabad (Pakistan). Dept. of Nuclear Engineering; Stummer, Thomas [Technische Univ. Wien (Austria). Atominst.

    2016-11-15

    Neutron and gamma flux and dose equivalent rate distribution are analysed in radial and shields of a typical PWR type reactor based on the Monte Carlo radiation transport computer code MCNP5. The ENDF/B-VI continuous energy cross-section library has been employed for the criticality and shielding analysis. The computed results are in good agreement with the reference results (maximum difference is less than 56 %). It implies that MCNP5 a good tool for accurate prediction of neutron and gamma flux and dose rates in radial shield around the core of PWR type reactors.

  7. The optimal balance between quality and efficiency in proton radiography imaging technique at various proton beam energies : A Monte Carlo study

    NARCIS (Netherlands)

    Biegun, A K; van Goethem, M-J; van der Graaf, E R; van Beuzekom, M; Koffeman, E N; Nakaji, T; Takatsu, J; Visser, J; Brandenburg, S

    Proton radiography is a novel imaging modality that allows direct measurement of the proton energy loss in various tissues. Currently, due to the conversion of so-called Hounsfield units from X-ray Computed Tomography (CT) into relative proton stopping powers (RPSP), the uncertainties of RPSP are

  8. SU-F-T-152: Experimental Validation and Calculation Benchmark for a Commercial Monte Carlo Pencil BeamScanning Proton Therapy Treatment Planning System in Heterogeneous Media

    Energy Technology Data Exchange (ETDEWEB)

    Lin, L; Huang, S; Kang, M; Ainsley, C; Simone, C; McDonough, J; Solberg, T [University of Pennsylvania, Philadelphia, PA (United States)

    2016-06-15

    Purpose: Eclipse AcurosPT 13.7, the first commercial Monte Carlo pencil beam scanning (PBS) proton therapy treatment planning system (TPS), was experimentally validated for an IBA dedicated PBS nozzle in the CIRS 002LFC thoracic phantom. Methods: A two-stage procedure involving the use of TOPAS 1.3 simulations was performed. First, Geant4-based TOPAS simulations in this phantom were experimentally validated for single and multi-spot profiles at several depths for 100, 115, 150, 180, 210 and 225 MeV proton beams, using the combination of a Lynx scintillation detector and a MatriXXPT ionization chamber array. Second, benchmark calculations were performed with both AcurosPT and TOPAS in a phantom identical to the CIRS 002LFC, with the exception that the CIRS bone/mediastinum/lung tissues were replaced with similar tissues that are predefined in AcurosPT (a limitation of this system which necessitates the two stage procedure). Results: Spot sigmas measured in tissue were in agreement within 0.2 mm of TOPAS simulation for all six energies, while AcurosPT was consistently found to have larger spot sigma (<0.7 mm) than TOPAS. Using absolute dose calibration by MatriXXPT, the agreements between profiles measurements and TOPAS simulation, and calculation benchmarks are over 97% except near the end of range using 2 mm/2% gamma criteria. Overdosing and underdosing were observed at the low and high density side of tissue interfaces, respectively, and these increased with increasing depth and decreasing energy. Near the mediastinum/lung interface, the magnitude can exceed 5 mm/10%. Furthermore, we observed >5% quenching effect in the conversion of Lynx measurements to dose. Conclusion: We recommend the use of an ionization chamber array in combination with the scintillation detector to measure absolute dose and relative PBS spot characteristics. We also recommend the use of an independent Monte Carlo calculation benchmark for the commissioning of a commercial TPS. Partially

  9. Proton irradiation effects on gallium nitride-based devices

    Science.gov (United States)

    Karmarkar, Aditya P.

    Proton radiation effects on state-of-the-art gallium nitride-based devices were studied using Schottky diodes and high electron-mobility transistors. The device degradation was studied over a wide range of proton fluences. This study allowed for a correlation between proton irradiation effects between different types of devices and enhanced the understanding of the mechanisms responsible for radiation damage in GaN-based devices. Proton irradiation causes reduced carrier concentration and increased series resistance and ideality factor in Schottky diodes. 1.0-MeV protons cause greater degradation than 1.8-MeV protons because of their higher non-ionizing energy loss. The displacement damage in Schottky diodes recovers during annealing. High electron-mobility transistors exhibit extremely high radiation tolerance, continuing to perform up to a fluence of ˜1014 cm-2 of 1.8-MeV protons. Proton irradiation creates defect complexes in the thin-film structure. Decreased sheet carrier mobility due to increased carrier scattering and decreased sheet carrier density due to carrier removal by the defect centers are the primary damage mechanisms. Interface disorder at either the Schottky or the Ohmic contact plays a relatively unimportant part in overall device degradation in both Schottky diodes and high electron-mobility transistors.

  10. Implementation and verification of nuclear interactions in a Monte-Carlo code for the Procom-ProGam proton therapy planning system

    Science.gov (United States)

    Kostyuchenko, V. I.; Makarova, A. S.; Ryazantsev, O. B.; Samarin, S. I.; Uglov, A. S.

    2014-06-01

    A great breakthrough in proton therapy has happened in the new century: several tens of dedicated centers are now operated throughout the world and their number increases every year. An important component of proton therapy is a treatment planning system. To make calculations faster, these systems usually use analytical methods whose reliability and accuracy do not allow the advantages of this method of treatment to implement to the full extent. Predictions by the Monte Carlo (MC) method are a "gold" standard for the verification of calculations with these systems. At the Institute of Experimental and Theoretical Physics (ITEP) which is one of the eldest proton therapy centers in the world, an MC code is an integral part of their treatment planning system. This code which is called IThMC was developed by scientists from RFNC-VNIITF (Snezhinsk) under ISTC Project 3563.

  11. Improvements in pencil beam scanning proton therapy dose calculation accuracy in brain tumor cases with a commercial Monte Carlo algorithm.

    Science.gov (United States)

    Widesott, Lamberto; Lorentini, Stefano; Fracchiolla, Francesco; Farace, Paolo; Schwarz, Marco

    2018-05-04

    validation of a commercial Monte Carlo (MC) algorithm (RayStation ver6.0.024) for the treatment of brain tumours with pencil beam scanning (PBS) proton therapy, comparing it via measurements and analytical calculations in clinically realistic scenarios. Methods: For the measurements a 2D ion chamber array detector (MatriXX PT)) was placed underneath the following targets: 1) anthropomorphic head phantom (with two different thickness) and 2) a biological sample (i.e. half lamb's head). In addition, we compared the MC dose engine vs. the RayStation pencil beam (PB) algorithm clinically implemented so far, in critical conditions such as superficial targets (i.e. in need of range shifter), different air gaps and gantry angles to simulate both orthogonal and tangential beam arrangements. For every plan the PB and MC dose calculation were compared to measurements using a gamma analysis metrics (3%, 3mm). Results: regarding the head phantom the gamma passing rate (GPR) was always >96% and on average > 99% for the MC algorithm; PB algorithm had a GPR ≤90% for all the delivery configurations with single slab (apart 95 % GPR from gantry 0° and small air gap) and in case of two slabs of the head phantom the GPR was >95% only in case of small air gaps for all the three (0°, 45°,and 70°) simulated beam gantry angles. Overall the PB algorithm tends to overestimate the dose to the target (up to 25%) and underestimate the dose to the organ at risk (up to 30%). We found similar results (but a bit worse for PB algorithm) for the two targets of the lamb's head where only two beam gantry angles were simulated. Conclusions: our results suggest that in PBS proton therapy range shifter (RS) need to be used with extreme caution when planning the treatment with an analytical algorithm due to potentially great discrepancies between the planned dose and the dose delivered to the patients, also in case of brain tumours where this issue could be underestimated. Our results also

  12. Efficient sampling algorithms for Monte Carlo based treatment planning

    International Nuclear Information System (INIS)

    DeMarco, J.J.; Solberg, T.D.; Chetty, I.; Smathers, J.B.

    1998-01-01

    Efficient sampling algorithms are necessary for producing a fast Monte Carlo based treatment planning code. This study evaluates several aspects of a photon-based tracking scheme and the effect of optimal sampling algorithms on the efficiency of the code. Four areas were tested: pseudo-random number generation, generalized sampling of a discrete distribution, sampling from the exponential distribution, and delta scattering as applied to photon transport through a heterogeneous simulation geometry. Generalized sampling of a discrete distribution using the cutpoint method can produce speedup gains of one order of magnitude versus conventional sequential sampling. Photon transport modifications based upon the delta scattering method were implemented and compared with a conventional boundary and collision checking algorithm. The delta scattering algorithm is faster by a factor of six versus the conventional algorithm for a boundary size of 5 mm within a heterogeneous geometry. A comparison of portable pseudo-random number algorithms and exponential sampling techniques is also discussed

  13. Proton therapy treatment monitoring with the DoPET system: activity range, positron emitters evaluation and comparison with Monte Carlo predictions

    Science.gov (United States)

    Muraro, S.; Battistoni, G.; Belcari, N.; Bisogni, M. G.; Camarlinghi, N.; Cristoforetti, L.; Del Guerra, A.; Ferrari, A.; Fracchiolla, F.; Morrocchi, M.; Righetto, R.; Sala, P.; Schwarz, M.; Sportelli, G.; Topi, A.; Rosso, V.

    2017-12-01

    Ion beam irradiations can deliver conformal dose distributions minimizing damage to healthy tissues thanks to their characteristic dose profiles. Nevertheless, the location of the Bragg peak can be affected by different sources of range uncertainties: a critical issue is the treatment verification. During the treatment delivery, nuclear interactions between the ions and the irradiated tissues generate β+ emitters: the detection of this activity signal can be used to perform the treatment monitoring if an expected activity distribution is available for comparison. Monte Carlo (MC) codes are widely used in the particle therapy community to evaluate the radiation transport and interaction with matter. In this work, FLUKA MC code was used to simulate the experimental conditions of irradiations performed at the Proton Therapy Center in Trento (IT). Several mono-energetic pencil beams were delivered on phantoms mimicking human tissues. The activity signals were acquired with a PET system (DoPET) based on two planar heads, and designed to be installed along the beam line to acquire data also during the irradiation. Different acquisitions are analyzed and compared with the MC predictions, with a special focus on validating the PET detectors response for activity range verification.

  14. Clinical dosimetry in photon radiotherapy. A Monte Carlo based investigation

    International Nuclear Information System (INIS)

    Wulff, Joerg

    2010-01-01

    Practical clinical dosimetry is a fundamental step within the radiation therapy process and aims at quantifying the absorbed radiation dose within a 1-2% uncertainty. To achieve this level of accuracy, corrections are needed for calibrated and air-filled ionization chambers, which are used for dose measurement. The procedures of correction are based on cavity theory of Spencer-Attix and are defined in current dosimetry protocols. Energy dependent corrections for deviations from calibration beams account for changed ionization chamber response in the treatment beam. The corrections applied are usually based on semi-analytical models or measurements and are generally hard to determine due to their magnitude of only a few percents or even less. Furthermore the corrections are defined for fixed geometrical reference-conditions and do not apply to non-reference conditions in modern radiotherapy applications. The stochastic Monte Carlo method for the simulation of radiation transport is becoming a valuable tool in the field of Medical Physics. As a suitable tool for calculation of these corrections with high accuracy the simulations enable the investigation of ionization chambers under various conditions. The aim of this work is the consistent investigation of ionization chamber dosimetry in photon radiation therapy with the use of Monte Carlo methods. Nowadays Monte Carlo systems exist, which enable the accurate calculation of ionization chamber response in principle. Still, their bare use for studies of this type is limited due to the long calculation times needed for a meaningful result with a small statistical uncertainty, inherent to every result of a Monte Carlo simulation. Besides heavy use of computer hardware, techniques methods of variance reduction to reduce the needed calculation time can be applied. Methods for increasing the efficiency in the results of simulation were developed and incorporated in a modern and established Monte Carlo simulation environment

  15. SU-E-T-521: Investigation of the Uncertainties Involved in Secondary Neutron/gamma Production in Geant4/MCNP6 Monte Carlo Codes for Proton Therapy Application

    International Nuclear Information System (INIS)

    Mirzakhanian, L; Enger, S; Giusti, V

    2015-01-01

    Purpose: A major concern in proton therapy is the production of secondary neutrons causing secondary cancers, especially in young adults and children. Most utilized Monte Carlo codes in proton therapy are Geant4 and MCNP. However, the default versions of Geant4 and MCNP6 do not have suitable cross sections or physical models to properly handle secondary particle production in proton energy ranges used for therapy. In this study, default versions of Geant4 and MCNP6 were modified to better handle production of secondaries by adding the TENDL-2012 cross-section library. Methods: In-water proton depth-dose was measured at the “The Svedberg Laboratory” in Uppsala (Sweden). The proton beam was mono-energetic with mean energy of 178.25±0.2 MeV. The measurement set-up was simulated by Geant4 version 10.00 (default and modified version) and MCNP6. Proton depth-dose, primary and secondary particle fluence and neutron equivalent dose were calculated. In case of Geant4, the secondary particle fluence was filtered by all the physics processes to identify the main process responsible for the difference between the default and modified version. Results: The proton depth-dose curves and primary proton fluence show a good agreement between both Geant4 versions and MCNP6. With respect to the modified version, default Geant4 underestimates the production of secondary neutrons while overestimates that of gammas. The “ProtonInElastic” process was identified as the main responsible process for the difference between the two versions. MCNP6 shows higher neutron production and lower gamma production than both Geant4 versions. Conclusion: Despite the good agreement on the proton depth dose curve and primary proton fluence, there is a significant discrepancy on secondary neutron production between MCNP6 and both versions of Geant4. Further studies are thus in order to find the possible cause of this discrepancy or more accurate cross-sections/models to handle the nuclear

  16. SU-F-T-682: In-Vivo Simulation of the Relative Biological Effectiveness in Proton Therapy Using a Monte Carlo Method

    International Nuclear Information System (INIS)

    Oesten, H; Loeck, S; Wohlfahrt, P; Helmbrecht, S; Tillner, F; Schuemann, J; Luehr, A

    2016-01-01

    Purpose: In proton therapy, the relative biological effectiveness (RBE) – compared with conventional photon therapy – is routinely set to 1.1. However, experimental in vitro studies indicate evidence for the variability of the RBE. To clarify the impact on patient treatment, investigation of the RBE in a preclinical case study should be performed. Methods: The Monte Carlo software TOPAS was used to simulate the radiation field of an irradiation setup at the experimental beamline of the proton therapy facility (OncoRay) in Dresden, Germany. Simulations were performed on cone beam CT-data (CBCT) of a xenogeneous mouse with an orthotopic lung carcinoma obtained by an in-house developed small animal image-guided radiotherapy device. A homogeneous physical fraction dose of 1.8Gy was prescribed for the contoured tumor volume. Simulated dose and linear energy transfer distributions were used to estimate RBE values in the mouse based on an RBE model by Wedenberg et al. To characterize radiation sensitivity of normal and tumor tissue, α/β-ratios were taken from the literature for NB1RGB (10.1Gy) and human squamous lung cancer (6.2Gy) cell lines, respectively. Results: Good dose coverage of the target volume was achieved with a spread-out Bragg peak (SOBP). The contra-lateral lung was completely spared from receiving radiation. An increase in RBE towards the distal end of the SOBP from 1.07 to 1.35 and from 1.05 to 1.3 was observed when considering normal tissue and tumor, respectively, with the highest RBE values located distal to the target volume. Conclusion: Modeled RBE values simulated on CBCT for experimental preclinical proton therapy varied with tissue type and depth in a mouse and differed therefore from a constant value of 1.1. Further translational work will include, first, conducting preclinical experiments and, second, analogous RBE studies in patients using experimentally verified simulation settings for our clinically used patient-specific beam

  17. SU-F-T-682: In-Vivo Simulation of the Relative Biological Effectiveness in Proton Therapy Using a Monte Carlo Method

    Energy Technology Data Exchange (ETDEWEB)

    Oesten, H [OncoRay - National Center for Radiation Research in Oncology, Faculty of Medicine and University Hospital Carl Gustav Carus, Technische Universitaet Dresden (Germany); Massachusetts General Hospital, Boston, MA (Germany); Loeck, S; Wohlfahrt, P [OncoRay - National Center for Radiation Research in Oncology, Faculty of Medicine and University Hospital Carl Gustav Carus, Technische Universitaet Dresden (Germany); Helmbrecht, S [OncoRay - National Center for Radiation Research in Oncology, Faculty of Medicine and University Hospital Carl Gustav Carus, Technische Universitaet Dresden (Germany); Institute of Radiation Physics, Helmholtz-Zentrum Dresden-Rossendorf (Germany); Tillner, F [OncoRay - National Center for Radiation Research in Oncology, Faculty of Medicine and University Hospital Carl Gustav Carus, Technische Universitaet Dresden (Germany); Department of Radiation Oncology, University Hospital Carl Gustav Carus, Technische Universitaet Dresden (Germany); Schuemann, J [Massachusetts General Hospital, Boston, MA (United States); Luehr, A [OncoRay - National Center for Radiation Research in Oncology, Faculty of Medicine and University Hospital Carl Gustav Carus, Technische Universitaet Dresden (Germany); German Cancer Consortium (DKTK), Dresden (Germany); German Cancer Research Center (DKFZ), Heidelberg (Germany)

    2016-06-15

    Purpose: In proton therapy, the relative biological effectiveness (RBE) – compared with conventional photon therapy – is routinely set to 1.1. However, experimental in vitro studies indicate evidence for the variability of the RBE. To clarify the impact on patient treatment, investigation of the RBE in a preclinical case study should be performed. Methods: The Monte Carlo software TOPAS was used to simulate the radiation field of an irradiation setup at the experimental beamline of the proton therapy facility (OncoRay) in Dresden, Germany. Simulations were performed on cone beam CT-data (CBCT) of a xenogeneous mouse with an orthotopic lung carcinoma obtained by an in-house developed small animal image-guided radiotherapy device. A homogeneous physical fraction dose of 1.8Gy was prescribed for the contoured tumor volume. Simulated dose and linear energy transfer distributions were used to estimate RBE values in the mouse based on an RBE model by Wedenberg et al. To characterize radiation sensitivity of normal and tumor tissue, α/β-ratios were taken from the literature for NB1RGB (10.1Gy) and human squamous lung cancer (6.2Gy) cell lines, respectively. Results: Good dose coverage of the target volume was achieved with a spread-out Bragg peak (SOBP). The contra-lateral lung was completely spared from receiving radiation. An increase in RBE towards the distal end of the SOBP from 1.07 to 1.35 and from 1.05 to 1.3 was observed when considering normal tissue and tumor, respectively, with the highest RBE values located distal to the target volume. Conclusion: Modeled RBE values simulated on CBCT for experimental preclinical proton therapy varied with tissue type and depth in a mouse and differed therefore from a constant value of 1.1. Further translational work will include, first, conducting preclinical experiments and, second, analogous RBE studies in patients using experimentally verified simulation settings for our clinically used patient-specific beam

  18. An experimental approach to improve the Monte Carlo modelling of offline PET/CT-imaging of positron emitters induced by scanned proton beams

    International Nuclear Information System (INIS)

    Bauer, J; Unholtz, D; Kurz, C; Parodi, K

    2013-01-01

    We report on the experimental campaign carried out at the Heidelberg Ion-Beam Therapy Center (HIT) to optimize the Monte Carlo (MC) modelling of proton-induced positron-emitter production. The presented experimental strategy constitutes a pragmatic inverse approach to overcome the known uncertainties in the modelling of positron-emitter production due to the lack of reliable cross-section data for the relevant therapeutic energy range. This work is motivated by the clinical implementation of offline PET/CT-based treatment verification at our facility. Here, the irradiation induced tissue activation in the patient is monitored shortly after the treatment delivery by means of a commercial PET/CT scanner and compared to a MC simulated activity expectation, derived under the assumption of a correct treatment delivery. At HIT, the MC particle transport and interaction code FLUKA is used for the simulation of the expected positron-emitter yield. For this particular application, the code is coupled to externally provided cross-section data of several proton-induced reactions. Studying experimentally the positron-emitting radionuclide yield in homogeneous phantoms provides access to the fundamental production channels. Therefore, five different materials have been irradiated by monoenergetic proton pencil beams at various energies and the induced β + activity subsequently acquired with a commercial full-ring PET/CT scanner. With the analysis of dynamically reconstructed PET images, we are able to determine separately the spatial distribution of different radionuclide concentrations at the starting time of the PET scan. The laterally integrated radionuclide yields in depth are used to tune the input cross-section data such that the impact of both the physical production and the imaging process on the various positron-emitter yields is reproduced. The resulting cross-section data sets allow to model the absolute level of measured β + activity induced in the investigated

  19. Monte-Carlo Modeling of Parameters of a Subcritical Cascade Reactor Based on MSBR and LMFBR Technologies

    CERN Document Server

    Bznuni, S A; Zhamkochyan, V M; Polanski, A; Sosnin, A N; Khudaverdyan, A H

    2001-01-01

    Parameters of a subcritical cascade reactor driven by a proton accelerator and based on a primary lead-bismuth target, main reactor constructed analogously to the molten salt breeder (MSBR) reactor core and a booster-reactor analogous to the core of the BN-350 liquid metal cooled fast breeder reactor (LMFBR). It is shown by means of Monte-Carlo modeling that the reactor under study provides safe operation modes (k_{eff}=0.94-0.98), is apable to transmute effectively radioactive nuclear waste and reduces by an order of magnitude the requirements on the accelerator beam current. Calculations show that the maximal neutron flux in the thermal zone is 10^{14} cm^{12}\\cdot s^_{-1}, in the fast booster zone is 5.12\\cdot10^{15} cm^{12}\\cdot s{-1} at k_{eff}=0.98 and proton beam current I=2.1 mA.

  20. Monte-Carlo modeling of parameters of a subcritical cascade reactor based on MSBR and LMFBR technologies

    International Nuclear Information System (INIS)

    Bznuni, S.A.; Zhamkochyan, V.M.; Khudaverdyan, A.G.; Barashenkov, V.S.; Sosnin, A.N.; Polanski, A.

    2001-01-01

    Parameters are investigated of a subcritical cascade reactor driven by a proton accelerator and based on a primary lead-bismuth target, main reactor constructed analogously to the molten salt breeder (MSBR) reactor core and a booster-reactor analogous to the core of the BN-350 liquid metal cooled fast breeder reactor (LMFBR). It is shown by means of Monte-Carlo modeling that the reactor under study provides safe operation modes (k eff = 0.94 - 0.98), is capable to transmute effectively radioactive nuclear waste and reduces by an order of magnitude the requirements on the accelerator beam current. Calculations show that the maximal neutron flux in the thermal zone is 10 14 cm 12 · s -1 , in the fast booster zone is 5.12 · 10 15 cm 12 · s -1 at k eff = 0.98 and proton beam current I = 2.1 mA. (author)

  1. GPU-Monte Carlo based fast IMRT plan optimization

    Directory of Open Access Journals (Sweden)

    Yongbao Li

    2014-03-01

    Full Text Available Purpose: Intensity-modulated radiation treatment (IMRT plan optimization needs pre-calculated beamlet dose distribution. Pencil-beam or superposition/convolution type algorithms are typically used because of high computation speed. However, inaccurate beamlet dose distributions, particularly in cases with high levels of inhomogeneity, may mislead optimization, hindering the resulting plan quality. It is desire to use Monte Carlo (MC methods for beamlet dose calculations. Yet, the long computational time from repeated dose calculations for a number of beamlets prevents this application. It is our objective to integrate a GPU-based MC dose engine in lung IMRT optimization using a novel two-steps workflow.Methods: A GPU-based MC code gDPM is used. Each particle is tagged with an index of a beamlet where the source particle is from. Deposit dose are stored separately for beamlets based on the index. Due to limited GPU memory size, a pyramid space is allocated for each beamlet, and dose outside the space is neglected. A two-steps optimization workflow is proposed for fast MC-based optimization. At first step, a rough dose calculation is conducted with only a few number of particle per beamlet. Plan optimization is followed to get an approximated fluence map. In the second step, more accurate beamlet doses are calculated, where sampled number of particles for a beamlet is proportional to the intensity determined previously. A second-round optimization is conducted, yielding the final result.Results: For a lung case with 5317 beamlets, 105 particles per beamlet in the first round, and 108 particles per beam in the second round are enough to get a good plan quality. The total simulation time is 96.4 sec.Conclusion: A fast GPU-based MC dose calculation method along with a novel two-step optimization workflow are developed. The high efficiency allows the use of MC for IMRT optimizations.--------------------------------Cite this article as: Li Y, Tian Z

  2. Dosimetric evaluation of a commercial proton spot scanning Monte-Carlo dose algorithm: comparisons against measurements and simulations.

    Science.gov (United States)

    Saini, Jatinder; Maes, Dominic; Egan, Alexander; Bowen, Stephen R; St James, Sara; Janson, Martin; Wong, Tony; Bloch, Charles

    2017-09-12

    RaySearch Americas Inc. (NY) has introduced a commercial Monte Carlo dose algorithm (RS-MC) for routine clinical use in proton spot scanning. In this report, we provide a validation of this algorithm against phantom measurements and simulations in the GATE software package. We also compared the performance of the RayStation analytical algorithm (RS-PBA) against the RS-MC algorithm. A beam model (G-MC) for a spot scanning gantry at our proton center was implemented in the GATE software package. The model was validated against measurements in a water phantom and was used for benchmarking the RS-MC. Validation of the RS-MC was performed in a water phantom by measuring depth doses and profiles for three spread-out Bragg peak (SOBP) beams with normal incidence, an SOBP with oblique incidence, and an SOBP with a range shifter and large air gap. The RS-MC was also validated against measurements and simulations in heterogeneous phantoms created by placing lung or bone slabs in a water phantom. Lateral dose profiles near the distal end of the beam were measured with a microDiamond detector and compared to the G-MC simulations, RS-MC and RS-PBA. Finally, the RS-MC and RS-PBA were validated against measured dose distributions in an Alderson-Rando (AR) phantom. Measurements were made using Gafchromic film in the AR phantom and compared to doses using the RS-PBA and RS-MC algorithms. For SOBP depth doses in a water phantom, all three algorithms matched the measurements to within  ±3% at all points and a range within 1 mm. The RS-PBA algorithm showed up to a 10% difference in dose at the entrance for the beam with a range shifter and  >30 cm air gap, while the RS-MC and G-MC were always within 3% of the measurement. For an oblique beam incident at 45°, the RS-PBA algorithm showed up to 6% local dose differences and broadening of distal fall-off by 5 mm. Both the RS-MC and G-MC accurately predicted the depth dose to within  ±3% and distal fall-off to within 2

  3. Dosimetric evaluation of a commercial proton spot scanning Monte-Carlo dose algorithm: comparisons against measurements and simulations

    Science.gov (United States)

    Saini, Jatinder; Maes, Dominic; Egan, Alexander; Bowen, Stephen R.; St. James, Sara; Janson, Martin; Wong, Tony; Bloch, Charles

    2017-10-01

    RaySearch Americas Inc. (NY) has introduced a commercial Monte Carlo dose algorithm (RS-MC) for routine clinical use in proton spot scanning. In this report, we provide a validation of this algorithm against phantom measurements and simulations in the GATE software package. We also compared the performance of the RayStation analytical algorithm (RS-PBA) against the RS-MC algorithm. A beam model (G-MC) for a spot scanning gantry at our proton center was implemented in the GATE software package. The model was validated against measurements in a water phantom and was used for benchmarking the RS-MC. Validation of the RS-MC was performed in a water phantom by measuring depth doses and profiles for three spread-out Bragg peak (SOBP) beams with normal incidence, an SOBP with oblique incidence, and an SOBP with a range shifter and large air gap. The RS-MC was also validated against measurements and simulations in heterogeneous phantoms created by placing lung or bone slabs in a water phantom. Lateral dose profiles near the distal end of the beam were measured with a microDiamond detector and compared to the G-MC simulations, RS-MC and RS-PBA. Finally, the RS-MC and RS-PBA were validated against measured dose distributions in an Alderson-Rando (AR) phantom. Measurements were made using Gafchromic film in the AR phantom and compared to doses using the RS-PBA and RS-MC algorithms. For SOBP depth doses in a water phantom, all three algorithms matched the measurements to within  ±3% at all points and a range within 1 mm. The RS-PBA algorithm showed up to a 10% difference in dose at the entrance for the beam with a range shifter and  >30 cm air gap, while the RS-MC and G-MC were always within 3% of the measurement. For an oblique beam incident at 45°, the RS-PBA algorithm showed up to 6% local dose differences and broadening of distal fall-off by 5 mm. Both the RS-MC and G-MC accurately predicted the depth dose to within  ±3% and distal fall-off to within 2

  4. GPU based Monte Carlo for PET image reconstruction: parameter optimization

    International Nuclear Information System (INIS)

    Cserkaszky, Á; Légrády, D.; Wirth, A.; Bükki, T.; Patay, G.

    2011-01-01

    This paper presents the optimization of a fully Monte Carlo (MC) based iterative image reconstruction of Positron Emission Tomography (PET) measurements. With our MC re- construction method all the physical effects in a PET system are taken into account thus superior image quality is achieved in exchange for increased computational effort. The method is feasible because we utilize the enormous processing power of Graphical Processing Units (GPUs) to solve the inherently parallel problem of photon transport. The MC approach regards the simulated positron decays as samples in mathematical sums required in the iterative reconstruction algorithm, so to complement the fast architecture, our work of optimization focuses on the number of simulated positron decays required to obtain sufficient image quality. We have achieved significant results in determining the optimal number of samples for arbitrary measurement data, this allows to achieve the best image quality with the least possible computational effort. Based on this research recommendations can be given for effective partitioning of computational effort into the iterations in limited time reconstructions. (author)

  5. Proton Therapy for Skull Base Chordomas: An Outcome Study from the University of Florida Proton Therapy Institute

    OpenAIRE

    Deraniyagala, Rohan L.; Yeung, Daniel; Mendenhall, William M.; Li, Zuofeng; Morris, Christopher G.; Mendenhall, Nancy P.; Okunieff, Paul; Malyapa, Robert S.

    2013-01-01

    Objectives Skull base chordoma is a rare, locally aggressive tumor located adjacent to critical structures. Gross total resection is difficult to achieve, and proton therapy has the conformal advantage of delivering a high postoperative dose to the tumor bed. We present our experience using proton therapy to treat 33 patients with skull base chordomas.

  6. Cell death following BNCT: A theoretical approach based on Monte Carlo simulations

    Energy Technology Data Exchange (ETDEWEB)

    Ballarini, F., E-mail: francesca.ballarini@pv.infn.it [University of Pavia, Department of Nuclear and Theoretical Physics, via Bassi 6, Pavia (Italy)] [INFN (National Institute of Nuclear Physics)-Sezione di Pavia, via Bassi 6, Pavia (Italy); Bakeine, J. [University of Pavia, Department of Nuclear and Theoretical Physics, via Bassi 6, Pavia (Italy); Bortolussi, S. [University of Pavia, Department of Nuclear and Theoretical Physics, via Bassi 6, Pavia (Italy)] [INFN (National Institute of Nuclear Physics)-Sezione di Pavia, via Bassi 6, Pavia (Italy); Bruschi, P. [University of Pavia, Department of Nuclear and Theoretical Physics, via Bassi 6, Pavia (Italy); Cansolino, L.; Clerici, A.M.; Ferrari, C. [University of Pavia, Department of Surgery, Experimental Surgery Laboratory, Pavia (Italy); Protti, N.; Stella, S. [University of Pavia, Department of Nuclear and Theoretical Physics, via Bassi 6, Pavia (Italy)] [INFN (National Institute of Nuclear Physics)-Sezione di Pavia, via Bassi 6, Pavia (Italy); Zonta, A.; Zonta, C. [University of Pavia, Department of Surgery, Experimental Surgery Laboratory, Pavia (Italy); Altieri, S. [University of Pavia, Department of Nuclear and Theoretical Physics, via Bassi 6, Pavia (Italy)] [INFN (National Institute of Nuclear Physics)-Sezione di Pavia, via Bassi 6, Pavia (Italy)

    2011-12-15

    In parallel to boron measurements and animal studies, investigations on radiation-induced cell death are also in progress in Pavia, with the aim of better characterisation of the effects of a BNCT treatment down to the cellular level. Such studies are being carried out not only experimentally but also theoretically, based on a mechanistic model and a Monte Carlo code. Such model assumes that: (1) only clustered DNA strand breaks can lead to chromosome aberrations; (2) only chromosome fragments within a certain threshold distance can undergo misrejoining; (3) the so-called 'lethal aberrations' (dicentrics, rings and large deletions) lead to cell death. After applying the model to normal cells exposed to monochromatic fields of different radiation types, the irradiation section of the code was purposely extended to mimic the cell exposure to a mixed radiation field produced by the {sup 10}B(n,{alpha}) {sup 7}Li reaction, which gives rise to alpha particles and Li ions of short range and high biological effectiveness, and by the {sup 14}N(n,p){sup 14}C reaction, which produces 0.58 MeV protons. Very good agreement between model predictions and literature data was found for human and animal cells exposed to X- or gamma-rays, protons and alpha particles, thus allowing to validate the model for cell death induced by monochromatic radiation fields. The model predictions showed good agreement also with experimental data obtained by our group exposing DHD cells to thermal neutrons in the TRIGA Mark II reactor of University of Pavia; this allowed to validate the model also for a BNCT exposure scenario, providing a useful predictive tool to bridge the gap between irradiation and cell death.

  7. Anhydrous proton conductor based on composites of PEO and ATMP

    Energy Technology Data Exchange (ETDEWEB)

    Sun Baoying [Key Lab of Organic Optoelectronics and Molecular Engineering, Department of Chemistry, Tsinghua University, Beijing 100084 (China); Qiu Xinping, E-mail: qiuxp@tsinghua.edu.c [Key Lab of Organic Optoelectronics and Molecular Engineering, Department of Chemistry, Tsinghua University, Beijing 100084 (China); Zhu Wentao [Key Lab of Organic Optoelectronics and Molecular Engineering, Department of Chemistry, Tsinghua University, Beijing 100084 (China)

    2011-04-15

    A new type anhydrous PEM material based on Poly (ethylene oxide) (PEO)/Amino Trimethylene Phosphonic Acid (ATMP) composite was prepared. In this study, PEO assumed to 'grab' protons via hydrogen bond between PEO and ATMP. Based on this point, the PEO/ATMP composites were prepared firstly as the preliminary study to verify this proton conducting system. Then, PVDF was added to enhance the membrane's stability. The PVDF/PEO/ATMP composite membranes were thermally stable up to 200 {sup o}C in the studied composition ranges. The membrane had relatively compact structure by SEM images. Proton conductivity of 59% PVDF/29% PEO/12% ATMP was up to 6.71 x 10 {sup -3} S cm{sup -1} at 86 {sup o}C after doping with 7.9 wt% phosphoric acid without extra humidification.

  8. Anhydrous proton conductor based on composites of PEO and ATMP

    International Nuclear Information System (INIS)

    Sun Baoying; Qiu Xinping; Zhu Wentao

    2011-01-01

    A new type anhydrous PEM material based on Poly (ethylene oxide) (PEO)/Amino Trimethylene Phosphonic Acid (ATMP) composite was prepared. In this study, PEO assumed to 'grab' protons via hydrogen bond between PEO and ATMP. Based on this point, the PEO/ATMP composites were prepared firstly as the preliminary study to verify this proton conducting system. Then, PVDF was added to enhance the membrane's stability. The PVDF/PEO/ATMP composite membranes were thermally stable up to 200 o C in the studied composition ranges. The membrane had relatively compact structure by SEM images. Proton conductivity of 59% PVDF/29% PEO/12% ATMP was up to 6.71 x 10 -3 S cm -1 at 86 o C after doping with 7.9 wt% phosphoric acid without extra humidification.

  9. Image based Monte Carlo modeling for computational phantom

    International Nuclear Information System (INIS)

    Cheng, M.; Wang, W.; Zhao, K.; Fan, Y.; Long, P.; Wu, Y.

    2013-01-01

    Full text of the publication follows. The evaluation on the effects of ionizing radiation and the risk of radiation exposure on human body has been becoming one of the most important issues for radiation protection and radiotherapy fields, which is helpful to avoid unnecessary radiation and decrease harm to human body. In order to accurately evaluate the dose on human body, it is necessary to construct more realistic computational phantom. However, manual description and verification of the models for Monte Carlo (MC) simulation are very tedious, error-prone and time-consuming. In addition, it is difficult to locate and fix the geometry error, and difficult to describe material information and assign it to cells. MCAM (CAD/Image-based Automatic Modeling Program for Neutronics and Radiation Transport Simulation) was developed as an interface program to achieve both CAD- and image-based automatic modeling. The advanced version (Version 6) of MCAM can achieve automatic conversion from CT/segmented sectioned images to computational phantoms such as MCNP models. Imaged-based automatic modeling program(MCAM6.0) has been tested by several medical images and sectioned images. And it has been applied in the construction of Rad-HUMAN. Following manual segmentation and 3D reconstruction, a whole-body computational phantom of Chinese adult female called Rad-HUMAN was created by using MCAM6.0 from sectioned images of a Chinese visible human dataset. Rad-HUMAN contains 46 organs/tissues, which faithfully represented the average anatomical characteristics of the Chinese female. The dose conversion coefficients (Dt/Ka) from kerma free-in-air to absorbed dose of Rad-HUMAN were calculated. Rad-HUMAN can be applied to predict and evaluate dose distributions in the Treatment Plan System (TPS), as well as radiation exposure for human body in radiation protection. (authors)

  10. Time-resolved diode dosimetry calibration through Monte Carlo modeling for in vivo passive scattered proton therapy range verification.

    Science.gov (United States)

    Toltz, Allison; Hoesl, Michaela; Schuemann, Jan; Seuntjens, Jan; Lu, Hsiao-Ming; Paganetti, Harald

    2017-11-01

    Our group previously introduced an in vivo proton range verification methodology in which a silicon diode array system is used to correlate the dose rate profile per range modulation wheel cycle of the detector signal to the water-equivalent path length (WEPL) for passively scattered proton beam delivery. The implementation of this system requires a set of calibration data to establish a beam-specific response to WEPL fit for the selected 'scout' beam (a 1 cm overshoot of the predicted detector depth with a dose of 4 cGy) in water-equivalent plastic. This necessitates a separate set of measurements for every 'scout' beam that may be appropriate to the clinical case. The current study demonstrates the use of Monte Carlo simulations for calibration of the time-resolved diode dosimetry technique. Measurements for three 'scout' beams were compared against simulated detector response with Monte Carlo methods using the Tool for Particle Simulation (TOPAS). The 'scout' beams were then applied in the simulation environment to simulated water-equivalent plastic, a CT of water-equivalent plastic, and a patient CT data set to assess uncertainty. Simulated detector response in water-equivalent plastic was validated against measurements for 'scout' spread out Bragg peaks of range 10 cm, 15 cm, and 21 cm (168 MeV, 177 MeV, and 210 MeV) to within 3.4 mm for all beams, and to within 1 mm in the region where the detector is expected to lie. Feasibility has been shown for performing the calibration of the detector response for three 'scout' beams through simulation for the time-resolved diode dosimetry technique in passive scattered proton delivery. © 2017 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.

  11. Monte Carlo study of radial energy deposition from primary and secondary particles for narrow and large proton beamlet source models

    International Nuclear Information System (INIS)

    Peeler, Christopher R; Titt, Uwe

    2012-01-01

    In spot-scanning intensity-modulated proton therapy, numerous unmodulated proton beam spots are delivered over a target volume to produce a prescribed dose distribution. To accurately model field size-dependent output factors for beam spots, the energy deposition at positions radial to the central axis of the beam must be characterized. In this study, we determined the difference in the central axis dose for spot-scanned fields that results from secondary particle doses by investigating energy deposition radial to the proton beam central axis resulting from primary protons and secondary particles for mathematical point source and distributed source models. The largest difference in the central axis dose from secondary particles resulting from the use of a mathematical point source and a distributed source model was approximately 0.43%. Thus, we conclude that the central axis dose for a spot-scanned field is effectively independent of the source model used to calculate the secondary particle dose. (paper)

  12. Proton-beam writing channel based on an electrostatic accelerator

    Science.gov (United States)

    Lapin, A. S.; Rebrov, V. A.; Kolin'ko, S. V.; Salivon, V. F.; Ponomarev, A. G.

    2016-09-01

    We have described the structure of the proton-beam writing channel as a continuation of a nuclear scanning microprobe channel. The problem of the accuracy of positioning a probe by constructing a new high-frequency electrostatic scanning system has been solved. Special attention has been paid to designing the probe-forming system and its various configurations have been considered. The probe-forming system that best corresponds to the conditions of the lithographic process has been found based on solving the problem of optimizing proton beam formation. A system for controlling beam scanning using multifunctional module of integrated programmable logic systems has been developed.

  13. Proton Conductivity and Operational Features Of PBI-Based Membranes

    DEFF Research Database (Denmark)

    Qingfeng, Li; Jensen, Jens Oluf; Precht Noyé, Pernille

    2005-01-01

    As an approach to high temperature operation of PEMFCs, acid-doped PBI membranes are under active development. The membrane exhibits high proton conductivity under low water contents at temperatures up to 200°C. Mechanisms of proton conduction for the membranes have been proposed. Based on the me...... on the membranes fuel cell tests have been demonstrated. Operating features of the PBI cell include no humidification, high CO tolerance, better heat utilization and possible integration with fuel processing units. Issues for further development are also discussed....

  14. Proton irradiation of liquid crystal based adaptive optical devices

    International Nuclear Information System (INIS)

    Buis, E.J.; Berkhout, G.C.G.; Love, G.D.; Kirby, A.K.; Taylor, J.M.; Hannemann, S.; Collon, M.J.

    2012-01-01

    To assess its radiation hardness, a liquid crystal based adaptive optical element has been irradiated using a 60 MeV proton beam. The device with the functionality of an optical beam steerer was characterised before, during and after the irradiation. A systematic set of measurements on the transmission and beam deflection angles was carried out. The measurements showed that the transmission decreased only marginally and that its optical performance degraded only after a very high proton fluence (10 10 p/cm 2 ). The device showed complete annealing in the functionality as a beam steerer, which leads to the conclusion that the liquid crystal technology for optical devices is not vulnerable to proton irradiation as expected in space.

  15. Proton irradiation of liquid crystal based adaptive optical devices

    Energy Technology Data Exchange (ETDEWEB)

    Buis, E.J., E-mail: ernst-jan.buis@tno.nl [cosine Science and Computing BV, Niels Bohrweg 11, 2333 CA Leiden (Netherlands); Berkhout, G.C.G. [cosine Science and Computing BV, Niels Bohrweg 11, 2333 CA Leiden (Netherlands); Huygens Laboratory, Leiden University, P.O. Box 9504, 2300 RA Leiden (Netherlands); Love, G.D.; Kirby, A.K.; Taylor, J.M. [Department of Physics, Durham University, South Road, Durham DH1 3LE (United Kingdom); Hannemann, S.; Collon, M.J. [cosine Research BV, Niels Bohrweg 11, 2333 CA Leiden (Netherlands)

    2012-01-01

    To assess its radiation hardness, a liquid crystal based adaptive optical element has been irradiated using a 60 MeV proton beam. The device with the functionality of an optical beam steerer was characterised before, during and after the irradiation. A systematic set of measurements on the transmission and beam deflection angles was carried out. The measurements showed that the transmission decreased only marginally and that its optical performance degraded only after a very high proton fluence (10{sup 10}p/cm{sup 2}). The device showed complete annealing in the functionality as a beam steerer, which leads to the conclusion that the liquid crystal technology for optical devices is not vulnerable to proton irradiation as expected in space.

  16. Implementation and verification of nuclear interactions in a Monte-Carlo code for the Procom-ProGam proton therapy planning system

    International Nuclear Information System (INIS)

    Kostyuchenko, V.I.; Makarova, A.S.; Ryazantsev, O.B.; Samarin, S.I.; Uglov, A.S.

    2013-01-01

    Proton interaction with an exposed object material needs to be modeled with account for three basic processes: electromagnetic stopping of protons in matter, multiple coulomb scattering and nuclear interactions. Just the last type of processes is the topic of this paper. Monte Carlo codes are often used to simulate high-energy particle interaction with matter. However, nuclear interaction models implemented in these codes are rather extensive and their use in treatment planning systems requires huge computational resources. We have selected the IThMC code for its ability to reproduce experiments which measure the distribution of the projected ranges of nuclear secondary particles generated by proton beams in a multi-layer Faraday cup. The multi-layer Faraday cup detectors measure charge rather than dose and allow distinguishing between electromagnetic and nuclear interactions. The event generator used in the IThMC code is faster, but less accurate than any other used in testing. Our model of nuclear reactions demonstrates quite good agreement with experiment in the context of their effect on the Bragg peak in therapeutic applications

  17. Clinical results of proton beam therapy for skull base chordoma

    International Nuclear Information System (INIS)

    Igaki, Hiroshi; Tokuuye, Koichi; Okumura, Toshiyuki; Sugahara, Shinji; Kagei, Kenji; Hata, Masaharu; Ohara, Kiyoshi; Hashimoto, Takayuki; Tsuboi, Koji; Takano, Shingo; Matsumura, Akira; Akine, Yasuyuki

    2004-01-01

    Purpose: To evaluate clinical results of proton beam therapy for patients with skull base chordoma. Methods and materials: Thirteen patients with skull base chordoma who were treated with proton beams with or without X-rays at the University of Tsukuba between 1989 and 2000 were retrospectively reviewed. A median total tumor dose of 72.0 Gy (range, 63.0-95.0 Gy) was delivered. The patients were followed for a median period of 69.3 months (range, 14.6-123.4 months). Results: The 5-year local control rate was 46.0%. Cause-specific, overall, and disease-free survival rates at 5 years were 72.2%, 66.7%, and 42.2%, respectively. The local control rate was higher, without statistical significance, for those with preoperative tumors <30 mL. Partial or subtotal tumor removal did not yield better local control rates than for patients who underwent biopsy only as the latest surgery. Conclusion: Proton beam therapy is effective for patients with skull base chordoma, especially for those with small tumors. For a patient with a tumor of <30 mL with no prior treatment, biopsy without tumor removal seems to be appropriate before proton beam therapy

  18. Monte Carlo Based Framework to Support HAZOP Study

    DEFF Research Database (Denmark)

    Danko, Matej; Frutiger, Jerome; Jelemenský, Ľudovít

    2017-01-01

    deviations in process parameters simultaneously, thereby bringing an improvement to the Hazard and Operability study (HAZOP), which normally considers only one at a time deviation in process parameters. Furthermore, Monte Carlo filtering was then used to identify operability and hazard issues including...

  19. Ceramic membrane fuel cells based on solid proton electrolytes

    Energy Technology Data Exchange (ETDEWEB)

    Meng, Guangyao; Ma, Qianli; Peng, Ranran; Liu, Xingqin [USTC Lab. for Solid State Chemistry and Inorganic Membranes, Department of Materials Science and Engineering, University of Science and Technology of China, Hefei 230026 (China); Ma, Guilin [School of Chemistry and Chemical Engineering, Suzhou University, Suzhou 215123 (China)

    2007-04-15

    The development of solid oxide fuel cells (SOFCs) has reached its new stage characterized with thin electrolytes on porous electrode support, and the most important fabrication techniques developed in which almost all are concerned with inorganic membranes, and so can be named as ceramic membrane fuel cells (CMFCs). CMFCs based on proton electrolytes (CMFC-H) may exhibit more advantages than CMFCs based on oxygen-ion electrolytes (CMFC-O) in many respects, such as energy efficiency and avoiding carbon deposit. Ammonia fuelled CMFC with proton-conducting BaCe{sub 0.8}Gd{sub 0.2}O{sub 2.9} (BCGO) electrolyte (50 {mu}m in thickness) is reported in this works, which showed the open current voltage (OCV) values close to theoretical ones and rather high power density. And also, we have found that the well known super oxide ion conductor, La{sub 0.9}Sr{sub 0.1}Ga{sub 0.8}Mg{sub 0.2}O{sub 3-{alpha}} (LSGM), is a pure proton conductor in H{sub 2} and mixed proton and oxide ion conductor in wet air, while it is a pure oxide ion conductor in oxygen or dry air. To demonstrate the CMFC-H concept to get high performance fuel cells the techniques for thin membranes, chemical vapor deposition (CVD), particularly novel CVD techniques, should be given more attention because of their many advantages. (author)

  20. Monte Carlo-based simulation of dynamic jaws tomotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Sterpin, E.; Chen, Y.; Chen, Q.; Lu, W.; Mackie, T. R.; Vynckier, S. [Department of Molecular Imaging, Radiotherapy and Oncology, Universite Catholique de Louvain, 54 Avenue Hippocrate, 1200 Brussels, Belgium and Department of Medical Physics, University of Wisconsin-Madison, Madison, Wisconsin 53705 (United States); TomoTherapy Inc., 1240 Deming Way, Madison, Wisconsin 53717 (United States); 21 Century Oncology., 1240 D' onofrio, Madison, Wisconsin 53719 (United States); TomoTherapy Inc., 1240 Deming Way, Madison, Wisconsin 53717 and Department of Medical Physics, University of Wisconsin-Madison, Madison, Wisconsin 53705 (United States); Department of Radiotherapy and Oncology, Universite Catholique de Louvain, St-Luc University Hospital, 10 Avenue Hippocrate, 1200 Brussels (Belgium)

    2011-09-15

    Purpose: Original TomoTherapy systems may involve a trade-off between conformity and treatment speed, the user being limited to three slice widths (1.0, 2.5, and 5.0 cm). This could be overcome by allowing the jaws to define arbitrary fields, including very small slice widths (<1 cm), which are challenging for a beam model. The aim of this work was to incorporate the dynamic jaws feature into a Monte Carlo (MC) model called TomoPen, based on the MC code PENELOPE, previously validated for the original TomoTherapy system. Methods: To keep the general structure of TomoPen and its efficiency, the simulation strategy introduces several techniques: (1) weight modifiers to account for any jaw settings using only the 5 cm phase-space file; (2) a simplified MC based model called FastStatic to compute the modifiers faster than pure MC; (3) actual simulation of dynamic jaws. Weight modifiers computed with both FastStatic and pure MC were compared. Dynamic jaws simulations were compared with the convolution/superposition (C/S) of TomoTherapy in the ''cheese'' phantom for a plan with two targets longitudinally separated by a gap of 3 cm. Optimization was performed in two modes: asymmetric jaws-constant couch speed (''running start stop,'' RSS) and symmetric jaws-variable couch speed (''symmetric running start stop,'' SRSS). Measurements with EDR2 films were also performed for RSS for the formal validation of TomoPen with dynamic jaws. Results: Weight modifiers computed with FastStatic were equivalent to pure MC within statistical uncertainties (0.5% for three standard deviations). Excellent agreement was achieved between TomoPen and C/S for both asymmetric jaw opening/constant couch speed and symmetric jaw opening/variable couch speed, with deviations well within 2%/2 mm. For RSS procedure, agreement between C/S and measurements was within 2%/2 mm for 95% of the points and 3%/3 mm for 98% of the points, where dose is

  1. Monte Carlo-based simulation of dynamic jaws tomotherapy

    International Nuclear Information System (INIS)

    Sterpin, E.; Chen, Y.; Chen, Q.; Lu, W.; Mackie, T. R.; Vynckier, S.

    2011-01-01

    Purpose: Original TomoTherapy systems may involve a trade-off between conformity and treatment speed, the user being limited to three slice widths (1.0, 2.5, and 5.0 cm). This could be overcome by allowing the jaws to define arbitrary fields, including very small slice widths (<1 cm), which are challenging for a beam model. The aim of this work was to incorporate the dynamic jaws feature into a Monte Carlo (MC) model called TomoPen, based on the MC code PENELOPE, previously validated for the original TomoTherapy system. Methods: To keep the general structure of TomoPen and its efficiency, the simulation strategy introduces several techniques: (1) weight modifiers to account for any jaw settings using only the 5 cm phase-space file; (2) a simplified MC based model called FastStatic to compute the modifiers faster than pure MC; (3) actual simulation of dynamic jaws. Weight modifiers computed with both FastStatic and pure MC were compared. Dynamic jaws simulations were compared with the convolution/superposition (C/S) of TomoTherapy in the ''cheese'' phantom for a plan with two targets longitudinally separated by a gap of 3 cm. Optimization was performed in two modes: asymmetric jaws-constant couch speed (''running start stop,'' RSS) and symmetric jaws-variable couch speed (''symmetric running start stop,'' SRSS). Measurements with EDR2 films were also performed for RSS for the formal validation of TomoPen with dynamic jaws. Results: Weight modifiers computed with FastStatic were equivalent to pure MC within statistical uncertainties (0.5% for three standard deviations). Excellent agreement was achieved between TomoPen and C/S for both asymmetric jaw opening/constant couch speed and symmetric jaw opening/variable couch speed, with deviations well within 2%/2 mm. For RSS procedure, agreement between C/S and measurements was within 2%/2 mm for 95% of the points and 3%/3 mm for 98% of the points, where dose is greater than 30% of the prescription dose (gamma analysis

  2. Neutron production in spallation reactions of 0.9 and 1.5 GeV protons on a thick lead target. Comparison between experimental data and Monte-Carlo simulations

    International Nuclear Information System (INIS)

    Krasa, A.; Krizek, F.; Wagner, V.; Kugler, A.; Henzl, V.; Henzlova, D.; Majerle, M.; Adam, J.; Caloun, P.; Bradnova, V.; Chultem, D.; Kalinnikov, V.G.; Krivopustov, M.I.; Solnyshkin, A.A.; Stegajlov, V.I.; Tsupko-Sitnikov, V.M.; Tumehndehlgehr, Ts.; Vasil'ev, S.I.

    2005-01-01

    This paper reports on two experiments performed at the Synchrophasotron/Nuclotron accelerator complex at JINR. Relativistic protons with energies 885 MeV and 1.5 GeV hit a massive cylindrical lead target. The spatial and energetic distributions of the neutron field produced by the spallation reactions were measured by the activation of Al, Au, Bi, Co, and Cu foils placed on the surface of the target and close to it. The yields of the radioactive nuclei produced by threshold reactions in these foils were determined by the analyses of their γ spectra. The comparison with Monte-Carlo based simulations was performed both with the LAHET+MCNP code and the MCNPX code

  3. Neutron Production in Spallation Reactions of 0.9 and 1.5 GeV Protons on a Thick Lead Target. Comparison between Experimental Data and Monte-Carlo Simulations

    CERN Document Server

    Krasa, A; Bradnova, V; Caloun, P; Chultem, D; Henzl, V; Henzlová, D; Kalinnikov, V G; Krivopustov, M I; Krízek, F; Kugler, A; Majerle, M; Solnyshkin, A A; Stegailov, V I; Tsoupko-Sitnikov, V M; Tumendelger, T; Vasilev, S I; Wagner, V; Nuclear Physics Institute of Academy of Sciences of Czech Republic, Rez, Czech Republic

    2005-01-01

    This paper reports on two experiments performed at the Synchrophasotron/Nuclotron accelerator complex at JINR. Relativistic protons with energies 885 MeV and 1.5 GeV hit a massive cylindrical lead target. The spatial and energetic distributions of the neutron field produced by the spallation reactions were measured by the activation of Al, Au, Bi, Co, and Cu foils placed on the surface of the target and close to it. The yields of the radioactive nuclei produced by threshold reactions in these foils were determined by the analyses of their $\\gamma$ spectra. The comparison with Monte-Carlo based simulations was performed both with the LAHET+MCNP code and the MCNPX code.

  4. Range verification for eye proton therapy based on proton-induced x-ray emissions from implanted metal markers

    Science.gov (United States)

    La Rosa, Vanessa; Kacperek, Andrzej; Royle, Gary; Gibson, Adam

    2014-06-01

    Metal fiducial markers are often implanted on the back of the eye before proton therapy to improve target localization and reduce patient setup errors. We aim to detect characteristic x-ray emissions from metal targets during proton therapy to verify the treatment range accuracy. Initially gold was chosen for its biocompatibility properties. Proton-induced x-ray emissions (PIXE) from a 15 mm diameter gold marker were detected at different penetration depths of a 59 MeV proton beam at the CATANA proton facility at INFN-LNS (Italy). The Monte Carlo code Geant4 was used to reproduce the experiment and to investigate the effect of different size markers, materials, and the response to both mono-energetic and fully modulated beams. The intensity of the emitted x-rays decreases with decreasing proton energy and thus decreases with depth. If we assume the range to be the depth at which the dose is reduced to 10% of its maximum value and we define the residual range as the distance between the marker and the range of the beam, then the minimum residual range which can be detected with 95% confidence level is the depth at which the PIXE peak is equal to 1.96 σbkg, which is the standard variation of the background noise. With our system and experimental setup this value is 3 mm, when 20 GyE are delivered to a gold marker of 15 mm diameter. Results from silver are more promising. Even when a 5 mm diameter silver marker is placed at a depth equal to the range, the PIXE peak is 2.1 σbkg. Although these quantitative results are dependent on the experimental setup used in this research study, they demonstrate that the real-time analysis of the PIXE emitted by fiducial metal markers can be used to derive beam range. Further analysis are needed to demonstrate the feasibility of the technique in a clinical setup.

  5. Perturbation based Monte Carlo criticality search in density, enrichment and concentration

    International Nuclear Information System (INIS)

    Li, Zeguang; Wang, Kan; Deng, Jingkang

    2015-01-01

    Highlights: • A new perturbation based Monte Carlo criticality search method is proposed. • The method could get accurate results with only one individual criticality run. • The method is used to solve density, enrichment and concentration search problems. • Results show the feasibility and good performances of this method. • The relationship between results’ accuracy and perturbation order is discussed. - Abstract: Criticality search is a very important aspect in reactor physics analysis. Due to the advantages of Monte Carlo method and the development of computer technologies, Monte Carlo criticality search is becoming more and more necessary and feasible. Existing Monte Carlo criticality search methods need large amount of individual criticality runs and may have unstable results because of the uncertainties of criticality results. In this paper, a new perturbation based Monte Carlo criticality search method is proposed and discussed. This method only needs one individual criticality calculation with perturbation tallies to estimate k eff changing function using initial k eff and differential coefficients results, and solves polynomial equations to get the criticality search results. The new perturbation based Monte Carlo criticality search method is implemented in the Monte Carlo code RMC, and criticality search problems in density, enrichment and concentration are taken out. Results show that this method is quite inspiring in accuracy and efficiency, and has advantages compared with other criticality search methods

  6. Mesh-based weight window approach for Monte Carlo simulation

    International Nuclear Information System (INIS)

    Liu, L.; Gardner, R.P.

    1997-01-01

    The Monte Carlo method has been increasingly used to solve particle transport problems. Statistical fluctuation from random sampling is the major limiting factor of its application. To obtain the desired precision, variance reduction techniques are indispensable for most practical problems. Among various variance reduction techniques, the weight window method proves to be one of the most general, powerful, and robust. The method is implemented in the current MCNP code. An importance map is estimated during a regular Monte Carlo run, and then the map is used in the subsequent run for splitting and Russian roulette games. The major drawback of this weight window method is lack of user-friendliness. It normally requires that users divide the large geometric cells into smaller ones by introducing additional surfaces to ensure an acceptable spatial resolution of the importance map. In this paper, we present a new weight window approach to overcome this drawback

  7. Monte Carlo based radial shield design of typical PWR reactor

    Energy Technology Data Exchange (ETDEWEB)

    Gul, Anas; Khan, Rustam; Qureshi, M. Ayub; Azeem, Muhammad Waqar; Raza, S.A. [Pakistan Institute of Engineering and Applied Sciences, Islamabad (Pakistan). Dept. of Nuclear Engineering; Stummer, Thomas [Technische Univ. Wien (Austria). Atominst.

    2017-04-15

    This paper presents the radiation shielding model of a typical PWR (CNPP-II) at Chashma, Pakistan. The model was developed using Monte Carlo N Particle code [2], equipped with ENDF/B-VI continuous energy cross section libraries. This model was applied to calculate the neutron and gamma flux and dose rates in the radial direction at core mid plane. The simulated results were compared with the reference results of Shanghai Nuclear Engineering Research and Design Institute (SNERDI).

  8. Systematic uncertainties on Monte Carlo simulation of lead based ADS

    International Nuclear Information System (INIS)

    Embid, M.; Fernandez, R.; Garcia-Sanz, J.M.; Gonzalez, E.

    1999-01-01

    Computer simulations of the neutronic behaviour of ADS systems foreseen for actinide and fission product transmutation are affected by many sources of systematic uncertainties, both from the nuclear data and by the methodology selected when applying the codes. Several actual ADS Monte Carlo simulations are presented, comparing different options both for the data and for the methodology, evaluating the relevance of the different uncertainties. (author)

  9. SELF-ABSORPTION CORRECTIONS BASED ON MONTE CARLO SIMULATIONS

    Directory of Open Access Journals (Sweden)

    Kamila Johnová

    2016-12-01

    Full Text Available The main aim of this article is to demonstrate how Monte Carlo simulations are implemented in our gamma spectrometry laboratory at the Department of Dosimetry and Application of Ionizing Radiation in order to calculate the self-absorption within the samples. A model of real HPGe detector created for MCNP simulations is presented in this paper. All of the possible parameters, which may influence the self-absorption, are at first discussed theoretically and lately described using the calculated results.

  10. TH-A-19A-08: Intel Xeon Phi Implementation of a Fast Multi-Purpose Monte Carlo Simulation for Proton Therapy

    Energy Technology Data Exchange (ETDEWEB)

    Souris, K; Lee, J; Sterpin, E [Universite catholique de Louvain, Brussels (Belgium)

    2014-06-15

    Purpose: Recent studies have demonstrated the capability of graphics processing units (GPUs) to compute dose distributions using Monte Carlo (MC) methods within clinical time constraints. However, GPUs have a rigid vectorial architecture that favors the implementation of simplified particle transport algorithms, adapted to specific tasks. Our new, fast, and multipurpose MC code, named MCsquare, runs on Intel Xeon Phi coprocessors. This technology offers 60 independent cores, and therefore more flexibility to implement fast and yet generic MC functionalities, such as prompt gamma simulations. Methods: MCsquare implements several models and hence allows users to make their own tradeoff between speed and accuracy. A 200 MeV proton beam is simulated in a heterogeneous phantom using Geant4 and two configurations of MCsquare. The first one is the most conservative and accurate. The method of fictitious interactions handles the interfaces and secondary charged particles emitted in nuclear interactions are fully simulated. The second, faster configuration simplifies interface crossings and simulates only secondary protons after nuclear interaction events. Integral depth-dose and transversal profiles are compared to those of Geant4. Moreover, the production profile of prompt gammas is compared to PENH results. Results: Integral depth dose and transversal profiles computed by MCsquare and Geant4 are within 3%. The production of secondaries from nuclear interactions is slightly inaccurate at interfaces for the fastest configuration of MCsquare but this is unlikely to have any clinical impact. The computation time varies between 90 seconds for the most conservative settings to merely 59 seconds in the fastest configuration. Finally prompt gamma profiles are also in very good agreement with PENH results. Conclusion: Our new, fast, and multi-purpose Monte Carlo code simulates prompt gammas and calculates dose distributions in less than a minute, which complies with clinical time

  11. TH-A-19A-08: Intel Xeon Phi Implementation of a Fast Multi-Purpose Monte Carlo Simulation for Proton Therapy

    International Nuclear Information System (INIS)

    Souris, K; Lee, J; Sterpin, E

    2014-01-01

    Purpose: Recent studies have demonstrated the capability of graphics processing units (GPUs) to compute dose distributions using Monte Carlo (MC) methods within clinical time constraints. However, GPUs have a rigid vectorial architecture that favors the implementation of simplified particle transport algorithms, adapted to specific tasks. Our new, fast, and multipurpose MC code, named MCsquare, runs on Intel Xeon Phi coprocessors. This technology offers 60 independent cores, and therefore more flexibility to implement fast and yet generic MC functionalities, such as prompt gamma simulations. Methods: MCsquare implements several models and hence allows users to make their own tradeoff between speed and accuracy. A 200 MeV proton beam is simulated in a heterogeneous phantom using Geant4 and two configurations of MCsquare. The first one is the most conservative and accurate. The method of fictitious interactions handles the interfaces and secondary charged particles emitted in nuclear interactions are fully simulated. The second, faster configuration simplifies interface crossings and simulates only secondary protons after nuclear interaction events. Integral depth-dose and transversal profiles are compared to those of Geant4. Moreover, the production profile of prompt gammas is compared to PENH results. Results: Integral depth dose and transversal profiles computed by MCsquare and Geant4 are within 3%. The production of secondaries from nuclear interactions is slightly inaccurate at interfaces for the fastest configuration of MCsquare but this is unlikely to have any clinical impact. The computation time varies between 90 seconds for the most conservative settings to merely 59 seconds in the fastest configuration. Finally prompt gamma profiles are also in very good agreement with PENH results. Conclusion: Our new, fast, and multi-purpose Monte Carlo code simulates prompt gammas and calculates dose distributions in less than a minute, which complies with clinical time

  12. TH-A-19A-06: Site-Specific Comparison of Analytical and Monte Carlo Based Dose Calculations

    International Nuclear Information System (INIS)

    Schuemann, J; Grassberger, C; Paganetti, H; Dowdell, S

    2014-01-01

    Purpose: To investigate the impact of complex patient geometries on the capability of analytical dose calculation algorithms to accurately predict dose distributions and to verify currently used uncertainty margins in proton therapy. Methods: Dose distributions predicted by an analytical pencilbeam algorithm were compared with Monte Carlo simulations (MCS) using TOPAS. 79 complete patient treatment plans were investigated for 7 disease sites (liver, prostate, breast, medulloblastoma spine and whole brain, lung and head and neck). A total of 508 individual passively scattered treatment fields were analyzed for field specific properties. Comparisons based on target coverage indices (EUD, D95, D90 and D50) were performed. Range differences were estimated for the distal position of the 90% dose level (R90) and the 50% dose level (R50). Two-dimensional distal dose surfaces were calculated and the root mean square differences (RMSD), average range difference (ARD) and average distal dose degradation (ADD), the distance between the distal position of the 80% and 20% dose levels (R80- R20), were analyzed. Results: We found target coverage indices calculated by TOPAS to generally be around 1–2% lower than predicted by the analytical algorithm. Differences in R90 predicted by TOPAS and the planning system can be larger than currently applied range margins in proton therapy for small regions distal to the target volume. We estimate new site-specific range margins (R90) for analytical dose calculations considering total range uncertainties and uncertainties from dose calculation alone based on the RMSD. Our results demonstrate that a reduction of currently used uncertainty margins is feasible for liver, prostate and whole brain fields even without introducing MC dose calculations. Conclusion: Analytical dose calculation algorithms predict dose distributions within clinical limits for more homogeneous patients sites (liver, prostate, whole brain). However, we recommend

  13. TH-A-19A-06: Site-Specific Comparison of Analytical and Monte Carlo Based Dose Calculations

    Energy Technology Data Exchange (ETDEWEB)

    Schuemann, J; Grassberger, C; Paganetti, H [Massachusetts General Hospital and Harvard Medical School, Boston, MA (United States); Dowdell, S [Illawarra Shoalhaven Local Health District, Wollongong (Australia)

    2014-06-15

    Purpose: To investigate the impact of complex patient geometries on the capability of analytical dose calculation algorithms to accurately predict dose distributions and to verify currently used uncertainty margins in proton therapy. Methods: Dose distributions predicted by an analytical pencilbeam algorithm were compared with Monte Carlo simulations (MCS) using TOPAS. 79 complete patient treatment plans were investigated for 7 disease sites (liver, prostate, breast, medulloblastoma spine and whole brain, lung and head and neck). A total of 508 individual passively scattered treatment fields were analyzed for field specific properties. Comparisons based on target coverage indices (EUD, D95, D90 and D50) were performed. Range differences were estimated for the distal position of the 90% dose level (R90) and the 50% dose level (R50). Two-dimensional distal dose surfaces were calculated and the root mean square differences (RMSD), average range difference (ARD) and average distal dose degradation (ADD), the distance between the distal position of the 80% and 20% dose levels (R80- R20), were analyzed. Results: We found target coverage indices calculated by TOPAS to generally be around 1–2% lower than predicted by the analytical algorithm. Differences in R90 predicted by TOPAS and the planning system can be larger than currently applied range margins in proton therapy for small regions distal to the target volume. We estimate new site-specific range margins (R90) for analytical dose calculations considering total range uncertainties and uncertainties from dose calculation alone based on the RMSD. Our results demonstrate that a reduction of currently used uncertainty margins is feasible for liver, prostate and whole brain fields even without introducing MC dose calculations. Conclusion: Analytical dose calculation algorithms predict dose distributions within clinical limits for more homogeneous patients sites (liver, prostate, whole brain). However, we recommend

  14. 'Hard' effects in Monte Carlo proton-(anti) proton events of soft two-string dual parton model, e+e- annihilation and cascade scaling break of string and the theory of the open string

    International Nuclear Information System (INIS)

    Lugovoj, V.V.

    1998-01-01

    At proton-(anti) proton scattering in the frame of two-string Dual Parton Model the semihard parton-parton interactions can lead to the valence (anti) (di) quark excitations which lead to the production of up to four fast hadron leaders, and the process of soft colour interaction between constituents leads to formation of two primary strings, which decay into secondary hadrons according to a new cascade model of string breaking, which corresponds to the fundamental interaction of the theory of the open string. Therefore the recent results of the theory of QCD open string (about the small deviations of the string stretch direction near the longitudinal direction) are used in the algorithm of string breaking. For the fitted values of the free parameters in the process of decay of mother string into two daughter strings the energy (momentum) distributions for the first and second daughter strings are similar to momentum distributions for valence quark and antiquark in meson. This Monte Carlo model with 9 free parameters agrees well with the multiplicity, pseudorapidity, transverse momentum (up to p T =4GeV) distributions and correlations between the average transverse momentum and multiplicity of secondary particles produced by ISR, SS, Tevatron experiments (√s=27 to 1800 GeV). There is quantitative (and qualitative) explanation for correlations between the average transverse momentum and multiplicity for different types of secondary particles (antiprotons, kaons, pions) at √s =1800 GeV. A cascade model of string breaking is a new Monte Carlo model for hadronization which agrees well with the experimental multiplicity, rapidity, transverse momentum distributions of secondary particles produced by e + e - annihilation at E c.m. =3GeV. (author)

  15. Sci—Fri PM: Topics — 07: Monte Carlo Simulation of Primary Dose and PET Isotope Production for the TRIUMF Proton Therapy Facility

    Energy Technology Data Exchange (ETDEWEB)

    Lindsay, C; Jirasek, A [University of Victoria (Australia); Blackmore, E; Hoehr, C; Schaffer, P; Trinczek, M [TRIUMF (Canada); Sossi, V [University of British Columbia (Canada)

    2014-08-15

    Uveal melanoma is a rare and deadly tumour of the eye with primary metastases in the liver resulting in an 8% 2-year survival rate upon detection. Large growths, or those in close proximity to the optic nerve, pose a particular challenge to the commonly employed eye-sparing technique of eye-plaque brachytherapy. In these cases external beam charged particle therapy offers improved odds in avoiding catastrophic side effects such as neuropathy or blindness. Since 1995, the British Columbia Cancer Agency in partnership with the TRIUMF national laboratory have offered proton therapy in the treatment of difficult ocular tumors. Having seen 175 patients, yielding 80% globe preservation and 82% metastasis free survival as of 2010, this modality has proven to be highly effective. Despite this success, there have been few studies into the use of the world's largest cyclotron in patient care. Here we describe first efforts of modeling the TRIUMF dose delivery system using the FLUKA Monte Carlo package. Details on geometry, estimating beam parameters, measurement of primary dose and simulation of PET isotope production are discussed. Proton depth dose in both modulated and pristine beams is successfully simulated to sub-millimeter precision in range (within limits of measurement) and 2% agreement to measurement within in a treatment volume. With the goal of using PET signals for in vivo dosimetry (alignment), a first look at PET isotope depth distribution is presented — comparing favourably to a naive method of approximating simulated PET slice activity in a Lucite phantom.

  16. Skin fluorescence model based on the Monte Carlo technique

    Science.gov (United States)

    Churmakov, Dmitry Y.; Meglinski, Igor V.; Piletsky, Sergey A.; Greenhalgh, Douglas A.

    2003-10-01

    The novel Monte Carlo technique of simulation of spatial fluorescence distribution within the human skin is presented. The computational model of skin takes into account spatial distribution of fluorophores following the collagen fibers packing, whereas in epidermis and stratum corneum the distribution of fluorophores assumed to be homogeneous. The results of simulation suggest that distribution of auto-fluorescence is significantly suppressed in the NIR spectral region, while fluorescence of sensor layer embedded in epidermis is localized at the adjusted depth. The model is also able to simulate the skin fluorescence spectra.

  17. Machine learning-based patient specific prompt-gamma dose monitoring in proton therapy

    Science.gov (United States)

    Gueth, P.; Dauvergne, D.; Freud, N.; Létang, J. M.; Ray, C.; Testa, E.; Sarrut, D.

    2013-07-01

    Online dose monitoring in proton therapy is currently being investigated with prompt-gamma (PG) devices. PG emission was shown to be correlated with dose deposition. This relationship is mostly unknown under real conditions. We propose a machine learning approach based on simulations to create optimized treatment-specific classifiers that detect discrepancies between planned and delivered dose. Simulations were performed with the Monte-Carlo platform Gate/Geant4 for a spot-scanning proton therapy treatment and a PG camera prototype currently under investigation. The method first builds a learning set of perturbed situations corresponding to a range of patient translation. This set is then used to train a combined classifier using distal falloff and registered correlation measures. Classifier performances were evaluated using receiver operating characteristic curves and maximum associated specificity and sensitivity. A leave-one-out study showed that it is possible to detect discrepancies of 5 mm with specificity and sensitivity of 85% whereas using only distal falloff decreases the sensitivity down to 77% on the same data set. The proposed method could help to evaluate performance and to optimize the design of PG monitoring devices. It is generic: other learning sets of deviations, other measures and other types of classifiers could be studied to potentially reach better performance. At the moment, the main limitation lies in the computation time needed to perform the simulations.

  18. Machine learning-based patient specific prompt-gamma dose monitoring in proton therapy

    International Nuclear Information System (INIS)

    Gueth, P; Freud, N; Létang, J M; Sarrut, D; Dauvergne, D; Ray, C; Testa, E

    2013-01-01

    Online dose monitoring in proton therapy is currently being investigated with prompt-gamma (PG) devices. PG emission was shown to be correlated with dose deposition. This relationship is mostly unknown under real conditions. We propose a machine learning approach based on simulations to create optimized treatment-specific classifiers that detect discrepancies between planned and delivered dose. Simulations were performed with the Monte-Carlo platform Gate/Geant4 for a spot-scanning proton therapy treatment and a PG camera prototype currently under investigation. The method first builds a learning set of perturbed situations corresponding to a range of patient translation. This set is then used to train a combined classifier using distal falloff and registered correlation measures. Classifier performances were evaluated using receiver operating characteristic curves and maximum associated specificity and sensitivity. A leave-one-out study showed that it is possible to detect discrepancies of 5 mm with specificity and sensitivity of 85% whereas using only distal falloff decreases the sensitivity down to 77% on the same data set. The proposed method could help to evaluate performance and to optimize the design of PG monitoring devices. It is generic: other learning sets of deviations, other measures and other types of classifiers could be studied to potentially reach better performance. At the moment, the main limitation lies in the computation time needed to perform the simulations. (paper)

  19. Construction of boundary-surface-based Chinese female astronaut computational phantom and proton dose estimation

    International Nuclear Information System (INIS)

    Sun Wenjuan; Xie Tianwu; Liu Qian; Jia Xianghong; Xu Feng

    2013-01-01

    With the rapid development of China's space industry, the importance of radiation protection is increasingly prominent. To provide relevant dose data, we first developed the Visible Chinese Human adult Female (VCH-F) phantom, and performed further modifications to generate the VCH-F Astronaut (VCH-FA) phantom, incorporating statistical body characteristics data from the first batch of Chinese female astronauts as well as reference organ mass data from the International Commission on Radiological Protection (ICRP; both within 1% relative error). Based on cryosection images, the original phantom was constructed via Non-Uniform Rational B-Spline (NURBS) boundary surfaces to strengthen the deformability for fitting the body parameters of Chinese female astronauts. The VCH-FA phantom was voxelized at a resolution of 2 x 2 x 4 mm 3 for radioactive particle transport simulations from isotropic protons with energies of 5000 - 10 000 MeV in Monte Carlo N-Particle eXtended (MCNPX) code. To investigate discrepancies caused by anatomical variations and other factors, the obtained doses were compared with corresponding values from other phantoms and sex-averaged doses. Dose differences were observed among phantom calculation results, especially for effective dose with low-energy protons. Local skin thickness shifts the breast dose curve toward high energy, but has little impact on inner organs. Under a shielding layer, organ dose reduction is greater for skin than for other organs. The calculated skin dose per day closely approximates measurement data obtained in low-Earth orbit (LEO). (author)

  20. Monte Carlo modelling and comparison with experiment of the nuclide production in thick stony targets isotropically irradiated with 600 MeV protons

    International Nuclear Information System (INIS)

    Aylmer, D.; Herzog, G.F.; Kruse, T.H.; Cloth, P.; Filges, D.; Moniot, R.K.; Signer, P.; Wieler, R.; Tuniz, C.

    1987-05-01

    Depth profiles for the production of stable and radioactive nuclides have been measured for a large variety of target elements in three thick spherical stony targets with radii of 5, 15 and 26 cm isotropically irradiated with 600 MeV protons at the CERN synchrocyclotron. These irradiation experiments (CERN SC96) were intended to simulate the irradiation of meteoroids by galactic cosmic ray protons. In order to combine this experimental approach with a theoretical one the intra- and internuclear cascades were calculated using Monte Carlo techniques via the high energy transport code HET/KFA 1. Together with transport calculations for low energy neutrons by the MORSE-CG code the depth dependent spectra of primary and secondary protons and of secondary neutrons were derived. On the basis of these spectra and a set of evaluated experimental excitation functions for p-induced reactions and of theoretical ones for n-induced reactions, calculated by the code ALICE LIVERMORE 82, theoretical depth profiles for the production of stable and radioactive nuclides in the three thick targets were calculated. This report is a comprehensive survey on all those target/product combination for which both experimental and theoretical data are available. It provides the basis for a detailed discussion of the various production modes of residual nuclides and on the depth and size dependence of their production rates in thick stony targets, serving as a simulation of the galactic cosmic ray irradiation of meteoroids in space. On the other hand the comparison of the experimental and theoretical depth profiles validates the high energy transport calculations, making them a promissing tool for further model calculations of the interactions of cosmic rays with matter. (orig.)

  1. SU-D-BRC-01: An Automatic Beam Model Commissioning Method for Monte Carlo Simulations in Pencil-Beam Scanning Proton Therapy

    Energy Technology Data Exchange (ETDEWEB)

    Qin, N; Shen, C; Tian, Z; Jiang, S; Jia, X [UT Southwestern Medical Ctr, Dallas, TX (United States)

    2016-06-15

    Purpose: Monte Carlo (MC) simulation is typically regarded as the most accurate dose calculation method for proton therapy. Yet for real clinical cases, the overall accuracy also depends on that of the MC beam model. Commissioning a beam model to faithfully represent a real beam requires finely tuning a set of model parameters, which could be tedious given the large number of pencil beams to commmission. This abstract reports an automatic beam-model commissioning method for pencil-beam scanning proton therapy via an optimization approach. Methods: We modeled a real pencil beam with energy and spatial spread following Gaussian distributions. Mean energy, and energy and spatial spread are model parameters. To commission against a real beam, we first performed MC simulations to calculate dose distributions of a set of ideal (monoenergetic, zero-size) pencil beams. Dose distribution for a real pencil beam is hence linear superposition of doses for those ideal pencil beams with weights in the Gaussian form. We formulated the commissioning task as an optimization problem, such that the calculated central axis depth dose and lateral profiles at several depths match corresponding measurements. An iterative algorithm combining conjugate gradient method and parameter fitting was employed to solve the optimization problem. We validated our method in simulation studies. Results: We calculated dose distributions for three real pencil beams with nominal energies 83, 147 and 199 MeV using realistic beam parameters. These data were regarded as measurements and used for commission. After commissioning, average difference in energy and beam spread between determined values and ground truth were 4.6% and 0.2%. With the commissioned model, we recomputed dose. Mean dose differences from measurements were 0.64%, 0.20% and 0.25%. Conclusion: The developed automatic MC beam-model commissioning method for pencil-beam scanning proton therapy can determine beam model parameters with

  2. Reoptimization of Intensity Modulated Proton Therapy Plans Based on Linear Energy Transfer

    Energy Technology Data Exchange (ETDEWEB)

    Unkelbach, Jan, E-mail: junkelbach@mgh.harvard.edu [Department of Radiation Oncology, Massachusetts General Hospital, Boston, Massachusetts (United States); Botas, Pablo [Department of Radiation Oncology, Massachusetts General Hospital, Boston, Massachusetts (United States); Faculty of Physics, Ruprecht-Karls-Universität Heidelberg, Heidelberg (Germany); Giantsoudi, Drosoula; Gorissen, Bram L.; Paganetti, Harald [Department of Radiation Oncology, Massachusetts General Hospital, Boston, Massachusetts (United States)

    2016-12-01

    Purpose: We describe a treatment plan optimization method for intensity modulated proton therapy (IMPT) that avoids high values of linear energy transfer (LET) in critical structures located within or near the target volume while limiting degradation of the best possible physical dose distribution. Methods and Materials: To allow fast optimization based on dose and LET, a GPU-based Monte Carlo code was extended to provide dose-averaged LET in addition to dose for all pencil beams. After optimizing an initial IMPT plan based on physical dose, a prioritized optimization scheme is used to modify the LET distribution while constraining the physical dose objectives to values close to the initial plan. The LET optimization step is performed based on objective functions evaluated for the product of LET and physical dose (LET×D). To first approximation, LET×D represents a measure of the additional biological dose that is caused by high LET. Results: The method is effective for treatments where serial critical structures with maximum dose constraints are located within or near the target. We report on 5 patients with intracranial tumors (high-grade meningiomas, base-of-skull chordomas, ependymomas) in whom the target volume overlaps with the brainstem and optic structures. In all cases, high LET×D in critical structures could be avoided while minimally compromising physical dose planning objectives. Conclusion: LET-based reoptimization of IMPT plans represents a pragmatic approach to bridge the gap between purely physical dose-based and relative biological effectiveness (RBE)-based planning. The method makes IMPT treatments safer by mitigating a potentially increased risk of side effects resulting from elevated RBE of proton beams near the end of range.

  3. GPU-based fast pencil beam algorithm for proton therapy

    International Nuclear Information System (INIS)

    Fujimoto, Rintaro; Nagamine, Yoshihiko; Kurihara, Tsuneya

    2011-01-01

    Performance of a treatment planning system is an essential factor in making sophisticated plans. The dose calculation is a major time-consuming process in planning operations. The standard algorithm for proton dose calculations is the pencil beam algorithm which produces relatively accurate results, but is time consuming. In order to shorten the computational time, we have developed a GPU (graphics processing unit)-based pencil beam algorithm. We have implemented this algorithm and calculated dose distributions in the case of a water phantom. The results were compared to those obtained by a traditional method with respect to the computational time and discrepancy between the two methods. The new algorithm shows 5-20 times faster performance using the NVIDIA GeForce GTX 480 card in comparison with the Intel Core-i7 920 processor. The maximum discrepancy of the dose distribution is within 0.2%. Our results show that GPUs are effective for proton dose calculations.

  4. Implementation of a Monte Carlo based inverse planning model for clinical IMRT with MCNP code

    International Nuclear Information System (INIS)

    He, Tongming Tony

    2003-01-01

    Inaccurate dose calculations and limitations of optimization algorithms in inverse planning introduce systematic and convergence errors to treatment plans. This work was to implement a Monte Carlo based inverse planning model for clinical IMRT aiming to minimize the aforementioned errors. The strategy was to precalculate the dose matrices of beamlets in a Monte Carlo based method followed by the optimization of beamlet intensities. The MCNP 4B (Monte Carlo N-Particle version 4B) code was modified to implement selective particle transport and dose tallying in voxels and efficient estimation of statistical uncertainties. The resulting performance gain was over eleven thousand times. Due to concurrent calculation of multiple beamlets of individual ports, hundreds of beamlets in an IMRT plan could be calculated within a practical length of time. A finite-sized point source model provided a simple and accurate modeling of treatment beams. The dose matrix calculations were validated through measurements in phantoms. Agreements were better than 1.5% or 0.2 cm. The beamlet intensities were optimized using a parallel platform based optimization algorithm that was capable of escape from local minima and preventing premature convergence. The Monte Carlo based inverse planning model was applied to clinical cases. The feasibility and capability of Monte Carlo based inverse planning for clinical IMRT was demonstrated. Systematic errors in treatment plans of a commercial inverse planning system were assessed in comparison with the Monte Carlo based calculations. Discrepancies in tumor doses and critical structure doses were up to 12% and 17%, respectively. The clinical importance of Monte Carlo based inverse planning for IMRT was demonstrated

  5. Proton radiotherapy in management of pediatric base of skull tumors

    International Nuclear Information System (INIS)

    Hug, Eugen B.; Sweeney, Reinhart A.; Nurre, Pamela M.; Holloway, Kitty C.; Slater, Jerry D.; Munzenrider, John E.

    2002-01-01

    Purpose: Primary skull base tumors of the developing child are rare and present a formidable challenge to both surgeons and radiation oncologists. Gross total resection with negative margins is rarely achieved, and the risks of functional, structural, and cosmetic deficits limit the radiation dose using conventional radiation techniques. Twenty-nine children and adolescents treated with conformal proton radiotherapy (proton RT) were analyzed to assess treatment efficacy and safety. Methods and Materials: Between July 1992 and April 1999, 29 patients with mesenchymal tumors underwent fractionated proton (13 patients) or fractionated combined proton and photon (16 patients) irradiation. The age at treatment ranged from 1 to 19 years (median 12); 14 patients were male and 15 female. Tumors were grouped as malignant or benign. Twenty patients had malignant histologic findings, including chordoma (n=10), chondrosarcoma (n=3), rhabdomyosarcoma (n=4), and other sarcomas (n=3). Target doses ranged between 50.4 and 78.6 Gy/cobalt Gray equivalent (CGE), delivered at doses of 1.8-2.0 Gy/CGE per fraction. The benign histologic findings included giant cell tumors (n=6), angiofibromas (n=2), and chondroblastoma (n=1). RT doses for this group ranged from 45.0 to 71.8 Gy/CGE. Despite maximal surgical resection, 28 (97%) of 29 patients had gross disease at the time of proton RT. Follow-up after proton RT ranged from 13 to 92 months (mean 40). Results: Of the 20 patients with malignant tumors, 5 (25%) had local failure; 1 patient had failure in the surgical access route and 3 patients developed distant metastases. Seven patients had died of progressive disease at the time of analysis. Local tumor control was maintained in 6 (60%) of 10 patients with chordoma, 3 (100%) of 3 with chondrosarcoma, 4 (100%) of 4 with rhabdomyosarcoma, and 2 (66%) of 3 with other sarcomas. The actuarial 5-year local control and overall survival rate was 72% and 56%, respectively, and the overall survival

  6. SU-E-T-180: Fano Cavity Test of Proton Transport in Monte Carlo Codes Running On GPU and Xeon Phi

    International Nuclear Information System (INIS)

    Sterpin, E; Sorriaux, J; Souris, K; Lee, J; Vynckier, S; Schuemann, J; Paganetti, H; Jia, X; Jiang, S

    2014-01-01

    Purpose: In proton dose calculation, clinically compatible speeds are now achieved with Monte Carlo codes (MC) that combine 1) adequate simplifications in the physics of transport and 2) the use of hardware architectures enabling massive parallel computing (like GPUs). However, the uncertainties related to the transport algorithms used in these codes must be kept minimal. Such algorithms can be checked with the so-called “Fano cavity test”. We implemented the test in two codes that run on specific hardware: gPMC on an nVidia GPU and MCsquare on an Intel Xeon Phi (60 cores). Methods: gPMC and MCsquare are designed for transporting protons in CT geometries. Both codes use the method of fictitious interaction to sample the step-length for each transport step. The considered geometry is a water cavity (2×2×0.2 cm 3 , 0.001 g/cm 3 ) in a 10×10×50 cm 3 water phantom (1 g/cm 3 ). CPE in the cavity is established by generating protons over the phantom volume with a uniform momentum (energy E) and a uniform intensity per unit mass I. Assuming no nuclear reactions and no generation of other secondaries, the computed cavity dose should equal IE, according to Fano's theorem. Both codes were tested for initial proton energies of 50, 100, and 200 MeV. Results: For all energies, gPMC and MCsquare are within 0.3 and 0.2 % of the theoretical value IE, respectively (0.1% standard deviation). Single-precision computations (instead of double) increased the error by about 0.1% in MCsquare. Conclusion: Despite the simplifications in the physics of transport, both gPMC and MCsquare successfully pass the Fano test. This ensures optimal accuracy of the codes for clinical applications within the uncertainties on the underlying physical models. It also opens the path to other applications of these codes, like the simulation of ion chamber response

  7. SU-E-T-180: Fano Cavity Test of Proton Transport in Monte Carlo Codes Running On GPU and Xeon Phi

    Energy Technology Data Exchange (ETDEWEB)

    Sterpin, E; Sorriaux, J; Souris, K; Lee, J; Vynckier, S [Universite catholique de Louvain, Brussels, Brussels (Belgium); Schuemann, J; Paganetti, H [Massachusetts General Hospital, Boston, MA (United States); Jia, X; Jiang, S [The University of Texas Southwestern Medical Ctr, Dallas, TX (United States)

    2014-06-01

    Purpose: In proton dose calculation, clinically compatible speeds are now achieved with Monte Carlo codes (MC) that combine 1) adequate simplifications in the physics of transport and 2) the use of hardware architectures enabling massive parallel computing (like GPUs). However, the uncertainties related to the transport algorithms used in these codes must be kept minimal. Such algorithms can be checked with the so-called “Fano cavity test”. We implemented the test in two codes that run on specific hardware: gPMC on an nVidia GPU and MCsquare on an Intel Xeon Phi (60 cores). Methods: gPMC and MCsquare are designed for transporting protons in CT geometries. Both codes use the method of fictitious interaction to sample the step-length for each transport step. The considered geometry is a water cavity (2×2×0.2 cm{sup 3}, 0.001 g/cm{sup 3}) in a 10×10×50 cm{sup 3} water phantom (1 g/cm{sup 3}). CPE in the cavity is established by generating protons over the phantom volume with a uniform momentum (energy E) and a uniform intensity per unit mass I. Assuming no nuclear reactions and no generation of other secondaries, the computed cavity dose should equal IE, according to Fano's theorem. Both codes were tested for initial proton energies of 50, 100, and 200 MeV. Results: For all energies, gPMC and MCsquare are within 0.3 and 0.2 % of the theoretical value IE, respectively (0.1% standard deviation). Single-precision computations (instead of double) increased the error by about 0.1% in MCsquare. Conclusion: Despite the simplifications in the physics of transport, both gPMC and MCsquare successfully pass the Fano test. This ensures optimal accuracy of the codes for clinical applications within the uncertainties on the underlying physical models. It also opens the path to other applications of these codes, like the simulation of ion chamber response.

  8. Comparison of depth-dose distributions of proton therapeutic beams calculated by means of logical detectors and ionization chamber modeled in Monte Carlo codes

    Energy Technology Data Exchange (ETDEWEB)

    Pietrzak, Robert [Department of Nuclear Physics and Its Applications, Institute of Physics, University of Silesia, Katowice (Poland); Konefał, Adam, E-mail: adam.konefal@us.edu.pl [Department of Nuclear Physics and Its Applications, Institute of Physics, University of Silesia, Katowice (Poland); Sokół, Maria; Orlef, Andrzej [Department of Medical Physics, Maria Sklodowska-Curie Memorial Cancer Center, Institute of Oncology, Gliwice (Poland)

    2016-08-01

    The success of proton therapy depends strongly on the precision of treatment planning. Dose distribution in biological tissue may be obtained from Monte Carlo simulations using various scientific codes making it possible to perform very accurate calculations. However, there are many factors affecting the accuracy of modeling. One of them is a structure of objects called bins registering a dose. In this work the influence of bin structure on the dose distributions was examined. The MCNPX code calculations of Bragg curve for the 60 MeV proton beam were done in two ways: using simple logical detectors being the volumes determined in water, and using a precise model of ionization chamber used in clinical dosimetry. The results of the simulations were verified experimentally in the water phantom with Marcus ionization chamber. The average local dose difference between the measured relative doses in the water phantom and those calculated by means of the logical detectors was 1.4% at first 25 mm, whereas in the full depth range this difference was 1.6% for the maximum uncertainty in the calculations less than 2.4% and for the maximum measuring error of 1%. In case of the relative doses calculated with the use of the ionization chamber model this average difference was somewhat greater, being 2.3% at depths up to 25 mm and 2.4% in the full range of depths for the maximum uncertainty in the calculations of 3%. In the dose calculations the ionization chamber model does not offer any additional advantages over the logical detectors. The results provided by both models are similar and in good agreement with the measurements, however, the logical detector approach is a more time-effective method. - Highlights: • Influence of the bin structure on the proton dose distributions was examined for the MC simulations. • The considered relative proton dose distributions in water correspond to the clinical application. • MC simulations performed with the logical detectors and the

  9. Comparative evaluations of the Monte Carlo-based light propagation simulation packages for optical imaging

    Directory of Open Access Journals (Sweden)

    Lin Wang

    2018-01-01

    Full Text Available Monte Carlo simulation of light propagation in turbid medium has been studied for years. A number of software packages have been developed to handle with such issue. However, it is hard to compare these simulation packages, especially for tissues with complex heterogeneous structures. Here, we first designed a group of mesh datasets generated by Iso2Mesh software, and used them to cross-validate the accuracy and to evaluate the performance of four Monte Carlo-based simulation packages, including Monte Carlo model of steady-state light transport in multi-layered tissues (MCML, tetrahedron-based inhomogeneous Monte Carlo optical simulator (TIMOS, Molecular Optical Simulation Environment (MOSE, and Mesh-based Monte Carlo (MMC. The performance of each package was evaluated based on the designed mesh datasets. The merits and demerits of each package were also discussed. Comparative results showed that the TIMOS package provided the best performance, which proved to be a reliable, efficient, and stable MC simulation package for users.

  10. The influence of lateral beam profile modifications in scanned proton and carbon ion therapy: a Monte Carlo study

    CERN Document Server

    Parodi, K; Kraemer, M; Sommerer, F; Naumann, J; Mairani, A; Brons, S

    2010-01-01

    Scanned ion beam delivery promises superior flexibility and accuracy for highly conformal tumour therapy in comparison to the usage of passive beam shaping systems. The attainable precision demands correct overlapping of the pencil-like beams which build up the entire dose distribution in the treatment field. In particular, improper dose application due to deviations of the lateral beam profiles from the nominal planning conditions must be prevented via appropriate beam monitoring in the beamline, prior to the entrance in the patient. To assess the necessary tolerance thresholds of the beam monitoring system at the Heidelberg Ion Beam Therapy Center, Germany, this study has investigated several worst-case scenarios for a sensitive treatment plan, namely scanned proton and carbon ion delivery to a small target volume at a shallow depth. Deviations from the nominal lateral beam profiles were simulated, which may occur because of misaligned elements or changes of the beam optic in the beamline. Data have been an...

  11. Pentanol-based target material with polarized protons

    International Nuclear Information System (INIS)

    Bunyatova, E.I.

    1992-01-01

    1-pentanol is a promising material for a target with polarized protons owing to its high resistance to radiation damage. To develop the target, the solutions of 1-pentanol or 2-pentanol with complexes of pentavalent chromium ware investigated. The material based EHBA-Cr(V) solution in a glass-like matrix, consisting of 1-pentanol, 3-pentanol and 1,2-propanediol, was proposed as a target material. It was investigated by the electron paramagnetic resonance and differential scanning calorimetry methods. 24 refs.; 3 figs.; 1 tab

  12. Range verification for eye proton therapy based on proton-induced x-ray emissions from implanted metal markers

    International Nuclear Information System (INIS)

    Rosa, Vanessa La; Royle, Gary; Gibson, Adam; Kacperek, Andrzej

    2014-01-01

    Metal fiducial markers are often implanted on the back of the eye before proton therapy to improve target localization and reduce patient setup errors. We aim to detect characteristic x-ray emissions from metal targets during proton therapy to verify the treatment range accuracy. Initially gold was chosen for its biocompatibility properties. Proton-induced x-ray emissions (PIXE) from a 15 mm diameter gold marker were detected at different penetration depths of a 59 MeV proton beam at the CATANA proton facility at INFN-LNS (Italy). The Monte Carlo code Geant4 was used to reproduce the experiment and to investigate the effect of different size markers, materials, and the response to both mono-energetic and fully modulated beams. The intensity of the emitted x-rays decreases with decreasing proton energy and thus decreases with depth. If we assume the range to be the depth at which the dose is reduced to 10% of its maximum value and we define the residual range as the distance between the marker and the range of the beam, then the minimum residual range which can be detected with 95% confidence level is the depth at which the PIXE peak is equal to 1.96 σ bkg , which is the standard variation of the background noise. With our system and experimental setup this value is 3 mm, when 20 GyE are delivered to a gold marker of 15 mm diameter. Results from silver are more promising. Even when a 5 mm diameter silver marker is placed at a depth equal to the range, the PIXE peak is 2.1 σ bkg . Although these quantitative results are dependent on the experimental setup used in this research study, they demonstrate that the real-time analysis of the PIXE emitted by fiducial metal markers can be used to derive beam range. Further analysis are needed to demonstrate the feasibility of the technique in a clinical setup. (paper)

  13. TU-EF-304-10: Efficient Multiscale Simulation of the Proton Relative Biological Effectiveness (RBE) for DNA Double Strand Break (DSB) Induction and Bio-Effective Dose in the FLUKA Monte Carlo Radiation Transport Code

    Energy Technology Data Exchange (ETDEWEB)

    Moskvin, V; Tsiamas, P; Axente, M; Farr, J [St. Jude Children’s Research Hospital, Memphis, TN (United States); Stewart, R [University of Washington, Seattle, WA. (United States)

    2015-06-15

    Purpose: One of the more critical initiating events for reproductive cell death is the creation of a DNA double strand break (DSB). In this study, we present a computationally efficient way to determine spatial variations in the relative biological effectiveness (RBE) of proton therapy beams within the FLUKA Monte Carlo (MC) code. Methods: We used the independently tested Monte Carlo Damage Simulation (MCDS) developed by Stewart and colleagues (Radiat. Res. 176, 587–602 2011) to estimate the RBE for DSB induction of monoenergetic protons, tritium, deuterium, hellium-3, hellium-4 ions and delta-electrons. The dose-weighted (RBE) coefficients were incorporated into FLUKA to determine the equivalent {sup 6}°60Co γ-ray dose for representative proton beams incident on cells in an aerobic and anoxic environment. Results: We found that the proton beam RBE for DSB induction at the tip of the Bragg peak, including primary and secondary particles, is close to 1.2. Furthermore, the RBE increases laterally to the beam axis at the area of Bragg peak. At the distal edge, the RBE is in the range from 1.3–1.4 for cells irradiated under aerobic conditions and may be as large as 1.5–1.8 for cells irradiated under anoxic conditions. Across the plateau region, the recorded RBE for DSB induction is 1.02 for aerobic cells and 1.05 for cells irradiated under anoxic conditions. The contribution to total effective dose from secondary heavy ions decreases with depth and is higher at shallow depths (e.g., at the surface of the skin). Conclusion: Multiscale simulation of the RBE for DSB induction provides useful insights into spatial variations in proton RBE within pristine Bragg peaks. This methodology is potentially useful for the biological optimization of proton therapy for the treatment of cancer. The study highlights the need to incorporate spatial variations in proton RBE into proton therapy treatment plans.

  14. Composite proton exchange membrane based on sulfonated organic nanoparticles

    Science.gov (United States)

    Pitia, Emmanuel Sokiri

    exchange was characterized with solid state 13C NMR spectroscopy, FTIR spectroscopy, TGA, elemental analysis, and titration. The results indicate the extent of ion exchange was ~ 70-80%. Due to the mass of QAA, the remaining QAA reduced the IEC of the nanoparticles to < 2.2 meq/g. In fabricating the composite membranes, the nanoparticles and polystyrene were solution cast in a continuous process with and without electric field. The electric field had no effect on the water uptake. Based on the morphology and the proton conductivity, it appears orientation of the nanoparticles did not occur. We hypothesize the lack of orientation was caused by swelling of the particles with the solvent. The solvent inside the particle minimized polarizability, and thus prevented orientation. The composite membranes were limited to low proton conductivity of ~ 10-5 S/cm due to low IEC of the nanoparticles, but good dispersion of the nanoparticles was achieved. Future work should look into eliminating the QAA during synthesis and developing a rigid core for the nanoparticles.

  15. Optimization of a neutron production target based on the 7Li (p,n)7Be reaction with the Monte Carlo Method

    International Nuclear Information System (INIS)

    Burlon, Alejandro A.; Kreiner, Andres J.; Minsky, Daniel; Valda, Alejandro A.; Somacal, Hector R.

    2003-01-01

    In order to optimize a neutron production target for accelerator-based boron neutron capture therapy (AB-BNCT) a Monte Carlo Neutron and Photon (MCNP) investigation has been performed. Neutron fields from a LiF thick target (with both a D 2 O-graphite and a Al/AlF 3 -graphite moderator/reflector assembly) were evaluated along the centerline in a head phantom. The target neutron beam was simulated from the 7 Li(p,n) 7 Be nuclear reaction for 1.89, 2.0 and 2.3 MeV protons. The results show that it is more advantageous to irradiate the target with near resonance energy protons (2.3 MeV) because of the high neutron yield at this energy. On the other hand, the Al/AlF 3 -graphite exhibits a more efficient performance than D 2 O. (author)

  16. SU-F-T-139: Meeting the Challenges of Quality Control in the TOPAS Monte Carlo Simulation Toolkit for Proton Therapy

    Energy Technology Data Exchange (ETDEWEB)

    Hall, D; Schuemann, J; Paganetti, H [Massachusetts General Hospital, Boston, MA (United States); Perl, J [Stanford Linear Accelerator Center, Menlo Park, CA (United States); Faddegon, B [UC San Francisco, San Francisco, CA (United States)

    2016-06-15

    Purpose: Monte Carlo particle transport simulation (MC) codes have become important tools in proton therapy and biology, both for research and practice. TOPAS is an MC toolkit serving users worldwide (213 licensed users at 95 institutions in 21 countries). It provides unprecedented ease in 4D placement of geometry components, beam sources and scoring through its user-friendly and reproducible parameter file interface. Quality control (QC) of stochastic simulation software is inherently difficult, and the versatility of TOPAS introduces additional challenges. But QC is vital as the TOPAS development team implements new features, addresses user feedback and reacts to upgrades of underlying software (i.e. Geant4). Methods: Whenever code is committed to our repository, over 50 separate module tests are automatically triggered via a continuous integration service. They check that the various module options execute successfully and that their results are statistically consistent with previous reference values. Prior to each software release, longer end-to-end tests automatically validate TOPAS against experimental data and a TOPAS benchmark. These include simulating multiple properties of spread-out Bragg peaks, validating nuclear models, and searching for differences in patient simulations. Results: Continuous integration has proven effective in catching regressions at the time they are introduced, particularly when implementing new features that involve refactoring code (e.g. multithreading and ntuple output). Code coverage statistics highlight untested portions of code and guide development of new tests. The various end-to-end tests demonstrate that TOPAS accurately describes the physics of proton therapy within clinical tolerances. Conclusion: The TOPAS QC strategy of frequent short tests and pre-release long tests has led to a more reliable tool. However, the versatility of TOPAS makes it difficult to predict how users might combine different modules, and so QC

  17. VIP-Man: An image-based whole-body adult male model constructed from color photographs of the visible human project for multi-particle Monte Carlo calculations

    International Nuclear Information System (INIS)

    Xu, X.G.; Chao, T.C.; Bozkurt, A.

    2000-01-01

    Human anatomical models have been indispensable to radiation protection dosimetry using Monte Carlo calculations. Existing MIRD-based mathematical models are easy to compute and standardize, but they are simplified and crude compared to human anatomy. This article describes the development of an image-based whole-body model, called VIP-Man, using transversal color photographic images obtained from the National Library of Medicine's Visible Human Project for Monte Carlo organ dose calculations involving photons, electron, neutrons, and protons. As the first of a series of papers on dose calculations based on VIP-Man, this article provides detailed information about how to construct an image-based model, as well as how to adopt it into well-tested Monte Carlo codes, EGS4, MCNP4B, and MCNPX

  18. Acceptance and implementation of a system of planning computerized based on Monte Carlo

    International Nuclear Information System (INIS)

    Lopez-Tarjuelo, J.; Garcia-Molla, R.; Suan-Senabre, X. J.; Quiros-Higueras, J. Q.; Santos-Serra, A.; Marco-Blancas, N.; Calzada-Feliu, S.

    2013-01-01

    It has been done the acceptance for use clinical Monaco computerized planning system, based on an on a virtual model of the energy yield of the head of the linear electron Accelerator and that performs the calculation of the dose with an algorithm of x-rays (XVMC) based on Monte Carlo algorithm. (Author)

  19. Continuous energy Monte Carlo method based homogenization multi-group constants calculation

    International Nuclear Information System (INIS)

    Li Mancang; Wang Kan; Yao Dong

    2012-01-01

    The efficiency of the standard two-step reactor physics calculation relies on the accuracy of multi-group constants from the assembly-level homogenization process. In contrast to the traditional deterministic methods, generating the homogenization cross sections via Monte Carlo method overcomes the difficulties in geometry and treats energy in continuum, thus provides more accuracy parameters. Besides, the same code and data bank can be used for a wide range of applications, resulting in the versatility using Monte Carlo codes for homogenization. As the first stage to realize Monte Carlo based lattice homogenization, the track length scheme is used as the foundation of cross section generation, which is straight forward. The scattering matrix and Legendre components, however, require special techniques. The Scattering Event method was proposed to solve the problem. There are no continuous energy counterparts in the Monte Carlo calculation for neutron diffusion coefficients. P 1 cross sections were used to calculate the diffusion coefficients for diffusion reactor simulator codes. B N theory is applied to take the leakage effect into account when the infinite lattice of identical symmetric motives is assumed. The MCMC code was developed and the code was applied in four assembly configurations to assess the accuracy and the applicability. At core-level, A PWR prototype core is examined. The results show that the Monte Carlo based multi-group constants behave well in average. The method could be applied to complicated configuration nuclear reactor core to gain higher accuracy. (authors)

  20. Evaluation of tomographic-image based geometries with PENELOPE Monte Carlo

    International Nuclear Information System (INIS)

    Kakoi, A.A.Y.; Galina, A.C.; Nicolucci, P.

    2009-01-01

    The Monte Carlo method can be used to evaluate treatment planning systems or for the determination of dose distributions in radiotherapy planning due to its accuracy and precision. In Monte Carlo simulation packages typically used in radiotherapy, however, a realistic representation of the geometry of the patient can not be used, which compromises the accuracy of the results. In this work, an algorithm for the description of geometries based on CT images of patients, developed to be used with Monte Carlo simulation package PENELOPE, is tested by simulating the dose distribution produced by a photon beam of 10 MV. The geometry simulated was based on CT images of a planning of prostate cancer. The volumes of interest in the treatment were adequately represented in the simulation geometry, allowing the algorithm to be used in verification of doses in radiotherapy treatments. (author)

  1. Monte Carlo simulations and experimental results on neutron production in the spallation target QUINTA irradiated with 660 MeV protons

    International Nuclear Information System (INIS)

    Khushvaktov, J.H.; Yuldashev, B.S.; Adam, J.; Vrzalova, J.; Baldin, A.A.; Furman, W.I.; Gustov, S.A.; Kish, Yu.V.; Solnyshkin, A.A.; Stegailov, V.I.; Tichy, P.; Tsoupko-Sitnikov, V.M.; Tyutyunnikov, S.I.; Zavorka, L.; Svoboda, J.; Zeman, M.; Vespalec, R.; Wagner, V.

    2017-01-01

    The activation experiment was performed using the accelerated beam of the Phasotron accelerator at the Joint Institute for Nuclear Research (JINR). The natural uranium spallation target QUINTA was irradiated with protons of energy 660 MeV. Monte Carlo simulations were performed using the FLUKA and Geant4 codes. The number of leakage neutrons from the sections of the uranium target surrounded by the lead shielding and the number of leakage neutrons from the lead shield were determined. The total number of fissions in the setup QUINTA were determined. Experimental values of reaction rates for the produced nuclei in the "1"2"7I sample were obtained, and several values of the reaction rates were compared with the results of simulations by the FLUKA and Geant4 codes. The experimentally determined fluence of neutrons in the energy range of 10-200 MeV using the (n, xn) reactions in the "1"2"7I(NaI) sample was compared with the results of simulations. Possibility of transmutation of the long-lived radionuclide "1"2"9I in the QUINTA setup was estimated. [ru

  2. Dosimetric accuracy of a treatment planning system for actively scanned proton beams and small target volumes: Monte Carlo and experimental validation

    Science.gov (United States)

    Magro, G.; Molinelli, S.; Mairani, A.; Mirandola, A.; Panizza, D.; Russo, S.; Ferrari, A.; Valvo, F.; Fossati, P.; Ciocca, M.

    2015-09-01

    This study was performed to evaluate the accuracy of a commercial treatment planning system (TPS), in optimising proton pencil beam dose distributions for small targets of different sizes (5-30 mm side) located at increasing depths in water. The TPS analytical algorithm was benchmarked against experimental data and the FLUKA Monte Carlo (MC) code, previously validated for the selected beam-line. We tested the Siemens syngo® TPS plan optimisation module for water cubes fixing the configurable parameters at clinical standards, with homogeneous target coverage to a 2 Gy (RBE) dose prescription as unique goal. Plans were delivered and the dose at each volume centre was measured in water with a calibrated PTW Advanced Markus® chamber. An EBT3® film was also positioned at the phantom entrance window for the acquisition of 2D dose maps. Discrepancies between TPS calculated and MC simulated values were mainly due to the different lateral spread modeling and resulted in being related to the field-to-spot size ratio. The accuracy of the TPS was proved to be clinically acceptable in all cases but very small and shallow volumes. In this contest, the use of MC to validate TPS results proved to be a reliable procedure for pre-treatment plan verification.

  3. Dosimetric accuracy of a treatment planning system for actively scanned proton beams and small target volumes: Monte Carlo and experimental validation

    International Nuclear Information System (INIS)

    Magro, G; Molinelli, S; Mairani, A; Mirandola, A; Panizza, D; Russo, S; Valvo, F; Fossati, P; Ciocca, M; Ferrari, A

    2015-01-01

    This study was performed to evaluate the accuracy of a commercial treatment planning system (TPS), in optimising proton pencil beam dose distributions for small targets of different sizes (5–30 mm side) located at increasing depths in water. The TPS analytical algorithm was benchmarked against experimental data and the FLUKA Monte Carlo (MC) code, previously validated for the selected beam-line. We tested the Siemens syngo ® TPS plan optimisation module for water cubes fixing the configurable parameters at clinical standards, with homogeneous target coverage to a 2 Gy (RBE) dose prescription as unique goal. Plans were delivered and the dose at each volume centre was measured in water with a calibrated PTW Advanced Markus ® chamber. An EBT3 ® film was also positioned at the phantom entrance window for the acquisition of 2D dose maps. Discrepancies between TPS calculated and MC simulated values were mainly due to the different lateral spread modeling and resulted in being related to the field-to-spot size ratio. The accuracy of the TPS was proved to be clinically acceptable in all cases but very small and shallow volumes. In this contest, the use of MC to validate TPS results proved to be a reliable procedure for pre-treatment plan verification. (paper)

  4. Dosimetric accuracy of a treatment planning system for actively scanned proton beams and small target volumes: Monte Carlo and experimental validation

    CERN Document Server

    Magro, G; Mairani, A; Mirandola, A; Panizza, D; Russo, S; Ferrari, A; Valvo, F; Fossati, P; Ciocca, M

    2015-01-01

    This study was performed to evaluate the accuracy of a commercial treatment planning system (TPS), in optimising proton pencil beam dose distributions for small targets of different sizes (5–30 mm side) located at increasing depths in water. The TPS analytical algorithm was benchmarked against experimental data and the FLUKA Monte Carlo (MC) code, previously validated for the selected beam-line. We tested the Siemens syngo® TPS plan optimisation module for water cubes fixing the configurable parameters at clinical standards, with homogeneous target coverage to a 2 Gy (RBE) dose prescription as unique goal. Plans were delivered and the dose at each volume centre was measured in water with a calibrated PTW Advanced Markus® chamber. An EBT3® film was also positioned at the phantom entrance window for the acquisition of 2D dose maps. Discrepancies between TPS calculated and MC simulated values were mainly due to the different lateral spread modeling and resulted in being related to the field-to-spot size r...

  5. Acceptance and implementation of a system of planning computerized based on Monte Carlo; Aceptacion y puesta en marcha de un sistema de planificacion comutarizada basado en Monte Carlo

    Energy Technology Data Exchange (ETDEWEB)

    Lopez-Tarjuelo, J.; Garcia-Molla, R.; Suan-Senabre, X. J.; Quiros-Higueras, J. Q.; Santos-Serra, A.; Marco-Blancas, N.; Calzada-Feliu, S.

    2013-07-01

    It has been done the acceptance for use clinical Monaco computerized planning system, based on an on a virtual model of the energy yield of the head of the linear electron Accelerator and that performs the calculation of the dose with an algorithm of x-rays (XVMC) based on Monte Carlo algorithm. (Author)

  6. Response matrix Monte Carlo based on a general geometry local calculation for electron transport

    International Nuclear Information System (INIS)

    Ballinger, C.T.; Rathkopf, J.A.; Martin, W.R.

    1991-01-01

    A Response Matrix Monte Carlo (RMMC) method has been developed for solving electron transport problems. This method was born of the need to have a reliable, computationally efficient transport method for low energy electrons (below a few hundred keV) in all materials. Today, condensed history methods are used which reduce the computation time by modeling the combined effect of many collisions but fail at low energy because of the assumptions required to characterize the electron scattering. Analog Monte Carlo simulations are prohibitively expensive since electrons undergo coulombic scattering with little state change after a collision. The RMMC method attempts to combine the accuracy of an analog Monte Carlo simulation with the speed of the condensed history methods. Like condensed history, the RMMC method uses probability distributions functions (PDFs) to describe the energy and direction of the electron after several collisions. However, unlike the condensed history method the PDFs are based on an analog Monte Carlo simulation over a small region. Condensed history theories require assumptions about the electron scattering to derive the PDFs for direction and energy. Thus the RMMC method samples from PDFs which more accurately represent the electron random walk. Results show good agreement between the RMMC method and analog Monte Carlo. 13 refs., 8 figs

  7. Proton therapy for tumors of the skull base

    Energy Technology Data Exchange (ETDEWEB)

    Munzenrider, J.E.; Liebsch, N.J. [Dept. of Radiation Oncology, Harvard Univ. Medical School, Boston, MA (United States)

    1999-06-01

    Charged particle beams are ideal for treating skull base and cervical spine tumors: dose can be focused in the target, while achieving significant sparing of the brain, brain stem, cervical cord, and optic nerves and chiasm. For skull base tumors, 10-year local control rates with combined proton-photon therapy are highest for chondrosarcomas, intermediate for male chordomas, and lowest for female chordomas (94%, 65%, and 42%, respectively). For cervical spine tumors, 10-year local control rates are not significantly different for chordomas and chondrosarcomas (54% and 48%, respectively), nor is there any difference in local control between males and females. Observed treatment-related morbidity has been judged acceptable, in view of the major morbidity and mortality which accompany uncontrolled tumor growth. (orig.)

  8. Proton therapy for tumors of the skull base

    International Nuclear Information System (INIS)

    Munzenrider, J.E.; Liebsch, N.J.

    1999-01-01

    Charged particle beams are ideal for treating skull base and cervical spine tumors: dose can be focused in the target, while achieving significant sparing of the brain, brain stem, cervical cord, and optic nerves and chiasm. For skull base tumors, 10-year local control rates with combined proton-photon therapy are highest for chondrosarcomas, intermediate for male chordomas, and lowest for female chordomas (94%, 65%, and 42%, respectively). For cervical spine tumors, 10-year local control rates are not significantly different for chordomas and chondrosarcomas (54% and 48%, respectively), nor is there any difference in local control between males and females. Observed treatment-related morbidity has been judged acceptable, in view of the major morbidity and mortality which accompany uncontrolled tumor growth. (orig.)

  9. Determination of the spatial response of neutron based analysers using a Monte Carlo based method

    International Nuclear Information System (INIS)

    Tickner, James

    2000-01-01

    One of the principal advantages of using thermal neutron capture (TNC, also called prompt gamma neutron activation analysis or PGNAA) or neutron inelastic scattering (NIS) techniques for measuring elemental composition is the high penetrating power of both the incident neutrons and the resultant gamma-rays, which means that large sample volumes can be interrogated. Gauges based on these techniques are widely used in the mineral industry for on-line determination of the composition of bulk samples. However, attenuation of both neutrons and gamma-rays in the sample and geometric (source/detector distance) effects typically result in certain parts of the sample contributing more to the measured composition than others. In turn, this introduces errors in the determination of the composition of inhomogeneous samples. This paper discusses a combined Monte Carlo/analytical method for estimating the spatial response of a neutron gauge. Neutron propagation is handled using a Monte Carlo technique which allows an arbitrarily complex neutron source and gauge geometry to be specified. Gamma-ray production and detection is calculated analytically which leads to a dramatic increase in the efficiency of the method. As an example, the method is used to study ways of reducing the spatial sensitivity of on-belt composition measurements of cement raw meal

  10. The future of new calculation concepts in dosimetry based on the Monte Carlo Methods; Avenir des nouveaux concepts des calculs dosimetriques bases sur les methodes de Monte Carlo

    Energy Technology Data Exchange (ETDEWEB)

    Makovicka, L.; Vasseur, A.; Sauget, M.; Martin, E.; Gschwind, R.; Henriet, J. [Universite de Franche-Comte, Equipe IRMA/ENISYS/FEMTO-ST, UMR6174 CNRS, 25 - Montbeliard (France); Vasseur, A.; Sauget, M.; Martin, E.; Gschwind, R.; Henriet, J.; Salomon, M. [Universite de Franche-Comte, Equipe AND/LIFC, 90 - Belfort (France)

    2009-01-15

    Monte Carlo codes, precise but slow, are very important tools in the vast majority of specialities connected to Radiation Physics, Radiation Protection and Dosimetry. A discussion about some other computing solutions is carried out; solutions not only based on the enhancement of computer power, or on the 'biasing'used for relative acceleration of these codes (in the case of photons), but on more efficient methods (A.N.N. - artificial neural network, C.B.R. - case-based reasoning - or other computer science techniques) already and successfully used for a long time in other scientific or industrial applications and not only Radiation Protection or Medical Dosimetry. (authors)

  11. Confronting uncertainty in model-based geostatistics using Markov Chain Monte Carlo simulation

    NARCIS (Netherlands)

    Minasny, B.; Vrugt, J.A.; McBratney, A.B.

    2011-01-01

    This paper demonstrates for the first time the use of Markov Chain Monte Carlo (MCMC) simulation for parameter inference in model-based soil geostatistics. We implemented the recently developed DiffeRential Evolution Adaptive Metropolis (DREAM) algorithm to jointly summarize the posterior

  12. Construction of boundary-surface-based Chinese female astronaut computational phantom and proton dose estimation

    Science.gov (United States)

    Sun, Wenjuan; JIA, Xianghong; XIE, Tianwu; XU, Feng; LIU, Qian

    2013-01-01

    With the rapid development of China's space industry, the importance of radiation protection is increasingly prominent. To provide relevant dose data, we first developed the Visible Chinese Human adult Female (VCH-F) phantom, and performed further modifications to generate the VCH-F Astronaut (VCH-FA) phantom, incorporating statistical body characteristics data from the first batch of Chinese female astronauts as well as reference organ mass data from the International Commission on Radiological Protection (ICRP; both within 1% relative error). Based on cryosection images, the original phantom was constructed via Non-Uniform Rational B-Spline (NURBS) boundary surfaces to strengthen the deformability for fitting the body parameters of Chinese female astronauts. The VCH-FA phantom was voxelized at a resolution of 2 × 2 × 4 mm3for radioactive particle transport simulations from isotropic protons with energies of 5000–10 000 MeV in Monte Carlo N-Particle eXtended (MCNPX) code. To investigate discrepancies caused by anatomical variations and other factors, the obtained doses were compared with corresponding values from other phantoms and sex-averaged doses. Dose differences were observed among phantom calculation results, especially for effective dose with low-energy protons. Local skin thickness shifts the breast dose curve toward high energy, but has little impact on inner organs. Under a shielding layer, organ dose reduction is greater for skin than for other organs. The calculated skin dose per day closely approximates measurement data obtained in low-Earth orbit (LEO). PMID:23135158

  13. Convex-based void filling method for CAD-based Monte Carlo geometry modeling

    International Nuclear Information System (INIS)

    Yu, Shengpeng; Cheng, Mengyun; Song, Jing; Long, Pengcheng; Hu, Liqin

    2015-01-01

    Highlights: • We present a new void filling method named CVF for CAD based MC geometry modeling. • We describe convex based void description based and quality-based space subdivision. • The results showed improvements provided by CVF for both modeling and MC calculation efficiency. - Abstract: CAD based automatic geometry modeling tools have been widely applied to generate Monte Carlo (MC) calculation geometry for complex systems according to CAD models. Automatic void filling is one of the main functions in the CAD based MC geometry modeling tools, because the void space between parts in CAD models is traditionally not modeled while MC codes such as MCNP need all the problem space to be described. A dedicated void filling method, named Convex-based Void Filling (CVF), is proposed in this study for efficient void filling and concise void descriptions. The method subdivides all the problem space into disjointed regions using Quality based Subdivision (QS) and describes the void space in each region with complementary descriptions of the convex volumes intersecting with that region. It has been implemented in SuperMC/MCAM, the Multiple-Physics Coupling Analysis Modeling Program, and tested on International Thermonuclear Experimental Reactor (ITER) Alite model. The results showed that the new method reduced both automatic modeling time and MC calculation time

  14. New memory devices based on the proton transfer process

    International Nuclear Information System (INIS)

    Wierzbowska, Małgorzata

    2016-01-01

    Memory devices operating due to the fast proton transfer (PT) process are proposed by the means of first-principles calculations. Writing  information is performed using the electrostatic potential of scanning tunneling microscopy (STM). Reading information is based on the effect of the local magnetization induced at the zigzag graphene nanoribbon (Z-GNR) edge—saturated with oxygen or the hydroxy group—and can be realized with the use of giant magnetoresistance (GMR), a magnetic tunnel junction or spin-transfer torque devices. The energetic barriers for the hop forward and backward processes can be tuned by the distance and potential of the STM tip; this thus enables us to tailor the non-volatile logic states. The proposed system enables very dense packing of the logic cells and could be used in random access and flash memory devices. (paper)

  15. New memory devices based on the proton transfer process

    Science.gov (United States)

    Wierzbowska, Małgorzata

    2016-01-01

    Memory devices operating due to the fast proton transfer (PT) process are proposed by the means of first-principles calculations. Writing information is performed using the electrostatic potential of scanning tunneling microscopy (STM). Reading information is based on the effect of the local magnetization induced at the zigzag graphene nanoribbon (Z-GNR) edge—saturated with oxygen or the hydroxy group—and can be realized with the use of giant magnetoresistance (GMR), a magnetic tunnel junction or spin-transfer torque devices. The energetic barriers for the hop forward and backward processes can be tuned by the distance and potential of the STM tip; this thus enables us to tailor the non-volatile logic states. The proposed system enables very dense packing of the logic cells and could be used in random access and flash memory devices.

  16. Development of Monte Carlo-based pebble bed reactor fuel management code

    International Nuclear Information System (INIS)

    Setiadipura, Topan; Obara, Toru

    2014-01-01

    Highlights: • A new Monte Carlo-based fuel management code for OTTO cycle pebble bed reactor was developed. • The double-heterogeneity was modeled using statistical method in MVP-BURN code. • The code can perform analysis of equilibrium and non-equilibrium phase. • Code-to-code comparisons for Once-Through-Then-Out case were investigated. • Ability of the code to accommodate the void cavity was confirmed. - Abstract: A fuel management code for pebble bed reactors (PBRs) based on the Monte Carlo method has been developed in this study. The code, named Monte Carlo burnup analysis code for PBR (MCPBR), enables a simulation of the Once-Through-Then-Out (OTTO) cycle of a PBR from the running-in phase to the equilibrium condition. In MCPBR, a burnup calculation based on a continuous-energy Monte Carlo code, MVP-BURN, is coupled with an additional utility code to be able to simulate the OTTO cycle of PBR. MCPBR has several advantages in modeling PBRs, namely its Monte Carlo neutron transport modeling, its capability of explicitly modeling the double heterogeneity of the PBR core, and its ability to model different axial fuel speeds in the PBR core. Analysis at the equilibrium condition of the simplified PBR was used as the validation test of MCPBR. The calculation results of the code were compared with the results of diffusion-based fuel management PBR codes, namely the VSOP and PEBBED codes. Using JENDL-4.0 nuclide library, MCPBR gave a 4.15% and 3.32% lower k eff value compared to VSOP and PEBBED, respectively. While using JENDL-3.3, MCPBR gave a 2.22% and 3.11% higher k eff value compared to VSOP and PEBBED, respectively. The ability of MCPBR to analyze neutron transport in the top void of the PBR core and its effects was also confirmed

  17. Interplay effects in proton scanning for lung: a 4D Monte Carlo study assessing the impact of tumor and beam delivery parameters

    International Nuclear Information System (INIS)

    Dowdell, S; Grassberger, C; Sharp, G C; Paganetti, H

    2013-01-01

    Relative motion between a tumor and a scanning proton beam results in a degradation of the dose distribution (interplay effect). This study investigates the relationship between beam scanning parameters and the interplay effect, with the goal of finding parameters that minimize interplay. 4D Monte Carlo simulations of pencil beam scanning proton therapy treatments were performed using the 4DCT geometry of five lung cancer patients of varying tumor size (50.4–167.1 cc) and motion amplitude (2.9–30.1 mm). Treatments were planned assuming delivery in 35 × 2.5 Gy(RBE) fractions. The spot size, time to change the beam energy (τ es ), time required for magnet settling (τ ss ), initial breathing phase, spot spacing, scanning direction, scanning speed, beam current and patient breathing period were varied for each of the five patients. Simulations were performed for a single fraction and an approximation of conventional fractionation. For the patients considered, the interplay effect could not be predicted using the superior–inferior motion amplitude alone. Larger spot sizes (σ ∼ 9–16 mm) were less susceptible to interplay, giving an equivalent uniform dose (EUD) of 99.0 ± 4.4% (1 standard deviation) in a single fraction compared to 86.1 ± 13.1% for smaller spots (σ ∼ 2–4 mm). The smaller spot sizes gave EUD values as low as 65.3% of the prescription dose in a single fraction. Reducing the spot spacing improved the target dose homogeneity. The initial breathing phase can have a significant effect on the interplay, particularly for shorter delivery times. No clear benefit was evident when scanning either parallel or perpendicular to the predominant axis of motion. Longer breathing periods decreased the EUD. In general, longer delivery times led to lower interplay effects. Conventional fractionation showed significant improvement in terms of interplay, giving a EUD of at least 84.7% and 100.0% of the prescription dose for the small and larger spot sizes

  18. Interplay effects in proton scanning for lung: a 4D Monte Carlo study assessing the impact of tumor and beam delivery parameters.

    Science.gov (United States)

    Dowdell, S; Grassberger, C; Sharp, G C; Paganetti, H

    2013-06-21

    Relative motion between a tumor and a scanning proton beam results in a degradation of the dose distribution (interplay effect). This study investigates the relationship between beam scanning parameters and the interplay effect, with the goal of finding parameters that minimize interplay. 4D Monte Carlo simulations of pencil beam scanning proton therapy treatments were performed using the 4DCT geometry of five lung cancer patients of varying tumor size (50.4-167.1 cc) and motion amplitude (2.9-30.1 mm). Treatments were planned assuming delivery in 35 × 2.5 Gy(RBE) fractions. The spot size, time to change the beam energy (τes), time required for magnet settling (τss), initial breathing phase, spot spacing, scanning direction, scanning speed, beam current and patient breathing period were varied for each of the five patients. Simulations were performed for a single fraction and an approximation of conventional fractionation. For the patients considered, the interplay effect could not be predicted using the superior-inferior motion amplitude alone. Larger spot sizes (σ ~ 9-16 mm) were less susceptible to interplay, giving an equivalent uniform dose (EUD) of 99.0 ± 4.4% (1 standard deviation) in a single fraction compared to 86.1 ± 13.1% for smaller spots (σ ~ 2-4 mm). The smaller spot sizes gave EUD values as low as 65.3% of the prescription dose in a single fraction. Reducing the spot spacing improved the target dose homogeneity. The initial breathing phase can have a significant effect on the interplay, particularly for shorter delivery times. No clear benefit was evident when scanning either parallel or perpendicular to the predominant axis of motion. Longer breathing periods decreased the EUD. In general, longer delivery times led to lower interplay effects. Conventional fractionation showed significant improvement in terms of interplay, giving a EUD of at least 84.7% and 100.0% of the prescription dose for the small and larger spot sizes respectively. The

  19. Accelerated Monte Carlo system reliability analysis through machine-learning-based surrogate models of network connectivity

    International Nuclear Information System (INIS)

    Stern, R.E.; Song, J.; Work, D.B.

    2017-01-01

    The two-terminal reliability problem in system reliability analysis is known to be computationally intractable for large infrastructure graphs. Monte Carlo techniques can estimate the probability of a disconnection between two points in a network by selecting a representative sample of network component failure realizations and determining the source-terminal connectivity of each realization. To reduce the runtime required for the Monte Carlo approximation, this article proposes an approximate framework in which the connectivity check of each sample is estimated using a machine-learning-based classifier. The framework is implemented using both a support vector machine (SVM) and a logistic regression based surrogate model. Numerical experiments are performed on the California gas distribution network using the epicenter and magnitude of the 1989 Loma Prieta earthquake as well as randomly-generated earthquakes. It is shown that the SVM and logistic regression surrogate models are able to predict network connectivity with accuracies of 99% for both methods, and are 1–2 orders of magnitude faster than using a Monte Carlo method with an exact connectivity check. - Highlights: • Surrogate models of network connectivity are developed by machine-learning algorithms. • Developed surrogate models can reduce the runtime required for Monte Carlo simulations. • Support vector machine and logistic regressions are employed to develop surrogate models. • Numerical example of California gas distribution network demonstrate the proposed approach. • The developed models have accuracies 99%, and are 1–2 orders of magnitude faster than MCS.

  20. Energy dependent track structure parametrizations for protons and carbon ions based on nano-metric simulations

    International Nuclear Information System (INIS)

    Frauke, A.; Wilkens, J.J.; Villagrasa, C.; Rabus, H.

    2015-01-01

    The BioQuaRT project within the European Metrology Research Programme aims at correlating ion track structure characteristics with the biological effects of radiation and develops measurement and simulation techniques for determining ion track structure on different length scales from about 2 nm to about 10 μm. Within this framework, we investigate methods to translate track-structure quantities derived on a nanometer scale to macroscopic dimensions. Input data sets were generated by simulations of ion tracks of protons and carbon ions in liquid water using the Geant-4 Monte Carlo tool-kit with the Geant-4-DNA processes. Based on the energy transfer points - recorded with nanometer resolution - we investigated parametrizations of overall properties of ion track structure. Three different track structure parametrizations have been developed using the distances to the 10 next neighbouring ionizations, the radial energy distribution and ionisation cluster size distributions. These parametrizations of nanometer-scale track structure build a basis for deriving biologically relevant mean values which are essential in the clinical situation where each voxel is exposed to a mixed radiation field. (authors)

  1. Four-dimensional Monte Carlo simulations demonstrating how the extent of intensity-modulation impacts motion effects in proton therapy lung treatments

    International Nuclear Information System (INIS)

    Dowdell, Stephen; Paganetti, Harald; Grassberger, Clemens

    2013-01-01

    Purpose: To compare motion effects in intensity modulated proton therapy (IMPT) lung treatments with different levels of intensity modulation.Methods: Spot scanning IMPT treatment plans were generated for ten lung cancer patients for 2.5Gy(RBE) and 12Gy(RBE) fractions and two distinct energy-dependent spot sizes (σ∼8–17 mm and ∼2–4 mm). IMPT plans were generated with the target homogeneity of each individual field restricted to 20% ). These plans were compared to full IMPT (IMPT full ), which had no restriction on the single field homogeneity. 4D Monte Carlo simulations were performed upon the patient 4DCT geometry, including deformable image registration and incorporating the detailed timing structure of the proton delivery system. Motion effects were quantified via comparison of the results of the 4D simulations (4D-IMPT 20% , 4D-IMPT full ) with those of a 3D Monte Carlo simulation (3D-IMPT 20% , 3D-IMPT full ) upon the planning CT using the equivalent uniform dose (EUD), V 95 and D 1 -D 99 . The effects in normal lung were quantified using mean lung dose (MLD) and V 90% .Results: For 2.5Gy(RBE), the mean EUD for the large spot size is 99.9%± 2.8% for 4D-IMPT 20% compared to 100.1%± 2.9% for 4D-IMPT full . The corresponding values are 88.6%± 8.7% (4D-IMPT 20% ) and 91.0%± 9.3% (4D-IMPT full ) for the smaller spot size. The EUD value is higher in 69.7% of the considered deliveries for 4D-IMPT full . The V 95 is also higher in 74.7% of the plans for 4D-IMPT full , implying that IMPT full plans experience less underdose compared to IMPT 20% . However, the target dose homogeneity is improved in the majority (67.8%) of plans for 4D-IMPT 20% . The higher EUD and V 95 suggests that the degraded homogeneity in IMPT full is actually due to the introduction of hot spots in the target volume, perhaps resulting from the sharper in-target dose gradients. The greatest variations between the IMPT 20% and IMPT full deliveries are observed for patients with the

  2. Abstract ID: 240 A probabilistic-based nuclear reaction model for Monte Carlo ion transport in particle therapy.

    Science.gov (United States)

    Maria Jose, Gonzalez Torres; Jürgen, Henniger

    2018-01-01

    In order to expand the Monte Carlo transport program AMOS to particle therapy applications, the ion module is being developed in the radiation physics group (ASP) at the TU Dresden. This module simulates the three main interactions of ions in matter for the therapy energy range: elastic scattering, inelastic collisions and nuclear reactions. The simulation of the elastic scattering is based on the Binary Collision Approximation and the inelastic collisions on the Bethe-Bloch theory. The nuclear reactions, which are the focus of the module, are implemented according to a probabilistic-based model developed in the group. The developed model uses probability density functions to sample the occurrence of a nuclear reaction given the initial energy of the projectile particle as well as the energy at which this reaction will take place. The particle is transported until the reaction energy is reached and then the nuclear reaction is simulated. This approach allows a fast evaluation of the nuclear reactions. The theory and application of the proposed model will be addressed in this presentation. The results of the simulation of a proton beam colliding with tissue will also be presented. Copyright © 2017.

  3. Proton diffraction

    International Nuclear Information System (INIS)

    Den Besten, J.L.; Jamieson, D.N.; Allen, L.J.

    1998-01-01

    The Lindhard theory on ion channeling in crystals has been widely accepted throughout ion beam analysis for use in simulating such experiments. The simulations use a Monte Carlo method developed by Barret, which utilises the classical 'billiard ball' theory of ions 'bouncing' between planes or tubes of atoms in the crystal. This theory is not valid for 'thin' crystals where the planes or strings of atoms can no longer be assumed to be of infinite proportions. We propose that a theory similar to that used for high energy electron diffraction can be applied to MeV ions, especially protons, in thin crystals to simulate the intensities of transmission channeling and of RBS spectra. The diffraction theory is based on a Bloch wave solution of the Schroedinger equation for an ion passing through the periodic crystal potential. The widely used universal potential for proton-nucleus scattering is used to construct the crystal potential. Absorption due to thermal diffuse scattering is included. Experimental parameters such as convergence angle, beam tilt and scanning directions are considered in our calculations. Comparison between theory and experiment is encouraging and suggests that further work is justified. (authors)

  4. SU-G-TeP1-06: Fast GPU Framework for Four-Dimensional Monte Carlo in Adaptive Intensity Modulated Proton Therapy (IMPT) for Mobile Tumors

    Energy Technology Data Exchange (ETDEWEB)

    Botas, P [Massachusetts General Hospital and Harvard Medical School, Boston, MA (United States); Heidelberg University, Heidelberg (Germany); Grassberger, C; Sharp, G; Paganetti, H [Massachusetts General Hospital and Harvard Medical School, Boston, MA (United States); Qin, N; Jia, X; Jiang, S [UT Southwestern Medical Center, Dallas, TX (United States)

    2016-06-15

    Purpose: To demonstrate the feasibility of fast Monte Carlo (MC) treatment planning and verification using four-dimensional CT (4DCT) for adaptive IMPT for lung cancer patients. Methods: A validated GPU MC code, gPMC, has been linked to the patient database at our institution and employed to compute the dose-influence matrices (Dij) on the planning CT (pCT). The pCT is an average of the respiratory motion of the patient. The Dijs and patient structures were fed to the optimizer to calculate a treatment plan. To validate the plan against motion, a 4D dose distribution averaged over the possible starting phases is calculated using the 4DCT and a model of the time structure of the delivered spot map. The dose is accumulated using vector maps created by a GPU-accelerated deformable image registration program (DIR) from each phase of the 4DCT to the reference phase using the B-spline method. Calculation of the Dij matrices and the DIR are performed on a cluster, with each field and vector map calculated in parallel. Results: The Dij production takes ∼3.5s per beamlet for 10e6 protons, depending on the energy and the CT size. Generating a plan with 4D simulation of 1000 spots in 4 fields takes approximately 1h. To test the framework, IMPT plans for 10 lung cancer patients were generated for validation. Differences between the planned and the delivered dose of 19% in dose to some organs at risk and 1.4/21.1% in target mean dose/homogeneity with respect to the plan were observed, suggesting potential for improvement if adaptation is considered. Conclusion: A fast MC treatment planning framework has been developed that allows reliable plan design and verification for mobile targets and adaptation of treatment plans. This will significantly impact treatments for lung tumors, as 4D-MC dose calculations can now become part of planning strategies.

  5. Configuration and validation of an analytical model predicting secondary neutron radiation in proton therapy using Monte Carlo simulations and experimental measurements.

    Science.gov (United States)

    Farah, J; Bonfrate, A; De Marzi, L; De Oliveira, A; Delacroix, S; Martinetti, F; Trompier, F; Clairand, I

    2015-05-01

    This study focuses on the configuration and validation of an analytical model predicting leakage neutron doses in proton therapy. Using Monte Carlo (MC) calculations, a facility-specific analytical model was built to reproduce out-of-field neutron doses while separately accounting for the contribution of intra-nuclear cascade, evaporation, epithermal and thermal neutrons. This model was first trained to reproduce in-water neutron absorbed doses and in-air neutron ambient dose equivalents, H*(10), calculated using MCNPX. Its capacity in predicting out-of-field doses at any position not involved in the training phase was also checked. The model was next expanded to enable a full 3D mapping of H*(10) inside the treatment room, tested in a clinically relevant configuration and finally consolidated with experimental measurements. Following the literature approach, the work first proved that it is possible to build a facility-specific analytical model that efficiently reproduces in-water neutron doses and in-air H*(10) values with a maximum difference less than 25%. In addition, the analytical model succeeded in predicting out-of-field neutron doses in the lateral and vertical direction. Testing the analytical model in clinical configurations proved the need to separate the contribution of internal and external neutrons. The impact of modulation width on stray neutrons was found to be easily adjustable while beam collimation remains a challenging issue. Finally, the model performance agreed with experimental measurements with satisfactory results considering measurement and simulation uncertainties. Analytical models represent a promising solution that substitutes for time-consuming MC calculations when assessing doses to healthy organs. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  6. Proton beam micromachining on strippable aqueous base developable negative resist

    International Nuclear Information System (INIS)

    Rajta, I.; Uzonyi, I.; Baradacs, E.; Chatzichristidi, M.; Raptis, I.; Valamontes, E.S.

    2004-01-01

    Complete text of publication follows. Proton Beam Micromachining (PBM, also known as P-beam writing), a novel direct- write process for the production of 3D microstructures, can be used to make multilevel structures in a single layer of resist by varying the ion energy. The interaction between the bombarding ions and the target material is mainly ionization, and very few ions suffer high angle nuclear collisions, therefore structures made with PBM have smooth near vertical side walls. The most commony applied resists in PBM are the positive, conventional, polymethyl methacrylate (PMMA); and the negative, chemically amplified, SU-8 (Micro Chem Corp). SU-8 is an epoxy based resist suitable also for LIGA and UV-LIGA processes, it offers good sensitivity, good process latitude, very high aspect ratio and therefore it dominates in the high aspect ratio micromachining applications. SU-8 requires 30 nC/mm 2 fluence for PBM irradiations at 2 MeV protons. Its crosslinking chemistry is based on the eight epoxy rings in the polymer chain, which provide a very dense three dimensional network in the presence of suitably activated photo acid generators (PAGs) which is very difficult to be stripped away after development. Thus, stripping has to be assisted with plasma processes or with special liquid removers. Moreover, the SU-8 developer is organic, propylene glycol methyl ether acetate (PGMEA), and thus environmentally non-friendly. To overcome the SU-8 stripping limitations, design of a negative resist system where solubility change is not based solely on cross- linking but also on the differentiation of hydrophilicity between exposed and non-exposed areas is desirable. A new resist formulation, fulfilling the above specifications has been developed recently [1]. This formulation is based on a specific grade epoxy novolac (EP) polymer, a partially hydrogenated poly-4-hydroxy styrene (PHS) polymer, and an onium salt as photoacid generator (PAG), and has been successfully

  7. A measurement-based generalized source model for Monte Carlo dose simulations of CT scans.

    Science.gov (United States)

    Ming, Xin; Feng, Yuanming; Liu, Ransheng; Yang, Chengwen; Zhou, Li; Zhai, Hezheng; Deng, Jun

    2017-03-07

    The goal of this study is to develop a generalized source model for accurate Monte Carlo dose simulations of CT scans based solely on the measurement data without a priori knowledge of scanner specifications. The proposed generalized source model consists of an extended circular source located at x-ray target level with its energy spectrum, source distribution and fluence distribution derived from a set of measurement data conveniently available in the clinic. Specifically, the central axis percent depth dose (PDD) curves measured in water and the cone output factors measured in air were used to derive the energy spectrum and the source distribution respectively with a Levenberg-Marquardt algorithm. The in-air film measurement of fan-beam dose profiles at fixed gantry was back-projected to generate the fluence distribution of the source model. A benchmarked Monte Carlo user code was used to simulate the dose distributions in water with the developed source model as beam input. The feasibility and accuracy of the proposed source model was tested on a GE LightSpeed and a Philips Brilliance Big Bore multi-detector CT (MDCT) scanners available in our clinic. In general, the Monte Carlo simulations of the PDDs in water and dose profiles along lateral and longitudinal directions agreed with the measurements within 4%/1 mm for both CT scanners. The absolute dose comparison using two CTDI phantoms (16 cm and 32 cm in diameters) indicated a better than 5% agreement between the Monte Carlo-simulated and the ion chamber-measured doses at a variety of locations for the two scanners. Overall, this study demonstrated that a generalized source model can be constructed based only on a set of measurement data and used for accurate Monte Carlo dose simulations of patients' CT scans, which would facilitate patient-specific CT organ dose estimation and cancer risk management in the diagnostic and therapeutic radiology.

  8. A measurement-based generalized source model for Monte Carlo dose simulations of CT scans

    Science.gov (United States)

    Ming, Xin; Feng, Yuanming; Liu, Ransheng; Yang, Chengwen; Zhou, Li; Zhai, Hezheng; Deng, Jun

    2017-03-01

    The goal of this study is to develop a generalized source model for accurate Monte Carlo dose simulations of CT scans based solely on the measurement data without a priori knowledge of scanner specifications. The proposed generalized source model consists of an extended circular source located at x-ray target level with its energy spectrum, source distribution and fluence distribution derived from a set of measurement data conveniently available in the clinic. Specifically, the central axis percent depth dose (PDD) curves measured in water and the cone output factors measured in air were used to derive the energy spectrum and the source distribution respectively with a Levenberg-Marquardt algorithm. The in-air film measurement of fan-beam dose profiles at fixed gantry was back-projected to generate the fluence distribution of the source model. A benchmarked Monte Carlo user code was used to simulate the dose distributions in water with the developed source model as beam input. The feasibility and accuracy of the proposed source model was tested on a GE LightSpeed and a Philips Brilliance Big Bore multi-detector CT (MDCT) scanners available in our clinic. In general, the Monte Carlo simulations of the PDDs in water and dose profiles along lateral and longitudinal directions agreed with the measurements within 4%/1 mm for both CT scanners. The absolute dose comparison using two CTDI phantoms (16 cm and 32 cm in diameters) indicated a better than 5% agreement between the Monte Carlo-simulated and the ion chamber-measured doses at a variety of locations for the two scanners. Overall, this study demonstrated that a generalized source model can be constructed based only on a set of measurement data and used for accurate Monte Carlo dose simulations of patients’ CT scans, which would facilitate patient-specific CT organ dose estimation and cancer risk management in the diagnostic and therapeutic radiology.

  9. Research on reactor physics analysis method based on Monte Carlo homogenization

    International Nuclear Information System (INIS)

    Ye Zhimin; Zhang Peng

    2014-01-01

    In order to meet the demand of nuclear energy market in the future, many new concepts of nuclear energy systems has been put forward. The traditional deterministic neutronics analysis method has been challenged in two aspects: one is the ability of generic geometry processing; the other is the multi-spectrum applicability of the multigroup cross section libraries. Due to its strong geometry modeling capability and the application of continuous energy cross section libraries, the Monte Carlo method has been widely used in reactor physics calculations, and more and more researches on Monte Carlo method has been carried out. Neutronics-thermal hydraulics coupling analysis based on Monte Carlo method has been realized. However, it still faces the problems of long computation time and slow convergence which make it not applicable to the reactor core fuel management simulations. Drawn from the deterministic core analysis method, a new two-step core analysis scheme is proposed in this work. Firstly, Monte Carlo simulations are performed for assembly, and the assembly homogenized multi-group cross sections are tallied at the same time. Secondly, the core diffusion calculations can be done with these multigroup cross sections. The new scheme can achieve high efficiency while maintain acceptable precision, so it can be used as an effective tool for the design and analysis of innovative nuclear energy systems. Numeric tests have been done in this work to verify the new scheme. (authors)

  10. A Monte-Carlo study to assess the effect of 1.5 T magnetic fields on the overall robustness of pencil-beam scanning proton radiotherapy plans for prostate cancer

    Science.gov (United States)

    Kurz, Christopher; Landry, Guillaume; Resch, Andreas F.; Dedes, George; Kamp, Florian; Ganswindt, Ute; Belka, Claus; Raaymakers, Bas W.; Parodi, Katia

    2017-11-01

    Combining magnetic-resonance imaging (MRI) and proton therapy (PT) using pencil-beam scanning (PBS) may improve image-guided radiotherapy. We aimed at assessing the impact of a magnetic field on PBS-PT plan quality and robustness. Specifically, the robustness against anatomical changes and positioning errors in an MRI-guided scenario with a 30 cm radius 1.5 T magnetic field was studied for prostate PT. Five prostate cancer patients with three consecutive CT images (CT1-3) were considered. Single-field uniform dose PBS-PT plans were generated on the segmented CT1 with Monte-Carlo-based treatment planning software for inverse optimization. Plans were optimized at 90° gantry angle without B-field (no B), with  ±1.5 T B-field (B and minus B), as well as at 81° gantry angle and  +1.5 T (B G81). Plans were re-calculated on aligned CT2 and CT3 to study the impact of anatomical changes. Dose distributions were compared in terms of changes in DVH parameters, proton range and gamma-index pass-rates. To assess the impact of positioning errors, DVH parameters were compared for  ±5 mm CT1 patient shifts in anterior-posterior (AP) and left-right (LR) direction. Proton beam deflection considerably reduced robustness against inter-fractional changes for the B scenario. Range agreement, gamma-index pass-rates and PTV V95% were significantly lower compared to no B. Improved robustness was obtained for minus B and B G81, the latter showing only minor differences to no B. The magnetic field introduced slight dosimetric changes under LR shifts. The impact of AP shifts was considerably larger, and equivalent for scenarios with and without B-field. Results suggest that robustness equivalent to PT without magnetic field can be achieved by adaptation of the treatment parameters, such as B-field orientation (minus B) with respect to the patient and/or gantry angle (B G81). MRI-guided PT for prostate cancer might thus be implemented without compromising robustness

  11. SU-F-J-57: Effectiveness of Daily CT-Based Three-Dimensional Image Guided and Adaptive Proton Therapy

    Energy Technology Data Exchange (ETDEWEB)

    Moriya, S [University of Tsukuba, Tsukuba, Ibaraki (Japan); National Cancer Center, Kashiwa, Chiba (Japan); Tachibana, H; Hotta, K; Baba, H; Kohno, R; Akimoto, T [National Cancer Center, Kashiwa, Chiba (Japan); Nakamura, N [National Cancer Center Hospital East, Kashiwa, Chiba (Japan); Miyakawa, S; Kurosawa, T [Komazawa University, Setagaya, Tokyo (Japan)

    2016-06-15

    Purpose: Daily CT-based three-dimensional image-guided and adaptive (CTIGRT-ART) proton therapy system was designed and developed. We also evaluated the effectiveness of the CTIGRT-ART. Methods: Retrospective analysis was performed in three lung cancer patients: Proton treatment planning was performed using CT image datasets acquired by Toshiba Aquilion ONE. Planning target volume and surrounding organs were contoured by a well-trained radiation oncologist. Dose distribution was optimized using 180-deg. and 270-deg. two fields in passive scattering proton therapy. Well commissioned Simplified Monte Carlo algorithm was used as dose calculation engine. Daily consecutive CT image datasets was acquired by an in-room CT (Toshiba Aquilion LB). In our in-house program, two image registrations for bone and tumor were performed to shift the isocenter using treatment CT image dataset. Subsequently, dose recalculation was performed after the shift of the isocenter. When the dose distribution after the tumor registration exhibits change of dosimetric parameter of CTV D90% compared to the initial plan, an additional process of was performed that the range shifter thickness was optimized. Dose distribution with CTV D90% for the bone registration, the tumor registration only and adaptive plan with the tumor registration was compared to the initial plan. Results: In the bone registration, tumor dose coverage was decreased by 16% on average (Maximum: 56%). The tumor registration shows better coverage than the bone registration, however the coverage was also decreased by 9% (Maximum: 22%) The adaptive plan shows similar dose coverage of the tumor (Average: 2%, Maximum: 7%). Conclusion: There is a high possibility that only image registration for bone and tumor may reduce tumor coverage. Thus, our proposed methodology of image guidance and adaptive planning using the range adaptation after tumor registration would be effective for proton therapy. This research is partially supported

  12. Development of a space radiation Monte Carlo computer simulation based on the FLUKA and ROOT codes

    CERN Document Server

    Pinsky, L; Ferrari, A; Sala, P; Carminati, F; Brun, R

    2001-01-01

    This NASA funded project is proceeding to develop a Monte Carlo-based computer simulation of the radiation environment in space. With actual funding only initially in place at the end of May 2000, the study is still in the early stage of development. The general tasks have been identified and personnel have been selected. The code to be assembled will be based upon two major existing software packages. The radiation transport simulation will be accomplished by updating the FLUKA Monte Carlo program, and the user interface will employ the ROOT software being developed at CERN. The end-product will be a Monte Carlo-based code which will complement the existing analytic codes such as BRYNTRN/HZETRN presently used by NASA to evaluate the effects of radiation shielding in space. The planned code will possess the ability to evaluate the radiation environment for spacecraft and habitats in Earth orbit, in interplanetary space, on the lunar surface, or on a planetary surface such as Mars. Furthermore, it will be usef...

  13. Molecular modeling of protonic acid doping of emeraldine base polyaniline for chemical sensors

    NARCIS (Netherlands)

    Chen, X.; Yuan, C.A.; Wong, C.K.Y.; Ye, H.; Leung, S.Y.Y.; Zhang, G.

    2012-01-01

    We proposed a molecular modeling methodology to study the protonic acid doping of emeraldine base polyaniline which can used in gas detection. The commercial forcefield COMPASS was used for the polymer and protonic acid molecules. The molecular model, which is capable of representing the polyaniline

  14. MBR Monte Carlo Simulation in PYTHIA8

    Science.gov (United States)

    Ciesielski, R.

    We present the MBR (Minimum Bias Rockefeller) Monte Carlo simulation of (anti)proton-proton interactions and its implementation in the PYTHIA8 event generator. We discuss the total, elastic, and total-inelastic cross sections, and three contributions from diffraction dissociation processes that contribute to the latter: single diffraction, double diffraction, and central diffraction or double-Pomeron exchange. The event generation follows a renormalized-Regge-theory model, successfully tested using CDF data. Based on the MBR-enhanced PYTHIA8 simulation, we present cross-section predictions for the LHC and beyond, up to collision energies of 50 TeV.

  15. Intrinsic fluorescence of protein in turbid media using empirical relation based on Monte Carlo lookup table

    Science.gov (United States)

    Einstein, Gnanatheepam; Udayakumar, Kanniyappan; Aruna, Prakasarao; Ganesan, Singaravelu

    2017-03-01

    Fluorescence of Protein has been widely used in diagnostic oncology for characterizing cellular metabolism. However, the intensity of fluorescence emission is affected due to the absorbers and scatterers in tissue, which may lead to error in estimating exact protein content in tissue. Extraction of intrinsic fluorescence from measured fluorescence has been achieved by different methods. Among them, Monte Carlo based method yields the highest accuracy for extracting intrinsic fluorescence. In this work, we have attempted to generate a lookup table for Monte Carlo simulation of fluorescence emission by protein. Furthermore, we fitted the generated lookup table using an empirical relation. The empirical relation between measured and intrinsic fluorescence is validated using tissue phantom experiments. The proposed relation can be used for estimating intrinsic fluorescence of protein for real-time diagnostic applications and thereby improving the clinical interpretation of fluorescence spectroscopic data.

  16. Visual improvement for bad handwriting based on Monte-Carlo method

    Science.gov (United States)

    Shi, Cao; Xiao, Jianguo; Xu, Canhui; Jia, Wenhua

    2014-03-01

    A visual improvement algorithm based on Monte Carlo simulation is proposed in this paper, in order to enhance visual effects for bad handwriting. The whole improvement process is to use well designed typeface so as to optimize bad handwriting image. In this process, a series of linear operators for image transformation are defined for transforming typeface image to approach handwriting image. And specific parameters of linear operators are estimated by Monte Carlo method. Visual improvement experiments illustrate that the proposed algorithm can effectively enhance visual effect for handwriting image as well as maintain the original handwriting features, such as tilt, stroke order and drawing direction etc. The proposed visual improvement algorithm, in this paper, has a huge potential to be applied in tablet computer and Mobile Internet, in order to improve user experience on handwriting.

  17. ERSN-OpenMC, a Java-based GUI for OpenMC Monte Carlo code

    Directory of Open Access Journals (Sweden)

    Jaafar EL Bakkali

    2016-07-01

    Full Text Available OpenMC is a new Monte Carlo transport particle simulation code focused on solving two types of neutronic problems mainly the k-eigenvalue criticality fission source problems and external fixed fission source problems. OpenMC does not have any Graphical User Interface and the creation of one is provided by our java-based application named ERSN-OpenMC. The main feature of this application is to provide to the users an easy-to-use and flexible graphical interface to build better and faster simulations, with less effort and great reliability. Additionally, this graphical tool was developed with several features, as the ability to automate the building process of OpenMC code and related libraries as well as the users are given the freedom to customize their installation of this Monte Carlo code. A full description of the ERSN-OpenMC application is presented in this paper.

  18. Fault Risk Assessment of Underwater Vehicle Steering System Based on Virtual Prototyping and Monte Carlo Simulation

    Directory of Open Access Journals (Sweden)

    He Deyu

    2016-09-01

    Full Text Available Assessing the risks of steering system faults in underwater vehicles is a human-machine-environment (HME systematic safety field that studies faults in the steering system itself, the driver’s human reliability (HR and various environmental conditions. This paper proposed a fault risk assessment method for an underwater vehicle steering system based on virtual prototyping and Monte Carlo simulation. A virtual steering system prototype was established and validated to rectify a lack of historic fault data. Fault injection and simulation were conducted to acquire fault simulation data. A Monte Carlo simulation was adopted that integrated randomness due to the human operator and environment. Randomness and uncertainty of the human, machine and environment were integrated in the method to obtain a probabilistic risk indicator. To verify the proposed method, a case of stuck rudder fault (SRF risk assessment was studied. This method may provide a novel solution for fault risk assessment of a vehicle or other general HME system.

  19. Present status of transport code development based on Monte Carlo method

    International Nuclear Information System (INIS)

    Nakagawa, Masayuki

    1985-01-01

    The present status of development in Monte Carlo code is briefly reviewed. The main items are the followings; Application fields, Methods used in Monte Carlo code (geometry spectification, nuclear data, estimator and variance reduction technique) and unfinished works, Typical Monte Carlo codes and Merits of continuous energy Monte Carlo code. (author)

  20. Using a knowledge-based planning solution to select patients for proton therapy.

    Science.gov (United States)

    Delaney, Alexander R; Dahele, Max; Tol, Jim P; Kuijper, Ingrid T; Slotman, Ben J; Verbakel, Wilko F A R

    2017-08-01

    Patient selection for proton therapy by comparing proton/photon treatment plans is time-consuming and prone to bias. RapidPlan™, a knowledge-based-planning solution, uses plan-libraries to model and predict organ-at-risk (OAR) dose-volume-histograms (DVHs). We investigated whether RapidPlan, utilizing an algorithm based only on photon beam characteristics, could generate proton DVH-predictions and whether these could correctly identify patients for proton therapy. Model PROT and Model PHOT comprised 30 head-and-neck cancer proton and photon plans, respectively. Proton and photon knowledge-based-plans (KBPs) were made for ten evaluation-patients. DVH-prediction accuracy was analyzed by comparing predicted-vs-achieved mean OAR doses. KBPs and manual plans were compared using salivary gland and swallowing muscle mean doses. For illustration, patients were selected for protons if predicted Model PHOT mean dose minus predicted Model PROT mean dose (ΔPrediction) for combined OARs was ≥6Gy, and benchmarked using achieved KBP doses. Achieved and predicted Model PROT /Model PHOT mean dose R 2 was 0.95/0.98. Generally, achieved mean dose for Model PHOT /Model PROT KBPs was respectively lower/higher than predicted. Comparing Model PROT /Model PHOT KBPs with manual plans, salivary and swallowing mean doses increased/decreased by <2Gy, on average. ΔPrediction≥6Gy correctly selected 4 of 5 patients for protons. Knowledge-based DVH-predictions can provide efficient, patient-specific selection for protons. A proton-specific RapidPlan-solution could improve results. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Proton exchange in acid–base complexes induced by reaction coordinates with heavy atom motions

    International Nuclear Information System (INIS)

    Alavi, Saman; Taghikhani, Mahdi

    2012-01-01

    Highlights: ► Proton exchange in acid–base complexes is studied. ► The structures, binding energies, and normal mode vibrations are calculated. ► Transition state structures of proton exchange mechanism are determined. ► In the complexes studied, the reaction coordinate involves heavy atom rocking. ► The reaction coordinate is not simply localized in the proton movements. - Abstract: We extend previous work on nitric acid–ammonia and nitric acid–alkylamine complexes to illustrate that proton exchange reaction coordinates involve the rocking motion of the base moiety in many double hydrogen-bonded gas phase strong acid–strong base complexes. The complexes studied involve the biologically and atmospherically relevant glycine, formic, acetic, propionic, and sulfuric acids with ammonia/alkylamine bases. In these complexes, the magnitude of the imaginary frequencies associated with the proton exchange transition states are −1 . This contrasts with widely studied proton exchange reactions between symmetric carboxylic acid dimers or asymmetric DNA base pair and their analogs where the reaction coordinate is localized in proton motions and the magnitude of the imaginary frequencies for the transition states are >1100 cm −1 . Calculations on complexes of these acids with water are performed for comparison. Variations of normal vibration modes along the reaction coordinate in the complexes are described.

  2. Continuous energy Monte Carlo calculations for randomly distributed spherical fuels based on statistical geometry model

    Energy Technology Data Exchange (ETDEWEB)

    Murata, Isao [Osaka Univ., Suita (Japan); Mori, Takamasa; Nakagawa, Masayuki; Itakura, Hirofumi

    1996-03-01

    The method to calculate neutronics parameters of a core composed of randomly distributed spherical fuels has been developed based on a statistical geometry model with a continuous energy Monte Carlo method. This method was implemented in a general purpose Monte Carlo code MCNP, and a new code MCNP-CFP had been developed. This paper describes the model and method how to use it and the validation results. In the Monte Carlo calculation, the location of a spherical fuel is sampled probabilistically along the particle flight path from the spatial probability distribution of spherical fuels, called nearest neighbor distribution (NND). This sampling method was validated through the following two comparisons: (1) Calculations of inventory of coated fuel particles (CFPs) in a fuel compact by both track length estimator and direct evaluation method, and (2) Criticality calculations for ordered packed geometries. This method was also confined by applying to an analysis of the critical assembly experiment at VHTRC. The method established in the present study is quite unique so as to a probabilistic model of the geometry with a great number of spherical fuels distributed randomly. Realizing the speed-up by vector or parallel computations in future, it is expected to be widely used in calculation of a nuclear reactor core, especially HTGR cores. (author).

  3. Monte Carlo-based investigation of water-equivalence of solid phantoms at 137Cs energy

    International Nuclear Information System (INIS)

    Vishwakarma, Ramkrushna S.; Palani Selvam, T.; Sahoo, Sridhar; Mishra, Subhalaxmi; Chourasiya, Ghanshyam

    2013-01-01

    Investigation of solid phantom materials such as solid water, virtual water, plastic water, RW1, polystyrene, and polymethylmethacrylate (PMMA) for their equivalence to liquid water at 137 Cs energy (photon energy of 662 keV) under full scatter conditions is carried out using the EGSnrc Monte Carlo code system. Monte Carlo-based EGSnrc code system was used in the work to calculate distance-dependent phantom scatter corrections. The study also includes separation of primary and scattered dose components. Monte Carlo simulations are carried out using primary particle histories up to 5 x 10 9 to attain less than 0.3% statistical uncertainties in the estimation of dose. Water equivalence of various solid phantoms such as solid water, virtual water, RW1, PMMA, polystyrene, and plastic water materials are investigated at 137 Cs energy under full scatter conditions. The investigation reveals that solid water, virtual water, and RW1 phantoms are water equivalent up to 15 cm from the source. Phantom materials such as plastic water, PMMA, and polystyrene phantom materials are water equivalent up to 10 cm. At 15 cm from the source, the phantom scatter corrections are 1.035, 1.050, and 0.949 for the phantoms PMMA, plastic water, and polystyrene, respectively. (author)

  4. Experimental observation of acoustic emissions generated by a pulsed proton beam from a hospital-based clinical cyclotron

    International Nuclear Information System (INIS)

    Jones, Kevin C.; Solberg, Timothy D.; Avery, Stephen; Vander Stappen, François; Janssens, Guillaume; Prieels, Damien; Bawiec, Christopher R.; Lewin, Peter A.; Sehgal, Chandra M.

    2015-01-01

    Purpose: To measure the acoustic signal generated by a pulsed proton spill from a hospital-based clinical cyclotron. Methods: An electronic function generator modulated the IBA C230 isochronous cyclotron to create a pulsed proton beam. The acoustic emissions generated by the proton beam were measured in water using a hydrophone. The acoustic measurements were repeated with increasing proton current and increasing distance between detector and beam. Results: The cyclotron generated proton spills with rise times of 18 μs and a maximum measured instantaneous proton current of 790 nA. Acoustic emissions generated by the proton energy deposition were measured to be on the order of mPa. The origin of the acoustic wave was identified as the proton beam based on the correlation between acoustic emission arrival time and distance between the hydrophone and proton beam. The acoustic frequency spectrum peaked at 10 kHz, and the acoustic pressure amplitude increased monotonically with increasing proton current. Conclusions: The authors report the first observation of acoustic emissions generated by a proton beam from a hospital-based clinical cyclotron. When modulated by an electronic function generator, the cyclotron is capable of creating proton spills with fast rise times (18 μs) and high instantaneous currents (790 nA). Measurements of the proton-generated acoustic emissions in a clinical setting may provide a method for in vivo proton range verification and patient monitoring

  5. The future of new calculation concepts in dosimetry based on the Monte Carlo Methods

    International Nuclear Information System (INIS)

    Makovicka, L.; Vasseur, A.; Sauget, M.; Martin, E.; Gschwind, R.; Henriet, J.; Vasseur, A.; Sauget, M.; Martin, E.; Gschwind, R.; Henriet, J.; Salomon, M.

    2009-01-01

    Monte Carlo codes, precise but slow, are very important tools in the vast majority of specialities connected to Radiation Physics, Radiation Protection and Dosimetry. A discussion about some other computing solutions is carried out; solutions not only based on the enhancement of computer power, or on the 'biasing'used for relative acceleration of these codes (in the case of photons), but on more efficient methods (A.N.N. - artificial neural network, C.B.R. - case-based reasoning - or other computer science techniques) already and successfully used for a long time in other scientific or industrial applications and not only Radiation Protection or Medical Dosimetry. (authors)

  6. Fast GPU-based Monte Carlo simulations for LDR prostate brachytherapy

    Science.gov (United States)

    Bonenfant, Éric; Magnoux, Vincent; Hissoiny, Sami; Ozell, Benoît; Beaulieu, Luc; Després, Philippe

    2015-07-01

    The aim of this study was to evaluate the potential of bGPUMCD, a Monte Carlo algorithm executed on Graphics Processing Units (GPUs), for fast dose calculations in permanent prostate implant dosimetry. It also aimed to validate a low dose rate brachytherapy source in terms of TG-43 metrics and to use this source to compute dose distributions for permanent prostate implant in very short times. The physics of bGPUMCD was reviewed and extended to include Rayleigh scattering and fluorescence from photoelectric interactions for all materials involved. The radial and anisotropy functions were obtained for the Nucletron SelectSeed in TG-43 conditions. These functions were compared to those found in the MD Anderson Imaging and Radiation Oncology Core brachytherapy source registry which are considered the TG-43 reference values. After appropriate calibration of the source, permanent prostate implant dose distributions were calculated for four patients and compared to an already validated Geant4 algorithm. The radial function calculated from bGPUMCD showed excellent agreement (differences within 1.3%) with TG-43 accepted values. The anisotropy functions at r = 1 cm and r = 4 cm were within 2% of TG-43 values for angles over 17.5°. For permanent prostate implants, Monte Carlo-based dose distributions with a statistical uncertainty of 1% or less for the target volume were obtained in 30 s or less for 1 × 1 × 1 mm3 calculation grids. Dosimetric indices were very similar (within 2.7%) to those obtained with a validated, independent Monte Carlo code (Geant4) performing the calculations for the same cases in a much longer time (tens of minutes to more than a hour). bGPUMCD is a promising code that lets envision the use of Monte Carlo techniques in a clinical environment, with sub-minute execution times on a standard workstation. Future work will explore the use of this code with an inverse planning method to provide a complete Monte Carlo-based planning solution.

  7. Proton therapy

    International Nuclear Information System (INIS)

    Smith, Alfred R

    2006-01-01

    Proton therapy has become a subject of considerable interest in the radiation oncology community and it is expected that there will be a substantial growth in proton treatment facilities during the next decade. I was asked to write a historical review of proton therapy based on my personal experiences, which have all occurred in the United States, so therefore I have a somewhat parochial point of view. Space requirements did not permit me to mention all of the existing proton therapy facilities or the names of all of those who have contributed to proton therapy. (review)

  8. Thermal transport in nanocrystalline Si and SiGe by ab initio based Monte Carlo simulation.

    Science.gov (United States)

    Yang, Lina; Minnich, Austin J

    2017-03-14

    Nanocrystalline thermoelectric materials based on Si have long been of interest because Si is earth-abundant, inexpensive, and non-toxic. However, a poor understanding of phonon grain boundary scattering and its effect on thermal conductivity has impeded efforts to improve the thermoelectric figure of merit. Here, we report an ab-initio based computational study of thermal transport in nanocrystalline Si-based materials using a variance-reduced Monte Carlo method with the full phonon dispersion and intrinsic lifetimes from first-principles as input. By fitting the transmission profile of grain boundaries, we obtain excellent agreement with experimental thermal conductivity of nanocrystalline Si [Wang et al. Nano Letters 11, 2206 (2011)]. Based on these calculations, we examine phonon transport in nanocrystalline SiGe alloys with ab-initio electron-phonon scattering rates. Our calculations show that low energy phonons still transport substantial amounts of heat in these materials, despite scattering by electron-phonon interactions, due to the high transmission of phonons at grain boundaries, and thus improvements in ZT are still possible by disrupting these modes. This work demonstrates the important insights into phonon transport that can be obtained using ab-initio based Monte Carlo simulations in complex nanostructured materials.

  9. SU-F-J-194: Development of Dose-Based Image Guided Proton Therapy Workflow

    Energy Technology Data Exchange (ETDEWEB)

    Pham, R; Sun, B; Zhao, T; Li, H; Yang, D; Grantham, K; Goddu, S; Santanam, L; Bradley, J; Mutic, S; Kandlakunta, P; Zhang, T [Washington University School of Medicine, Saint Louis, MO (United States)

    2016-06-15

    Purpose: To implement image-guided proton therapy (IGPT) based on daily proton dose distribution. Methods: Unlike x-ray therapy, simple alignment based on anatomy cannot ensure proper dose coverage in proton therapy. Anatomy changes along the beam path may lead to underdosing the target, or overdosing the organ-at-risk (OAR). With an in-room mobile computed tomography (CT) system, we are developing a dose-based IGPT software tool that allows patient positioning and treatment adaption based on daily dose distributions. During an IGPT treatment, daily CT images are acquired in treatment position. After initial positioning based on rigid image registration, proton dose distribution is calculated on daily CT images. The target and OARs are automatically delineated via deformable image registration. Dose distributions are evaluated to decide if repositioning or plan adaptation is necessary in order to achieve proper coverage of the target and sparing of OARs. Besides online dose-based image guidance, the software tool can also map daily treatment doses to the treatment planning CT images for offline adaptive treatment. Results: An in-room helical CT system is commissioned for IGPT purposes. It produces accurate CT numbers that allow proton dose calculation. GPU-based deformable image registration algorithms are developed and evaluated for automatic ROI-delineation and dose mapping. The online and offline IGPT functionalities are evaluated with daily CT images of the proton patients. Conclusion: The online and offline IGPT software tool may improve the safety and quality of proton treatment by allowing dose-based IGPT and adaptive proton treatments. Research is partially supported by Mevion Medical Systems.

  10. Monte Carlo tests of the Rasch model based on scalability coefficients

    DEFF Research Database (Denmark)

    Christensen, Karl Bang; Kreiner, Svend

    2010-01-01

    that summarizes the number of Guttman errors in the data matrix. These coefficients are shown to yield efficient tests of the Rasch model using p-values computed using Markov chain Monte Carlo methods. The power of the tests of unequal item discrimination, and their ability to distinguish between local dependence......For item responses fitting the Rasch model, the assumptions underlying the Mokken model of double monotonicity are met. This makes non-parametric item response theory a natural starting-point for Rasch item analysis. This paper studies scalability coefficients based on Loevinger's H coefficient...

  11. Binocular optical axis parallelism detection precision analysis based on Monte Carlo method

    Science.gov (United States)

    Ying, Jiaju; Liu, Bingqi

    2018-02-01

    According to the working principle of the binocular photoelectric instrument optical axis parallelism digital calibration instrument, and in view of all components of the instrument, the various factors affect the system precision is analyzed, and then precision analysis model is established. Based on the error distribution, Monte Carlo method is used to analyze the relationship between the comprehensive error and the change of the center coordinate of the circle target image. The method can further guide the error distribution, optimize control the factors which have greater influence on the comprehensive error, and improve the measurement accuracy of the optical axis parallelism digital calibration instrument.

  12. Monte Carlo simulation based reliability evaluation in a multi-bilateral contracts market

    International Nuclear Information System (INIS)

    Goel, L.; Viswanath, P.A.; Wang, P.

    2004-01-01

    This paper presents a time sequential Monte Carlo simulation technique to evaluate customer load point reliability in multi-bilateral contracts market. The effects of bilateral transactions, reserve agreements, and the priority commitments of generating companies on customer load point reliability have been investigated. A generating company with bilateral contracts is modelled as an equivalent time varying multi-state generation (ETMG). A procedure to determine load point reliability based on ETMG has been developed. The developed procedure is applied to a reliability test system to illustrate the technique. Representing each bilateral contract by an ETMG provides flexibility in determining the reliability at various customer load points. (authors)

  13. CARMEN: a system Monte Carlo based on linear programming from direct openings

    International Nuclear Information System (INIS)

    Ureba, A.; Pereira-Barbeiro, A. R.; Jimenez-Ortega, E.; Baeza, J. A.; Salguero, F. J.; Leal, A.

    2013-01-01

    The use of Monte Carlo (MC) has shown an improvement in the accuracy of the calculation of the dose compared to other analytics algorithms installed on the systems of business planning, especially in the case of non-standard situations typical of complex techniques such as IMRT and VMAT. Our treatment planning system called CARMEN, is based on the complete simulation, both the beam transport in the head of the accelerator and the patient, and simulation designed for efficient operation in terms of the accuracy of the estimate and the required computation times. (Author)

  14. Monte: A compact and versatile multidetector system based on monolithic telescopes

    International Nuclear Information System (INIS)

    Amorini, F.; Bonanno, A.; Cardella, G.; Di Pietro, A.; Fallica, G.; Figuera, P.; Morea, A.; Musumarra, A.; Papa, M.; Pappalardo, G.; Pinto, A.; Rizzo, F.; Tian, W.; Tudisco, S.; Valvo, G.

    2005-01-01

    We present the characteristics of a new multidetector based on monolithic silicon telescopes: MONTE. By using high-energy ion implantation techniques, the ΔE and residual energy stages of such telescopes have been integrated on the same silicon chip, obtaining extremely thin ΔE stages of the order of 1μm. This allowed one to obtain a very low charge identification energy threshold and a very good β background suppression in reactions induced by radioactive ion beams. The multidetector has a modular structure and can be assembled in different geometrical configurations according to experimental needs

  15. A new anhydrous proton conductor based on polybenzimidazole and tridecyl phosphate

    International Nuclear Information System (INIS)

    Jiang Fengjing; Pu Hongting; Meyer, Wolfgang H.; Guan Yisi; Wan Decheng

    2008-01-01

    Most of the anhydrous proton conducting membranes are based on inorganic or partially inorganic materials, like SrCeO 3 membranes or polybenzimidazole (PBI)/H 3 PO 4 composite membranes. In present work, a new kind of anhydrous proton conducting membrane based on fully organic components of PBI and tridecyl phosphate (TP) was prepared. The interaction between PBI and TP is discussed. The temperature dependence of the proton conductivity of the composite membranes can be modeled by an Arrhenius relation. Thermogravimetric analysis (TGA) illustrates that these composite membranes are chemically stable up to 145 deg. C. The weight loss appearing at 145 deg. C is attributed to the selfcondensation of phosphate, which results in the proton conductivity drop of the membranes occurring at the same temperature. The DC conductivity of the composite membranes can reach ∼10 -4 S/cm for PBI/1.8TP at 140 deg. C and increases with increasing TP content. The proton conductivity of PBI/TP and PBI/H 3 PO 4 composite membranes is compared. The former have higher proton conductivity, however, the proton conductivity of the PBI/H 3 PO 4 membranes increases with temperature more significantly. Compared with PBI/H 3 PO 4 membranes, the migration stability of TP in PBI/TP membranes is improved significantly

  16. PeneloPET, a Monte Carlo PET simulation tool based on PENELOPE: features and validation

    Energy Technology Data Exchange (ETDEWEB)

    Espana, S; Herraiz, J L; Vicente, E; Udias, J M [Grupo de Fisica Nuclear, Departmento de Fisica Atomica, Molecular y Nuclear, Universidad Complutense de Madrid, Madrid (Spain); Vaquero, J J; Desco, M [Unidad de Medicina y CirugIa Experimental, Hospital General Universitario Gregorio Maranon, Madrid (Spain)], E-mail: jose@nuc2.fis.ucm.es

    2009-03-21

    Monte Carlo simulations play an important role in positron emission tomography (PET) imaging, as an essential tool for the research and development of new scanners and for advanced image reconstruction. PeneloPET, a PET-dedicated Monte Carlo tool, is presented and validated in this work. PeneloPET is based on PENELOPE, a Monte Carlo code for the simulation of the transport in matter of electrons, positrons and photons, with energies from a few hundred eV to 1 GeV. PENELOPE is robust, fast and very accurate, but it may be unfriendly to people not acquainted with the FORTRAN programming language. PeneloPET is an easy-to-use application which allows comprehensive simulations of PET systems within PENELOPE. Complex and realistic simulations can be set by modifying a few simple input text files. Different levels of output data are available for analysis, from sinogram and lines-of-response (LORs) histogramming to fully detailed list mode. These data can be further exploited with the preferred programming language, including ROOT. PeneloPET simulates PET systems based on crystal array blocks coupled to photodetectors and allows the user to define radioactive sources, detectors, shielding and other parts of the scanner. The acquisition chain is simulated in high level detail; for instance, the electronic processing can include pile-up rejection mechanisms and time stamping of events, if desired. This paper describes PeneloPET and shows the results of extensive validations and comparisons of simulations against real measurements from commercial acquisition systems. PeneloPET is being extensively employed to improve the image quality of commercial PET systems and for the development of new ones.

  17. PeneloPET, a Monte Carlo PET simulation tool based on PENELOPE: features and validation

    International Nuclear Information System (INIS)

    Espana, S; Herraiz, J L; Vicente, E; Udias, J M; Vaquero, J J; Desco, M

    2009-01-01

    Monte Carlo simulations play an important role in positron emission tomography (PET) imaging, as an essential tool for the research and development of new scanners and for advanced image reconstruction. PeneloPET, a PET-dedicated Monte Carlo tool, is presented and validated in this work. PeneloPET is based on PENELOPE, a Monte Carlo code for the simulation of the transport in matter of electrons, positrons and photons, with energies from a few hundred eV to 1 GeV. PENELOPE is robust, fast and very accurate, but it may be unfriendly to people not acquainted with the FORTRAN programming language. PeneloPET is an easy-to-use application which allows comprehensive simulations of PET systems within PENELOPE. Complex and realistic simulations can be set by modifying a few simple input text files. Different levels of output data are available for analysis, from sinogram and lines-of-response (LORs) histogramming to fully detailed list mode. These data can be further exploited with the preferred programming language, including ROOT. PeneloPET simulates PET systems based on crystal array blocks coupled to photodetectors and allows the user to define radioactive sources, detectors, shielding and other parts of the scanner. The acquisition chain is simulated in high level detail; for instance, the electronic processing can include pile-up rejection mechanisms and time stamping of events, if desired. This paper describes PeneloPET and shows the results of extensive validations and comparisons of simulations against real measurements from commercial acquisition systems. PeneloPET is being extensively employed to improve the image quality of commercial PET systems and for the development of new ones.

  18. Monte Carlo based treatment planning for modulated electron beam radiation therapy

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Michael C. [Radiation Physics Division, Department of Radiation Oncology, Stanford University School of Medicine, Stanford, CA (United States)]. E-mail: mclee@reyes.stanford.edu; Deng Jun; Li Jinsheng; Jiang, Steve B.; Ma, C.-M. [Radiation Physics Division, Department of Radiation Oncology, Stanford University School of Medicine, Stanford, CA (United States)

    2001-08-01

    A Monte Carlo based treatment planning system for modulated electron radiation therapy (MERT) is presented. This new variation of intensity modulated radiation therapy (IMRT) utilizes an electron multileaf collimator (eMLC) to deliver non-uniform intensity maps at several electron energies. In this way, conformal dose distributions are delivered to irregular targets located a few centimetres below the surface while sparing deeper-lying normal anatomy. Planning for MERT begins with Monte Carlo generation of electron beamlets. Electrons are transported with proper in-air scattering and the dose is tallied in the phantom for each beamlet. An optimized beamlet plan may be calculated using inverse-planning methods. Step-and-shoot leaf sequences are generated for the intensity maps and dose distributions recalculated using Monte Carlo simulations. Here, scatter and leakage from the leaves are properly accounted for by transporting electrons through the eMLC geometry. The weights for the segments of the plan are re-optimized with the leaf positions fixed and bremsstrahlung leakage and electron scatter doses included. This optimization gives the final optimized plan. It is shown that a significant portion of the calculation time is spent transporting particles in the leaves. However, this is necessary since optimizing segment weights based on a model in which leaf transport is ignored results in an improperly optimized plan with overdosing of target and critical structures. A method of rapidly calculating the bremsstrahlung contribution is presented and shown to be an efficient solution to this problem. A homogeneous model target and a 2D breast plan are presented. The potential use of this tool in clinical planning is discussed. (author)

  19. Proton exchange membranes based on PVDF/SEBS blends

    Energy Technology Data Exchange (ETDEWEB)

    Mokrini, A.; Huneault, M.A. [Industrial Materials Institute, National Research Council of Canada, 75 de Mortagne Blvd., Boucherville, Que. (Canada J4B 6Y4)

    2006-03-09

    Proton-conductive polymer membranes are used as an electrolyte in the so-called proton exchange membrane fuel cells. Current commercially available membranes are perfluorosulfonic acid polymers, a class of high-cost ionomers. This paper examines the potential of polymer blends, namely those of styrene-(ethylene-butylene)-styrene block copolymer (SEBS) and polyvinylidene fluoride (PVDF), in the proton exchange membrane application. SEBS/PVDF blends were prepared by twin-screw extrusion and the membranes were formed by calendering. SEBS is a phase-segregated material where the polystyrene blocks can be selectively functionalized offering high ionic conductivity, while PVDF insures good dimensional stability and chemical resistance to the films. Proton conductivity of the films was obtained by solid-state grafting of sulfonic acid moieties. The obtained membranes were characterized in terms of conductivity, ionic exchange capacity and water uptake. In addition, the membranes were characterized in terms of morphology, microstructure and thermo-mechanical properties to establish the blends morphology-property relationships. Modification of interfacial properties between SEBS and PVDF was found to be a key to optimize the blends performance. Addition of a methyl methacrylate-butyl acrylate-methyl methacrylate block copolymer (MMA-BA-MMA) was found to compatibilize the blend by reducing the segregation scale and improving the blend homogeneity. Mechanical resistance of the membranes was also improved through the addition of this compatibilizer. As little as 2wt.% compatibilizer was sufficient for complete interfacial coverage and lead to improved mechanical properties. Compatibilized blend membranes also showed higher conductivities, 1.9x10{sup -2} to 5.5x10{sup -3}Scm{sup -1}, and improved water management. (author)

  20. Proton-conducting polymer electrolytes based on methacrylates

    Czech Academy of Sciences Publication Activity Database

    Reiter, Jakub; Velická, Jana; Míka, M.

    2008-01-01

    Roč. 53, č. 26 (2008), s. 7769-7774 ISSN 0013-4686 R&D Projects: GA ČR GA106/04/1279; GA AV ČR KJB400320701; GA MŠk LC523; GA ČR(CZ) GA104/06/1471 Institutional research plan: CEZ:AV0Z40320502 Keywords : polymer electrolyte * proton conductivity * phosporic acid Subject RIV: CA - Inorganic Chemistry Impact factor: 3.078, year: 2008

  1. Measurements and Monte Carlo calculations of forward-angle secondary-neutron-production cross-sections for 137 and 200 MeV proton-induced reactions in carbon

    Science.gov (United States)

    Iwamoto, Yosuke; Hagiwara, Masayuki; Matsumoto, Tetsuro; Masuda, Akihiko; Iwase, Hiroshi; Yashima, Hiroshi; Shima, Tatsushi; Tamii, Atsushi; Nakamura, Takashi

    2012-10-01

    Secondary neutron-production double-differential cross-sections (DDXs) have been measured from interactions of 137 MeV and 200 MeV protons in a natural carbon target. The data were measured between 0° and 25° in the laboratory. DDXs were obtained with high energy resolution in the energy region from 3 MeV up to the maximum energy. The experimental data of 137 MeV protons at 10° and 25° were in good agreement with that of 113 MeV protons at 7.5° and 30° at LANSCE/WNR in the energy region below 80 MeV. Benchmark calculations were carried out with the PHITS code using the evaluated nuclear data files of JENDL/HE-2007 and ENDF/B-VII, and the theoretical models of Bertini-GEM and ISOBAR-GEM. For the 137 MeV proton incidence, calculations using JENDL/HE-2007 generally reproduced the shape and the intensity of experimental spectra well including the ground state of the 12N state produced by the 12C(p,n)12N reaction. For the 200 MeV proton incidence, all calculated results underestimated the experimental data by the factor of two except for the calculated result using ISOBAR model. ISOBAR predicts the nucleon emission to the forward angles qualitatively better than the Bertini model. These experimental data will be useful to evaluate the carbon data and as benchmark data for investigating the validity of the Monte Carlo simulation for the shielding design of accelerator facilities.

  2. Detection of protonated non-Watson-Crick base pairs using electrospray ionization mass spectrometry.

    Science.gov (United States)

    Ishida, Riyoko; Iwahashi, Hideo

    2018-03-01

    Many studies have shown that protonated nucleic acid base pairs are involved in a wide variety of nucleic acid structures. However, little information is available on relative stability of hemiprotonated self- and non-self-dimers at monomer level. We used electrospray ionization mass spectrometry (ESI-MS) to evaluate the relative stability under various concentrations of hydrogen ion. These enable conjecture of the formation of protonated non-Watson-Crick base pairs based on DNA and RNA base sequence. In the present study, we observed that ESI-MS peaks corresponded to respective self-dimers for all examined nucleosides except for adenosine. Peak heights depended on the concentration of hydrogen ion. The ESI-MS peak heights of the hemiprotonated cytidine dimers and the hemiprotonated thymidine dimer sharply increased with increased concentration of hydrogen ion, suggesting direct participation of hydrogen ion in dimer formations. In ESI-MS measurements of the solutions containing adenosine, cytidine, thymidine and guanosine, we observed protonated cytidine-guanosine dimer (CH+-G) and protonated cytidine-thymidine dimer (CH+-T) in addition to hemiprotonated cytidine-cytidine dimer (CH+-C) with following relative peak height, (CH+-C) > (CH+-G) ≈ (CH+-T) > (CH+-A). Additionally, in the ESI-MS measurements of solutions containing adenosine, thymidine and guanosine, we observed a considerable amount of protonated adenosine-guanosine (AH+-G) and protonated adenosine-thymidine (AH+-T).

  3. The Quest for Evidence for Proton Therapy: Model-Based Approach and Precision Medicine

    Energy Technology Data Exchange (ETDEWEB)

    Widder, Joachim, E-mail: j.widder@umcg.nl [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schaaf, Arjen van der [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Lambin, Philippe [Department of Radiation Oncology, School for Oncology and Developmental Biology (GROW), Maastricht University Medical Center, Maastricht (Netherlands); Marijnen, Corrie A.M. [Department of Radiation Oncology, Leiden University Medical Center, Leiden (Netherlands); Pignol, Jean-Philippe [Department of Radiation Oncology, Erasmus Medical Center Cancer Institute, Rotterdam (Netherlands); Rasch, Coen R. [Department of Radiation Oncology, Academic Medical Center, Amsterdam (Netherlands); Slotman, Ben J. [Department of Radiation Oncology, VU Medical Center, Amsterdam (Netherlands); Verheij, Marcel [Department of Radiation Oncology, Netherlands Cancer Institute, Amsterdam (Netherlands); Langendijk, Johannes A. [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands)

    2016-05-01

    Purpose: Reducing dose to normal tissues is the advantage of protons versus photons. We aimed to describe a method for translating this reduction into a clinically relevant benefit. Methods and Materials: Dutch scientific and health care governance bodies have recently issued landmark reports regarding generation of relevant evidence for new technologies in health care including proton therapy. An approach based on normal tissue complication probability (NTCP) models has been adopted to select patients who are most likely to experience fewer (serious) adverse events achievable by state-of-the-art proton treatment. Results: By analogy with biologically targeted therapies, the technology needs to be tested in enriched cohorts of patients exhibiting the decisive predictive marker: difference in normal tissue dosimetric signatures between proton and photon treatment plans. Expected clinical benefit is then estimated by virtue of multifactorial NTCP models. In this sense, high-tech radiation therapy falls under precision medicine. As a consequence, randomizing nonenriched populations between photons and protons is predictably inefficient and likely to produce confusing results. Conclusions: Validating NTCP models in appropriately composed cohorts treated with protons should be the primary research agenda leading to urgently needed evidence for proton therapy.

  4. Interaction of 14 MeV neutrons with hydrogenated target proton emission calculation

    International Nuclear Information System (INIS)

    Martin, G.; Perez, N.; Desdin.

    1996-01-01

    Using neutron emission data of a 14 MeV neutron generator, a paraffin target, and based on the n + H 1 → n '+ p reaction, have been obtained the characteristics of the proton emission in a proton-neutron mixed field. It was used Monte Carlo simulation and it was obtained the proton output as function of the converter width and the energy spectrum of protons corresponding to different converter thickness. Among 0.07 and 0.2 cm there is a maximum zone for the proton emission. The energy spectrum agrees with obtained on previous papers. Figures showing these results are provided

  5. CARMEN: a system Monte Carlo based on linear programming from direct openings; CARMEN: Un sistema de planficiacion Monte Carlo basado en programacion lineal a partir de aberturas directas

    Energy Technology Data Exchange (ETDEWEB)

    Ureba, A.; Pereira-Barbeiro, A. R.; Jimenez-Ortega, E.; Baeza, J. A.; Salguero, F. J.; Leal, A.

    2013-07-01

    The use of Monte Carlo (MC) has shown an improvement in the accuracy of the calculation of the dose compared to other analytics algorithms installed on the systems of business planning, especially in the case of non-standard situations typical of complex techniques such as IMRT and VMAT. Our treatment planning system called CARMEN, is based on the complete simulation, both the beam transport in the head of the accelerator and the patient, and simulation designed for efficient operation in terms of the accuracy of the estimate and the required computation times. (Author)

  6. Monte Carlo simulation as a tool to predict blasting fragmentation based on the Kuz Ram model

    Science.gov (United States)

    Morin, Mario A.; Ficarazzo, Francesco

    2006-04-01

    Rock fragmentation is considered the most important aspect of production blasting because of its direct effects on the costs of drilling and blasting and on the economics of the subsequent operations of loading, hauling and crushing. Over the past three decades, significant progress has been made in the development of new technologies for blasting applications. These technologies include increasingly sophisticated computer models for blast design and blast performance prediction. Rock fragmentation depends on many variables such as rock mass properties, site geology, in situ fracturing and blasting parameters and as such has no complete theoretical solution for its prediction. However, empirical models for the estimation of size distribution of rock fragments have been developed. In this study, a blast fragmentation Monte Carlo-based simulator, based on the Kuz-Ram fragmentation model, has been developed to predict the entire fragmentation size distribution, taking into account intact and joints rock properties, the type and properties of explosives and the drilling pattern. Results produced by this simulator were quite favorable when compared with real fragmentation data obtained from a blast quarry. It is anticipated that the use of Monte Carlo simulation will increase our understanding of the effects of rock mass and explosive properties on the rock fragmentation by blasting, as well as increase our confidence in these empirical models. This understanding will translate into improvements in blasting operations, its corresponding costs and the overall economics of open pit mines and rock quarries.

  7. CAD-based Monte Carlo automatic modeling method based on primitive solid

    International Nuclear Information System (INIS)

    Wang, Dong; Song, Jing; Yu, Shengpeng; Long, Pengcheng; Wang, Yongliang

    2016-01-01

    Highlights: • We develop a method which bi-convert between CAD model and primitive solid. • This method was improved from convert method between CAD model and half space. • This method was test by ITER model and validated the correctness and efficiency. • This method was integrated in SuperMC which could model for SuperMC and Geant4. - Abstract: Monte Carlo method has been widely used in nuclear design and analysis, where geometries are described with primitive solids. However, it is time consuming and error prone to describe a primitive solid geometry, especially for a complicated model. To reuse the abundant existed CAD models and conveniently model with CAD modeling tools, an automatic modeling method for accurate prompt modeling between CAD model and primitive solid is needed. An automatic modeling method for Monte Carlo geometry described by primitive solid was developed which could bi-convert between CAD model and Monte Carlo geometry represented by primitive solids. While converting from CAD model to primitive solid model, the CAD model was decomposed into several convex solid sets, and then corresponding primitive solids were generated and exported. While converting from primitive solid model to the CAD model, the basic primitive solids were created and related operation was done. This method was integrated in the SuperMC and was benchmarked with ITER benchmark model. The correctness and efficiency of this method were demonstrated.

  8. GPU-based high performance Monte Carlo simulation in neutron transport

    Energy Technology Data Exchange (ETDEWEB)

    Heimlich, Adino; Mol, Antonio C.A.; Pereira, Claudio M.N.A. [Instituto de Engenharia Nuclear (IEN/CNEN-RJ), Rio de Janeiro, RJ (Brazil). Lab. de Inteligencia Artificial Aplicada], e-mail: cmnap@ien.gov.br

    2009-07-01

    Graphics Processing Units (GPU) are high performance co-processors intended, originally, to improve the use and quality of computer graphics applications. Since researchers and practitioners realized the potential of using GPU for general purpose, their application has been extended to other fields out of computer graphics scope. The main objective of this work is to evaluate the impact of using GPU in neutron transport simulation by Monte Carlo method. To accomplish that, GPU- and CPU-based (single and multicore) approaches were developed and applied to a simple, but time-consuming problem. Comparisons demonstrated that the GPU-based approach is about 15 times faster than a parallel 8-core CPU-based approach also developed in this work. (author)

  9. GPU-based high performance Monte Carlo simulation in neutron transport

    International Nuclear Information System (INIS)

    Heimlich, Adino; Mol, Antonio C.A.; Pereira, Claudio M.N.A.

    2009-01-01

    Graphics Processing Units (GPU) are high performance co-processors intended, originally, to improve the use and quality of computer graphics applications. Since researchers and practitioners realized the potential of using GPU for general purpose, their application has been extended to other fields out of computer graphics scope. The main objective of this work is to evaluate the impact of using GPU in neutron transport simulation by Monte Carlo method. To accomplish that, GPU- and CPU-based (single and multicore) approaches were developed and applied to a simple, but time-consuming problem. Comparisons demonstrated that the GPU-based approach is about 15 times faster than a parallel 8-core CPU-based approach also developed in this work. (author)

  10. Density-based Monte Carlo filter and its applications in nonlinear stochastic differential equation models.

    Science.gov (United States)

    Huang, Guanghui; Wan, Jianping; Chen, Hui

    2013-02-01

    Nonlinear stochastic differential equation models with unobservable state variables are now widely used in analysis of PK/PD data. Unobservable state variables are usually estimated with extended Kalman filter (EKF), and the unknown pharmacokinetic parameters are usually estimated by maximum likelihood estimator. However, EKF is inadequate for nonlinear PK/PD models, and MLE is known to be biased downwards. A density-based Monte Carlo filter (DMF) is proposed to estimate the unobservable state variables, and a simulation-based M estimator is proposed to estimate the unknown parameters in this paper, where a genetic algorithm is designed to search the optimal values of pharmacokinetic parameters. The performances of EKF and DMF are compared through simulations for discrete time and continuous time systems respectively, and it is found that the results based on DMF are more accurate than those given by EKF with respect to mean absolute error. Copyright © 2012 Elsevier Ltd. All rights reserved.

  11. An intense neutron generator based on a proton accelerator

    Energy Technology Data Exchange (ETDEWEB)

    Bartholomew, G A; Milton, J C.D.; Vogt, E W

    1964-07-01

    A study has been made of the demand for a neutron facility with a thermal flux of {>=} 10{sup 16} n cm{sup -2} sec{sup -1} and of possible methods of producing such fluxes with existing or presently developing technology. Experimental projects proposed by neutron users requiring high fluxes call for neutrons of all energies from thermal to 100 MeV with both continuous-wave and pulsed output. Consideration of the heat generated in the source per useful neutron liberated shows that the (p,xn) reaction with 400 1000 MeV bombarding energies and heavy element targets (e.g. bismuth, lead) is capable of greater specific source strength than other possible methods realizable within the time scale. A preliminary parameter optimization carried through for the accelerator currently promising greatest economy (the separated orbit cyclotron or S.O.C.), reveals that a facility delivering a proton beam of about 65 mA at about 1 BeV would satisfy the flux requirement with a neutron cost significantly more favourable than that projected for a high flux reactor. It is suggested that a proton storage ring providing post-acceleration pulsing of the proton beam should be developed for the facility. With this elaboration, and by taking advantage of the intrinsic microscopic pulse structure provided by the radio frequency duty cycle, a very versatile source may be devised capable of producing multiple beams of continuous and pulsed neutrons with a wide range of energies and pulse widths. The source promises to be of great value for high flux irradiations and as a pilot facility for advanced reactor technology. The proposed proton accelerator also constitutes a meson source capable of producing beams of {pi} and {mu} mesons and of neutrinos orders of magnitude more intense than those of any accelerator presently in use. These beams, which can be produced simultaneously with the neutron beams, open vast areas of new research in fundamental nuclear structure, elementary particle physics

  12. An intense neutron generator based on a proton accelerator

    International Nuclear Information System (INIS)

    Bartholomew, G.A.; Milton, J.C.D.; Vogt, E.W.

    1964-01-01

    A study has been made of the demand for a neutron facility with a thermal flux of ≥ 10 16 n cm -2 sec -1 and of possible methods of producing such fluxes with existing or presently developing technology. Experimental projects proposed by neutron users requiring high fluxes call for neutrons of all energies from thermal to 100 MeV with both continuous-wave and pulsed output. Consideration of the heat generated in the source per useful neutron liberated shows that the (p,xn) reaction with 400 1000 MeV bombarding energies and heavy element targets (e.g. bismuth, lead) is capable of greater specific source strength than other possible methods realizable within the time scale. A preliminary parameter optimization carried through for the accelerator currently promising greatest economy (the separated orbit cyclotron or S.O.C.), reveals that a facility delivering a proton beam of about 65 mA at about 1 BeV would satisfy the flux requirement with a neutron cost significantly more favourable than that projected for a high flux reactor. It is suggested that a proton storage ring providing post-acceleration pulsing of the proton beam should be developed for the facility. With this elaboration, and by taking advantage of the intrinsic microscopic pulse structure provided by the radio frequency duty cycle, a very versatile source may be devised capable of producing multiple beams of continuous and pulsed neutrons with a wide range of energies and pulse widths. The source promises to be of great value for high flux irradiations and as a pilot facility for advanced reactor technology. The proposed proton accelerator also constitutes a meson source capable of producing beams of π and μ mesons and of neutrinos orders of magnitude more intense than those of any accelerator presently in use. These beams, which can be produced simultaneously with the neutron beams, open vast areas of new research in fundamental nuclear structure, elementary particle physics, and perhaps also in

  13. The first proton sponge-based amino acids: synthesis, acid-base properties and some reactivity.

    Science.gov (United States)

    Ozeryanskii, Valery A; Gorbacheva, Anastasia Yu; Pozharskii, Alexander F; Vlasenko, Marina P; Tereznikov, Alexander Yu; Chernov'yants, Margarita S

    2015-08-21

    The first hybrid base constructed from 1,8-bis(dimethylamino)naphthalene (proton sponge or DMAN) and glycine, N-methyl-N-(8-dimethylamino-1-naphthyl)aminoacetic acid, was synthesised in high yield and its hydrobromide was structurally characterised and used to determine the acid-base properties via potentiometric titration. It was found that the basic strength of the DMAN-glycine base (pKa = 11.57, H2O) is on the level of amidine amino acids like arginine and creatine and its structure, zwitterionic vs. neutral, based on the spectroscopic (IR, NMR, mass) and theoretical (DFT) approaches has a strong preference to the zwitterionic form. Unlike glycine, the DMAN-glycine zwitterion is N-chiral and is hydrolytically cleaved with the loss of glycolic acid on heating in DMSO. This reaction together with the mild decarboxylative conversion of proton sponge-based amino acids into 2,3-dihydroperimidinium salts under air-oxygen was monitored with the help of the DMAN-alanine amino acid. The newly devised amino acids are unique as they combine fluorescence, strongly basic and redox-active properties.

  14. Monte Carlo based dosimetry and treatment planning for neutron capture therapy of brain tumors

    International Nuclear Information System (INIS)

    Zamenhof, R.G.; Brenner, J.F.; Wazer, D.E.; Madoc-Jones, H.; Clement, S.D.; Harling, O.K.; Yanch, J.C.

    1990-01-01

    Monte Carlo based dosimetry and computer-aided treatment planning for neutron capture therapy have been developed to provide the necessary link between physical dosimetric measurements performed on the MITR-II epithermal-neutron beams and the need of the radiation oncologist to synthesize large amounts of dosimetric data into a clinically meaningful treatment plan for each individual patient. Monte Carlo simulation has been employed to characterize the spatial dose distributions within a skull/brain model irradiated by an epithermal-neutron beam designed for neutron capture therapy applications. The geometry and elemental composition employed for the mathematical skull/brain model and the neutron and photon fluence-to-dose conversion formalism are presented. A treatment planning program, NCTPLAN, developed specifically for neutron capture therapy, is described. Examples are presented illustrating both one and two-dimensional dose distributions obtainable within the brain with an experimental epithermal-neutron beam, together with beam quality and treatment plan efficacy criteria which have been formulated for neutron capture therapy. The incorporation of three-dimensional computed tomographic image data into the treatment planning procedure is illustrated

  15. Levy-Lieb-Based Monte Carlo Study of the Dimensionality Behaviour of the Electronic Kinetic Functional

    Directory of Open Access Journals (Sweden)

    Seshaditya A.

    2017-06-01

    Full Text Available We consider a gas of interacting electrons in the limit of nearly uniform density and treat the one dimensional (1D, two dimensional (2D and three dimensional (3D cases. We focus on the determination of the correlation part of the kinetic functional by employing a Monte Carlo sampling technique of electrons in space based on an analytic derivation via the Levy-Lieb constrained search principle. Of particular interest is the question of the behaviour of the functional as one passes from 1D to 3D; according to the basic principles of Density Functional Theory (DFT the form of the universal functional should be independent of the dimensionality. However, in practice the straightforward use of current approximate functionals in different dimensions is problematic. Here, we show that going from the 3D to the 2D case the functional form is consistent (concave function but in 1D becomes convex; such a drastic difference is peculiar of 1D electron systems as it is for other quantities. Given the interesting behaviour of the functional, this study represents a basic first-principle approach to the problem and suggests further investigations using highly accurate (though expensive many-electron computational techniques, such as Quantum Monte Carlo.

  16. Monte-Carlo-based uncertainty propagation with hierarchical models—a case study in dynamic torque

    Science.gov (United States)

    Klaus, Leonard; Eichstädt, Sascha

    2018-04-01

    For a dynamic calibration, a torque transducer is described by a mechanical model, and the corresponding model parameters are to be identified from measurement data. A measuring device for the primary calibration of dynamic torque, and a corresponding model-based calibration approach, have recently been developed at PTB. The complete mechanical model of the calibration set-up is very complex, and involves several calibration steps—making a straightforward implementation of a Monte Carlo uncertainty evaluation tedious. With this in mind, we here propose to separate the complete model into sub-models, with each sub-model being treated with individual experiments and analysis. The uncertainty evaluation for the overall model then has to combine the information from the sub-models in line with Supplement 2 of the Guide to the Expression of Uncertainty in Measurement. In this contribution, we demonstrate how to carry this out using the Monte Carlo method. The uncertainty evaluation involves various input quantities of different origin and the solution of a numerical optimisation problem.

  17. Fission yield calculation using toy model based on Monte Carlo simulation

    International Nuclear Information System (INIS)

    Jubaidah; Kurniadi, Rizal

    2015-01-01

    Toy model is a new approximation in predicting fission yield distribution. Toy model assumes nucleus as an elastic toy consist of marbles. The number of marbles represents the number of nucleons, A. This toy nucleus is able to imitate the real nucleus properties. In this research, the toy nucleons are only influenced by central force. A heavy toy nucleus induced by a toy nucleon will be split into two fragments. These two fission fragments are called fission yield. In this research, energy entanglement is neglected. Fission process in toy model is illustrated by two Gaussian curves intersecting each other. There are five Gaussian parameters used in this research. They are scission point of the two curves (R c ), mean of left curve (μ L ) and mean of right curve (μ R ), deviation of left curve (σ L ) and deviation of right curve (σ R ). The fission yields distribution is analyses based on Monte Carlo simulation. The result shows that variation in σ or µ can significanly move the average frequency of asymmetry fission yields. This also varies the range of fission yields distribution probability. In addition, variation in iteration coefficient only change the frequency of fission yields. Monte Carlo simulation for fission yield calculation using toy model successfully indicates the same tendency with experiment results, where average of light fission yield is in the range of 90

  18. TREEDE, Point Fluxes and Currents Based on Track Rotation Estimator by Monte-Carlo Method

    International Nuclear Information System (INIS)

    Dubi, A.

    1985-01-01

    1 - Description of problem or function: TREEDE is a Monte Carlo transport code based on the Track Rotation estimator, used, in general, to calculate fluxes and currents at a point. This code served as a test code in the development of the concept of the Track Rotation estimator, and therefore analogue Monte Carlo is used (i.e. no importance biasing). 2 - Method of solution: The basic idea is to follow the particle's track in the medium and then to rotate it such that it passes through the detector point. That is, rotational symmetry considerations (even in non-spherically symmetric configurations) are applied to every history, so that a very large fraction of the track histories can be rotated and made to pass through the point of interest; in this manner the 1/r 2 singularity in the un-collided flux estimator (next event estimator) is avoided. TREEDE, being a test code, is used to estimate leakage or in-medium fluxes at given points in a 3-dimensional finite box, where the source is an isotropic point source at the centre of the z = 0 surface. However, many of the constraints of geometry and source can be easily removed. The medium is assumed homogeneous with isotropic scattering, and one energy group only is considered. 3 - Restrictions on the complexity of the problem: One energy group, a homogeneous medium, isotropic scattering

  19. Fission yield calculation using toy model based on Monte Carlo simulation

    Energy Technology Data Exchange (ETDEWEB)

    Jubaidah, E-mail: jubaidah@student.itb.ac.id [Nuclear Physics and Biophysics Division, Department of Physics, Bandung Institute of Technology. Jl. Ganesa No. 10 Bandung – West Java, Indonesia 40132 (Indonesia); Physics Department, Faculty of Mathematics and Natural Science – State University of Medan. Jl. Willem Iskandar Pasar V Medan Estate – North Sumatera, Indonesia 20221 (Indonesia); Kurniadi, Rizal, E-mail: rijalk@fi.itb.ac.id [Nuclear Physics and Biophysics Division, Department of Physics, Bandung Institute of Technology. Jl. Ganesa No. 10 Bandung – West Java, Indonesia 40132 (Indonesia)

    2015-09-30

    Toy model is a new approximation in predicting fission yield distribution. Toy model assumes nucleus as an elastic toy consist of marbles. The number of marbles represents the number of nucleons, A. This toy nucleus is able to imitate the real nucleus properties. In this research, the toy nucleons are only influenced by central force. A heavy toy nucleus induced by a toy nucleon will be split into two fragments. These two fission fragments are called fission yield. In this research, energy entanglement is neglected. Fission process in toy model is illustrated by two Gaussian curves intersecting each other. There are five Gaussian parameters used in this research. They are scission point of the two curves (R{sub c}), mean of left curve (μ{sub L}) and mean of right curve (μ{sub R}), deviation of left curve (σ{sub L}) and deviation of right curve (σ{sub R}). The fission yields distribution is analyses based on Monte Carlo simulation. The result shows that variation in σ or µ can significanly move the average frequency of asymmetry fission yields. This also varies the range of fission yields distribution probability. In addition, variation in iteration coefficient only change the frequency of fission yields. Monte Carlo simulation for fission yield calculation using toy model successfully indicates the same tendency with experiment results, where average of light fission yield is in the range of 90

  20. Development of a shield based on Monte-Carlo studies for the COBRA experiment

    Energy Technology Data Exchange (ETDEWEB)

    Heidrich, Nadine [Institut fuer Experimentalphysik, 22761 Hamburg (Germany); Collaboration: COBRA-Collaboration

    2013-07-01

    COBRA is a next-generation experiment searching for neutrinoless double beta decay using CdZnTe semiconductor detectors. The main focus is on {sup 116}Cd, with a decay energy of 2813.5 keV well above the highest naturally occurring gamma lines. The concept for a large scale set-up consists of an array of CdZnTe detectors with a total mass of 420 kg enriched in {sup 116}Cd up to 90 %. With a background rate in the order of 10{sup -3} counts/keV/kg/year, the experiment would be sensitive to a half-life larger than 10{sup 26} years, corresponding to a Majorana mass term m{sub ββ} smaller than 50 meV. To achieve the background level, an appropriate shield is necessary. The shield is developed based on Monte-Carlo simulations. For that, different materials and configurations are tested. In the talk the current status of the Monte-Carlo survey is presented and discussed.

  1. Monte Carlo closure for moment-based transport schemes in general relativistic radiation hydrodynamic simulations

    Science.gov (United States)

    Foucart, Francois

    2018-04-01

    General relativistic radiation hydrodynamic simulations are necessary to accurately model a number of astrophysical systems involving black holes and neutron stars. Photon transport plays a crucial role in radiatively dominated accretion discs, while neutrino transport is critical to core-collapse supernovae and to the modelling of electromagnetic transients and nucleosynthesis in neutron star mergers. However, evolving the full Boltzmann equations of radiative transport is extremely expensive. Here, we describe the implementation in the general relativistic SPEC code of a cheaper radiation hydrodynamic method that theoretically converges to a solution of Boltzmann's equation in the limit of infinite numerical resources. The algorithm is based on a grey two-moment scheme, in which we evolve the energy density and momentum density of the radiation. Two-moment schemes require a closure that fills in missing information about the energy spectrum and higher order moments of the radiation. Instead of the approximate analytical closure currently used in core-collapse and merger simulations, we complement the two-moment scheme with a low-accuracy Monte Carlo evolution. The Monte Carlo results can provide any or all of the missing information in the evolution of the moments, as desired by the user. As a first test of our methods, we study a set of idealized problems demonstrating that our algorithm performs significantly better than existing analytical closures. We also discuss the current limitations of our method, in particular open questions regarding the stability of the fully coupled scheme.

  2. Sign Learning Kink-based (SiLK) Quantum Monte Carlo for molecular systems

    Energy Technology Data Exchange (ETDEWEB)

    Ma, Xiaoyao [Department of Physics and Astronomy, Louisiana State University, Baton Rouge, Louisiana 70803 (United States); Hall, Randall W. [Department of Natural Sciences and Mathematics, Dominican University of California, San Rafael, California 94901 (United States); Department of Chemistry, Louisiana State University, Baton Rouge, Louisiana 70803 (United States); Löffler, Frank [Center for Computation and Technology, Louisiana State University, Baton Rouge, Louisiana 70803 (United States); Kowalski, Karol [William R. Wiley Environmental Molecular Sciences Laboratory, Battelle, Pacific Northwest National Laboratory, Richland, Washington 99352 (United States); Bhaskaran-Nair, Kiran; Jarrell, Mark; Moreno, Juana [Department of Physics and Astronomy, Louisiana State University, Baton Rouge, Louisiana 70803 (United States); Center for Computation and Technology, Louisiana State University, Baton Rouge, Louisiana 70803 (United States)

    2016-01-07

    The Sign Learning Kink (SiLK) based Quantum Monte Carlo (QMC) method is used to calculate the ab initio ground state energies for multiple geometries of the H{sub 2}O, N{sub 2}, and F{sub 2} molecules. The method is based on Feynman’s path integral formulation of quantum mechanics and has two stages. The first stage is called the learning stage and reduces the well-known QMC minus sign problem by optimizing the linear combinations of Slater determinants which are used in the second stage, a conventional QMC simulation. The method is tested using different vector spaces and compared to the results of other quantum chemical methods and to exact diagonalization. Our findings demonstrate that the SiLK method is accurate and reduces or eliminates the minus sign problem.

  3. Sign Learning Kink-based (SiLK) Quantum Monte Carlo for molecular systems

    Energy Technology Data Exchange (ETDEWEB)

    Ma, Xiaoyao [Department of Physics and Astronomy, Louisiana State University, Baton Rouge, Louisiana 70803, USA; Hall, Randall W. [Department of Natural Sciences and Mathematics, Dominican University of California, San Rafael, California 94901, USA; Department of Chemistry, Louisiana State University, Baton Rouge, Louisiana 70803, USA; Löffler, Frank [Center for Computation and Technology, Louisiana State University, Baton Rouge, Louisiana 70803, USA; Kowalski, Karol [William R. Wiley Environmental Molecular Sciences Laboratory, Battelle, Pacific Northwest National Laboratory, Richland, Washington 99352, USA; Bhaskaran-Nair, Kiran [Department of Physics and Astronomy, Louisiana State University, Baton Rouge, Louisiana 70803, USA; Center for Computation and Technology, Louisiana State University, Baton Rouge, Louisiana 70803, USA; Jarrell, Mark [Department of Physics and Astronomy, Louisiana State University, Baton Rouge, Louisiana 70803, USA; Center for Computation and Technology, Louisiana State University, Baton Rouge, Louisiana 70803, USA; Moreno, Juana [Department of Physics and Astronomy, Louisiana State University, Baton Rouge, Louisiana 70803, USA; Center for Computation and Technology, Louisiana State University, Baton Rouge, Louisiana 70803, USA

    2016-01-07

    The Sign Learning Kink (SiLK) based Quantum Monte Carlo (QMC) method is used to calculate the ab initio ground state energies for multiple geometries of the H2O, N2, and F2 molecules. The method is based on Feynman’s path integral formulation of quantum mechanics and has two stages. The first stage is called the learning stage and reduces the well-known QMC minus sign problem by optimizing the linear combinations of Slater determinants which are used in the second stage, a conventional QMC simulation. The method is tested using different vector spaces and compared to the results of other quantum chemical methods and to exact diagonalization. Our findings demonstrate that the SiLK method is accurate and reduces or eliminates the minus sign problem.

  4. Integrated layout based Monte-Carlo simulation for design arc optimization

    Science.gov (United States)

    Shao, Dongbing; Clevenger, Larry; Zhuang, Lei; Liebmann, Lars; Wong, Robert; Culp, James

    2016-03-01

    Design rules are created considering a wafer fail mechanism with the relevant design levels under various design cases, and the values are set to cover the worst scenario. Because of the simplification and generalization, design rule hinders, rather than helps, dense device scaling. As an example, SRAM designs always need extensive ground rule waivers. Furthermore, dense design also often involves "design arc", a collection of design rules, the sum of which equals critical pitch defined by technology. In design arc, a single rule change can lead to chain reaction of other rule violations. In this talk we present a methodology using Layout Based Monte-Carlo Simulation (LBMCS) with integrated multiple ground rule checks. We apply this methodology on SRAM word line contact, and the result is a layout that has balanced wafer fail risks based on Process Assumptions (PAs). This work was performed at the IBM Microelectronics Div, Semiconductor Research and Development Center, Hopewell Junction, NY 12533

  5. Comparison of nonstationary generalized logistic models based on Monte Carlo simulation

    Directory of Open Access Journals (Sweden)

    S. Kim

    2015-06-01

    Full Text Available Recently, the evidences of climate change have been observed in hydrologic data such as rainfall and flow data. The time-dependent characteristics of statistics in hydrologic data are widely defined as nonstationarity. Therefore, various nonstationary GEV and generalized Pareto models have been suggested for frequency analysis of nonstationary annual maximum and POT (peak-over-threshold data, respectively. However, the alternative models are required for nonstatinoary frequency analysis because of analyzing the complex characteristics of nonstationary data based on climate change. This study proposed the nonstationary generalized logistic model including time-dependent parameters. The parameters of proposed model are estimated using the method of maximum likelihood based on the Newton-Raphson method. In addition, the proposed model is compared by Monte Carlo simulation to investigate the characteristics of models and applicability.

  6. Monte Carlo dose calculation using a cell processor based PlayStation 3 system

    International Nuclear Information System (INIS)

    Chow, James C L; Lam, Phil; Jaffray, David A

    2012-01-01

    This study investigates the performance of the EGSnrc computer code coupled with a Cell-based hardware in Monte Carlo simulation of radiation dose in radiotherapy. Performance evaluations of two processor-intensive functions namely, HOWNEAR and RANMAR G ET in the EGSnrc code were carried out basing on the 20-80 rule (Pareto principle). The execution speeds of the two functions were measured by the profiler gprof specifying the number of executions and total time spent on the functions. A testing architecture designed for Cell processor was implemented in the evaluation using a PlayStation3 (PS3) system. The evaluation results show that the algorithms examined are readily parallelizable on the Cell platform, provided that an architectural change of the EGSnrc was made. However, as the EGSnrc performance was limited by the PowerPC Processing Element in the PS3, PC coupled with graphics processing units or GPCPU may provide a more viable avenue for acceleration.

  7. Monte Carlo dose calculation using a cell processor based PlayStation 3 system

    Science.gov (United States)

    Chow, James C. L.; Lam, Phil; Jaffray, David A.

    2012-02-01

    This study investigates the performance of the EGSnrc computer code coupled with a Cell-based hardware in Monte Carlo simulation of radiation dose in radiotherapy. Performance evaluations of two processor-intensive functions namely, HOWNEAR and RANMAR_GET in the EGSnrc code were carried out basing on the 20-80 rule (Pareto principle). The execution speeds of the two functions were measured by the profiler gprof specifying the number of executions and total time spent on the functions. A testing architecture designed for Cell processor was implemented in the evaluation using a PlayStation3 (PS3) system. The evaluation results show that the algorithms examined are readily parallelizable on the Cell platform, provided that an architectural change of the EGSnrc was made. However, as the EGSnrc performance was limited by the PowerPC Processing Element in the PS3, PC coupled with graphics processing units or GPCPU may provide a more viable avenue for acceleration.

  8. Prediction of betavoltaic battery output parameters based on SEM measurements and Monte Carlo simulation

    International Nuclear Information System (INIS)

    Yakimov, Eugene B.

    2016-01-01

    An approach for a prediction of "6"3Ni-based betavoltaic battery output parameters is described. It consists of multilayer Monte Carlo simulation to obtain the depth dependence of excess carrier generation rate inside the semiconductor converter, a determination of collection probability based on the electron beam induced current measurements, a calculation of current induced in the semiconductor converter by beta-radiation, and SEM measurements of output parameters using the calculated induced current value. Such approach allows to predict the betavoltaic battery parameters and optimize the converter design for any real semiconductor structure and any thickness and specific activity of beta-radiation source. - Highlights: • New procedure for betavoltaic battery output parameters prediction is described. • A depth dependence of beta particle energy deposition for Si and SiC is calculated. • Electron trajectories are assumed isotropic and uniformly started under simulation.

  9. Sign Learning Kink-based (SiLK) Quantum Monte Carlo for molecular systems

    International Nuclear Information System (INIS)

    Ma, Xiaoyao; Hall, Randall W.; Löffler, Frank; Kowalski, Karol; Bhaskaran-Nair, Kiran; Jarrell, Mark; Moreno, Juana

    2016-01-01

    The Sign Learning Kink (SiLK) based Quantum Monte Carlo (QMC) method is used to calculate the ab initio ground state energies for multiple geometries of the H 2 O, N 2 , and F 2 molecules. The method is based on Feynman’s path integral formulation of quantum mechanics and has two stages. The first stage is called the learning stage and reduces the well-known QMC minus sign problem by optimizing the linear combinations of Slater determinants which are used in the second stage, a conventional QMC simulation. The method is tested using different vector spaces and compared to the results of other quantum chemical methods and to exact diagonalization. Our findings demonstrate that the SiLK method is accurate and reduces or eliminates the minus sign problem

  10. SU-C-207A-04: Accuracy of Acoustic-Based Proton Range Verification in Water

    International Nuclear Information System (INIS)

    Jones, KC; Sehgal, CM; Avery, S; Vander Stappen, F

    2016-01-01

    Purpose: To determine the accuracy and dose required for acoustic-based proton range verification (protoacoustics) in water. Methods: Proton pulses with 17 µs FWHM and instantaneous currents of 480 nA (5.6 × 10 7 protons/pulse, 8.9 cGy/pulse) were generated by a clinical, hospital-based cyclotron at the University of Pennsylvania. The protoacoustic signal generated in a water phantom by the 190 MeV proton pulses was measured with a hydrophone placed at multiple known positions surrounding the dose deposition. The background random noise was measured. The protoacoustic signal was simulated to compare to the experiments. Results: The maximum protoacoustic signal amplitude at 5 cm distance was 5.2 mPa per 1 × 10 7 protons (1.6 cGy at the Bragg peak). The background random noise of the measurement was 27 mPa. Comparison between simulation and experiment indicates that the hydrophone introduced a delay of 2.4 µs. For acoustic data collected with a signal-to-noise ratio (SNR) of 21, deconvolution of the protoacoustic signal with the proton pulse provided the most precise time-of-flight range measurement (standard deviation of 2.0 mm), but a systematic error (−4.5 mm) was observed. Conclusion: Based on water phantom measurements at a clinical hospital-based cyclotron, protoacoustics is a potential technique for measuring the proton Bragg peak range with 2.0 mm standard deviation. Simultaneous use of multiple detectors is expected to reduce the standard deviation, but calibration is required to remove systematic error. Based on the measured background noise and protoacoustic amplitude, a SNR of 5.3 is projected for a deposited dose of 2 Gy.

  11. A comparison of two prompt gamma imaging techniques with collimator-based cameras for range verification in proton therapy

    Science.gov (United States)

    Lin, Hsin-Hon; Chang, Hao-Ting; Chao, Tsi-Chian; Chuang, Keh-Shih

    2017-08-01

    In vivo range verification plays an important role in proton therapy to fully utilize the benefits of the Bragg peak (BP) for delivering high radiation dose to tumor, while sparing the normal tissue. For accurately locating the position of BP, camera equipped with collimators (multi-slit and knife-edge collimator) to image prompt gamma (PG) emitted along the proton tracks in the patient have been proposed for range verification. The aim of the work is to compare the performance of multi-slit collimator and knife-edge collimator for non-invasive proton beam range verification. PG imaging was simulated by a validated GATE/GEANT4 Monte Carlo code to model the spot-scanning proton therapy and cylindrical PMMA phantom in detail. For each spot, 108 protons were simulated. To investigate the correlation between the acquired PG profile and the proton range, the falloff regions of PG profiles were fitted with a 3-line-segment curve function as the range estimate. Factors including the energy window setting, proton energy, phantom size, and phantom shift that may influence the accuracy of detecting range were studied. Results indicated that both collimator systems achieve reasonable accuracy and good response to the phantom shift. The accuracy of range predicted by multi-slit collimator system is less affected by the proton energy, while knife-edge collimator system can achieve higher detection efficiency that lead to a smaller deviation in predicting range. We conclude that both collimator systems have potentials for accurately range monitoring in proton therapy. It is noted that neutron contamination has a marked impact on range prediction of the two systems, especially in multi-slit system. Therefore, a neutron reduction technique for improving the accuracy of range verification of proton therapy is needed.

  12. Modelling of scintillator based flat-panel detectors with Monte-Carlo simulations

    International Nuclear Information System (INIS)

    Reims, N; Sukowski, F; Uhlmann, N

    2011-01-01

    Scintillator based flat panel detectors are state of the art in the field of industrial X-ray imaging applications. Choosing the proper system and setup parameters for the vast range of different applications can be a time consuming task, especially when developing new detector systems. Since the system behaviour cannot always be foreseen easily, Monte-Carlo (MC) simulations are keys to gain further knowledge of system components and their behaviour for different imaging conditions. In this work we used two Monte-Carlo based models to examine an indirect converting flat panel detector, specifically the Hamamatsu C9312SK. We focused on the signal generation in the scintillation layer and its influence on the spatial resolution of the whole system. The models differ significantly in their level of complexity. The first model gives a global description of the detector based on different parameters characterizing the spatial resolution. With relatively small effort a simulation model can be developed which equates the real detector regarding signal transfer. The second model allows a more detailed insight of the system. It is based on the well established cascade theory, i.e. describing the detector as a cascade of elemental gain and scattering stages, which represent the built in components and their signal transfer behaviour. In comparison to the first model the influence of single components especially the important light spread behaviour in the scintillator can be analysed in a more differentiated way. Although the implementation of the second model is more time consuming both models have in common that a relatively small amount of system manufacturer parameters are needed. The results of both models were in good agreement with the measured parameters of the real system.

  13. GGEMS-Brachy: GPU GEant4-based Monte Carlo simulation for brachytherapy applications

    International Nuclear Information System (INIS)

    Lemaréchal, Yannick; Bert, Julien; Schick, Ulrike; Pradier, Olivier; Garcia, Marie-Paule; Boussion, Nicolas; Visvikis, Dimitris; Falconnet, Claire; Després, Philippe; Valeri, Antoine

    2015-01-01

    In brachytherapy, plans are routinely calculated using the AAPM TG43 formalism which considers the patient as a simple water object. An accurate modeling of the physical processes considering patient heterogeneity using Monte Carlo simulation (MCS) methods is currently too time-consuming and computationally demanding to be routinely used. In this work we implemented and evaluated an accurate and fast MCS on Graphics Processing Units (GPU) for brachytherapy low dose rate (LDR) applications. A previously proposed Geant4 based MCS framework implemented on GPU (GGEMS) was extended to include a hybrid GPU navigator, allowing navigation within voxelized patient specific images and analytically modeled 125 I seeds used in LDR brachytherapy. In addition, dose scoring based on track length estimator including uncertainty calculations was incorporated. The implemented GGEMS-brachy platform was validated using a comparison with Geant4 simulations and reference datasets. Finally, a comparative dosimetry study based on the current clinical standard (TG43) and the proposed platform was performed on twelve prostate cancer patients undergoing LDR brachytherapy. Considering patient 3D CT volumes of 400  × 250  × 65 voxels and an average of 58 implanted seeds, the mean patient dosimetry study run time for a 2% dose uncertainty was 9.35 s (≈500 ms 10 −6 simulated particles) and 2.5 s when using one and four GPUs, respectively. The performance of the proposed GGEMS-brachy platform allows envisaging the use of Monte Carlo simulation based dosimetry studies in brachytherapy compatible with clinical practice. Although the proposed platform was evaluated for prostate cancer, it is equally applicable to other LDR brachytherapy clinical applications. Future extensions will allow its application in high dose rate brachytherapy applications. (paper)

  14. GGEMS-Brachy: GPU GEant4-based Monte Carlo simulation for brachytherapy applications

    Science.gov (United States)

    Lemaréchal, Yannick; Bert, Julien; Falconnet, Claire; Després, Philippe; Valeri, Antoine; Schick, Ulrike; Pradier, Olivier; Garcia, Marie-Paule; Boussion, Nicolas; Visvikis, Dimitris

    2015-07-01

    In brachytherapy, plans are routinely calculated using the AAPM TG43 formalism which considers the patient as a simple water object. An accurate modeling of the physical processes considering patient heterogeneity using Monte Carlo simulation (MCS) methods is currently too time-consuming and computationally demanding to be routinely used. In this work we implemented and evaluated an accurate and fast MCS on Graphics Processing Units (GPU) for brachytherapy low dose rate (LDR) applications. A previously proposed Geant4 based MCS framework implemented on GPU (GGEMS) was extended to include a hybrid GPU navigator, allowing navigation within voxelized patient specific images and analytically modeled 125I seeds used in LDR brachytherapy. In addition, dose scoring based on track length estimator including uncertainty calculations was incorporated. The implemented GGEMS-brachy platform was validated using a comparison with Geant4 simulations and reference datasets. Finally, a comparative dosimetry study based on the current clinical standard (TG43) and the proposed platform was performed on twelve prostate cancer patients undergoing LDR brachytherapy. Considering patient 3D CT volumes of 400  × 250  × 65 voxels and an average of 58 implanted seeds, the mean patient dosimetry study run time for a 2% dose uncertainty was 9.35 s (≈500 ms 10-6 simulated particles) and 2.5 s when using one and four GPUs, respectively. The performance of the proposed GGEMS-brachy platform allows envisaging the use of Monte Carlo simulation based dosimetry studies in brachytherapy compatible with clinical practice. Although the proposed platform was evaluated for prostate cancer, it is equally applicable to other LDR brachytherapy clinical applications. Future extensions will allow its application in high dose rate brachytherapy applications.

  15. Proton exchange between oxymethyl radical and acids and bases: semiempirical quantum-chemical study

    Directory of Open Access Journals (Sweden)

    Irina Pustolaikina

    2016-12-01

    Full Text Available The reactions with proton participation are widely represented in the analytical, technological and biological chemistry. Quantum-chemical study of the exchange processes in hydrogen bonding complexes will allow us to achieve progress in the understanding of the elementary act mechanism of proton transfer in hydrogen bonding chain as well as the essence of the acid-base interactions. Oxymethyl radical •CH2ОН is small in size and comfortable as a model particle that well transmits protolytic properties of paramagnetic acids having more complex structure. Quantum-chemical modeling of proton exchange reaction oxymethyl radical ∙CH2OH and its diamagnetic analog CH3OH with amines, carboxylic acids and water was carried out using UAM1 method with the help of Gaussian-2009 program. QST2 method was used for the search of transition state, IRC procedure was applied for the calculation of descents along the reaction coordinate. The difference in the structure of transition states of ∙CH2OH/ CH3OH with bases and acids has been shown. It has been confirmed that in the case of bases, consecutive proton exchange mechanism was fixed, and in the case of complexes with carboxylic acids parallel proton exchange mechanism was fixed. The similarity in the reaction behavior of paramagnetic and diamagnetic systems in the proton exchange has been found. It was suggested that the mechanism of proton exchange reaction is determined by the structure of the hydrogen bonding cyclic complex, which is, in turn, depends from the nature of the acid-base interactions partners.

  16. WE-H-BRA-08: A Monte Carlo Cell Nucleus Model for Assessing Cell Survival Probability Based On Particle Track Structure Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lee, B [Northwestern Memorial Hospital, Chicago, IL (United States); Georgia Institute of Technology, Atlanta, GA (Georgia); Wang, C [Georgia Institute of Technology, Atlanta, GA (Georgia)

    2016-06-15

    Purpose: To correlate the damage produced by particles of different types and qualities to cell survival on the basis of nanodosimetric analysis and advanced DNA structures in the cell nucleus. Methods: A Monte Carlo code was developed to simulate subnuclear DNA chromatin fibers (CFs) of 30nm utilizing a mean-free-path approach common to radiation transport. The cell nucleus was modeled as a spherical region containing 6000 chromatin-dense domains (CDs) of 400nm diameter, with additional CFs modeled in a sparser interchromatin region. The Geant4-DNA code was utilized to produce a particle track database representing various particles at different energies and dose quantities. These tracks were used to stochastically position the DNA structures based on their mean free path to interaction with CFs. Excitation and ionization events intersecting CFs were analyzed using the DBSCAN clustering algorithm for assessment of the likelihood of producing DSBs. Simulated DSBs were then assessed based on their proximity to one another for a probability of inducing cell death. Results: Variations in energy deposition to chromatin fibers match expectations based on differences in particle track structure. The quality of damage to CFs based on different particle types indicate more severe damage by high-LET radiation than low-LET radiation of identical particles. In addition, the model indicates more severe damage by protons than of alpha particles of same LET, which is consistent with differences in their track structure. Cell survival curves have been produced showing the L-Q behavior of sparsely ionizing radiation. Conclusion: Initial results indicate the feasibility of producing cell survival curves based on the Monte Carlo cell nucleus method. Accurate correlation between simulated DNA damage to cell survival on the basis of nanodosimetric analysis can provide insight into the biological responses to various radiation types. Current efforts are directed at producing cell

  17. Isotope effects for base-promoted, gas-phase proton transfer reactions

    International Nuclear Information System (INIS)

    Grabowski, J.J.; Cheng, Xueheng

    1991-01-01

    Proton transfer reactions are among the most basic, the most common and the most important of chemical transformations; despite their apparent simplicity, much is unknown about this most fundamental of all chemical processes. Active interest in understanding the underlying principles of organic proton transfer reactions continues because of efforts being made to develop the theory of elementary chemical processes, because of the resurgence of interest in mechanistic organic chemistry and because of the resurgence of interest in mechanistic organic chemistry processes, because of the resurgence of interest in mechanistic organic chemistry and because of the dynamic role played by proton transfers in biochemical transformations. As organic chemists, the authors have used the flowing afterglow technique to gain an appreciation of the fundamental issues involved in reaction mechanisms by examining such processes in a solvent-free environment under thermally-equilibrated (300 K) conditions. Recent characterization of the facile production of both acetate and the monoenolate anion from the interaction of hydroxide or fluoride with acetic acid reinforces the idea that much yet must be learned about proton transfers/proton abstractions in general. Earlier work by Riveros and co-workers on competitive H vs D abstraction from α-d 1 -toluenes and by Noest and Nibbering on competitive H vs D abstraction from α,α,α-d 3 -acetone, in combination with the acetic acid results, challenged the author's to assemble a comprehensive picture of the competitive nature of proton transfer reactions for anionic base-promoted processes

  18. Proton radiography to improve proton therapy treatment

    NARCIS (Netherlands)

    Takatsu, J.; van der Graaf, E. R.; van Goethem, Marc-Jan; van Beuzekom, M.; Klaver, T.; Visser, Jan; Brandenburg, S.; Biegun, A. K.

    The quality of cancer treatment with protons critically depends on an accurate prediction of the proton stopping powers for the tissues traversed by the protons. Today, treatment planning in proton radiotherapy is based on stopping power calculations from densities of X-ray Computed Tomography (CT)

  19. PELE:  Protein Energy Landscape Exploration. A Novel Monte Carlo Based Technique.

    Science.gov (United States)

    Borrelli, Kenneth W; Vitalis, Andreas; Alcantara, Raul; Guallar, Victor

    2005-11-01

    Combining protein structure prediction algorithms and Metropolis Monte Carlo techniques, we provide a novel method to explore all-atom energy landscapes. The core of the technique is based on a steered localized perturbation followed by side-chain sampling as well as minimization cycles. The algorithm and its application to ligand diffusion are presented here. Ligand exit pathways are successfully modeled for different systems containing ligands of various sizes:  carbon monoxide in myoglobin, camphor in cytochrome P450cam, and palmitic acid in the intestinal fatty-acid-binding protein. These initial applications reveal the potential of this new technique in mapping millisecond-time-scale processes. The computational cost associated with the exploration is significantly less than that of conventional MD simulations.

  20. CAD-based Monte Carlo program for integrated simulation of nuclear system SuperMC

    International Nuclear Information System (INIS)

    Wu, Y.; Song, J.; Zheng, H.; Sun, G.; Hao, L.; Long, P.; Hu, L.

    2013-01-01

    SuperMC is a (Computer-Aided-Design) CAD-based Monte Carlo (MC) program for integrated simulation of nuclear systems developed by FDS Team (China), making use of hybrid MC-deterministic method and advanced computer technologies. The design aim, architecture and main methodology of SuperMC are presented in this paper. The taking into account of multi-physics processes and the use of advanced computer technologies such as automatic geometry modeling, intelligent data analysis and visualization, high performance parallel computing and cloud computing, contribute to the efficiency of the code. SuperMC2.1, the latest version of the code for neutron, photon and coupled neutron and photon transport calculation, has been developed and validated by using a series of benchmarking cases such as the fusion reactor ITER model and the fast reactor BN-600 model

  1. Monte Carlo based statistical power analysis for mediation models: methods and software.

    Science.gov (United States)

    Zhang, Zhiyong

    2014-12-01

    The existing literature on statistical power analysis for mediation models often assumes data normality and is based on a less powerful Sobel test instead of the more powerful bootstrap test. This study proposes to estimate statistical power to detect mediation effects on the basis of the bootstrap method through Monte Carlo simulation. Nonnormal data with excessive skewness and kurtosis are allowed in the proposed method. A free R package called bmem is developed to conduct the power analysis discussed in this study. Four examples, including a simple mediation model, a multiple-mediator model with a latent mediator, a multiple-group mediation model, and a longitudinal mediation model, are provided to illustrate the proposed method.

  2. An Application of Monte-Carlo-Based Sensitivity Analysis on the Overlap in Discriminant Analysis

    Directory of Open Access Journals (Sweden)

    S. Razmyan

    2012-01-01

    Full Text Available Discriminant analysis (DA is used for the measurement of estimates of a discriminant function by minimizing their group misclassifications to predict group membership of newly sampled data. A major source of misclassification in DA is due to the overlapping of groups. The uncertainty in the input variables and model parameters needs to be properly characterized in decision making. This study combines DEA-DA with a sensitivity analysis approach to an assessment of the influence of banks’ variables on the overall variance in overlap in a DA in order to determine which variables are most significant. A Monte-Carlo-based sensitivity analysis is considered for computing the set of first-order sensitivity indices of the variables to estimate the contribution of each uncertain variable. The results show that the uncertainties in the loans granted and different deposit variables are more significant than uncertainties in other banks’ variables in decision making.

  3. A Monte Carlo-based treatment-planning tool for ion beam therapy

    CERN Document Server

    Böhlen, T T; Dosanjh, M; Ferrari, A; Haberer, T; Parodi, K; Patera, V; Mairan, A

    2013-01-01

    Ion beam therapy, as an emerging radiation therapy modality, requires continuous efforts to develop and improve tools for patient treatment planning (TP) and research applications. Dose and fluence computation algorithms using the Monte Carlo (MC) technique have served for decades as reference tools for accurate dose computations for radiotherapy. In this work, a novel MC-based treatment-planning (MCTP) tool for ion beam therapy using the pencil beam scanning technique is presented. It allows single-field and simultaneous multiple-fields optimization for realistic patient treatment conditions and for dosimetric quality assurance for irradiation conditions at state-of-the-art ion beam therapy facilities. It employs iterative procedures that allow for the optimization of absorbed dose and relative biological effectiveness (RBE)-weighted dose using radiobiological input tables generated by external RBE models. Using a re-implementation of the local effect model (LEM), theMCTP tool is able to perform TP studies u...

  4. Noninvasive spectral imaging of skin chromophores based on multiple regression analysis aided by Monte Carlo simulation

    Science.gov (United States)

    Nishidate, Izumi; Wiswadarma, Aditya; Hase, Yota; Tanaka, Noriyuki; Maeda, Takaaki; Niizeki, Kyuichi; Aizu, Yoshihisa

    2011-08-01

    In order to visualize melanin and blood concentrations and oxygen saturation in human skin tissue, a simple imaging technique based on multispectral diffuse reflectance images acquired at six wavelengths (500, 520, 540, 560, 580 and 600nm) was developed. The technique utilizes multiple regression analysis aided by Monte Carlo simulation for diffuse reflectance spectra. Using the absorbance spectrum as a response variable and the extinction coefficients of melanin, oxygenated hemoglobin, and deoxygenated hemoglobin as predictor variables, multiple regression analysis provides regression coefficients. Concentrations of melanin and total blood are then determined from the regression coefficients using conversion vectors that are deduced numerically in advance, while oxygen saturation is obtained directly from the regression coefficients. Experiments with a tissue-like agar gel phantom validated the method. In vivo experiments with human skin of the human hand during upper limb occlusion and of the inner forearm exposed to UV irradiation demonstrated the ability of the method to evaluate physiological reactions of human skin tissue.

  5. Fast online Monte Carlo-based IMRT planning for the MRI linear accelerator

    Science.gov (United States)

    Bol, G. H.; Hissoiny, S.; Lagendijk, J. J. W.; Raaymakers, B. W.

    2012-03-01

    The MRI accelerator, a combination of a 6 MV linear accelerator with a 1.5 T MRI, facilitates continuous patient anatomy updates regarding translations, rotations and deformations of targets and organs at risk. Accounting for these demands high speed, online intensity-modulated radiotherapy (IMRT) re-optimization. In this paper, a fast IMRT optimization system is described which combines a GPU-based Monte Carlo dose calculation engine for online beamlet generation and a fast inverse dose optimization algorithm. Tightly conformal IMRT plans are generated for four phantom cases and two clinical cases (cervix and kidney) in the presence of the magnetic fields of 0 and 1.5 T. We show that for the presented cases the beamlet generation and optimization routines are fast enough for online IMRT planning. Furthermore, there is no influence of the magnetic field on plan quality and complexity, and equal optimization constraints at 0 and 1.5 T lead to almost identical dose distributions.

  6. Monte Carlo MCNP-4B-based absorbed dose distribution estimates for patient-specific dosimetry.

    Science.gov (United States)

    Yoriyaz, H; Stabin, M G; dos Santos, A

    2001-04-01

    This study was intended to verify the capability of the Monte Carlo MCNP-4B code to evaluate spatial dose distribution based on information gathered from CT or SPECT. A new three-dimensional (3D) dose calculation approach for internal emitter use in radioimmunotherapy (RIT) was developed using the Monte Carlo MCNP-4B code as the photon and electron transport engine. It was shown that the MCNP-4B computer code can be used with voxel-based anatomic and physiologic data to provide 3D dose distributions. This study showed that the MCNP-4B code can be used to develop a treatment planning system that will provide such information in a time manner, if dose reporting is suitably optimized. If each organ is divided into small regions where the average energy deposition is calculated with a typical volume of 0.4 cm(3), regional dose distributions can be provided with reasonable central processing unit times (on the order of 12-24 h on a 200-MHz personal computer or modest workstation). Further efforts to provide semiautomated region identification (segmentation) and improvement of marrow dose calculations are needed to supply a complete system for RIT. It is envisioned that all such efforts will continue to develop and that internal dose calculations may soon be brought to a similar level of accuracy, detail, and robustness as is commonly expected in external dose treatment planning. For this study we developed a code with a user-friendly interface that works on several nuclear medicine imaging platforms and provides timely patient-specific dose information to the physician and medical physicist. Future therapy with internal emitters should use a 3D dose calculation approach, which represents a significant advance over dose information provided by the standard geometric phantoms used for more than 20 y (which permit reporting of only average organ doses for certain standardized individuals)

  7. A Monte Carlo-based model for simulation of digital chest tomo-synthesis

    International Nuclear Information System (INIS)

    Ullman, G.; Dance, D. R.; Sandborg, M.; Carlsson, G. A.; Svalkvist, A.; Baath, M.

    2010-01-01

    The aim of this work was to calculate synthetic digital chest tomo-synthesis projections using a computer simulation model based on the Monte Carlo method. An anthropomorphic chest phantom was scanned in a computed tomography scanner, segmented and included in the computer model to allow for simulation of realistic high-resolution X-ray images. The input parameters to the model were adapted to correspond to the VolumeRAD chest tomo-synthesis system from GE Healthcare. Sixty tomo-synthesis projections were calculated with projection angles ranging from + 15 to -15 deg. The images from primary photons were calculated using an analytical model of the anti-scatter grid and a pre-calculated detector response function. The contributions from scattered photons were calculated using an in-house Monte Carlo-based model employing a number of variance reduction techniques such as the collision density estimator. Tomographic section images were reconstructed by transferring the simulated projections into the VolumeRAD system. The reconstruction was performed for three types of images using: (i) noise-free primary projections, (ii) primary projections including contributions from scattered photons and (iii) projections as in (ii) with added correlated noise. The simulated section images were compared with corresponding section images from projections taken with the real, anthropomorphic phantom from which the digital voxel phantom was originally created. The present article describes a work in progress aiming towards developing a model intended for optimisation of chest tomo-synthesis, allowing for simulation of both existing and future chest tomo-synthesis systems. (authors)

  8. Monte Carlo simulation of grating-based neutron phase contrast imaging at CPHS

    International Nuclear Information System (INIS)

    Zhang Ran; Chen Zhiqiang; Huang Zhifeng; Xiao Yongshun; Wang Xuewu; Wie Jie; Loong, C.-K.

    2011-01-01

    Since the launching of the Compact Pulsed Hadron Source (CPHS) project of Tsinghua University in 2009, works have begun on the design and engineering of an imaging/radiography instrument for the neutron source provided by CPHS. The instrument will perform basic tasks such as transmission imaging and computerized tomography. Additionally, we include in the design the utilization of coded-aperture and grating-based phase contrast methodology, as well as the options of prompt gamma-ray analysis and neutron-energy selective imaging. Previously, we had implemented the hardware and data-analysis software for grating-based X-ray phase contrast imaging. Here, we investigate Geant4-based Monte Carlo simulations of neutron refraction phenomena and then model the grating-based neutron phase contrast imaging system according to the classic-optics-based method. The simulated experimental results of the retrieving phase shift gradient information by five-step phase-stepping approach indicate the feasibility of grating-based neutron phase contrast imaging as an option for the cold neutron imaging instrument at the CPHS.

  9. Model-based fault detection for proton exchange membrane fuel cell ...

    African Journals Online (AJOL)

    In this paper, an intelligent model-based fault detection (FD) is developed for proton exchange membrane fuel cell (PEMFC) dynamic systems using an independent radial basis function (RBF) networks. The novelty is that this RBF networks is used to model the PEMFC dynamic systems and residuals are generated based ...

  10. Full modelling of the MOSAIC animal PET system based on the GATE Monte Carlo simulation code

    International Nuclear Information System (INIS)

    Merheb, C; Petegnief, Y; Talbot, J N

    2007-01-01

    within 9%. For a 410-665 keV energy window, the measured sensitivity for a centred point source was 1.53% and mouse and rat scatter fractions were respectively 12.0% and 18.3%. The scattered photons produced outside the rat and mouse phantoms contributed to 24% and 36% of total simulated scattered coincidences. Simulated and measured single and prompt count rates agreed well for activities up to the electronic saturation at 110 MBq for the mouse and rat phantoms. Volumetric spatial resolution was 17.6 μL at the centre of the FOV with differences less than 6% between experimental and simulated spatial resolution values. The comprehensive evaluation of the Monte Carlo modelling of the Mosaic(TM) system demonstrates that the GATE package is adequately versatile and appropriate to accurately describe the response of an Anger logic based animal PET system

  11. SU-F-T-185: Study of the Robustness of a Proton Arc Technique Based On PBS Beams

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Z [Reading Hospital, West Reading, PA (United States); Zheng, Y [Procure Proton Therapy Center, Oklahoma City, OK (United States)

    2016-06-15

    Purpose: One potential technique to realize proton arc is through using PBS beams from many directions to form overlaid Bragg peak (OBP) spots and placing these OBP spots throughout the target volume to achieve desired dose distribution. In this study, we analyzed the robustness of this proton arc technique. Methods: We used a cylindrical water phantom of 20 cm in radius in our robustness analysis. To study the range uncertainty effect, we changed the density of the phantom by ±3%. To study the setup uncertainty effect, we shifted the phantom by 3 & 5 mm. We also combined the range and setup uncertainties (3mm/±3%). For each test plan, we performed dose calculation for the nominal and 6 disturbed scenarios. Two test plans were used, one with single OBP spot and the other consisting of 121 OBP spots covering a 10×10cm{sup 2} area. We compared the dose profiles between the nominal and disturbed scenarios to estimate the impact of the uncertainties. Dose calculation was performed with Gate/GEANT based Monte Carlo software in cloud computing environment. Results: For each of the 7 scenarios, we simulated 100k & 10M events for plans consisting of single OBP spot and 121 OBP spots respectively. For single OBP spot, the setup uncertainty had minimum impact on the spot’s dose profile while range uncertainty had significant impact on the dose profile. For plan consisting of 121 OBP spots, similar effect was observed but the extent of disturbance was much less compared to single OBP spot. Conclusion: For PBS arc technique, range uncertainty has significantly more impact than setup uncertainty. Although single OBP spot can be severely disturbed by the range uncertainty, the overall effect is much less when a large number of OBP spots are used. Robustness optimization for PBS arc technique should consider range uncertainty with priority.

  12. SU-F-T-185: Study of the Robustness of a Proton Arc Technique Based On PBS Beams

    International Nuclear Information System (INIS)

    Wang, Z; Zheng, Y

    2016-01-01

    Purpose: One potential technique to realize proton arc is through using PBS beams from many directions to form overlaid Bragg peak (OBP) spots and placing these OBP spots throughout the target volume to achieve desired dose distribution. In this study, we analyzed the robustness of this proton arc technique. Methods: We used a cylindrical water phantom of 20 cm in radius in our robustness analysis. To study the range uncertainty effect, we changed the density of the phantom by ±3%. To study the setup uncertainty effect, we shifted the phantom by 3 & 5 mm. We also combined the range and setup uncertainties (3mm/±3%). For each test plan, we performed dose calculation for the nominal and 6 disturbed scenarios. Two test plans were used, one with single OBP spot and the other consisting of 121 OBP spots covering a 10×10cm"2 area. We compared the dose profiles between the nominal and disturbed scenarios to estimate the impact of the uncertainties. Dose calculation was performed with Gate/GEANT based Monte Carlo software in cloud computing environment. Results: For each of the 7 scenarios, we simulated 100k & 10M events for plans consisting of single OBP spot and 121 OBP spots respectively. For single OBP spot, the setup uncertainty had minimum impact on the spot’s dose profile while range uncertainty had significant impact on the dose profile. For plan consisting of 121 OBP spots, similar effect was observed but the extent of disturbance was much less compared to single OBP spot. Conclusion: For PBS arc technique, range uncertainty has significantly more impact than setup uncertainty. Although single OBP spot can be severely disturbed by the range uncertainty, the overall effect is much less when a large number of OBP spots are used. Robustness optimization for PBS arc technique should consider range uncertainty with priority.

  13. Proton therapy physics

    CERN Document Server

    2012-01-01

    Proton Therapy Physics goes beyond current books on proton therapy to provide an in-depth overview of the physics aspects of this radiation therapy modality, eliminating the need to dig through information scattered in the medical physics literature. After tracing the history of proton therapy, the book summarizes the atomic and nuclear physics background necessary for understanding proton interactions with tissue. It describes the physics of proton accelerators, the parameters of clinical proton beams, and the mechanisms to generate a conformal dose distribution in a patient. The text then covers detector systems and measuring techniques for reference dosimetry, outlines basic quality assurance and commissioning guidelines, and gives examples of Monte Carlo simulations in proton therapy. The book moves on to discussions of treatment planning for single- and multiple-field uniform doses, dose calculation concepts and algorithms, and precision and uncertainties for nonmoving and moving targets. It also exami...

  14. Monte Carlo based dosimetry and treatment planning for neutron capture therapy of brain tumors

    International Nuclear Information System (INIS)

    Zamenhof, R.G.; Clement, S.D.; Harling, O.K.; Brenner, J.F.; Wazer, D.E.; Madoc-Jones, H.; Yanch, J.C.

    1990-01-01

    Monte Carlo based dosimetry and computer-aided treatment planning for neutron capture therapy have been developed to provide the necessary link between physical dosimetric measurements performed on the MITR-II epithermal-neutron beams and the need of the radiation oncologist to synthesize large amounts of dosimetric data into a clinically meaningful treatment plan for each individual patient. Monte Carlo simulation has been employed to characterize the spatial dose distributions within a skull/brain model irradiated by an epithermal-neutron beam designed for neutron capture therapy applications. The geometry and elemental composition employed for the mathematical skull/brain model and the neutron and photon fluence-to-dose conversion formalism are presented. A treatment planning program, NCTPLAN, developed specifically for neutron capture therapy, is described. Examples are presented illustrating both one and two-dimensional dose distributions obtainable within the brain with an experimental epithermal-neutron beam, together with beam quality and treatment plan efficacy criteria which have been formulated for neutron capture therapy. The incorporation of three-dimensional computed tomographic image data into the treatment planning procedure is illustrated. The experimental epithermal-neutron beam has a maximum usable circular diameter of 20 cm, and with 30 ppm of B-10 in tumor and 3 ppm of B-10 in blood, it produces a beam-axis advantage depth of 7.4 cm, a beam-axis advantage ratio of 1.83, a global advantage ratio of 1.70, and an advantage depth RBE-dose rate to tumor of 20.6 RBE-cGy/min (cJ/kg-min). These characteristics make this beam well suited for clinical applications, enabling an RBE-dose of 2,000 RBE-cGy/min (cJ/kg-min) to be delivered to tumor at brain midline in six fractions with a treatment time of approximately 16 minutes per fraction

  15. Microalgae dewatering based on forward osmosis employing proton exchange membrane.

    Science.gov (United States)

    Son, Jieun; Sung, Mina; Ryu, Hoyoung; Oh, You-Kwan; Han, Jong-In

    2017-11-01

    In this study, electrically-facilitated forward osmosis (FO) employing proton exchange membrane (PEM) was established for the purpose of microalgae dewatering. An increase in water flux was observed when an external voltage was applied to the FO equipped with the PEM; as expected, the trend became more dramatic with both concentration of draw solution and applied voltage raised. With this FO used for microalgae dewatering, 247% of increase in flux and 86% in final biomass concentration were observed. In addition to the effect on flux improvement, the electrically-facilitated FO exhibited the ability to remove chlorophyll from the dewatered biomass, down to 0.021±0015mg/g cell. All these suggest that the newly suggested electrically-facilitated FO, one particularly employed PEM, can indeed offer a workable way of dewatering of microalgae; it appeared to be so because it can also remove the ever-problematic chlorophyll from extracted lipids in a simultaneous fashion. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Girsanov's transformation based variance reduced Monte Carlo simulation schemes for reliability estimation in nonlinear stochastic dynamics

    Science.gov (United States)

    Kanjilal, Oindrila; Manohar, C. S.

    2017-07-01

    The study considers the problem of simulation based time variant reliability analysis of nonlinear randomly excited dynamical systems. Attention is focused on importance sampling strategies based on the application of Girsanov's transformation method. Controls which minimize the distance function, as in the first order reliability method (FORM), are shown to minimize a bound on the sampling variance of the estimator for the probability of failure. Two schemes based on the application of calculus of variations for selecting control signals are proposed: the first obtains the control force as the solution of a two-point nonlinear boundary value problem, and, the second explores the application of the Volterra series in characterizing the controls. The relative merits of these schemes, vis-à-vis the method based on ideas from the FORM, are discussed. Illustrative examples, involving archetypal single degree of freedom (dof) nonlinear oscillators, and a multi-degree of freedom nonlinear dynamical system, are presented. The credentials of the proposed procedures are established by comparing the solutions with pertinent results from direct Monte Carlo simulations.

  17. Online advertising and marketing claims by providers of proton beam therapy: are they guideline-based?

    Science.gov (United States)

    Corkum, Mark T; Liu, Wei; Palma, David A; Bauman, Glenn S; Dinniwell, Robert E; Warner, Andrew; Mishra, Mark V; Louie, Alexander V

    2018-03-15

    Cancer patients frequently search the Internet for treatment options, and hospital websites are seen as reliable sources of knowledge. Guidelines support the use of proton radiotherapy in specific disease sites or on clinical trials. This study aims to evaluate direct-to-consumer advertising content and claims made by proton therapy centre (PTC) websites worldwide. Operational PTC websites in English were identified through the Particle Therapy Co-Operative Group website. Data abstraction of website content was performed independently by two investigators. Eight international guidelines were consulted to determine guideline-based indications for proton radiotherapy. Univariate and multivariate logistic regression models were used to determine the characteristics of PTC websites that indicated proton radiotherapy offered greater disease control or cure rates. Forty-eight PTCs with 46 English websites were identified. 60·9% of PTC websites claimed proton therapy provided improved disease control or cure. U.S. websites listed more indications than international websites (15·5 ± 5·4 vs. 10·4 ± 5·8, p = 0·004). The most common disease sites advertised were prostate (87·0%), head and neck (87·0%) and pediatrics (82·6%), all of which were indicated in least one international guideline. Several disease sites advertised were not present in any consensus guidelines, including pancreatobiliary (52·2%), breast (50·0%), and esophageal (43·5%) cancers. Multivariate analysis found increasing number of disease sites and claiming their centre was a local or regional leader in proton radiotherapy was associated with indicating proton radiotherapy offers greater disease control or cure. Information from PTC websites often differs from recommendations found in international consensus guidelines. As online marketing information may have significant influence on patient decision-making, alignment of such information with accepted guidelines and consensus

  18. Measurement and Simulation of the Variation in Proton-Induced Energy Deposition in Large Silicon Diode Arrays

    Science.gov (United States)

    Howe, Christina L.; Weller, Robert A.; Reed, Robert A.; Sierawski, Brian D.; Marshall, Paul W.; Marshall, Cheryl J.; Mendenhall, Marcus H.; Schrimpf, Ronald D.

    2007-01-01

    The proton induced charge deposition in a well characterized silicon P-i-N focal plane array is analyzed with Monte Carlo based simulations. These simulations include all physical processes, together with pile up, to accurately describe the experimental data. Simulation results reveal important high energy events not easily detected through experiment due to low statistics. The effects of each physical mechanism on the device response is shown for a single proton energy as well as a full proton space flux.

  19. Analysis by Monte Carlo simulations of the sensitivity to single event upset of SRAM memories under spatial proton or terrestrial neutron environment

    International Nuclear Information System (INIS)

    Lambert, D.

    2006-07-01

    Electronic systems in space and terrestrial environments are subjected to a flow of particles of natural origin, which can induce dysfunctions. These particles can cause Single Event Upsets (SEU) in SRAM memories. Although non-destructive, the SEU can have consequences on the equipment functioning in applications requiring a great reliability (airplane, satellite, launcher, medical, etc). Thus, an evaluation of the sensitivity of the component technology is necessary to predict the reliability of a system. In atmospheric environment, the SEU sensitivity is mainly caused by the secondary ions resulting from the nuclear reactions between the neutrons and the atoms of the component. In space environment, the protons with strong energies induce the same effects as the atmospheric neutrons. In our work, a new code of prediction of the rate of SEU has been developed (MC-DASIE) in order to quantify the sensitivity for a given environment and to explore the mechanisms of failures according to technology. This code makes it possible to study various technologies of memories SRAM (Bulk and SOI) in neutron and proton environment between 1 MeV and 1 GeV. Thus, MC-DASIE was used with experiment data to study the effect of integration on the sensitivity of the memories in terrestrial environment, a comparison between the neutron and proton irradiations and the influence of the modeling of the target component on the calculation of the rate of SEU. (author)

  20. MRI-Based Computed Tomography Metal Artifact Correction Method for Improving Proton Range Calculation Accuracy

    International Nuclear Information System (INIS)

    Park, Peter C.; Schreibmann, Eduard; Roper, Justin; Elder, Eric; Crocker, Ian; Fox, Tim; Zhu, X. Ronald; Dong, Lei; Dhabaan, Anees

    2015-01-01

    Purpose: Computed tomography (CT) artifacts can severely degrade dose calculation accuracy in proton therapy. Prompted by the recently increased popularity of magnetic resonance imaging (MRI) in the radiation therapy clinic, we developed an MRI-based CT artifact correction method for improving the accuracy of proton range calculations. Methods and Materials: The proposed method replaces corrupted CT data by mapping CT Hounsfield units (HU number) from a nearby artifact-free slice, using a coregistered MRI. MRI and CT volumetric images were registered with use of 3-dimensional (3D) deformable image registration (DIR). The registration was fine-tuned on a slice-by-slice basis by using 2D DIR. Based on the intensity of paired MRI pixel values and HU from an artifact-free slice, we performed a comprehensive analysis to predict the correct HU for the corrupted region. For a proof-of-concept validation, metal artifacts were simulated on a reference data set. Proton range was calculated using reference, artifactual, and corrected images to quantify the reduction in proton range error. The correction method was applied to 4 unique clinical cases. Results: The correction method resulted in substantial artifact reduction, both quantitatively and qualitatively. On respective simulated brain and head and neck CT images, the mean error was reduced from 495 and 370 HU to 108 and 92 HU after correction. Correspondingly, the absolute mean proton range errors of 2.4 cm and 1.7 cm were reduced to less than 2 mm in both cases. Conclusions: Our MRI-based CT artifact correction method can improve CT image quality and proton range calculation accuracy for patients with severe CT artifacts

  1. SQERTSS: Dynamic rank based throttling of transition probabilities in kinetic Monte Carlo simulations

    International Nuclear Information System (INIS)

    Danielson, Thomas; Sutton, Jonathan E.; Hin, Céline; Virginia Polytechnic Institute and State University; Savara, Aditya

    2017-01-01

    Lattice based Kinetic Monte Carlo (KMC) simulations offer a powerful simulation technique for investigating large reaction networks while retaining spatial configuration information, unlike ordinary differential equations. However, large chemical reaction networks can contain reaction processes with rates spanning multiple orders of magnitude. This can lead to the problem of “KMC stiffness” (similar to stiffness in differential equations), where the computational expense has the potential to be overwhelmed by very short time-steps during KMC simulations, with the simulation spending an inordinate amount of KMC steps / cpu-time simulating fast frivolous processes (FFPs) without progressing the system (reaction network). In order to achieve simulation times that are experimentally relevant or desired for predictions, a dynamic throttling algorithm involving separation of the processes into speed-ranks based on event frequencies has been designed and implemented with the intent of decreasing the probability of FFP events, and increasing the probability of slow process events -- allowing rate limiting events to become more likely to be observed in KMC simulations. This Staggered Quasi-Equilibrium Rank-based Throttling for Steady-state (SQERTSS) algorithm designed for use in achieving and simulating steady-state conditions in KMC simulations. Lastly, as shown in this work, the SQERTSS algorithm also works for transient conditions: the correct configuration space and final state will still be achieved if the required assumptions are not violated, with the caveat that the sizes of the time-steps may be distorted during the transient period.

  2. Fast CPU-based Monte Carlo simulation for radiotherapy dose calculation

    Science.gov (United States)

    Ziegenhein, Peter; Pirner, Sven; Kamerling, Cornelis Ph; Oelfke, Uwe

    2015-08-01

    Monte-Carlo (MC) simulations are considered to be the most accurate method for calculating dose distributions in radiotherapy. Its clinical application, however, still is limited by the long runtimes conventional implementations of MC algorithms require to deliver sufficiently accurate results on high resolution imaging data. In order to overcome this obstacle we developed the software-package PhiMC, which is capable of computing precise dose distributions in a sub-minute time-frame by leveraging the potential of modern many- and multi-core CPU-based computers. PhiMC is based on the well verified dose planning method (DPM). We could demonstrate that PhiMC delivers dose distributions which are in excellent agreement to DPM. The multi-core implementation of PhiMC scales well between different computer architectures and achieves a speed-up of up to 37× compared to the original DPM code executed on a modern system. Furthermore, we could show that our CPU-based implementation on a modern workstation is between 1.25× and 1.95× faster than a well-known GPU implementation of the same simulation method on a NVIDIA Tesla C2050. Since CPUs work on several hundreds of GB RAM the typical GPU memory limitation does not apply for our implementation and high resolution clinical plans can be calculated.

  3. Ultrafast cone-beam CT scatter correction with GPU-based Monte Carlo simulation

    Directory of Open Access Journals (Sweden)

    Yuan Xu

    2014-03-01

    Full Text Available Purpose: Scatter artifacts severely degrade image quality of cone-beam CT (CBCT. We present an ultrafast scatter correction framework by using GPU-based Monte Carlo (MC simulation and prior patient CT image, aiming at automatically finish the whole process including both scatter correction and reconstruction within 30 seconds.Methods: The method consists of six steps: 1 FDK reconstruction using raw projection data; 2 Rigid Registration of planning CT to the FDK results; 3 MC scatter calculation at sparse view angles using the planning CT; 4 Interpolation of the calculated scatter signals to other angles; 5 Removal of scatter from the raw projections; 6 FDK reconstruction using the scatter-corrected projections. In addition to using GPU to accelerate MC photon simulations, we also use a small number of photons and a down-sampled CT image in simulation to further reduce computation time. A novel denoising algorithm is used to eliminate MC noise from the simulated scatter images caused by low photon numbers. The method is validated on one simulated head-and-neck case with 364 projection angles.Results: We have examined variation of the scatter signal among projection angles using Fourier analysis. It is found that scatter images at 31 angles are sufficient to restore those at all angles with < 0.1% error. For the simulated patient case with a resolution of 512 × 512 × 100, we simulated 5 × 106 photons per angle. The total computation time is 20.52 seconds on a Nvidia GTX Titan GPU, and the time at each step is 2.53, 0.64, 14.78, 0.13, 0.19, and 2.25 seconds, respectively. The scatter-induced shading/cupping artifacts are substantially reduced, and the average HU error of a region-of-interest is reduced from 75.9 to 19.0 HU.Conclusion: A practical ultrafast MC-based CBCT scatter correction scheme is developed. It accomplished the whole procedure of scatter correction and reconstruction within 30 seconds.----------------------------Cite this

  4. Proton linac for hospital-based fast neutron therapy and radioisotope production

    International Nuclear Information System (INIS)

    Lennox, A.J.; Hendrickson, F.R.; Swenson, D.A.; Winje, R.A.; Young, D.E.

    1989-09-01

    Recent developments in linac technology have led to the design of a hospital-based proton linac for fast neutron therapy. The 180 microamp average current allows beam to be diverted for radioisotope production during treatments while maintaining an acceptable dose rate. During dedicated operation, dose rates greater than 280 neutron rads per minute are achievable at depth, DMAX = 1.6 cm with source to axis distance, SAD = 190 cm. Maximum machine energy is 70 MeV and several intermediate energies are available for optimizing production of isotopes for Positron Emission Tomography and other medical applications. The linac can be used to produce a horizontal or a gantry can be added to the downstream end of the linac for conventional patient positioning. The 70 MeV protons can also be used for proton therapy for ocular melanomas. 17 refs., 1 fig., 1 tab

  5. A research plan based on high intensity proton accelerator Neutron Science Research Center

    International Nuclear Information System (INIS)

    Mizumoto, Motoharu

    1997-01-01

    A plan called Neutron Science Research Center (NSRC) has been proposed in JAERI. The center is a complex composed of research facilities based on a proton linac with an energy of 1.5GeV and an average current of 10mA. The research facilities will consist of Thermal/Cold Neutron Facility, Neutron Irradiation Facility, Neutron Physics Facility, OMEGA/Nuclear Energy Facility, Spallation RI Beam Facility, Meson/Muon Facility and Medium Energy Experiment Facility, where high intensity proton beam and secondary particle beams such as neutron, pion, muon and unstable radio isotope (RI) beams generated from the proton beam will be utilized for innovative researches in the fields on nuclear engineering and basic sciences. (author)

  6. A research plan based on high intensity proton accelerator Neutron Science Research Center

    Energy Technology Data Exchange (ETDEWEB)

    Mizumoto, Motoharu [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1997-03-01

    A plan called Neutron Science Research Center (NSRC) has been proposed in JAERI. The center is a complex composed of research facilities based on a proton linac with an energy of 1.5GeV and an average current of 10mA. The research facilities will consist of Thermal/Cold Neutron Facility, Neutron Irradiation Facility, Neutron Physics Facility, OMEGA/Nuclear Energy Facility, Spallation RI Beam Facility, Meson/Muon Facility and Medium Energy Experiment Facility, where high intensity proton beam and secondary particle beams such as neutron, pion, muon and unstable radio isotope (RI) beams generated from the proton beam will be utilized for innovative researches in the fields on nuclear engineering and basic sciences. (author)

  7. Radiation sensors based on the generation of mobile protons in organic dielectrics.

    Science.gov (United States)

    Kapetanakis, Eleftherios; Douvas, Antonios M; Argitis, Panagiotis; Normand, Pascal

    2013-06-26

    A sensing scheme based on mobile protons generated by radiation, including ionizing radiation (IonR), in organic gate dielectrics is investigated for the development of metal-insulator-semiconductor (MIS)-type dosimeters. Application of an electric field to the gate dielectric moves the protons and thereby alters the flat band voltage (VFB) of the MIS device. The shift in the VFB is proportional to the IonR-generated protons and, therefore, to the IonR total dose. Triphenylsulfonium nonaflate (TPSNF) photoacid generator (PAG)-containing poly(methyl methacrylate) (PMMA) polymeric films was selected as radiation-sensitive gate dielectrics. The effects of UV (249 nm) and gamma (Co-60) irradiations on the high-frequency capacitance versus the gate voltage (C-VG) curves of the MIS devices were investigated for different total dose values. Systematic improvements in sensitivity can be accomplished by increasing the concentration of the TPSNF molecules embedded in the polymeric matrix.

  8. Structure and functionality of PVdF/PAN based, composite proton conducting membranes

    International Nuclear Information System (INIS)

    Martinelli, A.; Navarra, M.A.; Matic, A.; Panero, S.; Jacobsson, P.; Boerjesson, L.; Scrosati, B.

    2005-01-01

    We have investigated new poly-vinylidene fluoride/poly-acrylonitrile (PVdF/PAN) based proton conducting membranes by means of vibrational spectroscopy. We find that a complete phase inversion occurs during the preparation procedure, when the gelling solvents are replaced by an acidic solution, providing the proton conducting property. The uptake of acid is promoted both by the presence of PAN and the ceramic filler, Al 2 O 3 . No particular interaction between the polymer matrix and the acidic solution could be detected, supporting the picture of an inert matrix entrapping a liquid component. However, the dissociation degree of the acid is decreased due to the spatial confinement in the membrane. By comparing the dissociation degree and the actual amount of acid in the membrane to the conductivity, we conclude that the limiting factor for the conductivity is the long-range mobility of the protons, which is governed by the morphology of the membrane

  9. Ionomeric membranes based on partially sulfonated poly(styrene) : synthesis, proton conduction and methanol permeation

    NARCIS (Netherlands)

    Picchioni, F.; Tricoli, V.; Carretta, N.

    2000-01-01

    Homogeneuosly sulfonated poly(styrene) (SPS) was prepared with various concentration of sulfonic acid groups in the base polymer. Membranes cast from these materials were investigated in relation to proton conductivity and methanol permeability in the temperature range from 20°C to 60°C. It was

  10. Molecular modeling of the conductivity changes of the emeraldine base polyaniline due to protonic acid doping

    NARCIS (Netherlands)

    Chen, X.; Yuan, C.A.; Wong, C.K.Y.; Zhang, G.

    2012-01-01

    We propose a molecular modeling strategy, which is capable of predicting the conductivity change of emeraldine base polyaniline polymer due to different degree of protonic acid doping. The method is comprised of two key steps: (1) generating the amorphous unit cells with given number of polymer

  11. An UV photochromic memory effect in proton-based WO3 electrochromic devices

    International Nuclear Information System (INIS)

    Zhang Yong; Lee, S.-H.; Mascarenhas, A.; Deb, S. K.

    2008-01-01

    We report an UV photochromic memory effect on a standard proton-based WO 3 electrochromic device. It exhibits two memory states, associated with the colored and bleached states of the device, respectively. Such an effect can be used to enhance device performance (increasing the dynamic range), re-energize commercial electrochromic devices, and develop memory devices

  12. An UV photochromic memory effect in proton-based WO3 electrochromic devices

    Science.gov (United States)

    Zhang, Yong; Lee, S.-H.; Mascarenhas, A.; Deb, S. K.

    2008-11-01

    We report an UV photochromic memory effect on a standard proton-based WO3 electrochromic device. It exhibits two memory states, associated with the colored and bleached states of the device, respectively. Such an effect can be used to enhance device performance (increasing the dynamic range), re-energize commercial electrochromic devices, and develop memory devices.

  13. Proton affinities of anionic bases: Trends across the periodic table, structural effects, and DFT validation

    NARCIS (Netherlands)

    Swart, M.; Bickelhaupt, F.M.

    2006-01-01

    We have carried out an extensive exploration of the gas-phase basicity of archetypal anionic bases across the periodic system using the generalized gradient approximation of density functional theory (DFT) at BP86/QZ4P//BP86/TZ2P. First, we validate DFT as a reliable tool for computing proton

  14. Ionomeric membranes based on partially sulfonated poly(styrene): synthesis, proton conduction and methanol permeation

    NARCIS (Netherlands)

    Carretta, N.; Tricoli, V.; Picchioni, F.

    2000-01-01

    Homogeneuosly sulfonated poly(styrene) (SPS) was prepared with various concentration of sulfonic acid groups in the base polymer. Membranes cast from these materials were investigated in relation to proton conductivity and methanol permeability in the temperature range from 20°C to 60°C. It was

  15. SU-E-T-610: Phosphor-Based Fiber Optic Probes for Proton Beam Characterization

    Energy Technology Data Exchange (ETDEWEB)

    Darafsheh, A; Soldner, A; Liu, H; Kassaee, A; Zhu, T; Finlay, J [Univ Pennsylvania, Philadelphia, PA (United States)

    2015-06-15

    Purpose: To investigate feasibility of using fiber optics probes with rare-earth-based phosphor tips for proton beam radiation dosimetry. We designed and fabricated a fiber probe with submillimeter resolution (<0.5 mm3) based on TbF3 phosphors and evaluated its performance for measurement of proton beam including profiles and range. Methods: The fiber optic probe with TbF3 phosphor tip, embedded in tissue-mimicking phantoms was irradiated with double scattering proton beam with energy of 180 MeV. Luminescence spectroscopy was performed by a CCD-coupled spectrograph to analyze the emission spectra of the fiber tip. In order to measure the spatial beam profile and percentage depth dose, we used singular value decomposition method to spectrally separate the phosphors ionoluminescence signal from the background Cerenkov radiation signal. Results: The spectra of the TbF3 fiber probe showed characteristic ionoluminescence emission peaks at 489, 542, 586, and 620 nm. By using singular value decomposition we found the contribution of the ionoluminescence signal to measure the percentage depth dose in phantoms and compared that with measurements performed with ion chamber. We observed quenching effect at the spread out Bragg peak region, manifested as under-responding of the signal, due to the high LET of the beam. However, the beam profiles were not dramatically affected by the quenching effect. Conclusion: We have evaluated the performance of a fiber optic probe with submillimeter resolution for proton beam dosimetry. We demonstrated feasibility of spectral separation of the Cerenkov radiation from the collected signal. Such fiber probes can be used for measurements of proton beams profile and range. The experimental apparatus and spectroscopy method developed in this work provide a robust platform for characterization of proton-irradiated nanophosphor particles for ultralow fluence photodynamic therapy or molecular imaging applications.

  16. Development of CAD-Based Geometry Processing Module for a Monte Carlo Particle Transport Analysis Code

    International Nuclear Information System (INIS)

    Choi, Sung Hoon; Kwark, Min Su; Shim, Hyung Jin

    2012-01-01

    As The Monte Carlo (MC) particle transport analysis for a complex system such as research reactor, accelerator, and fusion facility may require accurate modeling of the complicated geometry. Its manual modeling by using the text interface of a MC code to define the geometrical objects is tedious, lengthy and error-prone. This problem can be overcome by taking advantage of modeling capability of the computer aided design (CAD) system. There have been two kinds of approaches to develop MC code systems utilizing the CAD data: the external format conversion and the CAD kernel imbedded MC simulation. The first approach includes several interfacing programs such as McCAD, MCAM, GEOMIT etc. which were developed to automatically convert the CAD data into the MCNP geometry input data. This approach makes the most of the existing MC codes without any modifications, but implies latent data inconsistency due to the difference of the geometry modeling system. In the second approach, a MC code utilizes the CAD data for the direct particle tracking or the conversion to an internal data structure of the constructive solid geometry (CSG) and/or boundary representation (B-rep) modeling with help of a CAD kernel. MCNP-BRL and OiNC have demonstrated their capabilities of the CAD-based MC simulations. Recently we have developed a CAD-based geometry processing module for the MC particle simulation by using the OpenCASCADE (OCC) library. In the developed module, CAD data can be used for the particle tracking through primitive CAD surfaces (hereafter the CAD-based tracking) or the internal conversion to the CSG data structure. In this paper, the performances of the text-based model, the CAD-based tracking, and the internal CSG conversion are compared by using an in-house MC code, McSIM, equipped with the developed CAD-based geometry processing module

  17. Limits on the efficiency of event-based algorithms for Monte Carlo neutron transport

    Directory of Open Access Journals (Sweden)

    Paul K. Romano

    2017-09-01

    Full Text Available The traditional form of parallelism in Monte Carlo particle transport simulations, wherein each individual particle history is considered a unit of work, does not lend itself well to data-level parallelism. Event-based algorithms, which were originally used for simulations on vector processors, may offer a path toward better utilizing data-level parallelism in modern computer architectures. In this study, a simple model is developed for estimating the efficiency of the event-based particle transport algorithm under two sets of assumptions. Data collected from simulations of four reactor problems using OpenMC was then used in conjunction with the models to calculate the speedup due to vectorization as a function of the size of the particle bank and the vector width. When each event type is assumed to have constant execution time, the achievable speedup is directly related to the particle bank size. We observed that the bank size generally needs to be at least 20 times greater than vector size to achieve vector efficiency greater than 90%. When the execution times for events are allowed to vary, the vector speedup is also limited by differences in the execution time for events being carried out in a single event-iteration.

  18. Study on quantification method based on Monte Carlo sampling for multiunit probabilistic safety assessment models

    Energy Technology Data Exchange (ETDEWEB)

    Oh, Kye Min [KHNP Central Research Institute, Daejeon (Korea, Republic of); Han, Sang Hoon; Park, Jin Hee; Lim, Ho Gon; Yang, Joon Yang [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Heo, Gyun Young [Kyung Hee University, Yongin (Korea, Republic of)

    2017-06-15

    In Korea, many nuclear power plants operate at a single site based on geographical characteristics, but the population density near the sites is higher than that in other countries. Thus, multiunit accidents are a more important consideration than in other countries and should be addressed appropriately. Currently, there are many issues related to a multiunit probabilistic safety assessment (PSA). One of them is the quantification of a multiunit PSA model. A traditional PSA uses a Boolean manipulation of the fault tree in terms of the minimal cut set. However, such methods have some limitations when rare event approximations cannot be used effectively or a very small truncation limit should be applied to identify accident sequence combinations for a multiunit site. In particular, it is well known that seismic risk in terms of core damage frequency can be overestimated because there are many events that have a high failure probability. In this study, we propose a quantification method based on a Monte Carlo approach for a multiunit PSA model. This method can consider all possible accident sequence combinations in a multiunit site and calculate a more exact value for events that have a high failure probability. An example model for six identical units at a site was also developed and quantified to confirm the applicability of the proposed method.

  19. Iterative reconstruction using a Monte Carlo based system transfer matrix for dedicated breast positron emission tomography

    Energy Technology Data Exchange (ETDEWEB)

    Saha, Krishnendu [Ohio Medical Physics Consulting, Dublin, Ohio 43017 (United States); Straus, Kenneth J.; Glick, Stephen J. [Department of Radiology, University of Massachusetts Medical School, Worcester, Massachusetts 01655 (United States); Chen, Yu. [Department of Radiation Oncology, Columbia University, New York, New York 10032 (United States)

    2014-08-28

    To maximize sensitivity, it is desirable that ring Positron Emission Tomography (PET) systems dedicated for imaging the breast have a small bore. Unfortunately, due to parallax error this causes substantial degradation in spatial resolution for objects near the periphery of the breast. In this work, a framework for computing and incorporating an accurate system matrix into iterative reconstruction is presented in an effort to reduce spatial resolution degradation towards the periphery of the breast. The GATE Monte Carlo Simulation software was utilized to accurately model the system matrix for a breast PET system. A strategy for increasing the count statistics in the system matrix computation and for reducing the system element storage space was used by calculating only a subset of matrix elements and then estimating the rest of the elements by using the geometric symmetry of the cylindrical scanner. To implement this strategy, polar voxel basis functions were used to represent the object, resulting in a block-circulant system matrix. Simulation studies using a breast PET scanner model with ring geometry demonstrated improved contrast at 45% reduced noise level and 1.5 to 3 times resolution performance improvement when compared to MLEM reconstruction using a simple line-integral model. The GATE based system matrix reconstruction technique promises to improve resolution and noise performance and reduce image distortion at FOV periphery compared to line-integral based system matrix reconstruction.

  20. A technique for generating phase-space-based Monte Carlo beamlets in radiotherapy applications

    International Nuclear Information System (INIS)

    Bush, K; Popescu, I A; Zavgorodni, S

    2008-01-01

    As radiotherapy treatment planning moves toward Monte Carlo (MC) based dose calculation methods, the MC beamlet is becoming an increasingly common optimization entity. At present, methods used to produce MC beamlets have utilized a particle source model (PSM) approach. In this work we outline the implementation of a phase-space-based approach to MC beamlet generation that is expected to provide greater accuracy in beamlet dose distributions. In this approach a standard BEAMnrc phase space is sorted and divided into beamlets with particles labeled using the inheritable particle history variable. This is achieved with the use of an efficient sorting algorithm, capable of sorting a phase space of any size into the required number of beamlets in only two passes. Sorting a phase space of five million particles can be achieved in less than 8 s on a single-core 2.2 GHz CPU. The beamlets can then be transported separately into a patient CT dataset, producing separate dose distributions (doselets). Methods for doselet normalization and conversion of dose to absolute units of Gy for use in intensity modulated radiation therapy (IMRT) plan optimization are also described. (note)

  1. Monte Carlo dose calculation using a cell processor based PlayStation 3 system

    Energy Technology Data Exchange (ETDEWEB)

    Chow, James C L; Lam, Phil; Jaffray, David A, E-mail: james.chow@rmp.uhn.on.ca [Department of Radiation Oncology, University of Toronto and Radiation Medicine Program, Princess Margaret Hospital, University Health Network, Toronto, Ontario M5G 2M9 (Canada)

    2012-02-09

    This study investigates the performance of the EGSnrc computer code coupled with a Cell-based hardware in Monte Carlo simulation of radiation dose in radiotherapy. Performance evaluations of two processor-intensive functions namely, HOWNEAR and RANMAR{sub G}ET in the EGSnrc code were carried out basing on the 20-80 rule (Pareto principle). The execution speeds of the two functions were measured by the profiler gprof specifying the number of executions and total time spent on the functions. A testing architecture designed for Cell processor was implemented in the evaluation using a PlayStation3 (PS3) system. The evaluation results show that the algorithms examined are readily parallelizable on the Cell platform, provided that an architectural change of the EGSnrc was made. However, as the EGSnrc performance was limited by the PowerPC Processing Element in the PS3, PC coupled with graphics processing units or GPCPU may provide a more viable avenue for acceleration.

  2. The development of GPU-based parallel PRNG for Monte Carlo applications in CUDA Fortran

    Directory of Open Access Journals (Sweden)

    Hamed Kargaran

    2016-04-01

    Full Text Available The implementation of Monte Carlo simulation on the CUDA Fortran requires a fast random number generation with good statistical properties on GPU. In this study, a GPU-based parallel pseudo random number generator (GPPRNG have been proposed to use in high performance computing systems. According to the type of GPU memory usage, GPU scheme is divided into two work modes including GLOBAL_MODE and SHARED_MODE. To generate parallel random numbers based on the independent sequence method, the combination of middle-square method and chaotic map along with the Xorshift PRNG have been employed. Implementation of our developed PPRNG on a single GPU showed a speedup of 150x and 470x (with respect to the speed of PRNG on a single CPU core for GLOBAL_MODE and SHARED_MODE, respectively. To evaluate the accuracy of our developed GPPRNG, its performance was compared to that of some other commercially available PPRNGs such as MATLAB, FORTRAN and Miller-Park algorithm through employing the specific standard tests. The results of this comparison showed that the developed GPPRNG in this study can be used as a fast and accurate tool for computational science applications.

  3. The development of GPU-based parallel PRNG for Monte Carlo applications in CUDA Fortran

    Energy Technology Data Exchange (ETDEWEB)

    Kargaran, Hamed, E-mail: h-kargaran@sbu.ac.ir; Minuchehr, Abdolhamid; Zolfaghari, Ahmad [Department of nuclear engineering, Shahid Behesti University, Tehran, 1983969411 (Iran, Islamic Republic of)

    2016-04-15

    The implementation of Monte Carlo simulation on the CUDA Fortran requires a fast random number generation with good statistical properties on GPU. In this study, a GPU-based parallel pseudo random number generator (GPPRNG) have been proposed to use in high performance computing systems. According to the type of GPU memory usage, GPU scheme is divided into two work modes including GLOBAL-MODE and SHARED-MODE. To generate parallel random numbers based on the independent sequence method, the combination of middle-square method and chaotic map along with the Xorshift PRNG have been employed. Implementation of our developed PPRNG on a single GPU showed a speedup of 150x and 470x (with respect to the speed of PRNG on a single CPU core) for GLOBAL-MODE and SHARED-MODE, respectively. To evaluate the accuracy of our developed GPPRNG, its performance was compared to that of some other commercially available PPRNGs such as MATLAB, FORTRAN and Miller-Park algorithm through employing the specific standard tests. The results of this comparison showed that the developed GPPRNG in this study can be used as a fast and accurate tool for computational science applications.

  4. New Monte Carlo-based method to evaluate fission fraction uncertainties for the reactor antineutrino experiment

    Energy Technology Data Exchange (ETDEWEB)

    Ma, X.B., E-mail: maxb@ncepu.edu.cn; Qiu, R.M.; Chen, Y.X.

    2017-02-15

    Uncertainties regarding fission fractions are essential in understanding antineutrino flux predictions in reactor antineutrino experiments. A new Monte Carlo-based method to evaluate the covariance coefficients between isotopes is proposed. The covariance coefficients are found to vary with reactor burnup and may change from positive to negative because of balance effects in fissioning. For example, between {sup 235}U and {sup 239}Pu, the covariance coefficient changes from 0.15 to −0.13. Using the equation relating fission fraction and atomic density, consistent uncertainties in the fission fraction and covariance matrix were obtained. The antineutrino flux uncertainty is 0.55%, which does not vary with reactor burnup. The new value is about 8.3% smaller. - Highlights: • The covariance coefficients between isotopes vs reactor burnup may change its sign because of two opposite effects. • The relation between fission fraction uncertainty and atomic density are first studied. • A new MC-based method of evaluating the covariance coefficients between isotopes was proposed.

  5. Using gEUD based plan analysis method to evaluate proton vs. photon plans for lung cancer radiation therapy.

    Science.gov (United States)

    Xiao, Zhiyan; Zou, Wei J; Chen, Ting; Yue, Ning J; Jabbour, Salma K; Parikh, Rahul; Zhang, Miao

    2018-03-01

    The goal of this study was to exam the efficacy of current DVH based clinical guidelines draw from photon experience for lung cancer radiation therapy on proton therapy. Comparison proton plans and IMRT plans were generated for 10 lung patients treated in our proton facility. A gEUD based plan evaluation method was developed for plan evaluation. This evaluation method used normal lung gEUD(a) curve in which the model parameter "a" was sampled from the literature reported value. For all patients, the proton plans delivered lower normal lung V 5 Gy with similar V 20 Gy and similar target coverage. Based on current clinical guidelines, proton plans were ranked superior to IMRT plans for all 10 patients. However, the proton and IMRT normal lung gEUD(a) curves crossed for 8 patients within the tested range of "a", which means there was a possibility that proton plan would be worse than IMRT plan for lung sparing. A concept of deficiency index (DI) was introduced to quantify the probability of proton plans doing worse than IMRT plans. By applying threshold on DI, four patients' proton plan was ranked inferior to the IMRT plan. Meanwhile if a threshold to the location of curve crossing was applied, 6 patients' proton plan was ranked inferior to the IMRT plan. The contradictory ranking results between the current clinical guidelines and the gEUD(a) curve analysis demonstrated there is potential pitfalls by applying photon experience directly to the proton world. A comprehensive plan evaluation based on radio-biological models should be carried out to decide if a lung patient would really be benefit from proton therapy. © 2018 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.

  6. Memory and learning behaviors mimicked in nanogranular SiO2-based proton conductor gated oxide-based synaptic transistors.

    Science.gov (United States)

    Wan, Chang Jin; Zhu, Li Qiang; Zhou, Ju Mei; Shi, Yi; Wan, Qing

    2013-11-07

    In neuroscience, signal processing, memory and learning function are established in the brain by modifying ionic fluxes in neurons and synapses. Emulation of memory and learning behaviors of biological systems by nanoscale ionic/electronic devices is highly desirable for building neuromorphic systems or even artificial neural networks. Here, novel artificial synapses based on junctionless oxide-based protonic/electronic hybrid transistors gated by nanogranular phosphorus-doped SiO2-based proton-conducting films are fabricated on glass substrates by a room-temperature process. Short-term memory (STM) and long-term memory (LTM) are mimicked by tuning the pulse gate voltage amplitude. The LTM process in such an artificial synapse is due to the proton-related interfacial electrochemical reaction. Our results are highly desirable for building future neuromorphic systems or even artificial networks via electronic elements.

  7. Muusikamaailm : "Euroopa muusikakuu" Baselis. Leif Ove Andsnes Londonis. Konkursipreemiaid. Monte Pederson lahkunud / Priit Kuusk

    Index Scriptorium Estoniae

    Kuusk, Priit, 1938-

    2001-01-01

    Novembrikuus elab šveitsi linn Basel "Euroopa muusikakuu" tähe all. Noor norra pianist Leif Ove Andsnes kutsuti Londonisse esinema. Konkursipreemiaid erinevatel konkurssidelt. Suri ameerika laulja Monte Pederson

  8. CAD-based Monte Carlo program for integrated simulation of nuclear system SuperMC

    International Nuclear Information System (INIS)

    Wu, Yican; Song, Jing; Zheng, Huaqing; Sun, Guangyao; Hao, Lijuan; Long, Pengcheng; Hu, Liqin

    2015-01-01

    Highlights: • The new developed CAD-based Monte Carlo program named SuperMC for integrated simulation of nuclear system makes use of hybrid MC-deterministic method and advanced computer technologies. SuperMC is designed to perform transport calculation of various types of particles, depletion and activation calculation including isotope burn-up, material activation and shutdown dose, and multi-physics coupling calculation including thermo-hydraulics, fuel performance and structural mechanics. The bi-directional automatic conversion between general CAD models and physical settings and calculation models can be well performed. Results and process of simulation can be visualized with dynamical 3D dataset and geometry model. Continuous-energy cross section, burnup, activation, irradiation damage and material data etc. are used to support the multi-process simulation. Advanced cloud computing framework makes the computation and storage extremely intensive simulation more attractive just as a network service to support design optimization and assessment. The modular design and generic interface promotes its flexible manipulation and coupling of external solvers. • The new developed and incorporated advanced methods in SuperMC was introduced including hybrid MC-deterministic transport method, particle physical interaction treatment method, multi-physics coupling calculation method, geometry automatic modeling and processing method, intelligent data analysis and visualization method, elastic cloud computing technology and parallel calculation method. • The functions of SuperMC2.1 integrating automatic modeling, neutron and photon transport calculation, results and process visualization was introduced. It has been validated by using a series of benchmarking cases such as the fusion reactor ITER model and the fast reactor BN-600 model. - Abstract: Monte Carlo (MC) method has distinct advantages to simulate complicated nuclear systems and is envisioned as a routine

  9. TH-E-BRE-08: GPU-Monte Carlo Based Fast IMRT Plan Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Li, Y; Tian, Z; Shi, F; Jiang, S; Jia, X [The University of Texas Southwestern Medical Ctr, Dallas, TX (United States)

    2014-06-15

    Purpose: Intensity-modulated radiation treatment (IMRT) plan optimization needs pre-calculated beamlet dose distribution. Pencil-beam or superposition/convolution type algorithms are typically used because of high computation speed. However, inaccurate beamlet dose distributions, particularly in cases with high levels of inhomogeneity, may mislead optimization, hindering the resulting plan quality. It is desire to use Monte Carlo (MC) methods for beamlet dose calculations. Yet, the long computational time from repeated dose calculations for a number of beamlets prevents this application. It is our objective to integrate a GPU-based MC dose engine in lung IMRT optimization using a novel two-steps workflow. Methods: A GPU-based MC code gDPM is used. Each particle is tagged with an index of a beamlet where the source particle is from. Deposit dose are stored separately for beamlets based on the index. Due to limited GPU memory size, a pyramid space is allocated for each beamlet, and dose outside the space is neglected. A two-steps optimization workflow is proposed for fast MC-based optimization. At first step, rough beamlet dose calculations is conducted with only a small number of particles per beamlet. Plan optimization is followed to get an approximated fluence map. In the second step, more accurate beamlet doses are calculated, where sampled number of particles for a beamlet is proportional to the intensity determined previously. A second-round optimization is conducted, yielding the final Result. Results: For a lung case with 5317 beamlets, 10{sup 5} particles per beamlet in the first round, and 10{sup 8} particles per beam in the second round are enough to get a good plan quality. The total simulation time is 96.4 sec. Conclusion: A fast GPU-based MC dose calculation method along with a novel two-step optimization workflow are developed. The high efficiency allows the use of MC for IMRT optimizations.

  10. Efficiency of respiratory-gated delivery of synchrotron-based pulsed proton irradiation

    International Nuclear Information System (INIS)

    Tsunashima, Yoshikazu; Vedam, Sastry; Dong, Lei; Bues, Martin; Balter, Peter; Smith, Alfred; Mohan, Radhe; Umezawa, Masumi; Sakae, Takeji

    2008-01-01

    Significant differences exist in respiratory-gated proton beam delivery with a synchrotron-based accelerator system when compared to photon therapy with a conventional linear accelerator. Delivery of protons with a synchrotron accelerator is governed by a magnet excitation cycle pattern. Optimal synchronization of the magnet excitation cycle pattern with the respiratory motion pattern is critical to the efficiency of respiratory-gated proton delivery. There has been little systematic analysis to optimize the accelerator's operational parameters to improve gated treatment efficiency. The goal of this study was to estimate the overall efficiency of respiratory-gated synchrotron-based proton irradiation through realistic simulation. Using 62 respiratory motion traces from 38 patients, we simulated respiratory gating for duty cycles of 30%, 20% and 10% around peak exhalation for various fixed and variable magnet excitation patterns. In each case, the time required to deliver 100 monitor units in both non-gated and gated irradiation scenarios was determined. Based on results from this study, the minimum time required to deliver 100 MU was 1.1 min for non-gated irradiation. For respiratory-gated delivery at a 30% duty cycle around peak exhalation, corresponding average delivery times were typically three times longer with a fixed magnet excitation cycle pattern. However, when a variable excitation cycle was allowed in synchrony with the patient's respiratory cycle, the treatment time only doubled. Thus, respiratory-gated delivery of synchrotron-based pulsed proton irradiation is feasible and more efficient when a variable magnet excitation cycle pattern is used

  11. Application of the measurement-based Monte Carlo method in nasopharyngeal cancer patients for intensity modulated radiation therapy

    International Nuclear Information System (INIS)

    Yeh, C.Y.; Lee, C.C.; Chao, T.C.; Lin, M.H.; Lai, P.A.; Liu, F.H.; Tung, C.J.

    2014-01-01

    This study aims to utilize a measurement-based Monte Carlo (MBMC) method to evaluate the accuracy of dose distributions calculated using the Eclipse radiotherapy treatment planning system (TPS) based on the anisotropic analytical algorithm. Dose distributions were calculated for the nasopharyngeal carcinoma (NPC) patients treated with the intensity modulated radiotherapy (IMRT). Ten NPC IMRT plans were evaluated by comparing their dose distributions with those obtained from the in-house MBMC programs for the same CT images and beam geometry. To reconstruct the fluence distribution of the IMRT field, an efficiency map was obtained by dividing the energy fluence of the intensity modulated field by that of the open field, both acquired from an aS1000 electronic portal imaging device. The integrated image of the non-gated mode was used to acquire the full dose distribution delivered during the IMRT treatment. This efficiency map redistributed the particle weightings of the open field phase-space file for IMRT applications. Dose differences were observed in the tumor and air cavity boundary. The mean difference between MBMC and TPS in terms of the planning target volume coverage was 0.6% (range: 0.0–2.3%). The mean difference for the conformity index was 0.01 (range: 0.0–0.01). In conclusion, the MBMC method serves as an independent IMRT dose verification tool in a clinical setting. - Highlights: ► The patient-based Monte Carlo method serves as a reference standard to verify IMRT doses. ► 3D Dose distributions for NPC patients have been verified by the Monte Carlo method. ► Doses predicted by the Monte Carlo method matched closely with those by the TPS. ► The Monte Carlo method predicted a higher mean dose to the middle ears than the TPS. ► Critical organ doses should be confirmed to avoid overdose to normal organs

  12. Reporting and analyzing statistical uncertainties in Monte Carlo-based treatment planning

    International Nuclear Information System (INIS)

    Chetty, Indrin J.; Rosu, Mihaela; Kessler, Marc L.; Fraass, Benedick A.; Haken, Randall K. ten; Kong, Feng-Ming; McShan, Daniel L.

    2006-01-01

    Purpose: To investigate methods of reporting and analyzing statistical uncertainties in doses to targets and normal tissues in Monte Carlo (MC)-based treatment planning. Methods and Materials: Methods for quantifying statistical uncertainties in dose, such as uncertainty specification to specific dose points, or to volume-based regions, were analyzed in MC-based treatment planning for 5 lung cancer patients. The effect of statistical uncertainties on target and normal tissue dose indices was evaluated. The concept of uncertainty volume histograms for targets and organs at risk was examined, along with its utility, in conjunction with dose volume histograms, in assessing the acceptability of the statistical precision in dose distributions. The uncertainty evaluation tools were extended to four-dimensional planning for application on multiple instances of the patient geometry. All calculations were performed using the Dose Planning Method MC code. Results: For targets, generalized equivalent uniform doses and mean target doses converged at 150 million simulated histories, corresponding to relative uncertainties of less than 2% in the mean target doses. For the normal lung tissue (a volume-effect organ), mean lung dose and normal tissue complication probability converged at 150 million histories despite the large range in the relative organ uncertainty volume histograms. For 'serial' normal tissues such as the spinal cord, large fluctuations exist in point dose relative uncertainties. Conclusions: The tools presented here provide useful means for evaluating statistical precision in MC-based dose distributions. Tradeoffs between uncertainties in doses to targets, volume-effect organs, and 'serial' normal tissues must be considered carefully in determining acceptable levels of statistical precision in MC-computed dose distributions

  13. A GPU-based Monte Carlo dose calculation code for photon transport in a voxel phantom

    International Nuclear Information System (INIS)

    Bellezzo, M.; Do Nascimento, E.; Yoriyaz, H.

    2014-08-01

    As the most accurate method to estimate absorbed dose in radiotherapy, Monte Carlo method has been widely used in radiotherapy treatment planning. Nevertheless, its efficiency can be improved for clinical routine applications. In this paper, we present the CUBMC code, a GPU-based Mc photon transport algorithm for dose calculation under the Compute Unified Device Architecture platform. The simulation of physical events is based on the algorithm used in Penelope, and the cross section table used is the one generated by the Material routine, als present in Penelope code. Photons are transported in voxel-based geometries with different compositions. To demonstrate the capabilities of the algorithm developed in the present work four 128 x 128 x 128 voxel phantoms have been considered. One of them is composed by a homogeneous water-based media, the second is composed by bone, the third is composed by lung and the fourth is composed by a heterogeneous bone and vacuum geometry. Simulations were done considering a 6 MeV monoenergetic photon point source. There are two distinct approaches that were used for transport simulation. The first of them forces the photon to stop at every voxel frontier, the second one is the Woodcock method, where the photon stop in the frontier will be considered depending on the material changing across the photon travel line. Dose calculations using these methods are compared for validation with Penelope and MCNP5 codes. Speed-up factors are compared using a NVidia GTX 560-Ti GPU card against a 2.27 GHz Intel Xeon CPU processor. (Author)

  14. A GPU-based Monte Carlo dose calculation code for photon transport in a voxel phantom

    Energy Technology Data Exchange (ETDEWEB)

    Bellezzo, M.; Do Nascimento, E.; Yoriyaz, H., E-mail: mbellezzo@gmail.br [Instituto de Pesquisas Energeticas e Nucleares / CNEN, Av. Lineu Prestes 2242, Cidade Universitaria, 05508-000 Sao Paulo (Brazil)

    2014-08-15

    As the most accurate method to estimate absorbed dose in radiotherapy, Monte Carlo method has been widely used in radiotherapy treatment planning. Nevertheless, its efficiency can be improved for clinical routine applications. In this paper, we present the CUBMC code, a GPU-based Mc photon transport algorithm for dose calculation under the Compute Unified Device Architecture platform. The simulation of physical events is based on the algorithm used in Penelope, and the cross section table used is the one generated by the Material routine, als present in Penelope code. Photons are transported in voxel-based geometries with different compositions. To demonstrate the capabilities of the algorithm developed in the present work four 128 x 128 x 128 voxel phantoms have been considered. One of them is composed by a homogeneous water-based media, the second is composed by bone, the third is composed by lung and the fourth is composed by a heterogeneous bone and vacuum geometry. Simulations were done considering a 6 MeV monoenergetic photon point source. There are two distinct approaches that were used for transport simulation. The first of them forces the photon to stop at every voxel frontier, the second one is the Woodcock method, where the photon stop in the frontier will be considered depending on the material changing across the photon travel line. Dose calculations using these methods are compared for validation with Penelope and MCNP5 codes. Speed-up factors are compared using a NVidia GTX 560-Ti GPU card against a 2.27 GHz Intel Xeon CPU processor. (Author)

  15. Study on the propagation properties of laser in aerosol based on Monte Carlo simulation

    Science.gov (United States)

    Leng, Kun; Wu, Wenyuan; Zhang, Xi; Gong, Yanchun; Yang, Yuntao

    2018-02-01

    When laser propagate in the atmosphere, due to aerosol scattering and absorption, laser energy will continue to decline, affecting the effectiveness of the laser effect. Based on the Monte Carlo method, the relationship between the photon spatial energy distributions of the laser wavelengths of 10.6μm in marine, sand-type, water-soluble and soot aerosols ,and the propagation distance, visibility and the divergence angle were studied. The results show that for 10.6μm laser, the maximum number of attenuation of photons arriving at the receiving plane is sand-type aerosol, the minimal attenuation is water soluble aerosol; as the propagation distance increases, the number of photons arriving at the receiving plane decreases; as the visibility increases, the number of photons arriving at the receiving plane increases rapidly and then stabilizes; in the above cases, the photon energy distribution does not deviated from the Gaussian distribution; as the divergence angle increases, the number of photons arriving at the receiving plane is almost unchanged, but the photon energy distribution gradually deviates from the Gaussian distribution.

  16. IVF cycle cost estimation using Activity Based Costing and Monte Carlo simulation.

    Science.gov (United States)

    Cassettari, Lucia; Mosca, Marco; Mosca, Roberto; Rolando, Fabio; Costa, Mauro; Pisaturo, Valerio

    2016-03-01

    The Authors present a new methodological approach in stochastic regime to determine the actual costs of an healthcare process. The paper specifically shows the application of the methodology for the determination of the cost of an Assisted reproductive technology (ART) treatment in Italy. The reason of this research comes from the fact that deterministic regime is inadequate to implement an accurate estimate of the cost of this particular treatment. In fact the durations of the different activities involved are unfixed and described by means of frequency distributions. Hence the need to determine in addition to the mean value of the cost, the interval within which it is intended to vary with a known confidence level. Consequently the cost obtained for each type of cycle investigated (in vitro fertilization and embryo transfer with or without intracytoplasmic sperm injection), shows tolerance intervals around the mean value sufficiently restricted as to make the data obtained statistically robust and therefore usable also as reference for any benchmark with other Countries. It should be noted that under a methodological point of view the approach was rigorous. In fact it was used both the technique of Activity Based Costing for determining the cost of individual activities of the process both the Monte Carlo simulation, with control of experimental error, for the construction of the tolerance intervals on the final result.

  17. Monte Carlo simulation of ordering transformations in Ni-Mo-based alloys

    International Nuclear Information System (INIS)

    Kulkarni, U.D.

    2004-01-01

    The quenched in state of short range order (SRO) in binary Ni-Mo alloys is characterized by intensity maxima at {1 (1/2) 0} and equivalent positions in the reciprocal space. Ternary addition of a small amount of Al to the binary alloy, on the other hand, leads to a state of SRO that gives rise to intensity maxima at {1 0 0} and equivalent, in addition to {1 (1/2) 0} and equivalent, positions in the selected area electron diffraction patterns. Different geometric patterns of streaks of diffuse intensity, joining the SRO maxima with the superlattice positions of the emerging long range ordered (LRO) structures or in some cases between the superlattice positions of different LRO structures, are observed during the SRO-to-LRO transitions in the Ni-Mo-based and other 1 (1/2) 0 alloys. Monte Carlo simulations have been carried out here in order to shed some light on the atomic structures of the SRO and the SRO-to-LRO transition states in these alloys

  18. Voxel-based Monte Carlo simulation of X-ray imaging and spectroscopy experiments

    International Nuclear Information System (INIS)

    Bottigli, U.; Brunetti, A.; Golosio, B.; Oliva, P.; Stumbo, S.; Vincze, L.; Randaccio, P.; Bleuet, P.; Simionovici, A.; Somogyi, A.

    2004-01-01

    A Monte Carlo code for the simulation of X-ray imaging and spectroscopy experiments in heterogeneous samples is presented. The energy spectrum, polarization and profile of the incident beam can be defined so that X-ray tube systems as well as synchrotron sources can be simulated. The sample is modeled as a 3D regular grid. The chemical composition and density is given at each point of the grid. Photoelectric absorption, fluorescent emission, elastic and inelastic scattering are included in the simulation. The core of the simulation is a fast routine for the calculation of the path lengths of the photon trajectory intersections with the grid voxels. The voxel representation is particularly useful for samples that cannot be well described by a small set of polyhedra. This is the case of most naturally occurring samples. In such cases, voxel-based simulations are much less expensive in terms of computational cost than simulations on a polygonal representation. The efficient scheme used for calculating the path lengths in the voxels and the use of variance reduction techniques make the code suitable for the detailed simulation of complex experiments on generic samples in a relatively short time. Examples of applications to X-ray imaging and spectroscopy experiments are discussed

  19. Learning Algorithm of Boltzmann Machine Based on Spatial Monte Carlo Integration Method

    Directory of Open Access Journals (Sweden)

    Muneki Yasuda

    2018-04-01

    Full Text Available The machine learning techniques for Markov random fields are fundamental in various fields involving pattern recognition, image processing, sparse modeling, and earth science, and a Boltzmann machine is one of the most important models in Markov random fields. However, the inference and learning problems in the Boltzmann machine are NP-hard. The investigation of an effective learning algorithm for the Boltzmann machine is one of the most important challenges in the field of statistical machine learning. In this paper, we study Boltzmann machine learning based on the (first-order spatial Monte Carlo integration method, referred to as the 1-SMCI learning method, which was proposed in the author’s previous paper. In the first part of this paper, we compare the method with the maximum pseudo-likelihood estimation (MPLE method using a theoretical and a numerical approaches, and show the 1-SMCI learning method is more effective than the MPLE. In the latter part, we compare the 1-SMCI learning method with other effective methods, ratio matching and minimum probability flow, using a numerical experiment, and show the 1-SMCI learning method outperforms them.

  20. Monte Carlo efficiency calibration of a neutron generator-based total-body irradiator

    International Nuclear Information System (INIS)

    Shypailo, R.J.; Ellis, K.J.

    2009-01-01

    Many body composition measurement systems are calibrated against a single-sized reference phantom. Prompt-gamma neutron activation (PGNA) provides the only direct measure of total body nitrogen (TBN), an index of the body's lean tissue mass. In PGNA systems, body size influences neutron flux attenuation, induced gamma signal distribution, and counting efficiency. Thus, calibration based on a single-sized phantom could result in inaccurate TBN values. We used Monte Carlo simulations (MCNP-5; Los Alamos National Laboratory) in order to map a system's response to the range of body weights (65-160 kg) and body fat distributions (25-60%) in obese humans. Calibration curves were constructed to derive body-size correction factors relative to a standard reference phantom, providing customized adjustments to account for differences in body habitus of obese adults. The use of MCNP-generated calibration curves should allow for a better estimate of the true changes in lean tissue mass that many occur during intervention programs focused only on weight loss. (author)

  1. Optimization of Control Points Number at Coordinate Measurements based on the Monte-Carlo Method

    Science.gov (United States)

    Korolev, A. A.; Kochetkov, A. V.; Zakharov, O. V.

    2018-01-01

    Improving the quality of products causes an increase in the requirements for the accuracy of the dimensions and shape of the surfaces of the workpieces. This, in turn, raises the requirements for accuracy and productivity of measuring of the workpieces. The use of coordinate measuring machines is currently the most effective measuring tool for solving similar problems. The article proposes a method for optimizing the number of control points using Monte Carlo simulation. Based on the measurement of a small sample from batches of workpieces, statistical modeling is performed, which allows one to obtain interval estimates of the measurement error. This approach is demonstrated by examples of applications for flatness, cylindricity and sphericity. Four options of uniform and uneven arrangement of control points are considered and their comparison is given. It is revealed that when the number of control points decreases, the arithmetic mean decreases, the standard deviation of the measurement error increases and the probability of the measurement α-error increases. In general, it has been established that it is possible to repeatedly reduce the number of control points while maintaining the required measurement accuracy.

  2. Ab initio based kinetic Monte-Carlo simulations of phase transformations in FeCrAl

    International Nuclear Information System (INIS)

    Olsson, Paer

    2015-01-01

    Document available in abstract form only, full text follows: Corrosion and erosion in lead cooled reactors can be a serious issue due to the high operating temperature and the necessary flow rates. FeCrAl alloys are under consideration as cladding or as coating for stainless steel cladding tubes for lead cooled reactor concepts. The alumina scale that is formed, as Al segregates to the surface and Fe and Cr rich oxides break off, offers a highly protective layer against lead corrosion in a large range of temperatures. However, there are concerns about the phase stability of the alloy under irradiation conditions and of possible induced alpha-prime precipitation. Here a theoretical model of the ternary FeCrAl alloy is presented, based on density functional theory predictions and linked to a kinetic Monte-Carlo simulation framework. The effect of Al on the FeCr miscibility properties are discussed and the coupling of irradiation induced defects with the solutes are treated. Simulations of the micro-structure evolution are tentatively compared to available experiments. (authors)

  3. Poster - 20: Detector selection for commissioning of a Monte Carlo based electron dose calculation algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Anusionwu, Princess [Medical Physics, CancerCare Manitoba, Winnipeg Canada (Canada); Department of Physics & Astronomy, University of Manitoba, Winnipeg Canada (Canada); Alpuche Aviles, Jorge E. [Medical Physics, CancerCare Manitoba, Winnipeg Canada (Canada); Pistorius, Stephen [Medical Physics, CancerCare Manitoba, Winnipeg Canada (Canada); Department of Physics & Astronomy, University of Manitoba, Winnipeg Canada (Canada); Department of Radiology, University of Manitoba, Winnipeg (Canada)

    2016-08-15

    Objective: Commissioning of a Monte Carlo based electron dose calculation algorithm requires percentage depth doses (PDDs) and beam profiles which can be measured with multiple detectors. Electron dosimetry is commonly performed with cylindrical chambers but parallel plate chambers and diodes can also be used. The purpose of this study was to determine the most appropriate detector to perform the commissioning measurements. Methods: PDDs and beam profiles were measured for beams with energies ranging from 6 MeV to 15 MeV and field sizes ranging from 6 cm × 6 cm to 40 cm × 40 cm. Detectors used included diodes, cylindrical and parallel plate ionization chambers. Beam profiles were measured in water (100 cm source to surface distance) and in air (95 cm source to detector distance). Results: PDDs for the cylindrical chambers were shallower (1.3 mm averaged over all energies and field sizes) than those measured with the parallel plate chambers and diodes. Surface doses measured with the diode and cylindrical chamber were on average larger by 1.6 % and 3% respectively than those of the parallel plate chamber. Profiles measured with a diode resulted in penumbra values smaller than those measured with the cylindrical chamber by 2 mm. Conclusion: The diode was selected as the most appropriate detector since PDDs agreed with those measured with parallel plate chambers (typically recommended for low energies) and results in sharper profiles. Unlike ion chambers, no corrections are needed to measure PDDs, making it more convenient to use.

  4. dsmcFoam+: An OpenFOAM based direct simulation Monte Carlo solver

    Science.gov (United States)

    White, C.; Borg, M. K.; Scanlon, T. J.; Longshaw, S. M.; John, B.; Emerson, D. R.; Reese, J. M.

    2018-03-01

    dsmcFoam+ is a direct simulation Monte Carlo (DSMC) solver for rarefied gas dynamics, implemented within the OpenFOAM software framework, and parallelised with MPI. It is open-source and released under the GNU General Public License in a publicly available software repository that includes detailed documentation and tutorial DSMC gas flow cases. This release of the code includes many features not found in standard dsmcFoam, such as molecular vibrational and electronic energy modes, chemical reactions, and subsonic pressure boundary conditions. Since dsmcFoam+ is designed entirely within OpenFOAM's C++ object-oriented framework, it benefits from a number of key features: the code emphasises extensibility and flexibility so it is aimed first and foremost as a research tool for DSMC, allowing new models and test cases to be developed and tested rapidly. All DSMC cases are as straightforward as setting up any standard OpenFOAM case, as dsmcFoam+ relies upon the standard OpenFOAM dictionary based directory structure. This ensures that useful pre- and post-processing capabilities provided by OpenFOAM remain available even though the fully Lagrangian nature of a DSMC simulation is not typical of most OpenFOAM applications. We show that dsmcFoam+ compares well to other well-known DSMC codes and to analytical solutions in terms of benchmark results.

  5. Cost-effectiveness of targeted screening for abdominal aortic aneurysm. Monte Carlo-based estimates.

    Science.gov (United States)

    Pentikäinen, T J; Sipilä, T; Rissanen, P; Soisalon-Soininen, S; Salo, J

    2000-01-01

    This article reports a cost-effectiveness analysis of targeted screening for abdominal aortic aneurysm (AAA). A major emphasis was on the estimation of distributions of costs and effectiveness. We performed a Monte Carlo simulation using C programming language in a PC environment. Data on survival and costs, and a majority of screening probabilities, were from our own empirical studies. Natural history data were based on the literature. Each screened male gained 0.07 life-years at an incremental cost of FIM 3,300. The expected values differed from zero very significantly. For females, expected gains were 0.02 life-years at an incremental cost of FIM 1,100, which was not statistically significant. Cost-effectiveness ratios and their 95% confidence intervals were FIM 48,000 (27,000-121,000) and 54,000 (22,000-infinity) for males and females, respectively. Sensitivity analysis revealed that the results for males were stable. Individual variation in life-year gains was high. Males seemed to benefit from targeted AAA screening, and the results were stable. As far as the cost-effectiveness ratio is considered acceptable, screening for males seemed to be justified. However, our assumptions about growth and rupture behavior of AAAs might be improved with further clinical and epidemiological studies. As a point estimate, females benefited in a similar manner, but the results were not statistically significant. The evidence of this study did not justify screening of females.

  6. Voxel-based Monte Carlo simulation of X-ray imaging and spectroscopy experiments

    Energy Technology Data Exchange (ETDEWEB)

    Bottigli, U. [Istituto di Matematica e Fisica dell' Universita di Sassari, via Vienna 2, 07100, Sassari (Italy); Sezione INFN di Cagliari (Italy); Brunetti, A. [Istituto di Matematica e Fisica dell' Universita di Sassari, via Vienna 2, 07100, Sassari (Italy); Golosio, B. [Istituto di Matematica e Fisica dell' Universita di Sassari, via Vienna 2, 07100, Sassari (Italy) and Sezione INFN di Cagliari (Italy)]. E-mail: golosio@uniss.it; Oliva, P. [Istituto di Matematica e Fisica dell' Universita di Sassari, via Vienna 2, 07100, Sassari (Italy); Stumbo, S. [Istituto di Matematica e Fisica dell' Universita di Sassari, via Vienna 2, 07100, Sassari (Italy); Vincze, L. [Department of Chemistry, University of Antwerp (Belgium); Randaccio, P. [Dipartimento di Fisica dell' Universita di Cagliari and Sezione INFN di Cagliari (Italy); Bleuet, P. [European Synchrotron Radiation Facility, Grenoble (France); Simionovici, A. [European Synchrotron Radiation Facility, Grenoble (France); Somogyi, A. [European Synchrotron Radiation Facility, Grenoble (France)

    2004-10-08

    A Monte Carlo code for the simulation of X-ray imaging and spectroscopy experiments in heterogeneous samples is presented. The energy spectrum, polarization and profile of the incident beam can be defined so that X-ray tube systems as well as synchrotron sources can be simulated. The sample is modeled as a 3D regular grid. The chemical composition and density is given at each point of the grid. Photoelectric absorption, fluorescent emission, elastic and inelastic scattering are included in the simulation. The core of the simulation is a fast routine for the calculation of the path lengths of the photon trajectory intersections with the grid voxels. The voxel representation is particularly useful for samples that cannot be well described by a small set of polyhedra. This is the case of most naturally occurring samples. In such cases, voxel-based simulations are much less expensive in terms of computational cost than simulations on a polygonal representation. The efficient scheme used for calculating the path lengths in the voxels and the use of variance reduction techniques make the code suitable for the detailed simulation of complex experiments on generic samples in a relatively short time. Examples of applications to X-ray imaging and spectroscopy experiments are discussed.

  7. Comparison Between In-Beam and Offline Positron Emission Tomography Imaging of Proton and Carbon Ion Therapeutic Irradiation at Synchrotron- and Cyclotron-Based Facilities

    International Nuclear Information System (INIS)

    Parodi, Katia; Bortfeld, Thomas; Haberer, Thomas

    2008-01-01

    Purpose: The benefit of using dedicated in-beam positron emission tomography (PET) detectors in the treatment room instead of commercial tomographs nearby is an open question. This work quantitatively compares the measurable signal for in-beam and offline PET imaging, taking into account realistic acquisition strategies at different ion beam facilities. Both scenarios of pulsed and continuous irradiation from synchrotron and cyclotron accelerators are considered, because of their widespread use in most carbon ion and proton therapy centers. Methods and Materials: A mathematical framework is introduced to compare the time-dependent amount and spatial distribution of decays from irradiation-induced isotope production. The latter is calculated with Monte Carlo techniques for real proton treatments of head-and-neck and paraspinal tumors. Extrapolation to carbon ion irradiation is based on results of previous phantom experiments. Biologic clearance is modeled taking into account available data from previous animal and clinical studies. Results: Ratios between the amount of physical decays available for in-beam and offline detection range from 40% to 60% for cyclotron-based facilities, to 65% to 110% (carbon ions) and 94% to 166% (protons) at synchrotron-based facilities, and increase when including biologic clearance. Spatial distributions of decays during irradiation exhibit better correlation with the dose delivery and reduced influence of biologic processes. Conclusions: In-beam imaging can be advantageous for synchrotron-based facilities, provided that efficient PET systems enabling detection of isotope decays during beam extraction are implemented. For very short (<2 min) irradiation times at cyclotron-based facilities, a few minutes of acquisition time after the end of irradiation are needed for counting statistics, thus affecting patient throughput

  8. Monte Carlo simulation of hybrid systems: An example

    International Nuclear Information System (INIS)

    Bacha, F.; D'Alencon, H.; Grivelet, J.; Jullien, E.; Jejcic, A.; Maillard, J.; Silva, J.; Zukanovich, R.; Vergnes, J.

    1997-01-01

    Simulation of hybrid systems needs tracking of particles from the GeV (incident proton beam) range down to a fraction of eV (thermic neutrons). We show how a GEANT based Monte-Carlo program can achieve this, with a realistic computer time and accompanying tools. An example of a dedicated original actinide burner is simulated with this chain. 8 refs., 5 figs

  9. Independent Monte-Carlo dose calculation for MLC based CyberKnife radiotherapy

    Science.gov (United States)

    Mackeprang, P.-H.; Vuong, D.; Volken, W.; Henzen, D.; Schmidhalter, D.; Malthaner, M.; Mueller, S.; Frei, D.; Stampanoni, M. F. M.; Dal Pra, A.; Aebersold, D. M.; Fix, M. K.; Manser, P.

    2018-01-01

    This work aims to develop, implement and validate a Monte Carlo (MC)-based independent dose calculation (IDC) framework to perform patient-specific quality assurance (QA) for multi-leaf collimator (MLC)-based CyberKnife® (Accuray Inc., Sunnyvale, CA) treatment plans. The IDC framework uses an XML-format treatment plan as exported from the treatment planning system (TPS) and DICOM format patient CT data, an MC beam model using phase spaces, CyberKnife MLC beam modifier transport using the EGS++ class library, a beam sampling and coordinate transformation engine and dose scoring using DOSXYZnrc. The framework is validated against dose profiles and depth dose curves of single beams with varying field sizes in a water tank in units of cGy/Monitor Unit and against a 2D dose distribution of a full prostate treatment plan measured with Gafchromic EBT3 (Ashland Advanced Materials, Bridgewater, NJ) film in a homogeneous water-equivalent slab phantom. The film measurement is compared to IDC results by gamma analysis using 2% (global)/2 mm criteria. Further, the dose distribution of the clinical treatment plan in the patient CT is compared to TPS calculation by gamma analysis using the same criteria. Dose profiles from IDC calculation in a homogeneous water phantom agree within 2.3% of the global max dose or 1 mm distance to agreement to measurements for all except the smallest field size. Comparing the film measurement to calculated dose, 99.9% of all voxels pass gamma analysis, comparing dose calculated by the IDC framework to TPS calculated dose for the clinical prostate plan shows 99.0% passing rate. IDC calculated dose is found to be up to 5.6% lower than dose calculated by the TPS in this case near metal fiducial markers. An MC-based modular IDC framework was successfully developed, implemented and validated against measurements and is now available to perform patient-specific QA by IDC.

  10. Limits on the Efficiency of Event-Based Algorithms for Monte Carlo Neutron Transport

    Energy Technology Data Exchange (ETDEWEB)

    Romano, Paul K.; Siegel, Andrew R.

    2017-04-16

    The traditional form of parallelism in Monte Carlo particle transport simulations, wherein each individual particle history is considered a unit of work, does not lend itself well to data-level parallelism. Event-based algorithms, which were originally used for simulations on vector processors, may offer a path toward better utilizing data-level parallelism in modern computer architectures. In this study, a simple model is developed for estimating the efficiency of the event-based particle transport algorithm under two sets of assumptions. Data collected from simulations of four reactor problems using OpenMC was then used in conjunction with the models to calculate the speedup due to vectorization as a function of two parameters: the size of the particle bank and the vector width. When each event type is assumed to have constant execution time, the achievable speedup is directly related to the particle bank size. We observed that the bank size generally needs to be at least 20 times greater than vector size in order to achieve vector efficiency greater than 90%. When the execution times for events are allowed to vary, however, the vector speedup is also limited by differences in execution time for events being carried out in a single event-iteration. For some problems, this implies that vector effciencies over 50% may not be attainable. While there are many factors impacting performance of an event-based algorithm that are not captured by our model, it nevertheless provides insights into factors that may be limiting in a real implementation.

  11. Measurements and Monte Carlo calculations of neutron production cross-sections at 180o for the 140 MeV proton incident reactions on carbon, iron, and gold

    International Nuclear Information System (INIS)

    Iwamoto, Yosuke; Satoh, Daiki; Hagiwara, Masayuki; Yashima, Hiroshi; Nakane, Yoshihiro; Tamii, Atsushi; Iwase, Hiroshi; Endo, Akira; Nakashima, Hiroshi; Sakamoto, Yukio; Hatanaka, Kichiji; Niita, Koji

    2010-01-01

    The neutron production cross-sections of carbon, iron, and gold targets with 140 MeV protons at 180 o were measured at the RCNP cyclotron facility. The time-of-flight technique was used to obtain the neutron energy spectra in the energy range above 1 MeV. The carbon and iron target results were compared with the experimental data from 113 MeV (p,xn) reactions at 150 o reported by Meier et al. Our data agreed well with them in spite of different incident energies and angles. Calculations were then performed using different intra-nuclear cascade models (Bertini, ISOBAR, and JQMD) implemented with PHITS code. The results calculated using the ISOBAR and JQMD models roughly agreed with the experimental iron and gold target data, but the Bertini could not reproduce the high-energy neutrons above 10 MeV.

  12. Monte Carlo study of particle production in diffractive proton-proton collisions at √(s) = 13 TeV with the very forward detector combined with central information

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Qi-Dong [Nagoya University, Institute for Space-Earth Environmental Research, Nagoya (Japan); Itow, Yoshitaka; Sako, Takashi [Nagoya University, Institute for Space-Earth Environmental Research, Nagoya (Japan); Nagoya University, Kobayashi-Maskawa Institute, Nagoya (Japan); Menjo, Hiroaki [Nagoya University, Graduate School of Science, Nagoya (Japan)

    2017-04-15

    Very forward (VF) detectors in hadron colliders, having unique sensitivity to diffractive processes, can be a powerful tool for studying diffractive dissociation by combining them with central detectors. Several Monte Carlo simulation samples in p-p collisions at √(s) = 13 TeV were analyzed, and different nondiffractive and diffractive contributions were clarified through differential cross sections of forward neutral particles. Diffraction selection criteria in the VF-triggered-event samples were determined by using the central track information. The corresponding selection applicable in real experiments has ∼ 100% purity and 30-70% efficiency. Consequently, the central information enables classification of the forward productions into diffraction and nondiffraction categories; in particular, most of the surviving events from the selection belong to low-mass diffraction events at log{sub 10}(ξ{sub x}) < -5.5. Therefore, the combined method can uniquely access the low-mass diffraction regime experimentally. (orig.)

  13. Nanostructure-based proton exchange membrane for fuel cell applications at high temperature.

    Science.gov (United States)

    Li, Junsheng; Wang, Zhengbang; Li, Junrui; Pan, Mu; Tang, Haolin

    2014-02-01

    As a clean and highly efficient energy source, the proton exchange membrane fuel cell (PEMFC) has been considered an ideal alternative to traditional fossil energy sources. Great efforts have been devoted to realizing the commercialization of the PEMFC in the past decade. To eliminate some technical problems that are associated with the low-temperature operation (such as catalyst poisoning and poor water management), PEMFCs are usually operated at elevated temperatures (e.g., > 100 degrees C). However, traditional proton exchange membrane (PEM) shows poor performance at elevated temperature. To achieve a high-performance PEM for high temperature fuel cell applications, novel PEMs, which are based on nanostructures, have been developed recently. In this review, we discuss and summarize the methods for fabricating the nanostructure-based PEMs for PEMFC operated at elevated temperatures and the high temperature performance of these PEMs. We also give an outlook on the rational design and development of the nanostructure-based PEMs.

  14. CT-Based Brachytherapy Treatment Planning using Monte Carlo Simulation Aided by an Interface Software

    Directory of Open Access Journals (Sweden)

    Vahid Moslemi

    2011-03-01

    Full Text Available Introduction: In brachytherapy, radioactive sources are placed close to the tumor, therefore, small changes in their positions can cause large changes in the dose distribution. This emphasizes the need for computerized treatment planning. The usual method for treatment planning of cervix brachytherapy uses conventional radiographs in the Manchester system. Nowadays, because of their advantages in locating the source positions and the surrounding tissues, CT and MRI images are replacing conventional radiographs. In this study, we used CT images in Monte Carlo based dose calculation for brachytherapy treatment planning, using an interface software to create the geometry file required in the MCNP code. The aim of using the interface software is to facilitate and speed up the geometry set-up for simulations based on the patient’s anatomy. This paper examines the feasibility of this method in cervix brachytherapy and assesses its accuracy and speed. Material and Methods: For dosimetric measurements regarding the treatment plan, a pelvic phantom was made from polyethylene in which the treatment applicators could be placed. For simulations using CT images, the phantom was scanned at 120 kVp. Using an interface software written in MATLAB, the CT images were converted into MCNP input file and the simulation was then performed. Results: Using the interface software, preparation time for the simulations of the applicator and surrounding structures was approximately 3 minutes; the corresponding time needed in the conventional MCNP geometry entry being approximately 1 hour. The discrepancy in the simulated and measured doses to point A was 1.7% of the prescribed dose.  The corresponding dose differences between the two methods in rectum and bladder were 3.0% and 3.7% of the prescribed dose, respectively. Comparing the results of simulation using the interface software with those of simulation using the standard MCNP geometry entry showed a less than 1

  15. A voxel-based mouse for internal dose calculations using Monte Carlo simulations (MCNP).

    Science.gov (United States)

    Bitar, A; Lisbona, A; Thedrez, P; Sai Maurel, C; Le Forestier, D; Barbet, J; Bardies, M

    2007-02-21

    Murine models are useful for targeted radiotherapy pre-clinical experiments. These models can help to assess the potential interest of new radiopharmaceuticals. In this study, we developed a voxel-based mouse for dosimetric estimates. A female nude mouse (30 g) was frozen and cut into slices. High-resolution digital photographs were taken directly on the frozen block after each section. Images were segmented manually. Monoenergetic photon or electron sources were simulated using the MCNP4c2 Monte Carlo code for each source organ, in order to give tables of S-factors (in Gy Bq-1 s-1) for all target organs. Results obtained from monoenergetic particles were then used to generate S-factors for several radionuclides of potential interest in targeted radiotherapy. Thirteen source and 25 target regions were considered in this study. For each source region, 16 photon and 16 electron energies were simulated. Absorbed fractions, specific absorbed fractions and S-factors were calculated for 16 radionuclides of interest for targeted radiotherapy. The results obtained generally agree well with data published previously. For electron energies ranging from 0.1 to 2.5 MeV, the self-absorbed fraction varies from 0.98 to 0.376 for the liver, and from 0.89 to 0.04 for the thyroid. Electrons cannot be considered as 'non-penetrating' radiation for energies above 0.5 MeV for mouse organs. This observation can be generalized to radionuclides: for example, the beta self-absorbed fraction for the thyroid was 0.616 for I-131; absorbed fractions for Y-90 for left kidney-to-left kidney and for left kidney-to-spleen were 0.486 and 0.058, respectively. Our voxel-based mouse allowed us to generate a dosimetric database for use in preclinical targeted radiotherapy experiments.

  16. An exercise in model validation: Comparing univariate statistics and Monte Carlo-based multivariate statistics

    International Nuclear Information System (INIS)

    Weathers, J.B.; Luck, R.; Weathers, J.W.

    2009-01-01

    The complexity of mathematical models used by practicing engineers is increasing due to the growing availability of sophisticated mathematical modeling tools and ever-improving computational power. For this reason, the need to define a well-structured process for validating these models against experimental results has become a pressing issue in the engineering community. This validation process is partially characterized by the uncertainties associated with the modeling effort as well as the experimental results. The net impact of the uncertainties on the validation effort is assessed through the 'noise level of the validation procedure', which can be defined as an estimate of the 95% confidence uncertainty bounds for the comparison error between actual experimental results and model-based predictions of the same quantities of interest. Although general descriptions associated with the construction of the noise level using multivariate statistics exists in the literature, a detailed procedure outlining how to account for the systematic and random uncertainties is not available. In this paper, the methodology used to derive the covariance matrix associated with the multivariate normal pdf based on random and systematic uncertainties is examined, and a procedure used to estimate this covariance matrix using Monte Carlo analysis is presented. The covariance matrices are then used to construct approximate 95% confidence constant probability contours associated with comparison error results for a practical example. In addition, the example is used to show the drawbacks of using a first-order sensitivity analysis when nonlinear local sensitivity coefficients exist. Finally, the example is used to show the connection between the noise level of the validation exercise calculated using multivariate and univariate statistics.

  17. An exercise in model validation: Comparing univariate statistics and Monte Carlo-based multivariate statistics

    Energy Technology Data Exchange (ETDEWEB)

    Weathers, J.B. [Shock, Noise, and Vibration Group, Northrop Grumman Shipbuilding, P.O. Box 149, Pascagoula, MS 39568 (United States)], E-mail: James.Weathers@ngc.com; Luck, R. [Department of Mechanical Engineering, Mississippi State University, 210 Carpenter Engineering Building, P.O. Box ME, Mississippi State, MS 39762-5925 (United States)], E-mail: Luck@me.msstate.edu; Weathers, J.W. [Structural Analysis Group, Northrop Grumman Shipbuilding, P.O. Box 149, Pascagoula, MS 39568 (United States)], E-mail: Jeffrey.Weathers@ngc.com

    2009-11-15

    The complexity of mathematical models used by practicing engineers is increasing due to the growing availability of sophisticated mathematical modeling tools and ever-improving computational power. For this reason, the need to define a well-structured process for validating these models against experimental results has become a pressing issue in the engineering community. This validation process is partially characterized by the uncertainties associated with the modeling effort as well as the experimental results. The net impact of the uncertainties on the validation effort is assessed through the 'noise level of the validation procedure', which can be defined as an estimate of the 95% confidence uncertainty bounds for the comparison error between actual experimental results and model-based predictions of the same quantities of interest. Although general descriptions associated with the construction of the noise level using multivariate statistics exists in the literature, a detailed procedure outlining how to account for the systematic and random uncertainties is not available. In this paper, the methodology used to derive the covariance matrix associated with the multivariate normal pdf based on random and systematic uncertainties is examined, and a procedure used to estimate this covariance matrix using Monte Carlo analysis is presented. The covariance matrices are then used to construct approximate 95% confidence constant probability contours associated with comparison error results for a practical example. In addition, the example is used to show the drawbacks of using a first-order sensitivity analysis when nonlinear local sensitivity coefficients exist. Finally, the example is used to show the connection between the noise level of the validation exercise calculated using multivariate and univariate statistics.

  18. Preparation and proton conductivity of composite membranes based on sulfonated poly(phenylene oxide) and benzimidazole

    International Nuclear Information System (INIS)

    Liu Yifeng; Yu Qinchun; Wu Yihua

    2007-01-01

    The Bronsted acid-base composite membrane was prepared by entrapping benzimidazole in sulfonated poly(phenylene oxide) by tuning the doping ratios. Their thermal stability, dynamic mechanical properties and proton conductivity were investigated under the conditions for intermediate temperature proton exchange membrane (PEM) fuel cell operation. In addition, investigation of activation energies of the SPPO-xBnIm at different relative humidity was also performed. TG-DTA curves reveal these SPPO-xBnIm composite materials had the high thermal stability. The proton conductivity of SPPO-xBnIm composite material increased with the temperature, and the highest proton conductivity of SPPO-xBnIm composite materials was found to be 8.93 x 10 -4 S/cm at 200 deg. C under 35% relative humidity (RH) with a 'doping rate' where x = 2. The SPPO-2BnIm composite membrane show higher storage moduli and loss moduli than SPPO. Tests in a hydrogen-air laboratory cell demonstrate the applicability of SPPO-2BnIm in PEMFCs at intermediate temperature under non-humidified conditions

  19. A scintillator-based online detector for the angularly resolved measurement of laser-accelerated proton spectra

    International Nuclear Information System (INIS)

    Metzkes, J.; Kraft, S. D.; Sobiella, M.; Stiller, N.; Zeil, K.; Schramm, U.; Karsch, L.; Schürer, M.; Pawelke, J.; Richter, C.

    2012-01-01

    In recent years, a new generation of high repetition rate (∼10 Hz), high power (∼100 TW) laser systems has stimulated intense research on laser-driven sources for fast protons. Considering experimental instrumentation, this development requires online diagnostics for protons to be added to the established offline detection tools such as solid state track detectors or radiochromic films. In this article, we present the design and characterization of a scintillator-based online detector that gives access to the angularly resolved proton distribution along one spatial dimension and resolves 10 different proton energy ranges. Conceived as an online detector for key parameters in laser-proton acceleration, such as the maximum proton energy and the angular distribution, the detector features a spatial resolution of ∼1.3 mm and a spectral resolution better than 1.5 MeV for a maximum proton energy above 12 MeV in the current design. Regarding its areas of application, we consider the detector a useful complement to radiochromic films and Thomson parabola spectrometers, capable to give immediate feedback on the experimental performance. The detector was characterized at an electrostatic Van de Graaff tandetron accelerator and tested in a laser-proton acceleration experiment, proving its suitability as a diagnostic device for laser-accelerated protons.

  20. A scintillator-based online detector for the angularly resolved measurement of laser-accelerated proton spectra.

    Science.gov (United States)

    Metzkes, J; Karsch, L; Kraft, S D; Pawelke, J; Richter, C; Schürer, M; Sobiella, M; Stiller, N; Zeil, K; Schramm, U

    2012-12-01

    In recent years, a new generation of high repetition rate (~10 Hz), high power (~100 TW) laser systems has stimulated intense research on laser-driven sources for fast protons. Considering experimental instrumentation, this development requires online diagnostics for protons to be added to the established offline detection tools such as solid state track detectors or radiochromic films. In this article, we present the design and characterization of a scintillator-based online detector that gives access to the angularly resolved proton distribution along one spatial dimension and resolves 10 different proton energy ranges. Conceived as an online detector for key parameters in laser-proton acceleration, such as the maximum proton energy and the angular distribution, the detector features a spatial resolution of ~1.3 mm and a spectral resolution better than 1.5 MeV for a maximum proton energy above 12 MeV in the current design. Regarding its areas of application, we consider the detector a useful complement to radiochromic films and Thomson parabola spectrometers, capable to give immediate feedback on the experimental performance. The detector was characterized at an electrostatic Van de Graaff tandetron accelerator and tested in a laser-proton acceleration experiment, proving its suitability as a diagnostic device for laser-accelerated protons.

  1. Development of a practical Monte Carlo based fuel management system for the Penn State University Breazeale Research Reactor (PSBR)

    International Nuclear Information System (INIS)

    Tippayakul, Chanatip; Ivanov, Kostadin; Frederick Sears, C.

    2008-01-01

    A practical fuel management system for the he Pennsylvania State University Breazeale Research Reactor (PSBR) based on the advanced Monte Carlo methodology was developed from the existing fuel management tool in this research. Several modeling improvements were implemented to the old system. The improved fuel management system can now utilize the burnup dependent cross section libraries generated specifically for PSBR fuel and it is also able to update the cross sections of these libraries by the Monte Carlo calculation automatically. Considerations were given to balance the computation time and the accuracy of the cross section update. Thus, certain types of a limited number of isotopes, which are considered 'important', are calculated and updated by the scheme. Moreover, the depletion algorithm of the existing fuel management tool was replaced from the predictor only to the predictor-corrector depletion scheme to account for burnup spectrum changes during the burnup step more accurately. An intermediate verification of the fuel management system was performed to assess the correctness of the newly implemented schemes against HELIOS. It was found that the agreement of both codes is good when the same energy released per fission (Q values) is used. Furthermore, to be able to model the reactor at various temperatures, the fuel management tool is able to utilize automatically the continuous cross sections generated at different temperatures. Other additional useful capabilities were also added to the fuel management tool to make it easy to use and be practical. As part of the development, a hybrid nodal diffusion/Monte Carlo calculation was devised to speed up the Monte Carlo calculation by providing more converged initial source distribution for the Monte Carlo calculation from the nodal diffusion calculation. Finally, the fuel management system was validated against the measured data using several actual PSBR core loadings. The agreement of the predicted core

  2. A new Monte Carlo-based treatment plan optimization approach for intensity modulated radiation therapy.

    Science.gov (United States)

    Li, Yongbao; Tian, Zhen; Shi, Feng; Song, Ting; Wu, Zhaoxia; Liu, Yaqiang; Jiang, Steve; Jia, Xun

    2015-04-07

    Intensity-modulated radiation treatment (IMRT) plan optimization needs beamlet dose distributions. Pencil-beam or superposition/convolution type algorithms are typically used because of their high computational speed. However, inaccurate beamlet dose distributions may mislead the optimization process and hinder the resulting plan quality. To solve this problem, the Monte Carlo (MC) simulation method has been used to compute all beamlet doses prior to the optimization step. The conventional approach samples the same number of particles from each beamlet. Yet this is not the optimal use of MC in this problem. In fact, there are beamlets that have very small intensities after solving the plan optimization problem. For those beamlets, it may be possible to use fewer particles in dose calculations to increase efficiency. Based on this idea, we have developed a new MC-based IMRT plan optimization framework that iteratively performs MC dose calculation and plan optimization. At each dose calculation step, the particle numbers for beamlets were adjusted based on the beamlet intensities obtained through solving the plan optimization problem in the last iteration step. We modified a GPU-based MC dose engine to allow simultaneous computations of a large number of beamlet doses. To test the accuracy of our modified dose engine, we compared the dose from a broad beam and the summed beamlet doses in this beam in an inhomogeneous phantom. Agreement within 1% for the maximum difference and 0.55% for the average difference was observed. We then validated the proposed MC-based optimization schemes in one lung IMRT case. It was found that the conventional scheme required 10(6) particles from each beamlet to achieve an optimization result that was 3% difference in fluence map and 1% difference in dose from the ground truth. In contrast, the proposed scheme achieved the same level of accuracy with on average 1.2 × 10(5) particles per beamlet. Correspondingly, the computation

  3. Volume Measurement Algorithm for Food Product with Irregular Shape using Computer Vision based on Monte Carlo Method

    Directory of Open Access Journals (Sweden)

    Joko Siswantoro

    2014-11-01

    Full Text Available Volume is one of important issues in the production and processing of food product. Traditionally, volume measurement can be performed using water displacement method based on Archimedes’ principle. Water displacement method is inaccurate and considered as destructive method. Computer vision offers an accurate and nondestructive method in measuring volume of food product. This paper proposes algorithm for volume measurement of irregular shape food product using computer vision based on Monte Carlo method. Five images of object were acquired from five different views and then processed to obtain the silhouettes of object. From the silhouettes of object, Monte Carlo method was performed to approximate the volume of object. The simulation result shows that the algorithm produced high accuracy and precision for volume measurement.

  4. Monte-Carlo Simulation for PDC-Based Optical CDMA System

    Directory of Open Access Journals (Sweden)

    FAHIM AZIZ UMRANI

    2010-10-01

    Full Text Available This paper presents the Monte-Carlo simulation of Optical CDMA (Code Division Multiple Access systems, and analyse its performance in terms of the BER (Bit Error Rate. The spreading sequence chosen for CDMA is Perfect Difference Codes. Furthermore, this paper derives the expressions of noise variances from first principles to calibrate the noise for both bipolar (electrical domain and unipolar (optical domain signalling required for Monte-Carlo simulation. The simulated results conform to the theory and show that the receiver gain mismatch and splitter loss at the transceiver degrades the system performance.

  5. A semi-grand canonical Monte Carlo simulation model for ion binding to ionizable surfaces: proton binding of carboxylated latex particles as a case study.

    Science.gov (United States)

    Madurga, Sergio; Rey-Castro, Carlos; Pastor, Isabel; Vilaseca, Eudald; David, Calin; Garcés, Josep Lluís; Puy, Jaume; Mas, Francesc

    2011-11-14

    In this paper, we present a computer simulation study of the ion binding process at an ionizable surface using a semi-grand canonical Monte Carlo method that models the surface as a discrete distribution of charged and neutral functional groups in equilibrium with explicit ions modelled in the context of the primitive model. The parameters of the simulation model were tuned and checked by comparison with experimental titrations of carboxylated latex particles in the presence of different ionic strengths of monovalent ions. The titration of these particles was analysed by calculating the degree of dissociation of the latex functional groups vs. pH curves at different background salt concentrations. As the charge of the titrated surface changes during the simulation, a procedure to keep the electroneutrality of the system is required. Here, two approaches are used with the choice depending on the ion selected to maintain electroneutrality: counterion or coion procedures. We compare and discuss the difference between the procedures. The simulations also provided a microscopic description of the electrostatic double layer (EDL) structure as a function of pH and ionic strength. The results allow us to quantify the effect of the size of the background salt ions and of the surface functional groups on the degree of dissociation. The non-homogeneous structure of the EDL was revealed by plotting the counterion density profiles around charged and neutral surface functional groups. © 2011 American Institute of Physics

  6. Steam Electrolysis by Proton-Conducting Solid Oxide Electrolysis Cells (SOECs) with Chemically Stable BaZrO3-Based Electrolytes

    KAUST Repository

    Bi, Lei; Traversa, Enrico

    2015-01-01

    BaZrO3-based material was applied as the electrolyte for proton-conducting solid oxide fuel cells (SOECs). Compared with the instability of BaCeO3-based proton-conductors, BaZrO3-based material could be a more promising candidate for proton

  7. Optimization of FIBMOS Through 2D Silvaco ATLAS and 2D Monte Carlo Particle-based Device Simulations

    OpenAIRE

    Kang, J.; He, X.; Vasileska, D.; Schroder, D. K.

    2001-01-01

    Focused Ion Beam MOSFETs (FIBMOS) demonstrate large enhancements in core device performance areas such as output resistance, hot electron reliability and voltage stability upon channel length or drain voltage variation. In this work, we describe an optimization technique for FIBMOS threshold voltage characterization using the 2D Silvaco ATLAS simulator. Both ATLAS and 2D Monte Carlo particle-based simulations were used to show that FIBMOS devices exhibit enhanced current drive ...

  8. Test Population Selection from Weibull-Based, Monte Carlo Simulations of Fatigue Life

    Science.gov (United States)

    Vlcek, Brian L.; Zaretsky, Erwin V.; Hendricks, Robert C.

    2012-01-01

    Fatigue life is probabilistic and not deterministic. Experimentally establishing the fatigue life of materials, components, and systems is both time consuming and costly. As a result, conclusions regarding fatigue life are often inferred from a statistically insufficient number of physical tests. A proposed methodology for comparing life results as a function of variability due to Weibull parameters, variability between successive trials, and variability due to size of the experimental population is presented. Using Monte Carlo simulation of randomly selected lives from a large Weibull distribution, the variation in the L10 fatigue life of aluminum alloy AL6061 rotating rod fatigue tests was determined as a function of population size. These results were compared to the L10 fatigue lives of small (10 each) populations from AL2024, AL7075 and AL6061. For aluminum alloy AL6061, a simple algebraic relationship was established for the upper and lower L10 fatigue life limits as a function of the number of specimens failed. For most engineering applications where less than 30 percent variability can be tolerated in the maximum and minimum values, at least 30 to 35 test samples are necessary. The variability of test results based on small sample sizes can be greater than actual differences, if any, that exists between materials and can result in erroneous conclusions. The fatigue life of AL2024 is statistically longer than AL6061 and AL7075. However, there is no statistical difference between the fatigue lives of AL6061 and AL7075 even though AL7075 had a fatigue life 30 percent greater than AL6061.

  9. Monte Carlo N-particle simulation of neutron-based sterilisation of anthrax contamination.

    Science.gov (United States)

    Liu, B; Xu, J; Liu, T; Ouyang, X

    2012-10-01

    To simulate the neutron-based sterilisation of anthrax contamination by Monte Carlo N-particle (MCNP) 4C code. Neutrons are elementary particles that have no charge. They are 20 times more effective than electrons or γ-rays in killing anthrax spores on surfaces and inside closed containers. Neutrons emitted from a (252)Cf neutron source are in the 100 keV to 2 MeV energy range. A 2.5 MeV D-D neutron generator can create neutrons at up to 10(13) n s(-1) with current technology. All these enable an effective and low-cost method of killing anthrax spores. There is no effect on neutron energy deposition on the anthrax sample when using a reflector that is thicker than its saturation thickness. Among all three reflecting materials tested in the MCNP simulation, paraffin is the best because it has the thinnest saturation thickness and is easy to machine. The MCNP radiation dose and fluence simulation calculation also showed that the MCNP-simulated neutron fluence that is needed to kill the anthrax spores agrees with previous analytical estimations very well. The MCNP simulation indicates that a 10 min neutron irradiation from a 0.5 g (252)Cf neutron source or a 1 min neutron irradiation from a 2.5 MeV D-D neutron generator may kill all anthrax spores in a sample. This is a promising result because a 2.5 MeV D-D neutron generator output >10(13) n s(-1) should be attainable in the near future. This indicates that we could use a D-D neutron generator to sterilise anthrax contamination within several seconds.

  10. The use of tetrahedral mesh geometries in Monte Carlo simulation of applicator based brachytherapy dose distributions

    International Nuclear Information System (INIS)

    Fonseca, Gabriel Paiva; Yoriyaz, Hélio; Landry, Guillaume; White, Shane; Reniers, Brigitte; Verhaegen, Frank; D’Amours, Michel; Beaulieu, Luc

    2014-01-01

    Accounting for brachytherapy applicator attenuation is part of the recommendations from the recent report of AAPM Task Group 186. To do so, model based dose calculation algorithms require accurate modelling of the applicator geometry. This can be non-trivial in the case of irregularly shaped applicators such as the Fletcher Williamson gynaecological applicator or balloon applicators with possibly irregular shapes employed in accelerated partial breast irradiation (APBI) performed using electronic brachytherapy sources (EBS). While many of these applicators can be modelled using constructive solid geometry (CSG), the latter may be difficult and time-consuming. Alternatively, these complex geometries can be modelled using tessellated geometries such as tetrahedral meshes (mesh geometries (MG)). Recent versions of Monte Carlo (MC) codes Geant4 and MCNP6 allow for the use of MG. The goal of this work was to model a series of applicators relevant to brachytherapy using MG. Applicators designed for 192 Ir sources and 50 kV EBS were studied; a shielded vaginal applicator, a shielded Fletcher Williamson applicator and an APBI balloon applicator. All applicators were modelled in Geant4 and MCNP6 using MG and CSG for dose calculations. CSG derived dose distributions were considered as reference and used to validate MG models by comparing dose distribution ratios. In general agreement within 1% for the dose calculations was observed for all applicators between MG and CSG and between codes when considering volumes inside the 25% isodose surface. When compared to CSG, MG required longer computation times by a factor of at least 2 for MC simulations using the same code. MCNP6 calculation times were more than ten times shorter than Geant4 in some cases. In conclusion we presented methods allowing for high fidelity modelling with results equivalent to CSG. To the best of our knowledge MG offers the most accurate representation of an irregular APBI balloon applicator. (paper)

  11. Photoresponse of the protonated Schiff-base retinal chromophore in the gas phase

    DEFF Research Database (Denmark)

    Toker, Jonathan; Rahbek, Dennis Bo; Kiefer, H V

    2013-01-01

    The fragmentation, initiated by photoexcitation as well as collisionally-induced excitation, of several retinal chromophores was studied in the gas phase. The chromophore in the protonated Schiff-base form (RPSB), essential for mammalian vision, shows a remarkably selective photoresponse. The sel......The fragmentation, initiated by photoexcitation as well as collisionally-induced excitation, of several retinal chromophores was studied in the gas phase. The chromophore in the protonated Schiff-base form (RPSB), essential for mammalian vision, shows a remarkably selective photoresponse...... modifications of the chromophore. We propose that isomerizations play an important role in the photoresponse of gas-phase retinal chromophores and guide internal conversion through conical intersections. The role of protein interactions is then to control the specificity of the photoisomerization in the primary...

  12. A comparison study for dose calculation in radiation therapy: pencil beam Kernel based vs. Monte Carlo simulation vs. measurements

    Energy Technology Data Exchange (ETDEWEB)

    Cheong, Kwang-Ho; Suh, Tae-Suk; Lee, Hyoung-Koo; Choe, Bo-Young [The Catholic Univ. of Korea, Seoul (Korea, Republic of); Kim, Hoi-Nam; Yoon, Sei-Chul [Kangnam St. Mary' s Hospital, Seoul (Korea, Republic of)

    2002-07-01

    Accurate dose calculation in radiation treatment planning is most important for successful treatment. Since human body is composed of various materials and not an ideal shape, it is not easy to calculate the accurate effective dose in the patients. Many methods have been proposed to solve inhomogeneity and surface contour problems. Monte Carlo simulations are regarded as the most accurate method, but it is not appropriate for routine planning because it takes so much time. Pencil beam kernel based convolution/superposition methods were also proposed to correct those effects. Nowadays, many commercial treatment planning systems have adopted this algorithm as a dose calculation engine. The purpose of this study is to verify the accuracy of the dose calculated from pencil beam kernel based treatment planning system comparing to Monte Carlo simulations and measurements especially in inhomogeneous region. Home-made inhomogeneous phantom, Helax-TMS ver. 6.0 and Monte Carlo code BEAMnrc and DOSXYZnrc were used in this study. In homogeneous media, the accuracy was acceptable but in inhomogeneous media, the errors were more significant. However in general clinical situation, pencil beam kernel based convolution algorithm is thought to be a valuable tool to calculate the dose.

  13. Development of three-dimensional program based on Monte Carlo and discrete ordinates bidirectional coupling method

    International Nuclear Information System (INIS)

    Han Jingru; Chen Yixue; Yuan Longjun

    2013-01-01

    The Monte Carlo (MC) and discrete ordinates (SN) are the commonly used methods in the design of radiation shielding. Monte Carlo method is able to treat the geometry exactly, but time-consuming in dealing with the deep penetration problem. The discrete ordinate method has great computational efficiency, but it is quite costly in computer memory and it suffers from ray effect. Single discrete ordinates method or single Monte Carlo method has limitation in shielding calculation for large complex nuclear facilities. In order to solve the problem, the Monte Carlo and discrete ordinates bidirectional coupling method is developed. The bidirectional coupling method is implemented in the interface program to transfer the particle probability distribution of MC and angular flux of discrete ordinates. The coupling method combines the advantages of MC and SN. The test problems of cartesian and cylindrical coordinate have been calculated by the coupling methods. The calculation results are performed with comparison to MCNP and TORT and satisfactory agreements are obtained. The correctness of the program is proved. (authors)

  14. Monte Carlo-based QA for IMRT of head and neck cancers

    Science.gov (United States)

    Tang, F.; Sham, J.; Ma, C.-M.; Li, J.-S.

    2007-06-01

    It is well-known that the presence of large air cavity in a dense medium (or patient) introduces significant electronic disequilibrium when irradiated with megavoltage X-ray field. This condition may worsen by the possible use of tiny beamlets in intensity-modulated radiation therapy (IMRT). Commercial treatment planning systems (TPSs), in particular those based on the pencil-beam method, do not provide accurate dose computation for the lungs and other cavity-laden body sites such as the head and neck. In this paper we present the use of Monte Carlo (MC) technique for dose re-calculation of IMRT of head and neck cancers. In our clinic, a turn-key software system is set up for MC calculation and comparison with TPS-calculated treatment plans as part of the quality assurance (QA) programme for IMRT delivery. A set of 10 off-the-self PCs is employed as the MC calculation engine with treatment plan parameters imported from the TPS via a graphical user interface (GUI) which also provides a platform for launching remote MC simulation and subsequent dose comparison with the TPS. The TPS-segmented intensity maps are used as input for the simulation hence skipping the time-consuming simulation of the multi-leaf collimator (MLC). The primary objective of this approach is to assess the accuracy of the TPS calculations in the presence of air cavities in the head and neck whereas the accuracy of leaf segmentation is verified by fluence measurement using a fluoroscopic camera-based imaging device. This measurement can also validate the correct transfer of intensity maps to the record and verify system. Comparisons between TPS and MC calculations of 6 MV IMRT for typical head and neck treatments review regional consistency in dose distribution except at and around the sinuses where our pencil-beam-based TPS sometimes over-predicts the dose by up to 10%, depending on the size of the cavities. In addition, dose re-buildup of up to 4% is observed at the posterior nasopharyngeal

  15. Proton-proton bremsstrahlung

    International Nuclear Information System (INIS)

    Fearing, H.W.

    1990-01-01

    We summarize some of the information about the nucleon-nucleon force which has been obtained by comparing recent calculations of proton-proton bremsstrahlung with cross section and analyzing power data from the new TRIUMF bremsstrahlung experiment. Some comments are made as to how these results can be extended to neutron-proton bremsstrahlung. (Author) 17 refs., 6 figs

  16. Development of a consistent Monte Carlo-deterministic transport methodology based on the method of characteristics and MCNP5

    International Nuclear Information System (INIS)

    Karriem, Z.; Ivanov, K.; Zamonsky, O.

    2011-01-01

    This paper presents work that has been performed to develop an integrated Monte Carlo- Deterministic transport methodology in which the two methods make use of exactly the same general geometry and multigroup nuclear data. The envisioned application of this methodology is in reactor lattice physics methods development and shielding calculations. The methodology will be based on the Method of Long Characteristics (MOC) and the Monte Carlo N-Particle Transport code MCNP5. Important initial developments pertaining to ray tracing and the development of an MOC flux solver for the proposed methodology are described. Results showing the viability of the methodology are presented for two 2-D general geometry transport problems. The essential developments presented is the use of MCNP as geometry construction and ray tracing tool for the MOC, verification of the ray tracing indexing scheme that was developed to represent the MCNP geometry in the MOC and the verification of the prototype 2-D MOC flux solver. (author)

  17. Three-Dimensional Simulation of DRIE Process Based on the Narrow Band Level Set and Monte Carlo Method

    Directory of Open Access Journals (Sweden)

    Jia-Cheng Yu

    2018-02-01

    Full Text Available A three-dimensional topography simulation of deep reactive ion etching (DRIE is developed based on the narrow band level set method for surface evolution and Monte Carlo method for flux distribution. The advanced level set method is implemented to simulate the time-related movements of etched surface. In the meanwhile, accelerated by ray tracing algorithm, the Monte Carlo method incorporates all dominant physical and chemical mechanisms such as ion-enhanced etching, ballistic transport, ion scattering, and sidewall passivation. The modified models of charged particles and neutral particles are epitomized to determine the contributions of etching rate. The effects such as scalloping effect and lag effect are investigated in simulations and experiments. Besides, the quantitative analyses are conducted to measure the simulation error. Finally, this simulator will be served as an accurate prediction tool for some MEMS fabrications.

  18. Qualitative Simulation of Photon Transport in Free Space Based on Monte Carlo Method and Its Parallel Implementation

    Directory of Open Access Journals (Sweden)

    Xueli Chen

    2010-01-01

    Full Text Available During the past decade, Monte Carlo method has obtained wide applications in optical imaging to simulate photon transport process inside tissues. However, this method has not been effectively extended to the simulation of free-space photon transport at present. In this paper, a uniform framework for noncontact optical imaging is proposed based on Monte Carlo method, which consists of the simulation of photon transport both in tissues and in free space. Specifically, the simplification theory of lens system is utilized to model the camera lens equipped in the optical imaging system, and Monte Carlo method is employed to describe the energy transformation from the tissue surface to the CCD camera. Also, the focusing effect of camera lens is considered to establish the relationship of corresponding points between tissue surface and CCD camera. Furthermore, a parallel version of the framework is realized, making the simulation much more convenient and effective. The feasibility of the uniform framework and the effectiveness of the parallel version are demonstrated with a cylindrical phantom based on real experimental results.

  19. Proton-Based Stereotactic Ablative Radiotherapy in Early-Stage Non-Small-Cell Lung Cancer

    Directory of Open Access Journals (Sweden)

    Jonathan D. Grant

    2014-01-01

    Full Text Available Stereotactic ablative radiotherapy (SABR, a recent implementation in the practice of radiation oncology, has been shown to confer high rates of local control in the treatment of early stage non-small-cell lung cancer (NSCLC. This technique, which involves limited invasive procedures and reduced treatment intervals, offers definitive treatment for patients unable or unwilling to undergo an operation. The use of protons in SABR delivery confers the added physical advantage of normal tissue sparing due to the absence of collateral radiation dose delivered to regions distal to the target. This may translate into clinical benefit and a decreased risk of clinical toxicity in patients with nearby critical structures or limited pulmonary reserve. In this review, we present the rationale for proton-based SABR, principles relating to the delivery and planning of this modality, and a summary of published clinical studies.

  20. Fast neutron spectrometry based on proton detection in CR-39 detector

    Energy Technology Data Exchange (ETDEWEB)

    Dajko, G.; Somogyi, G.

    1986-01-01

    The authors have developed a home-made proton-sensitive CR-39 track detector called MA-ND/p. Using this and the n-p scattering process the performance of a fast neutron spectrometer has been studied by applying two different methods. These are based on track density determinations by using varying radiator thicknesses at constant etching time and by using varying etching times at fixed radiator thickness, respectively. For both methods studied a computer programme is made to calculate the theoretically expected neutron sensitivity as a function of neutron energy. For both methods the neutron sensitivities, expressed in terms of observable etched proton tracks per neutron, are determined experimentally for 3.3 and 14.7 MeV neutron energies. The theoretical and experimental data obtained are compared.

  1. Fast neutron spectrometry based on proton detection in CR-39 detector

    International Nuclear Information System (INIS)

    Dajko, G.; Somogyi, G.

    1986-01-01

    The authors have developed a home-made proton-sensitive CR-39 track detector called MA-ND/p. Using this and the n-p scattering process the performance of a fast neutron spectrometer has been studied by applying two different methods. These are based on track density determinations by using varying radiator thicknesses at constant etching time and by using varying etching times at fixed radiator thickness, respectively. For both methods studied a computer programme is made to calculate the theoretically expected neutron sensitivity as a function of neutron energy. For both methods the neutron sensitivities, expressed in terms of observable etched proton tracks per neutron, are determined experimentally for 3.3 and 14.7 MeV neutron energies. The theoretical and experimental data obtained are compared. (author)

  2. Pseudo-diode based on protonic/electronic hybrid oxide transistor

    Science.gov (United States)

    Fu, Yang Ming; Liu, Yang Hui; Zhu, Li Qiang; Xiao, Hui; Song, An Ran

    2018-01-01

    Current rectification behavior has been proved to be essential in modern electronics. Here, a pseudo-diode is proposed based on protonic/electronic hybrid indium-gallium-zinc oxide electric-double-layer (EDL) transistor. The oxide EDL transistors are fabricated by using phosphorous silicate glass (PSG) based proton conducting electrolyte as gate dielectric. A diode operation mode is established on the transistor, originating from field configurable proton fluxes within the PSG electrolyte. Current rectification ratios have been modulated to values ranged between ˜4 and ˜50 000 with gate electrode biased at voltages ranged between -0.7 V and 0.1 V. Interestingly, the proposed pseudo-diode also exhibits field reconfigurable threshold voltages. When the gate is biased at -0.5 V and 0.3 V, threshold voltages are set to ˜-1.3 V and -0.55 V, respectively. The proposed pseudo-diode may find potential applications in brain-inspired platforms and low-power portable systems.

  3. An easily sintered, chemically stable, barium zirconate-based proton conductor for high-performance proton-conducting solid oxide fuel cells

    KAUST Repository

    Sun, Wenping

    2014-07-25

    Yttrium and indium co-doped barium zirconate is investigated to develop a chemically stable and sintering active proton conductor for solid oxide fuel cells (SOFCs). BaZr0.8Y0.2-xInxO3- δ possesses a pure cubic perovskite structure. The sintering activity of BaZr0.8Y0.2-xInxO3- δ increases significantly with In concentration. BaZr0.8Y0.15In0.05O3- δ (BZYI5) exhibits the highest total electrical conductivity among the sintered oxides. BZYI5 also retains high chemical stability against CO2, vapor, and reduction of H2. The good sintering activity, high conductivity, and chemical stability of BZYI5 facilitate the fabrication of durable SOFCs based on a highly conductive BZYI5 electrolyte film by cost-effective ceramic processes. Fully dense BZYI5 electrolyte film is successfully prepared on the anode substrate by a facile drop-coating technique followed by co-firing at 1400 °C for 5 h in air. The BZYI5 film exhibits one of the highest conductivity among the BaZrO3-based electrolyte films with various sintering aids. BZYI5-based single cells output very encouraging and by far the highest peak power density for BaZrO3-based proton-conducting SOFCs, reaching as high as 379 mW cm-2 at 700 °C. The results demonstrate that Y and In co-doping is an effective strategy for exploring sintering active and chemically stable BaZrO3-based proton conductors for high performance proton-conducting SOFCs. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. An NPT Monte Carlo Molecular Simulation-Based Approach to Investigate Solid-Vapor Equilibrium: Application to Elemental Sulfur-H2S System

    KAUST Repository

    Kadoura, Ahmad Salim; Salama, Amgad; Sun, Shuyu; Sherik, Abdelmounam

    2013-01-01

    In this work, a method to estimate solid elemental sulfur solubility in pure and gas mixtures using Monte Carlo (MC) molecular simulation is proposed. This method is based on Isobaric-Isothermal (NPT) ensemble and the Widom insertion technique

  5. [Study of Determination of Oil Mixture Components Content Based on Quasi-Monte Carlo Method].

    Science.gov (United States)

    Wang, Yu-tian; Xu, Jing; Liu, Xiao-fei; Chen, Meng-han; Wang, Shi-tao

    2015-05-01

    Gasoline, kerosene, diesel is processed by crude oil with different distillation range. The boiling range of gasoline is 35 ~205 °C. The boiling range of kerosene is 140~250 °C. And the boiling range of diesel is 180~370 °C. At the same time, the carbon chain length of differentmineral oil is different. The carbon chain-length of gasoline is within the scope of C7 to C11. The carbon chain length of kerosene is within the scope of C12 to C15. And the carbon chain length of diesel is within the scope of C15 to C18. The recognition and quantitative measurement of three kinds of mineral oil is based on different fluorescence spectrum formed in their different carbon number distribution characteristics. Mineral oil pollution occurs frequently, so monitoring mineral oil content in the ocean is very important. A new method of components content determination of spectra overlapping mineral oil mixture is proposed, with calculation of characteristic peak power integrationof three-dimensional fluorescence spectrum by using Quasi-Monte Carlo Method, combined with optimal algorithm solving optimum number of characteristic peak and range of integral region, solving nonlinear equations by using BFGS(a rank to two update method named after its inventor surname first letter, Boyden, Fletcher, Goldfarb and Shanno) method. Peak power accumulation of determined points in selected area is sensitive to small changes of fluorescence spectral line, so the measurement of small changes of component content is sensitive. At the same time, compared with the single point measurement, measurement sensitivity is improved by the decrease influence of random error due to the selection of points. Three-dimensional fluorescence spectra and fluorescence contour spectra of single mineral oil and the mixture are measured by taking kerosene, diesel and gasoline as research objects, with a single mineral oil regarded whole, not considered each mineral oil components. Six characteristic peaks are

  6. Development of a hybrid multi-scale phantom for Monte-Carlo based internal dosimetry

    International Nuclear Information System (INIS)

    Marcatili, S.; Villoing, D.; Bardies, M.

    2015-01-01

    Full text of publication follows. Aim: in recent years several phantoms were developed for radiopharmaceutical dosimetry in clinical and preclinical settings. Voxel-based models (Zubal, Max/Fax, ICRP110) were developed to reach a level of realism that could not be achieved by mathematical models. In turn, 'hybrid' models (XCAT, MOBY/ROBY, Mash/Fash) allow a further degree of versatility by offering the possibility to finely tune each model according to various parameters. However, even 'hybrid' models require the generation of a voxel version for Monte-Carlo modeling of radiation transport. Since absorbed dose simulation time is strictly related to geometry spatial sampling, a compromise should be made between phantom realism and simulation speed. This trade-off leads on one side in an overestimation of the size of small radiosensitive structures such as the skin or hollow organs' walls, and on the other hand to unnecessarily detailed voxellization of large, homogeneous structures. The Aim of this work is to develop a hybrid multi-resolution phantom model for Geant4 and Gate, to better characterize energy deposition in small structures while preserving reasonable computation times. Materials and Methods: we have developed a pipeline for the conversion of preexisting phantoms into a multi-scale Geant4 model. Meshes of each organ are created from raw binary images of a phantom and then voxellized to the smallest spatial sampling required by the user. The user can then decide to re-sample the internal part of each organ, while leaving a layer of smallest voxels at the edge of the organ. In this way, the realistic shape of the organ is maintained while reducing the voxel number in the inner part. For hollow organs, the wall is always modeled using the smallest voxel sampling. This approach allows choosing different voxel resolutions for each organ according to a specific application. Results: preliminary results show that it is possible to

  7. Monte Carlo based simulation of LIAC intraoperative radiotherapy accelerator along with beam shaper applicator

    Directory of Open Access Journals (Sweden)

    N Heidarloo

    2017-08-01

    Full Text Available Intraoperative electron radiotherapy is one of the radiotherapy methods that delivers a high single fraction of radiation dose to the patient in one session during the surgery. Beam shaper applicator is one of the applicators that is recently employed with this radiotherapy method. This applicator has a considerable application in treatment of large tumors. In this study, the dosimetric characteristics of the electron beam produced by LIAC intraoperative radiotherapy accelerator in conjunction with this applicator have been evaluated through Monte Carlo simulation by MCNP code. The results showed that the electron beam produced by the beam shaper applicator would have the desirable dosimetric characteristics, so that the mentioned applicator can be considered for clinical purposes. Furthermore, the good agreement between the results of simulation and practical dosimetry, confirms the applicability of Monte Carlo method in determining the dosimetric parameters of electron beam  intraoperative radiotherapy

  8. Algorithm simulating the atom displacement processes induced by the gamma rays on the base of Monte Carlo method

    International Nuclear Information System (INIS)

    Cruz, C. M.; Pinera, I; Abreu, Y.; Leyva, A.

    2007-01-01

    Present work concerns with the implementation of a Monte Carlo based calculation algorithm describing particularly the occurrence of Atom Displacements induced by the Gamma Radiation interactions at a given target material. The Atom Displacement processes were considered only on the basis of single elastic scattering interactions among fast secondary electrons with matrix atoms, which are ejected from their crystalline sites at recoil energies higher than a given threshold energy. The secondary electron transport was described assuming typical approaches on this matter, where consecutive small angle scattering and very low energy transfer events behave as a continuously cuasi-classical electron state changes along a given path length delimited by two discrete high scattering angle and electron energy losses events happening on a random way. A limiting scattering angle was introduced and calculated according Moliere-Bethe-Goudsmit-Saunderson Electron Multiple Scattering, which allows splitting away secondary electrons single scattering processes from multiple one, according which a modified McKinley-Feshbach electron elastic scattering cross section arises. This distribution was statistically sampled and simulated in the framework of the Monte Carlo Method to perform discrete single electron scattering processes, particularly those leading to Atom Displacement events. The possibility of adding this algorithm to present existing open Monte Carlo code systems is analyze, in order to improve their capabilities. (Author)

  9. Uncertainty Analysis Based on Sparse Grid Collocation and Quasi-Monte Carlo Sampling with Application in Groundwater Modeling

    Science.gov (United States)

    Zhang, G.; Lu, D.; Ye, M.; Gunzburger, M.

    2011-12-01

    Markov Chain Monte Carlo (MCMC) methods have been widely used in many fields of uncertainty analysis to estimate the posterior distributions of parameters and credible intervals of predictions in the Bayesian framework. However, in practice, MCMC may be computationally unaffordable due to slow convergence and the excessive number of forward model executions required, especially when the forward model is expensive to compute. Both disadvantages arise from the curse of dimensionality, i.e., the posterior distribution is usually a multivariate function of parameters. Recently, sparse grid method has been demonstrated to be an effective technique for coping with high-dimensional interpolation or integration problems. Thus, in order to accelerate the forward model and avoid the slow convergence of MCMC, we propose a new method for uncertainty analysis based on sparse grid interpolation and quasi-Monte Carlo sampling. First, we construct a polynomial approximation of the forward model in the parameter space by using the sparse grid interpolation. This approximation then defines an accurate surrogate posterior distribution that can be evaluated repeatedly at minimal computational cost. Second, instead of using MCMC, a quasi-Monte Carlo method is applied to draw samples in the parameter space. Then, the desired probability density function of each prediction is approximated by accumulating the posterior density values of all the samples according to the prediction values. Our method has the following advantages: (1) the polynomial approximation of the forward model on the sparse grid provides a very efficient evaluation of the surrogate posterior distribution; (2) the quasi-Monte Carlo method retains the same accuracy in approximating the PDF of predictions but avoids all disadvantages of MCMC. The proposed method is applied to a controlled numerical experiment of groundwater flow modeling. The results show that our method attains the same accuracy much more efficiently

  10. G4-STORK: A Geant4-based Monte Carlo reactor kinetics simulation code

    International Nuclear Information System (INIS)

    Russell, Liam; Buijs, Adriaan; Jonkmans, Guy

    2014-01-01

    Highlights: • G4-STORK is a new, time-dependent, Monte Carlo code for reactor physics applications. • G4-STORK was built by adapting and expanding on the Geant4 Monte Carlo toolkit. • G4-STORK was designed to simulate short-term fluctuations in reactor cores. • G4-STORK is well suited for simulating sub- and supercritical assemblies. • G4-STORK was verified through comparisons with DRAGON and MCNP. - Abstract: In this paper we introduce G4-STORK (Geant4 STOchastic Reactor Kinetics), a new, time-dependent, Monte Carlo particle tracking code for reactor physics applications. G4-STORK was built by adapting and expanding on the Geant4 Monte Carlo toolkit. The toolkit provides the fundamental physics models and particle tracking algorithms that track each particle in space and time. It is a framework for further development (e.g. for projects such as G4-STORK). G4-STORK derives reactor physics parameters (e.g. k eff ) from the continuous evolution of a population of neutrons in space and time in the given simulation geometry. In this paper we detail the major additions to the Geant4 toolkit that were necessary to create G4-STORK. These include a renormalization process that maintains a manageable number of neutrons in the simulation even in very sub- or supercritical systems, scoring processes (e.g. recording fission locations, total neutrons produced and lost, etc.) that allow G4-STORK to calculate the reactor physics parameters, and dynamic simulation geometries that can change over the course of simulation to illicit reactor kinetics responses (e.g. fuel temperature reactivity feedback). The additions are verified through simple simulations and code-to-code comparisons with established reactor physics codes such as DRAGON and MCNP. Additionally, G4-STORK was developed to run a single simulation in parallel over many processors using MPI (Message Passing Interface) pipes

  11. Evaluation of IMRT plans of prostate carcinoma from four treatment planning systems based on Monte Carlo

    International Nuclear Information System (INIS)

    Chi Zifeng; Han Chun; Liu Dan; Cao Yankun; Li Runxiao

    2011-01-01

    Objective: With the Monte Carlo method to recalculate the IMRT dose distributions from four TPS to provide a platform for independent comparison and evaluation of the plan quality.These results will help make a clinical decision as which TPS will be used for prostate IMRT planning. Methods: Eleven prostate cancer cases were planned with the Corvus, Xio, Pinnacle and Eclipse TPS. The plans were recalculated by Monte Carlo using leaf sequences and MUs for individual plans. Dose-volume-histograms and isodose distributions were compared. Other quantities such as D min (the minimum dose received by 99% of CTV/PTV), D max (the maximum dose received by 1% of CTV/PTV), V 110% , V 105% , V 95% (the volume of CTV/PTV receiving 110%, 105%, 95% of the prescription dose), the volume of rectum and bladder receiving >65 Gy and >40 Gy, and the volume of femur receiving >50 Gy were evaluated. Total segments and MUs were also compared. Results: The Monte Carlo results agreed with the dose distributions from the TPS to within 3%/3 mm. The Xio, Pinnacle and Eclipse plans show less target dose heterogeneity and lower V 65 and V 40 for the rectum and bladder compared to the Corvus plans. The PTV D min is about 2 Gy lower for Xio plans than others while the Corvus plans have slightly lower female head V 50 (0.03% and 0.58%) than others. The Corvus plans require significantly most segments (187.8) and MUs (1264.7) to deliver and the Pinnacle plans require fewest segments (82.4) and MUs (703.6). Conclusions: We have tested an independent Monte Carlo dose calculation system for dose reconstruction and plan evaluation. This system provides a platform for the fair comparison and evaluation of treatment plans to facilitate clinical decision making in selecting a TPS and beam delivery system for particular treatment sites. (authors)

  12. Simulation based sequential Monte Carlo methods for discretely observed Markov processes

    OpenAIRE

    Neal, Peter

    2014-01-01

    Parameter estimation for discretely observed Markov processes is a challenging problem. However, simulation of Markov processes is straightforward using the Gillespie algorithm. We exploit this ease of simulation to develop an effective sequential Monte Carlo (SMC) algorithm for obtaining samples from the posterior distribution of the parameters. In particular, we introduce two key innovations, coupled simulations, which allow us to study multiple parameter values on the basis of a single sim...

  13. Uncertainty Analysis of Power Grid Investment Capacity Based on Monte Carlo

    Science.gov (United States)

    Qin, Junsong; Liu, Bingyi; Niu, Dongxiao

    By analyzing the influence factors of the investment capacity of power grid, to depreciation cost, sales price and sales quantity, net profit, financing and GDP of the second industry as the dependent variable to build the investment capacity analysis model. After carrying out Kolmogorov-Smirnov test, get the probability distribution of each influence factor. Finally, obtained the grid investment capacity uncertainty of analysis results by Monte Carlo simulation.

  14. Design and evaluation of a Monte Carlo based model of an orthovoltage treatment system

    International Nuclear Information System (INIS)

    Penchev, Petar; Maeder, Ulf; Fiebich, Martin; Zink, Klemens; University Hospital Marburg

    2015-01-01

    The aim of this study was to develop a flexible framework of an orthovoltage treatment system capable of calculating and visualizing dose distributions in different phantoms and CT datasets. The framework provides a complete set of various filters, applicators and X-ray energies and therefore can be adapted to varying studies or be used for educational purposes. A dedicated user friendly graphical interface was developed allowing for easy setup of the simulation parameters and visualization of the results. For the Monte Carlo simulations the EGSnrc Monte Carlo code package was used. Building the geometry was accomplished with the help of the EGSnrc C++ class library. The deposited dose was calculated according to the KERMA approximation using the track-length estimator. The validation against measurements showed a good agreement within 4-5% deviation, down to depths of 20% of the depth dose maximum. Furthermore, to show its capabilities, the validated model was used to calculate the dose distribution on two CT datasets. Typical Monte Carlo calculation time for these simulations was about 10 minutes achieving an average statistical uncertainty of 2% on a standard PC. However, this calculation time depends strongly on the used CT dataset, tube potential, filter material/thickness and applicator size.

  15. Proton tunneling in the A∙T Watson-Crick DNA base pair: myth or reality?

    Science.gov (United States)

    Brovarets', Ol'ha O; Hovorun, Dmytro M

    2015-01-01

    The results and conclusions reached by Godbeer et al. in their recent work, that proton tunneling in the A∙T(WC) Watson-Crick (WC) DNA base pair occurs according to the Löwdin's (L) model, but with a small (~10(-9)) probability were critically analyzed. Here, it was shown that this finding overestimates the possibility of the proton tunneling at the A∙T(WC)↔A*∙T*(L) tautomerization, because this process cannot be implemented as a chemical reaction. Furthermore, it was outlined those biologically important nucleobase mispairs (A∙A*↔A*∙A, G∙G*↔G*∙G, T∙T*↔T*∙T, C∙C*↔C*∙C, H∙H*↔H*∙H (H - hypoxanthine)) - the players in the field of the spontaneous point mutagenesis - where the tunneling of protons is expected and for which the application of the model proposed by Godbeer et al. can be productive.

  16. Novel proton exchange membrane based on crosslinked poly(vinyl alcohol) for direct methanol fuel cells

    Science.gov (United States)

    Liu, Chien-Pan; Dai, Chi-An; Chao, Chi-Yang; Chang, Shoou-Jinn

    2014-03-01

    In this study, we report the synthesis and the characterization of poly (vinyl alcohol) based proton conducting membranes. In particular, we describe a novel physically and chemically PVA/HFA (poly (vinyl alcohol)/hexafluoroglutaric acid) blending membranes with BASANa (Benzenesulfonic acid sodium salt) and GA (Glutaraldehyde) as binary reaction agents. The key PEM parameters such as ion exchange capacity (IEC), water uptake, proton conductivity, and methanol permeability were controlled by adjusting the chemical composition of the membranes. The IEC value of the membrane is found to be an important parameter in affecting water uptake, conductivity as well as the permeability of the resulting membrane. Plots of the water uptake, conductivity, and methanol permeability vs. IEC of the membranes show a distinct change in the slope of their curves at roughly the same IEC value which suggests a transition of structural changes in the network. The proton conductivities and the methanol permeability of all the membranes are in the range of 10-3-10-2 S cm-1 and 10-8-10-7 cm2 s-1, respectively, depending on its binary crosslinking density, and it shows great selectivity compared with those of Nafion®-117. The membranes display good mechanical properties which suggest a good lifetime usage of the membranes applied in DMFCs.

  17. SPL-based Proton Driver for a nu-Factory at CERN

    CERN Document Server

    Benedetto, E; Garoby, R; Meddahi, M

    2010-01-01

    The conceptual design and feasibility studies for a nu-Factory Proton Driver based on the CERN Superconducting Proton Linac (SPL) have been com- pleted. In the proposed scenario, the 4 MW proton beam (H- beam) is acceler- ated with the upgraded High Power (HP)-SPL to 5 GeV, stored in an accumu- lator ring and Þnally transported to a compressor ring, where bunch rotation takes place, in order to achieve the speciÞc time structure. We here summa- rize the choices in terms of lattice, magnet technology and RF manipulations in the two rings. The possible critical issues, such as heating of the foil for the charge-exchange injection, space-charge problems in the compressor and beam stability in the accumulator ring, have been addressed and are shown not to be show-stoppers. The analysis focuses on the baseline scenario, consider- ing 6 bunches in the accumulator, and preliminary studies are discussed for the option of 3 or a single bunch per burst.

  18. Cloud-based Monte Carlo modelling of BSSRDF for the rendering of human skin appearance (Conference Presentation)

    Science.gov (United States)

    Doronin, Alexander; Rushmeier, Holly E.; Meglinski, Igor; Bykov, Alexander V.

    2016-03-01

    We present a new Monte Carlo based approach for the modelling of Bidirectional Scattering-Surface Reflectance Distribution Function (BSSRDF) for accurate rendering of human skin appearance. The variations of both skin tissues structure and the major chromophores are taken into account correspondingly to the different ethnic and age groups. The computational solution utilizes HTML5, accelerated by the graphics processing units (GPUs), and therefore is convenient for the practical use at the most of modern computer-based devices and operating systems. The results of imitation of human skin reflectance spectra, corresponding skin colours and examples of 3D faces rendering are presented and compared with the results of phantom studies.

  19. Sampling-based nuclear data uncertainty quantification for continuous energy Monte-Carlo codes

    International Nuclear Information System (INIS)

    Zhu, T.

    2015-01-01

    Research on the uncertainty of nuclear data is motivated by practical necessity. Nuclear data uncertainties can propagate through nuclear system simulations into operation and safety related parameters. The tolerance for uncertainties in nuclear reactor design and operation can affect the economic efficiency of nuclear power, and essentially its sustainability. The goal of the present PhD research is to establish a methodology of nuclear data uncertainty quantification (NDUQ) for MCNPX, the continuous-energy Monte-Carlo (M-C) code. The high fidelity (continuous-energy treatment and flexible geometry modelling) of MCNPX makes it the choice of routine criticality safety calculations at PSI/LRS, but also raises challenges for NDUQ by conventional sensitivity/uncertainty (S/U) methods. For example, only recently in 2011, the capability of calculating continuous energy κ_e_f_f sensitivity to nuclear data was demonstrated in certain M-C codes by using the method of iterated fission probability. The methodology developed during this PhD research is fundamentally different from the conventional S/U approach: nuclear data are treated as random variables and sampled in accordance to presumed probability distributions. When sampled nuclear data are used in repeated model calculations, the output variance is attributed to the collective uncertainties of nuclear data. The NUSS (Nuclear data Uncertainty Stochastic Sampling) tool is based on this sampling approach and implemented to work with MCNPX’s ACE format of nuclear data, which also gives NUSS compatibility with MCNP and SERPENT M-C codes. In contrast, multigroup uncertainties are used for the sampling of ACE-formatted pointwise-energy nuclear data in a groupwise manner due to the more limited quantity and quality of nuclear data uncertainties. Conveniently, the usage of multigroup nuclear data uncertainties allows consistent comparison between NUSS and other methods (both S/U and sampling-based) that employ the same

  20. Sampling-based nuclear data uncertainty quantification for continuous energy Monte-Carlo codes

    Energy Technology Data Exchange (ETDEWEB)

    Zhu, T.

    2015-07-01

    Research on the uncertainty of nuclear data is motivated by practical necessity. Nuclear data uncertainties can propagate through nuclear system simulations into operation and safety related parameters. The tolerance for uncertainties in nuclear reactor design and operation can affect the economic efficiency of nuclear power, and essentially its sustainability. The goal of the present PhD research is to establish a methodology of nuclear data uncertainty quantification (NDUQ) for MCNPX, the continuous-energy Monte-Carlo (M-C) code. The high fidelity (continuous-energy treatment and flexible geometry modelling) of MCNPX makes it the choice of routine criticality safety calculations at PSI/LRS, but also raises challenges for NDUQ by conventional sensitivity/uncertainty (S/U) methods. For example, only recently in 2011, the capability of calculating continuous energy κ{sub eff} sensitivity to nuclear data was demonstrated in certain M-C codes by using the method of iterated fission probability. The methodology developed during this PhD research is fundamentally different from the conventional S/U approach: nuclear data are treated as random variables and sampled in accordance to presumed probability distributions. When sampled nuclear data are used in repeated model calculations, the output variance is attributed to the collective uncertainties of nuclear data. The NUSS (Nuclear data Uncertainty Stochastic Sampling) tool is based on this sampling approach and implemented to work with MCNPX’s ACE format of nuclear data, which also gives NUSS compatibility with MCNP and SERPENT M-C codes. In contrast, multigroup uncertainties are used for the sampling of ACE-formatted pointwise-energy nuclear data in a groupwise manner due to the more limited quantity and quality of nuclear data uncertainties. Conveniently, the usage of multigroup nuclear data uncertainties allows consistent comparison between NUSS and other methods (both S/U and sampling-based) that employ the same

  1. SU-C-207A-05: Feature Based Water Equivalent Path Length (WEPL) Determination for Proton Radiography by the Technique of Time Resolved Dose Measurement

    International Nuclear Information System (INIS)

    Zhang, R; Jee, K; Sharp, G; Flanz, J; Lu, H

    2016-01-01

    Purpose: Studies show that WEPL can be determined from modulated dose rate functions (DRF). However, the previous calibration method based on statistics of the DRF is sensitive to energy mixing of protons due to scattering through different materials (termed as range mixing here), causing inaccuracies in the determination of WEPL. This study intends to explore time-domain features of the DRF to reduce the effect of range mixing in proton radiography (pRG) by this technique. Methods: An amorphous silicon flat panel (PaxScan™ 4030CB, Varian Medical Systems, Inc., Palo Alto, CA) was placed behind phantoms to measure DRFs from a proton beam modulated by a specially designed modulator wheel. The performance of two methods, the previously used method based on the root mean square (RMS) and the new approach based on time-domain features of the DRF, are compared for retrieving WEPL and RSP from pRG of a Gammex phantom. Results: Calibration by T_8_0 (the time point for 80% of the major peak) was more robust to range mixing and produced WEPL with improved accuracy. The error of RSP was reduced from 8.2% to 1.7% for lung equivalent material, with the mean error for all other materials reduced from 1.2% to 0.7%. The mean error of the full width at half maximum (FWHM) of retrieved inserts was decreased from 25.85% to 5.89% for the RMS and T_8_0 method respectively. Monte Carlo simulations in simplified cases also demonstrated that the T_8_0 method is less sensitive to range mixing than the RMS method. Conclusion: WEPL images have been retrieved based on single flat panel measured DRFs, with inaccuracies reduced by exploiting time-domain features as the calibration parameter. The T_8_0 method is validated to be less sensitive to range mixing and can thus retrieve the WEPL values in proximity of interfaces with improved numerical and spatial accuracy for proton radiography.

  2. Monte Carlo-based treatment planning system calculation engine for microbeam radiation therapy.

    Science.gov (United States)

    Martinez-Rovira, I; Sempau, J; Prezado, Y

    2012-05-01

    Microbeam radiation therapy (MRT) is a synchrotron radiotherapy technique that explores the limits of the dose-volume effect. Preclinical studies have shown that MRT irradiations (arrays of 25-75-μm-wide microbeams spaced by 200-400 μm) are able to eradicate highly aggressive animal tumor models while healthy tissue is preserved. These promising results have provided the basis for the forthcoming clinical trials at the ID17 Biomedical Beamline of the European Synchrotron Radiation Facility (ESRF). The first step includes irradiation of pets (cats and dogs) as a milestone before treatment of human patients. Within this context, accurate dose calculations are required. The distinct features of both beam generation and irradiation geometry in MRT with respect to conventional techniques require the development of a specific MRT treatment planning system (TPS). In particular, a Monte Carlo (MC)-based calculation engine for the MRT TPS has been developed in this work. Experimental verification in heterogeneous phantoms and optimization of the computation time have also been performed. The penelope/penEasy MC code was used to compute dose distributions from a realistic beam source model. Experimental verification was carried out by means of radiochromic films placed within heterogeneous slab-phantoms. Once validation was completed, dose computations in a virtual model of a patient, reconstructed from computed tomography (CT) images, were performed. To this end, decoupling of the CT image voxel grid (a few cubic millimeter volume) to the dose bin grid, which has micrometer dimensions in the transversal direction of the microbeams, was performed. Optimization of the simulation parameters, the use of variance-reduction (VR) techniques, and other methods, such as the parallelization of the simulations, were applied in order to speed up the dose computation. Good agreement between MC simulations and experimental results was achieved, even at the interfaces between two

  3. Monte Carlo-based treatment planning system calculation engine for microbeam radiation therapy

    Energy Technology Data Exchange (ETDEWEB)

    Martinez-Rovira, I.; Sempau, J.; Prezado, Y. [Institut de Tecniques Energetiques, Universitat Politecnica de Catalunya, Diagonal 647, Barcelona E-08028 (Spain) and ID17 Biomedical Beamline, European Synchrotron Radiation Facility (ESRF), 6 rue Jules Horowitz B.P. 220, F-38043 Grenoble Cedex (France); Institut de Tecniques Energetiques, Universitat Politecnica de Catalunya, Diagonal 647, Barcelona E-08028 (Spain); Laboratoire Imagerie et modelisation en neurobiologie et cancerologie, UMR8165, Centre National de la Recherche Scientifique (CNRS), Universites Paris 7 et Paris 11, Bat 440., 15 rue Georges Clemenceau, F-91406 Orsay Cedex (France)

    2012-05-15

    Purpose: Microbeam radiation therapy (MRT) is a synchrotron radiotherapy technique that explores the limits of the dose-volume effect. Preclinical studies have shown that MRT irradiations (arrays of 25-75-{mu}m-wide microbeams spaced by 200-400 {mu}m) are able to eradicate highly aggressive animal tumor models while healthy tissue is preserved. These promising results have provided the basis for the forthcoming clinical trials at the ID17 Biomedical Beamline of the European Synchrotron Radiation Facility (ESRF). The first step includes irradiation of pets (cats and dogs) as a milestone before treatment of human patients. Within this context, accurate dose calculations are required. The distinct features of both beam generation and irradiation geometry in MRT with respect to conventional techniques require the development of a specific MRT treatment planning system (TPS). In particular, a Monte Carlo (MC)-based calculation engine for the MRT TPS has been developed in this work. Experimental verification in heterogeneous phantoms and optimization of the computation time have also been performed. Methods: The penelope/penEasy MC code was used to compute dose distributions from a realistic beam source model. Experimental verification was carried out by means of radiochromic films placed within heterogeneous slab-phantoms. Once validation was completed, dose computations in a virtual model of a patient, reconstructed from computed tomography (CT) images, were performed. To this end, decoupling of the CT image voxel grid (a few cubic millimeter volume) to the dose bin grid, which has micrometer dimensions in the transversal direction of the microbeams, was performed. Optimization of the simulation parameters, the use of variance-reduction (VR) techniques, and other methods, such as the parallelization of the simulations, were applied in order to speed up the dose computation. Results: Good agreement between MC simulations and experimental results was achieved, even at

  4. Automatic commissioning of a GPU-based Monte Carlo radiation dose calculation code for photon radiotherapy

    International Nuclear Information System (INIS)

    Tian, Zhen; Jia, Xun; Jiang, Steve B; Graves, Yan Jiang

    2014-01-01

    Monte Carlo (MC) simulation is commonly considered as the most accurate method for radiation dose calculations. Commissioning of a beam model in the MC code against a clinical linear accelerator beam is of crucial importance for its clinical implementation. In this paper, we propose an automatic commissioning method for our GPU-based MC dose engine, gDPM. gDPM utilizes a beam model based on a concept of phase-space-let (PSL). A PSL contains a group of particles that are of the same type and close in space and energy. A set of generic PSLs was generated by splitting a reference phase-space file. Each PSL was associated with a weighting factor, and in dose calculations the particle carried a weight corresponding to the PSL where it was from. Dose for each PSL in water was pre-computed, and hence the dose in water for a whole beam under a given set of PSL weighting factors was the weighted sum of the PSL doses. At the commissioning stage, an optimization problem was solved to adjust the PSL weights in order to minimize the difference between the calculated dose and measured one. Symmetry and smoothness regularizations were utilized to uniquely determine the solution. An augmented Lagrangian method was employed to solve the optimization problem. To validate our method, a phase-space file of a Varian TrueBeam 6 MV beam was used to generate the PSLs for 6 MV beams. In a simulation study, we commissioned a Siemens 6 MV beam on which a set of field-dependent phase-space files was available. The dose data of this desired beam for different open fields and a small off-axis open field were obtained by calculating doses using these phase-space files. The 3D γ-index test passing rate within the regions with dose above 10% of d max dose for those open fields tested was improved averagely from 70.56 to 99.36% for 2%/2 mm criteria and from 32.22 to 89.65% for 1%/1 mm criteria. We also tested our commissioning method on a six-field head-and-neck cancer IMRT plan. The

  5. Basic features of proton-proton interactions at ultra-relativistic energies and RFT-based quark-gluon string model

    Directory of Open Access Journals (Sweden)

    Zabrodin E.

    2017-01-01

    Full Text Available Proton-proton collisions at energies from √s = 200 GeV up to √s = 14 TeV are studied within the microscopic quark-gluon string model. The model is based on Gribov’s Reggeon Field Theory accomplished by string phenomenology. Comparison with experimental data shows that QGSM describes well particle yields, rapidity - and transverse momentum spectra, rise of mean 〈 pT 〉 and forward-backward multiplicity correlations. The latter arise in QGSM because of the addition of various processes with different mean multiplicities. The model also indicates fulfillment of extended longitudinal scaling and violation of Koba-Nielsen-Olesen scaling at LHC. The origin of both features is traced to short-range particle correlations in the strings. Predictions are made for √s = 14 TeV.

  6. A Monte Carlo based development of a cavity theory for solid state detectors irradiated in electron beams

    International Nuclear Information System (INIS)

    Mobit, P.

    2002-01-01

    Recent Monte Carlo simulations have shown that the assumption in the small cavity theory (and the extension of the small cavity theory by Spencer-Attix) that the cavity does not perturb the electron fluence is seriously flawed. For depths beyond d max not only is there a significant difference between the energy spectra in the medium and in the solid cavity materials but there is also a significant difference in the number of low-energy electrons which cannot travel across the solid cavity and hence deposit their dose in it (i.e. stopper electrons whose residual range is less than the cavity thickness). The number of these low-energy electrons that are not able to travel across the solid state cavity increases with depth and effective thickness of the detector. This also invalidates the assumption in the small cavity theory that most of the dose deposited in a small cavity is delivered by crossers. Based on Monte Carlo simulations, a new cavity theory for solid state detectors irradiated in electron beams has been proposed as: D med (p)=D det (p) x s S-A med.det x gamma(p) e x S T , where D med (p) is the dose to the medium at point, p, D det (p) is the average detector dose to the same point, s S-A med.det is the Spencer-Attix mass collision stopping power ratio of the medium to the detector material, gamma(p) e is the electron fluence perturbation correction factor and S T is a stopper-to-crosser correction factor to correct for the dependence of the stopper-to-crosser ratio on depth and the effective cavity size. Monte Carlo simulations have been computed for all the terms in this equation. The new cavity theory has been tested against the Spencer-Attix cavity equation as the small cavity limiting case and also Monte Carlo simulations. The agreement between this new cavity theory and Monte Carlo simulations is within 0.3%. (author)

  7. Dosimetric performance evaluation regarding proton beam incident angles of a lithium-based AB-BNCT design

    International Nuclear Information System (INIS)

    Lee, Pei-Yi; Jiang, Shiang-Huei; Liu, Yuan-Hao

    2014-01-01

    The 7 Li(p,xn) 7 Be nuclear reaction, based on the low-energy protons, could produce soft neutrons for accelerator-based boron neutron capture therapy (AB-BNCT). Based on the fact that the induced neutron field is relatively divergent, the relationship between the incident angle of proton beam and the neutron beam quality was evaluated in this study. To provide an intense epithermal neutron beam, a beam-shaping assembly (BSA) was designed. And a modified Snyder head phantom was used in the calculations for evaluating the dosimetric performance. From the calculated results, the intensity of epithermal neutrons increased with the increase in proton incident angle. Hence, either the irradiation time or the required proton current can be reduced. When the incident angle of 2.5-MeV proton beam is 120 deg., the required proton current is ∼13.3 mA for an irradiation time of half an hour. The results of this study show that the BSA designs can generate neutron beams with good intensity and penetrability. Using a 20-mA, 2.5-MeV proton beam as the source, the required irradiation time, to induce 60 RBE-Gy of maximum tumour dose, is less than half an hour in any proton beam alignments. On the premise that the dosimetric performances are similar, the intensity of epithermal neutrons can be increased by using non-collinear (e.g. 90 deg., 120 deg.) incident protons. Thus, either the irradiation time or the required proton current can be reduced. The use of 120 deg. BSA model shows the possibility to reduce the required proton current to ∼13.3 mA when the goal of irradiation time is 30 min. The decrease of required proton beam current certainly will make the use of lithium target much easier. In June 2013, a 5-MeV, 30-mA radio frequency quadruple (RFQ) accelerator for BNCT was built at INFN-LNL (Legnaro National Laboratories, Italy), which shows a possibility to build a suitable RFQ accelerator for the authors' design. In addition, a 2.5-MeV, 30-mA Tandem accelerator was

  8. TU-F-CAMPUS-T-05: A Cloud-Based Monte Carlo Dose Calculation for Electron Cutout Factors

    Energy Technology Data Exchange (ETDEWEB)

    Mitchell, T; Bush, K [Stanford School of Medicine, Stanford, CA (United States)

    2015-06-15

    Purpose: For electron cutouts of smaller sizes, it is necessary to verify electron cutout factors due to perturbations in electron scattering. Often, this requires a physical measurement using a small ion chamber, diode, or film. The purpose of this study is to develop a fast Monte Carlo based dose calculation framework that requires only a smart phone photograph of the cutout and specification of the SSD and energy to determine the electron cutout factor, with the ultimate goal of making this cloud-based calculation widely available to the medical physics community. Methods: The algorithm uses a pattern recognition technique to identify the corners of the cutout in the photograph as shown in Figure 1. It then corrects for variations in perspective, scaling, and translation of the photograph introduced by the user’s positioning of the camera. Blob detection is used to identify the portions of the cutout which comprise the aperture and the portions which are cutout material. This information is then used define physical densities of the voxels used in the Monte Carlo dose calculation algorithm as shown in Figure 2, and select a particle source from a pre-computed library of phase-spaces scored above the cutout. The electron cutout factor is obtained by taking a ratio of the maximum dose delivered with the cutout in place to the dose delivered under calibration/reference conditions. Results: The algorithm has been shown to successfully identify all necessary features of the electron cutout to perform the calculation. Subsequent testing will be performed to compare the Monte Carlo results with a physical measurement. Conclusion: A simple, cloud-based method of calculating electron cutout factors could eliminate the need for physical measurements and substantially reduce the time required to properly assure accurate dose delivery.

  9. Demonstration of a zero-variance based scheme for variance reduction to a mini-core Monte Carlo calculation

    Energy Technology Data Exchange (ETDEWEB)

    Christoforou, Stavros, E-mail: stavros.christoforou@gmail.com [Kirinthou 17, 34100, Chalkida (Greece); Hoogenboom, J. Eduard, E-mail: j.e.hoogenboom@tudelft.nl [Department of Applied Sciences, Delft University of Technology (Netherlands)

    2011-07-01

    A zero-variance based scheme is implemented and tested in the MCNP5 Monte Carlo code. The scheme is applied to a mini-core reactor using the adjoint function obtained from a deterministic calculation for biasing the transport kernels. It is demonstrated that the variance of the k{sub eff} estimate is halved compared to a standard criticality calculation. In addition, the biasing does not affect source distribution convergence of the system. However, since the code lacked optimisations for speed, we were not able to demonstrate an appropriate increase in the efficiency of the calculation, because of the higher CPU time cost. (author)

  10. Demonstration of a zero-variance based scheme for variance reduction to a mini-core Monte Carlo calculation

    International Nuclear Information System (INIS)

    Christoforou, Stavros; Hoogenboom, J. Eduard

    2011-01-01

    A zero-variance based scheme is implemented and tested in the MCNP5 Monte Carlo code. The scheme is applied to a mini-core reactor using the adjoint function obtained from a deterministic calculation for biasing the transport kernels. It is demonstrated that the variance of the k_e_f_f estimate is halved compared to a standard criticality calculation. In addition, the biasing does not affect source distribution convergence of the system. However, since the code lacked optimisations for speed, we were not able to demonstrate an appropriate increase in the efficiency of the calculation, because of the higher CPU time cost. (author)

  11. Inorganic proton conducting electrolyte coupled oxide-based dendritic transistors for synaptic electronics.

    Science.gov (United States)

    Wan, Chang Jin; Zhu, Li Qiang; Zhou, Ju Mei; Shi, Yi; Wan, Qing

    2014-05-07

    Ionic/electronic hybrid devices with synaptic functions are considered to be the essential building blocks for neuromorphic systems and brain-inspired computing. Here, artificial synapses based on indium-zinc-oxide (IZO) transistors gated by nanogranular SiO2 proton-conducting electrolyte films are fabricated on glass substrates. Spike-timing dependent plasticity and paired-pulse facilitation are successfully mimicked in an individual bottom-gate transistor. Most importantly, dynamic logic and dendritic integration established by spatiotemporally correlated spikes are also mimicked in dendritic transistors with two in-plane gates as the presynaptic input terminals.

  12. Cherenkov radiation-based three-dimensional position-sensitive PET detector: A Monte Carlo study.

    Science.gov (United States)

    Ota, Ryosuke; Yamada, Ryoko; Moriya, Takahiro; Hasegawa, Tomoyuki

    2018-05-01

    Cherenkov radiation has recently received attention due to its prompt emission phenomenon, which has the potential to improve the timing performance of radiation detectors dedicated to positron emission tomography (PET). In this study, a Cherenkov-based three-dimensional (3D) position-sensitive radiation detector was proposed, which is composed of a monolithic lead fluoride (PbF 2 ) crystal and a photodetector array of which the signals can be readout independently. Monte Carlo simulations were performed to estimate the performance of the proposed detector. The position- and time resolution were evaluated under various practical conditions. The radiator size and various properties of the photodetector, e.g., readout pitch and single photon timing resolution (SPTR), were parameterized. The single photon time response of the photodetector was assumed to be a single Gaussian for the simplification. The photo detection efficiency of the photodetector was ideally 100% for all wavelengths. Compton scattering was included in simulations, but partly analyzed. To estimate the position at which a γ-ray interacted in the Cherenkov radiator, the center-of-gravity (COG) method was employed. In addition, to estimate the depth-of-interaction (DOI) principal component analysis (PCA), which is a multivariate analysis method and has been used to identify the patterns in data, was employed. The time-space distribution of Cherenkov photons was quantified to perform PCA. To evaluate coincidence time resolution (CTR), the time difference of two independent γ-ray events was calculated. The detection time was defined as the first photon time after the SPTR of the photodetector was taken into account. The position resolution on the photodetector plane could be estimated with high accuracy, by using a small number of Cherenkov photons. Moreover, PCA showed an ability to estimate the DOI. The position resolution heavily depends on the pitch of the photodetector array and the radiator

  13. Pattern recognition and data mining software based on artificial neural networks applied to proton transfer in aqueous environments

    International Nuclear Information System (INIS)

    Tahat Amani; Marti Jordi; Khwaldeh Ali; Tahat Kaher

    2014-01-01

    In computational physics proton transfer phenomena could be viewed as pattern classification problems based on a set of input features allowing classification of the proton motion into two categories: transfer ‘occurred’ and transfer ‘not occurred’. The goal of this paper is to evaluate the use of artificial neural networks in the classification of proton transfer events, based on the feed-forward back propagation neural network, used as a classifier to distinguish between the two transfer cases. In this paper, we use a new developed data mining and pattern recognition tool for automating, controlling, and drawing charts of the output data of an Empirical Valence Bond existing code. The study analyzes the need for pattern recognition in aqueous proton transfer processes and how the learning approach in error back propagation (multilayer perceptron algorithms) could be satisfactorily employed in the present case. We present a tool for pattern recognition and validate the code including a real physical case study. The results of applying the artificial neural networks methodology to crowd patterns based upon selected physical properties (e.g., temperature, density) show the abilities of the network to learn proton transfer patterns corresponding to properties of the aqueous environments, which is in turn proved to be fully compatible with previous proton transfer studies. (condensed matter: structural, mechanical, and thermal properties)

  14. Chemical characterization of materials relevant to nuclear technology using neutron and proton based nuclear analytical methods

    International Nuclear Information System (INIS)

    Acharya, R.

    2014-01-01

    Nuclear analytical techniques (NATs), utilizing neutron and proton based nuclear reactions and subsequent measurement of gamma rays, are capable of chemical characterization of various materials at major to trace concentration levels. The present article deals with the recent developments and applications of conventional and k0-based internal monostandard (i) neutron activation analysis (NAA) and (ii) prompt gamma ray NAA (PGNAA) methods as well as (iii) in situ current normalized particle induced gamma ray emission (PIGE). The materials that have been analyzed by NAA and PGNAA include (i) nuclear reactor structural materials like zircaloys, stainless steels, Ni alloys, high purity aluminium and graphite and (ii) uranium oxide, U-Th mixed oxides, uranium ores and minerals. Internal monostandard NAA (IM-NAA) method with in situ detection efficiency was used to analyze large and non-standard geometry samples and standard-less compositional characterization was carried out for zircaloys and stainless steels. PIGE methods using proton beams were standardized for quantification of low Z elements (Li to Ti) and applied for compositional analysis of borosilicate glass and lithium titanate (Li 2 TiO 3 ) samples and quantification of total B and its isotopic composition of B ( 10 B/ 11 B) in boron based neutron absorbers like B 4 C. (author)

  15. A lattice-based Monte Carlo evaluation of Canada Deuterium Uranium-6 safety parameters

    International Nuclear Information System (INIS)

    Kim, Yong Hee; Hartanto, Donny; Kim, Woo Song

    2016-01-01

    Important safety parameters such as the fuel temperature coefficient (FTC) and the power coefficient of reactivity (PCR) of the CANada Deuterium Uranium (CANDU-6) reactor have been evaluated using the Monte Carlo method. For accurate analysis of the parameters, the Doppler broadening rejection correction scheme was implemented in the MCNPX code to account for the thermal motion of the heavy uranium-238 nucleus in the neutron-U scattering reactions. In this work, a standard fuel lattice has been modeled and the fuel is depleted using MCNPX. The FTC value is evaluated for several burnup points including the mid-burnup representing a near-equilibrium core. The Doppler effect has been evaluated using several cross-section libraries such as ENDF/B-VI.8, ENDF/B-VII.0, JEFF-3.1.1, and JENDL-4.0. The PCR value is also evaluated at mid-burnup conditions to characterize the safety features of an equilibrium CANDU-6 reactor. To improve the reliability of the Monte Carlo calculations, we considered a huge number of neutron histories in this work and the standard deviation of the k-infinity values is only 0.5-1 pcm

  16. Monte Carlo based electron treatment planning and cutout output factor calculations

    Science.gov (United States)

    Mitrou, Ellis

    Electron radiotherapy (RT) offers a number of advantages over photons. The high surface dose, combined with a rapid dose fall-off beyond the target volume presents a net increase in tumor control probability and decreases the normal tissue complication for superficial tumors. Electron treatments are normally delivered clinically without previously calculated dose distributions due to the complexity of the electron transport involved and greater error in planning accuracy. This research uses Monte Carlo (MC) methods to model clinical electron beams in order to accurately calculate electron beam dose distributions in patients as well as calculate cutout output factors, reducing the need for a clinical measurement. The present work is incorporated into a research MC calculation system: McGill Monte Carlo Treatment Planning (MMCTP) system. Measurements of PDDs, profiles and output factors in addition to 2D GAFCHROMICRTM EBT2 film measurements in heterogeneous phantoms were obtained to commission the electron beam model. The use of MC for electron TP will provide more accurate treatments and yield greater knowledge of the electron dose distribution within the patient. The calculation of output factors could invoke a clinical time saving of up to 1 hour per patient.

  17. Evaluation of the interindividual human variation in bioactivation of methyleugenol using physiologically based kinetic modeling and Monte Carlo simulations

    Energy Technology Data Exchange (ETDEWEB)

    Al-Subeihi, Ala' A.A., E-mail: subeihi@yahoo.com [Division of Toxicology, Wageningen University, Tuinlaan 5, 6703 HE Wageningen (Netherlands); BEN-HAYYAN-Aqaba International Laboratories, Aqaba Special Economic Zone Authority (ASEZA), P. O. Box 2565, Aqaba 77110 (Jordan); Alhusainy, Wasma; Kiwamoto, Reiko; Spenkelink, Bert [Division of Toxicology, Wageningen University, Tuinlaan 5, 6703 HE Wageningen (Netherlands); Bladeren, Peter J. van [Division of Toxicology, Wageningen University, Tuinlaan 5, 6703 HE Wageningen (Netherlands); Nestec S.A., Avenue Nestlé 55, 1800 Vevey (Switzerland); Rietjens, Ivonne M.C.M.; Punt, Ans [Division of Toxicology, Wageningen University, Tuinlaan 5, 6703 HE Wageningen (Netherlands)

    2015-03-01

    The present study aims at predicting the level of formation of the ultimate carcinogenic metabolite of methyleugenol, 1′-sulfooxymethyleugenol, in the human population by taking variability in key bioactivation and detoxification reactions into account using Monte Carlo simulations. Depending on the metabolic route, variation was simulated based on kinetic constants obtained from incubations with a range of individual human liver fractions or by combining kinetic constants obtained for specific isoenzymes with literature reported human variation in the activity of these enzymes. The results of the study indicate that formation of 1′-sulfooxymethyleugenol is predominantly affected by variation in i) P450 1A2-catalyzed bioactivation of methyleugenol to 1′-hydroxymethyleugenol, ii) P450 2B6-catalyzed epoxidation of methyleugenol, iii) the apparent kinetic constants for oxidation of 1′-hydroxymethyleugenol, and iv) the apparent kinetic constants for sulfation of 1′-hydroxymethyleugenol. Based on the Monte Carlo simulations a so-called chemical-specific adjustment factor (CSAF) for intraspecies variation could be derived by dividing different percentiles by the 50th percentile of the predicted population distribution for 1′-sulfooxymethyleugenol formation. The obtained CSAF value at the 90th percentile was 3.2, indicating that the default uncertainty factor of 3.16 for human variability in kinetics may adequately cover the variation within 90% of the population. Covering 99% of the population requires a larger uncertainty factor of 6.4. In conclusion, the results showed that adequate predictions on interindividual human variation can be made with Monte Carlo-based PBK modeling. For methyleugenol this variation was observed to be in line with the default variation generally assumed in risk assessment. - Highlights: • Interindividual human differences in methyleugenol bioactivation were simulated. • This was done using in vitro incubations, PBK modeling

  18. A Newton-based Jacobian-free approach for neutronic-Monte Carlo/thermal-hydraulic static coupled analysis

    International Nuclear Information System (INIS)

    Mylonakis, Antonios G.; Varvayanni, M.; Catsaros, N.

    2017-01-01

    Highlights: •A Newton-based Jacobian-free Monte Carlo/thermal-hydraulic coupling approach is introduced. •OpenMC is coupled with COBRA-EN with a Newton-based approach. •The introduced coupling approach is tested in numerical experiments. •The performance of the new approach is compared with the traditional “serial” coupling approach. -- Abstract: In the field of nuclear reactor analysis, multi-physics calculations that account for the bonded nature of the neutronic and thermal-hydraulic phenomena are of major importance for both reactor safety and design. So far in the context of Monte-Carlo neutronic analysis a kind of “serial” algorithm has been mainly used for coupling with thermal-hydraulics. The main motivation of this work is the interest for an algorithm that could maintain the distinct treatment of the involved fields within a tight coupling context that could be translated into higher convergence rates and more stable behaviour. This work investigates the possibility of replacing the usually used “serial” iteration with an approximate Newton algorithm. The selected algorithm, called Approximate Block Newton, is actually a version of the Jacobian-free Newton Krylov method suitably modified for coupling mono-disciplinary solvers. Within this Newton scheme the linearised system is solved with a Krylov solver in order to avoid the creation of the Jacobian matrix. A coupling algorithm between Monte-Carlo neutronics and thermal-hydraulics based on the above-mentioned methodology is developed and its performance is analysed. More specifically, OpenMC, a Monte-Carlo neutronics code and COBRA-EN, a thermal-hydraulics code for sub-channel and core analysis, are merged in a coupling scheme using the Approximate Block Newton method aiming to examine the performance of this scheme and compare with that of the “traditional” serial iterative scheme. First results show a clear improvement of the convergence especially in problems where significant

  19. Evaluation of the interindividual human variation in bioactivation of methyleugenol using physiologically based kinetic modeling and Monte Carlo simulations

    International Nuclear Information System (INIS)

    Al-Subeihi, Ala' A.A.; Alhusainy, Wasma; Kiwamoto, Reiko; Spenkelink, Bert; Bladeren, Peter J. van; Rietjens, Ivonne M.C.M.; Punt, Ans

    2015-01-01

    The present study aims at predicting the level of formation of the ultimate carcinogenic metabolite of methyleugenol, 1′-sulfooxymethyleugenol, in the human population by taking variability in key bioactivation and detoxification reactions into account using Monte Carlo simulations. Depending on the metabolic route, variation was simulated based on kinetic constants obtained from incubations with a range of individual human liver fractions or by combining kinetic constants obtained for specific isoenzymes with literature reported human variation in the activity of these enzymes. The results of the study indicate that formation of 1′-sulfooxymethyleugenol is predominantly affected by variation in i) P450 1A2-catalyzed bioactivation of methyleugenol to 1′-hydroxymethyleugenol, ii) P450 2B6-catalyzed epoxidation of methyleugenol, iii) the apparent kinetic constants for oxidation of 1′-hydroxymethyleugenol, and iv) the apparent kinetic constants for sulfation of 1′-hydroxymethyleugenol. Based on the Monte Carlo simulations a so-called chemical-specific adjustment factor (CSAF) for intraspecies variation could be derived by dividing different percentiles by the 50th percentile of the predicted population distribution for 1′-sulfooxymethyleugenol formation. The obtained CSAF value at the 90th percentile was 3.2, indicating that the default uncertainty factor of 3.16 for human variability in kinetics may adequately cover the variation within 90% of the population. Covering 99% of the population requires a larger uncertainty factor of 6.4. In conclusion, the results showed that adequate predictions on interindividual human variation can be made with Monte Carlo-based PBK modeling. For methyleugenol this variation was observed to be in line with the default variation generally assumed in risk assessment. - Highlights: • Interindividual human differences in methyleugenol bioactivation were simulated. • This was done using in vitro incubations, PBK modeling

  20. TH-C-BRD-05: Reducing Proton Beam Range Uncertainty with Patient-Specific CT HU to RSP Calibrations Based On Single-Detector Proton Radiography

    Energy Technology Data Exchange (ETDEWEB)

    Doolan, P [University College London, London (United Kingdom); Massachusetts General Hospital, Boston, MA (United States); Sharp, G; Testa, M; Lu, H-M [Massachusetts General Hospital, Boston, MA (United States); Bentefour, E [Ion Beam Applications (IBA), Louvain la Neuve (Belgium); Royle, G [University College London, London (United Kingdom)

    2014-06-15

    Purpose: Beam range uncertainty in proton treatment comes primarily from converting the patient's X-ray CT (xCT) dataset to relative stopping power (RSP). Current practices use a single curve for this conversion, produced by a stoichiometric calibration based on tissue composition data for average, healthy, adult humans, but not for the individual in question. Proton radiographs produce water-equivalent path length (WEPL) maps, dependent on the RSP of tissues within the specific patient. This work investigates the use of such WEPL maps to optimize patient-specific calibration curves for reducing beam range uncertainty. Methods: The optimization procedure works on the principle of minimizing the difference between the known WEPL map, obtained from a proton radiograph, and a digitally-reconstructed WEPL map (DRWM) through an RSP dataset, by altering the calibration curve that is used to convert the xCT into an RSP dataset. DRWMs were produced with Plastimatch, an in-house developed software, and an optimization procedure was implemented in Matlab. Tests were made on a range of systems including simulated datasets with computed WEPL maps and phantoms (anthropomorphic and real biological tissue) with WEPL maps measured by single detector proton radiography. Results: For the simulated datasets, the optimizer showed excellent results. It was able to either completely eradicate or significantly reduce the root-mean-square-error (RMSE) in the WEPL for the homogeneous phantoms (to zero for individual materials or from 1.5% to 0.2% for the simultaneous optimization of multiple materials). For the heterogeneous phantom the RMSE was reduced from 1.9% to 0.3%. Conclusion: An optimization procedure has been designed to produce patient-specific calibration curves. Test results on a range of systems with different complexities and sizes have been promising for accurate beam range control in patients. This project was funded equally by the Engineering and Physical Sciences

  1. A Treatment Planning Comparison of Combined Photon-Proton Beams Versus Proton Beams-Only for the Treatment of Skull Base Tumors

    International Nuclear Information System (INIS)

    Feuvret, Loic; Noel, Georges; Weber, Damien C.; Pommier, Pascal; Ferrand, Regis; De Marzi, Ludovic; Dhermain, Frederic; Alapetite, Claire; Mammar, Hamid; Boisserie, Gilbert; Habrand, Jean-Louis; Mazeron, Jean-Jacques

    2007-01-01

    Purpose: To compare treatment planning between combined photon-proton planning (CP) and proton planning (PP) for skull base tumors, so as to assess the potential limitations of CP for these tumors. Methods and Materials: Plans for 10 patients were computed for both CP and PP. Prescribed dose was 67 cobalt Gray equivalent (CGE) for PP; 45 Gy (photons) and 22 CGE (protons) for CP. Dose-volume histograms (DVHs) were calculated for gross target volume (GTV), clinical target volume (CTV), normal tissues (NT), and organs at risk (OARs) for each plan. Results were analyzed using DVH parameters, inhomogeneity coefficient (IC), and conformity index (CI). Results: Mean doses delivered to the GTVs and CTVs with CP (65.0 and 61.7 CGE) and PP (65.3 and 62.2 Gy CGE) were not significantly different (p > 0.1 and p = 0.72). However, the dose inhomogeneity was drastically increased with CP, with a mean significant incremental IC value of 10.5% and CP of 6.8%, for both the GTV (p = 0.01) and CTV (p = 0.04), respectively. The CI 80% values for the GTV and CTV were significantly higher with PP compared with CP. Compared with CP, the use of protons only led to a significant reduction of NT and OAR irradiation, in the intermediate-to-low dose (≤80% isodose line) range. Conclusions: These results suggest that the use of CP results in levels of target dose conformation similar to those with PP. Use of PP significantly reduced the tumor dose inhomogeneity and the delivered intermediate-to-low dose to NT and OARs, leading us to conclude that this treatment is mainly appropriate for tumors in children

  2. From nanochannel-induced proton conduction enhancement to a nanochannel-based fuel cell.

    Science.gov (United States)

    Liu, Shaorong; Pu, Qiaosheng; Gao, Lin; Korzeniewski, Carol; Matzke, Carolyn

    2005-07-01

    The apparent proton conductivity inside a nanochannel can be enhanced by orders of magnitude due to the electric double layer overlap. A nanochannel filled with an acidic solution is thus a micro super proton conductor, and an array of such nanochannels forms an excellent proton conductive membrane. Taking advantage of this effect, a new class of proton exchange membrane is developed for micro fuel cell applications.

  3. Commissioning of a compact laser-based proton beam line for high intensity bunches around 10 MeV

    Directory of Open Access Journals (Sweden)

    S. Busold

    2014-03-01

    Full Text Available We report on the first results of experiments with a new laser-based proton beam line at the GSI accelerator facility in Darmstadt. It delivers high current bunches at proton energies around 9.6 MeV, containing more than 10^{9} particles in less than 10 ns and with tunable energy spread down to 2.7% (ΔE/E_{0} at FWHM. A target normal sheath acceleration stage serves as a proton source and a pulsed solenoid provides for beam collimation and energy selection. Finally a synchronous radio frequency (rf field is applied via a rf cavity for energy compression at a synchronous phase of -90  deg. The proton bunch is characterized at the end of the very compact beam line, only 3 m behind the laser matter interaction point, which defines the particle source.

  4. Commissioning of a compact laser-based proton beam line for high intensity bunches around 10Â MeV

    Science.gov (United States)

    Busold, S.; Schumacher, D.; Deppert, O.; Brabetz, C.; Kroll, F.; Blažević, A.; Bagnoud, V.; Roth, M.

    2014-03-01

    We report on the first results of experiments with a new laser-based proton beam line at the GSI accelerator facility in Darmstadt. It delivers high current bunches at proton energies around 9.6 MeV, containing more than 109 particles in less than 10 ns and with tunable energy spread down to 2.7% (ΔE/E0 at FWHM). A target normal sheath acceleration stage serves as a proton source and a pulsed solenoid provides for beam collimation and energy selection. Finally a synchronous radio frequency (rf) field is applied via a rf cavity for energy compression at a synchronous phase of -90 deg. The proton bunch is characterized at the end of the very compact beam line, only 3 m behind the laser matter interaction point, which defines the particle source.

  5. Optimization of mass of plastic scintillator film for flow-cell based tritium monitoring: a Monte Carlo study

    International Nuclear Information System (INIS)

    Roy, Arup Singha; Palani Selvam, T.; Raman, Anand; Raja, V.; Chaudhury, Probal

    2014-01-01

    Over the years, various types of tritium-in-air monitors have been designed and developed based on different principles. Ionization chamber, proportional counter and scintillation detector systems are few among them. A plastic scintillator based, flow-cell type online tritium-in-air monitoring system was developed for online monitoring of tritium in air. The value of the scintillator mass inside the cell-volume, which maximizes the response of the detector system, should be obtained to get maximum efficiency. The present study is aimed to optimize the amount of mass of the plastic scintillator film for the flow-cell based tritium monitoring instrument so that maximum efficiency is achieved. The Monte Carlo based EGSnrc code system has been used for this purpose

  6. Hydrogen effects on deep level defects in proton implanted Cu(In,Ga)Se{sub 2} based thin films

    Energy Technology Data Exchange (ETDEWEB)

    Lee, D.W.; Seol, M.S.; Kwak, D.W.; Oh, J.S. [Department of Physics, Dongguk University, Seoul 100-715 (Korea, Republic of); Jeong, J.H. [Photo-electronic Hybrids Research Center, Korea Institute of Science and Technology, Seoul 136-791 (Korea, Republic of); Cho, H.Y., E-mail: hycho@dongguk.edu [Department of Physics, Dongguk University, Seoul 100-715 (Korea, Republic of)

    2012-08-01

    Hydrogen effects on deep level defects and a defect generation in proton implanted Cu(In,Ga)Se{sub 2} (CIGS) based thin films for solar cell were investigated. CIGS films with a thickness of 3 {mu}m were grown on a soda-lime glass substrate by a co-evaporation method, and then were implanted with protons. To study deep level defects in the proton implanted CIGS films, deep level transient spectroscopy measurements on the CIGS-based solar cells were carried out, these measurements found 6 traps (including 3 hole traps and 3 electron traps). In the proton implanted CIGS films, the deep level defects, which are attributed to the recombination centers of the CIGS solar cell, were significantly reduced in intensity, while a deep level defect was generated around 0.28 eV above the valence band maximum. Therefore, we suggest that most deep level defects in CIGS films can be controlled by hydrogen effects. - Highlights: Black-Right-Pointing-Pointer Proton implanted Cu(In,Ga)Se{sub 2} thin film and solar cell are prepared. Black-Right-Pointing-Pointer Deep level defects of Cu(In,Ga)Se{sub 2} thin film and solar cell are investigated. Black-Right-Pointing-Pointer Hydrogenation using proton implantation and H{sub 2} annealing reduces deep level defects. Black-Right-Pointing-Pointer Hydrogenation could enhance electrical properties and efficiency of solar cells.

  7. Analysis of accelerator based neutron spectra for BNCT using proton recoil spectroscopy

    International Nuclear Information System (INIS)

    Wielopolski, L.; Ludewig, H.; Powell, J.R.; Raparia, D.; Alessi, J.G.; Lowenstein, D.I.

    1998-01-01

    Boron Neutron Capture Therapy (BNCT) is a promising binary treatment modality for high-grade primary brain tumors (glioblastoma multiforme, GM) and other cancers. BNCT employs a boron-10 containing compound that preferentially accumulates in the cancer cells in the brain. Upon neutron capture by 10 B energetic alpha particles and triton released at the absorption site kill the cancer cell. In order to gain penetration depth in the brain Fairchild proposed, for this purpose, the use of energetic epithermal neutrons at about 10 keV. Phase I/II clinical trials of BNCT for GM are underway at the Brookhaven Medical Research Reactor (BMRR) and at the MIT Reactor, using these nuclear reactors as the source for epithermal neutrons. In light of the limitations of new reactor installations, e.g. cost, safety and licensing, and limited capability for modulating the reactor based neutron beam energy spectra alternative neutron sources are being contemplated for wider implementation of this modality in a hospital environment. For example, accelerator based neutron sources offer the possibility of tailoring the neutron beams, in terms of improved depth-dose distributions, to the individual and offer, with relative ease, the capability of modifying the neutron beam energy and port size. In previous work new concepts for compact accelerator/target configuration were published. In this work, using the Van de Graaff accelerator the authors have explored different materials for filtering and reflecting neutron beams produced by irradiating a thick Li target with 1.8 to 2.5 MeV proton beams. However, since the yield and the maximum neutron energy emerging from the Li-7(p,n)Be-7 reaction increase with increase in the proton beam energy, there is a need for optimization of the proton energy versus filter and shielding requirements to obtain the desired epithermal neutron beam. The MCNP-4A computer code was used for the initial design studies that were verified with benchmark experiments

  8. Heavy Ion SEU Cross Section Calculation Based on Proton Experimental Data, and Vice Versa

    CERN Document Server

    Wrobel, F; Pouget, V; Dilillo, L; Ecoffet, R; Lorfèvre, E; Bezerra, F; Brugger, M; Saigné, F

    2014-01-01

    The aim of this work is to provide a method to calculate single event upset (SEU) cross sections by using experimental data. Valuable tools such as PROFIT and SIMPA already focus on the calculation of the proton cross section by using heavy ions cross-section experiments. However, there is no available tool that calculates heavy ion cross sections based on measured proton cross sections with no knowledge of the technology. We based our approach on the diffusion-collection model with the aim of analyzing the characteristics of transient currents that trigger SEUs. We show that experimental cross sections could be used to characterize the pulses that trigger an SEU. Experimental results allow yet defining an empirical rule to identify the transient current that are responsible for an SEU. Then, the SEU cross section can be calculated for any kind of particle and any energy with no need to know the Spice model of the cell. We applied our method to some technologies (250 nm, 90 nm and 65 nm bulk SRAMs) and we sho...

  9. Proton-beam window design for a transmutation facility operating with a liquid lead target

    Energy Technology Data Exchange (ETDEWEB)

    Jansen, C.; Lypsch, F.; Lizana, P. [Institute for Safety Research and Reactor Technology, Juelich (Germany)] [and others

    1995-10-01

    The proton beam target of an accelerator-driven transmutation facility can be designed as a vertical liquid lead column. To prevent lead vapor from entering the accelerator vacuum, a proton-beam window has to separate the area above the lead surface from the accelerator tube. Two radiation-cooled design alternatives have been investigated which should withstand a proton beam of 1.6 GeV and 25 mA. Temperature calculations based on energy deposition calculations with the Monte Carlo code HETC, stability analysis and spallation-induced damage calculations have been performed showing the applicability of both designs.

  10. Monte Carlo application based on GEANT4 toolkit to simulate a laser–plasma electron beam line for radiobiological studies

    Energy Technology Data Exchange (ETDEWEB)

    Lamia, D., E-mail: debora.lamia@ibfm.cnr.it [Institute of Molecular Bioimaging and Physiology IBFM CNR – LATO, Cefalù (Italy); Russo, G., E-mail: giorgio.russo@ibfm.cnr.it [Institute of Molecular Bioimaging and Physiology IBFM CNR – LATO, Cefalù (Italy); Casarino, C.; Gagliano, L.; Candiano, G.C. [Institute of Molecular Bioimaging and Physiology IBFM CNR – LATO, Cefalù (Italy); Labate, L. [Intense Laser Irradiation Laboratory (ILIL) – National Institute of Optics INO CNR, Pisa (Italy); National Institute for Nuclear Physics INFN, Pisa Section and Frascati National Laboratories LNF (Italy); Baffigi, F.; Fulgentini, L.; Giulietti, A.; Koester, P.; Palla, D. [Intense Laser Irradiation Laboratory (ILIL) – National Institute of Optics INO CNR, Pisa (Italy); Gizzi, L.A. [Intense Laser Irradiation Laboratory (ILIL) – National Institute of Optics INO CNR, Pisa (Italy); National Institute for Nuclear Physics INFN, Pisa Section and Frascati National Laboratories LNF (Italy); Gilardi, M.C. [Institute of Molecular Bioimaging and Physiology IBFM CNR, Segrate (Italy); University of Milano-Bicocca, Milano (Italy)

    2015-06-21

    We report on the development of a Monte Carlo application, based on the GEANT4 toolkit, for the characterization and optimization of electron beams for clinical applications produced by a laser-driven plasma source. The GEANT4 application is conceived so as to represent in the most general way the physical and geometrical features of a typical laser-driven accelerator. It is designed to provide standard dosimetric figures such as percentage dose depth curves, two-dimensional dose distributions and 3D dose profiles at different positions both inside and outside the interaction chamber. The application was validated by comparing its predictions to experimental measurements carried out on a real laser-driven accelerator. The work is aimed at optimizing the source, by using this novel application, for radiobiological studies and, in perspective, for medical applications. - Highlights: • Development of a Monte Carlo application based on GEANT4 toolkit. • Experimental measurements carried out with a laser-driven acceleration system. • Validation of Geant4 application comparing experimental data with the simulated ones. • Dosimetric characterization of the acceleration system.

  11. Proton conducting polymeric materials for hydrogen based electrochemical energy conversion technologies

    DEFF Research Database (Denmark)

    Aili, David

    on the development and characterization of polymer based proton conducting membranes for operation at temperatures above 100 °C. The most frequently recurring experimental methods and techniques are described in Chapter 2. For PEM steam and liquid water electrolysis at temperatures up to 130 °C (Chapter 3 and 4...... and water electrolyzers. This thesis gives an overview of the principles and the current state-of-the-art technology of the hydrogen based electrochemical energy conversion technologies, with special emphasis on the PEM based water electrolyzers and fuel cells (Chapter 1). The fundamental thermodynamics...... of the recast Nafion® membranes at elevated temperature could be slightly improved by annealing the membrane in order to increase its degree of crystallinity. Short side chain (SSC) PFSA membranes such as Aquivion™ (Solvey Solexis), on the other hand, are generally characterized by a considerably higher degree...

  12. Using Monte Carlo/Gaussian Based Small Area Estimates to Predict Where Medicaid Patients Reside.

    Science.gov (United States)

    Behrens, Jess J; Wen, Xuejin; Goel, Satyender; Zhou, Jing; Fu, Lina; Kho, Abel N

    2016-01-01

    Electronic Health Records (EHR) are rapidly becoming accepted as tools for planning and population health 1,2 . With the national dialogue around Medicaid expansion 12 , the role of EHR data has become even more important. For their potential to be fully realized and contribute to these discussions, techniques for creating accurate small area estimates is vital. As such, we examined the efficacy of developing small area estimates for Medicaid patients in two locations, Albuquerque and Chicago, by using a Monte Carlo/Gaussian technique that has worked in accurately locating registered voters in North Carolina 11 . The Albuquerque data, which includes patient address, will first be used to assess the accuracy of the methodology. Subsequently, it will be combined with the EHR data from Chicago to develop a regression that predicts Medicaid patients by US Block Group. We seek to create a tool that is effective in translating EHR data's potential for population health studies.

  13. Monte Carlo based geometrical model for efficiency calculation of an n-type HPGe detector

    Energy Technology Data Exchange (ETDEWEB)

    Padilla Cabal, Fatima, E-mail: fpadilla@instec.c [Instituto Superior de Tecnologias y Ciencias Aplicadas, ' Quinta de los Molinos' Ave. Salvador Allende, esq. Luaces, Plaza de la Revolucion, Ciudad de la Habana, CP 10400 (Cuba); Lopez-Pino, Neivy; Luis Bernal-Castillo, Jose; Martinez-Palenzuela, Yisel; Aguilar-Mena, Jimmy; D' Alessandro, Katia; Arbelo, Yuniesky; Corrales, Yasser; Diaz, Oscar [Instituto Superior de Tecnologias y Ciencias Aplicadas, ' Quinta de los Molinos' Ave. Salvador Allende, esq. Luaces, Plaza de la Revolucion, Ciudad de la Habana, CP 10400 (Cuba)

    2010-12-15

    A procedure to optimize the geometrical model of an n-type detector is described. Sixteen lines from seven point sources ({sup 241}Am, {sup 133}Ba, {sup 22}Na, {sup 60}Co, {sup 57}Co, {sup 137}Cs and {sup 152}Eu) placed at three different source-to-detector distances (10, 20 and 30 cm) were used to calibrate a low-background gamma spectrometer between 26 and 1408 keV. Direct Monte Carlo techniques using the MCNPX 2.6 and GEANT 4 9.2 codes, and a semi-empirical procedure were performed to obtain theoretical efficiency curves. Since discrepancies were found between experimental and calculated data using the manufacturer parameters of the detector, a detail study of the crystal dimensions and the geometrical configuration is carried out. The relative deviation with experimental data decreases from a mean value of 18-4%, after the parameters were optimized.

  14. The motion of discs and spherical fuel particles in combustion burners based on Monte Carlo simulation

    Energy Technology Data Exchange (ETDEWEB)

    Granada, E.; Patino, D.; Porteiro, J.; Collazo, J.; Miguez, J.L.; Moran, J. [University of Vigo, E.T.S. Ingenieros Industriales, Lagoas-Marcosende s/n, 36200-Vigo (Spain)

    2010-04-15

    The position of pellet fuel particles in a burner largely determines their combustion behaviour. This paper addresses the simulated motion of circles and spheres, equivalent to pellet, and their final position in a packed bed subject to a gravitational field confined inside rigid cylindrical walls. A simplified Monte Carlo statistical technique has been described and applied with the standard Metropolis method for the simulation of movement. This simplification provides an easier understanding of the method when applied to solid fuels in granular form, provided that they are only under gravitational forces. Not only have we contrasted one parameter, as other authors, but three, which are radial, bulk and local porosities, via Voronoi tessellation. Our simulations reveal a structural order near the walls, which declines towards the centre of the container, and no pattern was found in local porosity via Voronoi. Results with this simplified method are in agreement with more complex previously published studies. (author)

  15. PC-based process distribution to solve iterative Monte Carlo simulations in physical dosimetry

    International Nuclear Information System (INIS)

    Leal, A.; Sanchez-Doblado, F.; Perucha, M.; Rincon, M.; Carrasco, E.; Bernal, C.

    2001-01-01

    A distribution model to simulate physical dosimetry measurements with Monte Carlo (MC) techniques has been developed. This approach is indicated to solve the simulations where there are continuous changes of measurement conditions (and hence of the input parameters) such as a TPR curve or the estimation of the resolution limit of an optimal densitometer in the case of small field profiles. As a comparison, a high resolution scan for narrow beams with no iterative process is presented. The model has been installed on a network PCs without any resident software. The only requirement for these PCs has been a small and temporal Linux partition in the hard disks and to be connecting by the net with our server PC. (orig.)

  16. Experimental validation of a rapid Monte Carlo based micro-CT simulator

    International Nuclear Information System (INIS)

    Colijn, A P; Zbijewski, W; Sasov, A; Beekman, F J

    2004-01-01

    We describe a newly developed, accelerated Monte Carlo simulator of a small animal micro-CT scanner. Transmission measurements using aluminium slabs are employed to estimate the spectrum of the x-ray source. The simulator incorporating this spectrum is validated with micro-CT scans of physical water phantoms of various diameters, some containing stainless steel and Teflon rods. Good agreement is found between simulated and real data: normalized error of simulated projections, as compared to the real ones, is typically smaller than 0.05. Also the reconstructions obtained from simulated and real data are found to be similar. Thereafter, effects of scatter are studied using a voxelized software phantom representing a rat body. It is shown that the scatter fraction can reach tens of per cents in specific areas of the body and therefore scatter can significantly affect quantitative accuracy in small animal CT imaging

  17. The motion of discs and spherical fuel particles in combustion burners based on Monte Carlo simulation

    International Nuclear Information System (INIS)

    Granada, E.; Patino, D.; Porteiro, J.; Collazo, J.; Miguez, J.L.; Moran, J.

    2010-01-01

    The position of pellet fuel particles in a burner largely determines their combustion behaviour. This paper addresses the simulated motion of circles and spheres, equivalent to pellet, and their final position in a packed bed subject to a gravitational field confined inside rigid cylindrical walls. A simplified Monte Carlo statistical technique has been described and applied with the standard Metropolis method for the simulation of movement. This simplification provides an easier understanding of the method when applied to solid fuels in granular form, provided that they are only under gravitational forces. Not only have we contrasted one parameter, as other authors, but three, which are radial, bulk and local porosities, via Voronoi tessellation. Our simulations reveal a structural order near the walls, which declines towards the centre of the container, and no pattern was found in local porosity via Voronoi. Results with this simplified method are in agreement with more complex previously published studies.

  18. Hospital-based proton linear accelerator for particle therapy and radioisotope production

    Science.gov (United States)

    Lennox, Arlene J.

    1991-05-01

    Taking advantage of recent advances in linear accelerator technology, it is possible for a hospital to use a 70 MeV proton linac for fast neutron therapy, boron neutron capture therapy, proton therapy for ocular melanomas, and production of radiopharmaceuticals. The linac can also inject protons into a synchrotron for proton therapy of deep-seated tumors. With 180 μA average current, a single linac can support all these applications. This paper presents a conceptual design for a medical proton linac, switchyard, treatment rooms, and isotope production rooms. Special requirements for each application are outlined and a layout for sharing beam among the applications is suggested.

  19. In Silico Generation of Peptides by Replica Exchange Monte Carlo: Docking-Based Optimization of Maltose-Binding-Protein Ligands.

    Directory of Open Access Journals (Sweden)

    Anna Russo

    Full Text Available Short peptides can be designed in silico and synthesized through automated techniques, making them advantageous and versatile protein binders. A number of docking-based algorithms allow for a computational screening of peptides as binders. Here we developed ex-novo peptides targeting the maltose site of the Maltose Binding Protein, the prototypical system for the study of protein ligand recognition. We used a Monte Carlo based protocol, to computationally evolve a set of octapeptides starting from a polialanine sequence. We screened in silico the candidate peptides and characterized their binding abilities by surface plasmon resonance, fluorescence and electrospray ionization mass spectrometry assays. These experiments showed the designed binders to recognize their target with micromolar affinity. We finally discuss the obtained results in the light of further improvement in the ex-novo optimization of peptide based binders.

  20. Proton beam therapy in the management of skull base chordomas: systematic review of indications, outcomes, and implications for neurosurgeons.

    Science.gov (United States)

    Matloob, Samir A; Nasir, Haleema A; Choi, David

    2016-08-01

    Chordomas are rare tumours affecting the skull base. There is currently no clear consensus on the post-surgical radiation treatments that should be used after maximal tumour resection. However, high-dose proton beam therapy is an accepted option for post-operative radiotherapy to maximise local control, and in the UK, National Health Service approval for funding abroad is granted for specific patient criteria. To review the indications and efficacy of proton beam therapy in the management of skull base chordomas. The primary outcome measure for review was the efficacy of proton beam therapy in the prevention of local occurrence. A systematic review of English and non-English articles using MEDLINE (1946-present) and EMBASE (1974-present) databases was performed. Additional studies were reviewed when referenced in other studies and not available on these databases. Search terms included chordoma or chordomas. The PRISMA guidelines were followed for reporting our findings as a systematic review. A total of 76 articles met the inclusion and exclusion criteria for this review. Limitations included the lack of documentation of the extent of primary surgery, tumour size, and lack of standardised outcome measures. Level IIb/III evidence suggests proton beam therapy given post operatively for skull base chordomas results in better survival with less damage to surrounding tissue. Proton beam therapy is a grade B/C recommended treatment modality for post-operative radiation therapy to skull base chordomas. In comparison to other treatment modalities long-term local control and survival is probably improved with proton beam therapy. Further, studies are required to directly compare proton beam therapy to other treatment modalities in selected patients.

  1. Optic neuropathy following combined proton and photon radiotherapy for base of skull tumors

    International Nuclear Information System (INIS)

    Kim, June; Munzenrider, John; Maas, Alicea; Finkelstein, Dianne; Liebsch, Norbert; Hug, Eugen; Suit, Herman; Smith, Al; Goitein, Michael

    1997-01-01

    Purpose/Objective: To evaluate the risk of radiation injury to the optic pathway following high dose radiation therapy (RT) for base of skull tumors with regard to the following variables: diabetes, hypertension, number of surgical procedures, use of patch, patch distance, radiation dose, and volume of optic structures receiving 50, 55, or 60 Cobalt Gray Equivalent (CGE). Materials and Methods: A total of 359 patients with base of skull chordoma or low grade chondrosarcoma received high dose radiation therapy. Patients were treated with external beam radiotherapy utilizing protons alone or combined protons and photons. Protons of 160 MeV were delivered at the Harvard Cyclotron Laboratory using a modulated Bragg peak. The tumor dose ranged from 61 to 76 CGE. CGE was used because modulated protons have an RBE of 1.1 compared to 60 Co. Among 359 patients, 85 patients were excluded from evaluation based on age, tumor location, and pre-RT treatment criteria. All 274 evaluable patients had a minimum follow up of 12 months. Medical records were reviewed to determine the actual cause of vision changes. A total of 12 patients with grade II, III, and IV radiation-induced optic neuropathy were identified. Twenty-four patients without complications who closely matched the aforementioned 12 cases with optic neuropathy were selected from the 274 patients as a control group. Dose volume histograms of 12 cases and 24 controls were reviewed to determine minimum, median, and maximum dose to the optic apparatus as well as dose volume at 50, 55, and 60 CGE. Other information regarding remaining potential risk factors, such as diabetes, hypertension, number of surgical procedures, use of patch, and patch distance, was also obtained. Results: A total of 12 patients (4.4%) developed radiation-induced optic neuropathy: 1 grade II, 9 grade III, and 2 grade IV. Specific sites of involvement were left optic nerve in 9, right optic nerve in 5, and chiasm in 4 cases. The duration to the onset

  2. SCINFUL-QMD: Monte Carlo based computer code to calculate response function and detection efficiency of a liquid organic scintillator for neutron energies up to 3 GeV

    International Nuclear Information System (INIS)

    Satoh, Daiki; Sato, Tatsuhiko; Shigyo, Nobuhiro; Ishibashi, Kenji

    2006-11-01

    The Monte Carlo based computer code SCINFUL-QMD has been developed to evaluate response function and detection efficiency of a liquid organic scintillator for neutrons from 0.1 MeV to 3 GeV. This code is a modified version of SCINFUL that was developed at Oak Ridge National Laboratory in 1988, to provide a calculated full response anticipated for neutron interactions in a scintillator. The upper limit of the applicable energy was extended from 80 MeV to 3 GeV by introducing the quantum molecular dynamics incorporated with the statistical decay model (QMD+SDM) in the high-energy nuclear reaction part. The particles generated in QMD+SDM are neutron, proton, deuteron, triton, 3 He nucleus, alpha particle, and charged pion. Secondary reactions by neutron, proton, and pion inside the scintillator are also taken into account. With the extension of the applicable energy, the database of total cross sections for hydrogen and carbon nuclei were upgraded. This report describes the physical model, computational flow and how to use the code. (author)

  3. Poly(dA-dT).poly(dA-dT) two-pathway proton exchange mechanism. Effect of general and specific base catalysis on deuteration rates

    International Nuclear Information System (INIS)

    Hartmann, B.; Leng, M.; Ramstein, J.

    1986-01-01

    The deuteration rates of the poly(dA-dT).poly(dA-dT) amino and imino protons have been measured with stopped-flow spectrophotometry as a function of general and specific base catalyst concentration. Two proton exchange classes are found with time constants differing by a factor of 10 (4 and 0.4 s-1). The slower class represents the exchange of the adenine amino protons whereas the proton of the faster class has been assigned to the thymine imino proton. The exchange rates of these two classes of protons are independent of general and specific base catalyst concentration. This very characteristic behavior demonstrates that in our experimental conditions the exchange rates of the imino and amino protons in poly(dA-dT).poly(dA-dT) are limited by two different conformational fluctuations. We present a three-state exchange mechanism accounting for our experimental results

  4. Accelerating parameter identification of