Proton Upset Monte Carlo Simulation
O'Neill, Patrick M.; Kouba, Coy K.; Foster, Charles C.
2009-01-01
The Proton Upset Monte Carlo Simulation (PROPSET) program calculates the frequency of on-orbit upsets in computer chips (for given orbits such as Low Earth Orbit, Lunar Orbit, and the like) from proton bombardment based on the results of heavy ion testing alone. The software simulates the bombardment of modern microelectronic components (computer chips) with high-energy (.200 MeV) protons. The nuclear interaction of the proton with the silicon of the chip is modeled and nuclear fragments from this interaction are tracked using Monte Carlo techniques to produce statistically accurate predictions.
GPU-based fast Monte Carlo dose calculation for proton therapy
Jia, Xun; Schümann, Jan; Paganetti, Harald; Jiang, Steve B.
2012-12-01
Accurate radiation dose calculation is essential for successful proton radiotherapy. Monte Carlo (MC) simulation is considered to be the most accurate method. However, the long computation time limits it from routine clinical applications. Recently, graphics processing units (GPUs) have been widely used to accelerate computationally intensive tasks in radiotherapy. We have developed a fast MC dose calculation package, gPMC, for proton dose calculation on a GPU. In gPMC, proton transport is modeled by the class II condensed history simulation scheme with a continuous slowing down approximation. Ionization, elastic and inelastic proton nucleus interactions are considered. Energy straggling and multiple scattering are modeled. Secondary electrons are not transported and their energies are locally deposited. After an inelastic nuclear interaction event, a variety of products are generated using an empirical model. Among them, charged nuclear fragments are terminated with energy locally deposited. Secondary protons are stored in a stack and transported after finishing transport of the primary protons, while secondary neutral particles are neglected. gPMC is implemented on the GPU under the CUDA platform. We have validated gPMC using the TOPAS/Geant4 MC code as the gold standard. For various cases including homogeneous and inhomogeneous phantoms as well as a patient case, good agreements between gPMC and TOPAS/Geant4 are observed. The gamma passing rate for the 2%/2 mm criterion is over 98.7% in the region with dose greater than 10% maximum dose in all cases, excluding low-density air regions. With gPMC it takes only 6-22 s to simulate 10 million source protons to achieve ˜1% relative statistical uncertainty, depending on the phantoms and energy. This is an extremely high efficiency compared to the computational time of tens of CPU hours for TOPAS/Geant4. Our fast GPU-based code can thus facilitate the routine use of MC dose calculation in proton therapy.
Dosimetric investigation of proton therapy on CT-based patient data using Monte Carlo simulation
Chongsan, T.; Liamsuwan, T.; Tangboonduangjit, P.
2016-03-01
The aim of radiotherapy is to deliver high radiation dose to the tumor with low radiation dose to healthy tissues. Protons have Bragg peaks that give high radiation dose to the tumor but low exit dose or dose tail. Therefore, proton therapy is promising for treating deep- seated tumors and tumors locating close to organs at risk. Moreover, the physical characteristic of protons is suitable for treating cancer in pediatric patients. This work developed a computational platform for calculating proton dose distribution using the Monte Carlo (MC) technique and patient's anatomical data. The studied case is a pediatric patient with a primary brain tumor. PHITS will be used for MC simulation. Therefore, patient-specific CT-DICOM files were converted to the PHITS input. A MATLAB optimization program was developed to create a beam delivery control file for this study. The optimization program requires the proton beam data. All these data were calculated in this work using analytical formulas and the calculation accuracy was tested, before the beam delivery control file is used for MC simulation. This study will be useful for researchers aiming to investigate proton dose distribution in patients but do not have access to proton therapy machines.
A GPU-accelerated and Monte Carlo-based intensity modulated proton therapy optimization system
International Nuclear Information System (INIS)
Purpose: Conventional spot scanning intensity modulated proton therapy (IMPT) treatment planning systems (TPSs) optimize proton spot weights based on analytical dose calculations. These analytical dose calculations have been shown to have severe limitations in heterogeneous materials. Monte Carlo (MC) methods do not have these limitations; however, MC-based systems have been of limited clinical use due to the large number of beam spots in IMPT and the extremely long calculation time of traditional MC techniques. In this work, the authors present a clinically applicable IMPT TPS that utilizes a very fast MC calculation. Methods: An in-house graphics processing unit (GPU)-based MC dose calculation engine was employed to generate the dose influence map for each proton spot. With the MC generated influence map, a modified least-squares optimization method was used to achieve the desired dose volume histograms (DVHs). The intrinsic CT image resolution was adopted for voxelization in simulation and optimization to preserve spatial resolution. The optimizations were computed on a multi-GPU framework to mitigate the memory limitation issues for the large dose influence maps that resulted from maintaining the intrinsic CT resolution. The effects of tail cutoff and starting condition were studied and minimized in this work. Results: For relatively large and complex three-field head and neck cases, i.e., >100 000 spots with a target volume of ∼1000 cm3 and multiple surrounding critical structures, the optimization together with the initial MC dose influence map calculation was done in a clinically viable time frame (less than 30 min) on a GPU cluster consisting of 24 Nvidia GeForce GTX Titan cards. The in-house MC TPS plans were comparable to a commercial TPS plans based on DVH comparisons. Conclusions: A MC-based treatment planning system was developed. The treatment planning can be performed in a clinically viable time frame on a hardware system costing around 45 000
A GPU-accelerated and Monte Carlo-based intensity modulated proton therapy optimization system
Energy Technology Data Exchange (ETDEWEB)
Ma, Jiasen, E-mail: ma.jiasen@mayo.edu; Beltran, Chris; Seum Wan Chan Tseung, Hok; Herman, Michael G. [Department of Radiation Oncology, Division of Medical Physics, Mayo Clinic, 200 First Street Southwest, Rochester, Minnesota 55905 (United States)
2014-12-15
Purpose: Conventional spot scanning intensity modulated proton therapy (IMPT) treatment planning systems (TPSs) optimize proton spot weights based on analytical dose calculations. These analytical dose calculations have been shown to have severe limitations in heterogeneous materials. Monte Carlo (MC) methods do not have these limitations; however, MC-based systems have been of limited clinical use due to the large number of beam spots in IMPT and the extremely long calculation time of traditional MC techniques. In this work, the authors present a clinically applicable IMPT TPS that utilizes a very fast MC calculation. Methods: An in-house graphics processing unit (GPU)-based MC dose calculation engine was employed to generate the dose influence map for each proton spot. With the MC generated influence map, a modified least-squares optimization method was used to achieve the desired dose volume histograms (DVHs). The intrinsic CT image resolution was adopted for voxelization in simulation and optimization to preserve spatial resolution. The optimizations were computed on a multi-GPU framework to mitigate the memory limitation issues for the large dose influence maps that resulted from maintaining the intrinsic CT resolution. The effects of tail cutoff and starting condition were studied and minimized in this work. Results: For relatively large and complex three-field head and neck cases, i.e., >100 000 spots with a target volume of ∼1000 cm{sup 3} and multiple surrounding critical structures, the optimization together with the initial MC dose influence map calculation was done in a clinically viable time frame (less than 30 min) on a GPU cluster consisting of 24 Nvidia GeForce GTX Titan cards. The in-house MC TPS plans were comparable to a commercial TPS plans based on DVH comparisons. Conclusions: A MC-based treatment planning system was developed. The treatment planning can be performed in a clinically viable time frame on a hardware system costing around 45
Tseung, H Wan Chan; Beltran, C
2014-01-01
Purpose: Very fast Monte Carlo (MC) simulations of proton transport have been implemented recently on GPUs. However, these usually use simplified models for non-elastic (NE) proton-nucleus interactions. Our primary goal is to build a GPU-based proton transport MC with detailed modeling of elastic and NE collisions. Methods: Using CUDA, we implemented GPU kernels for these tasks: (1) Simulation of spots from our scanning nozzle configurations, (2) Proton propagation through CT geometry, considering nuclear elastic scattering, multiple scattering, and energy loss straggling, (3) Modeling of the intranuclear cascade stage of NE interactions, (4) Nuclear evaporation simulation, and (5) Statistical error estimates on the dose. To validate our MC, we performed: (1) Secondary particle yield calculations in NE collisions, (2) Dose calculations in homogeneous phantoms, (3) Re-calculations of head and neck plans from a commercial treatment planning system (TPS), and compared with Geant4.9.6p2/TOPAS. Results: Yields, en...
Energy Technology Data Exchange (ETDEWEB)
Wuerl, Matthias
2016-08-01
Matthias Wuerl presents two essential steps to implement offline PET monitoring of proton dose delivery at a clinical facility, namely the setting up of an accurate Monte Carlo model of the clinical beamline and the experimental validation of positron emitter production cross-sections. In the first part, the field size dependence of the dose output is described for scanned proton beams. Both the Monte Carlo and an analytical computational beam model were able to accurately predict target dose, while the latter tends to overestimate dose in normal tissue. In the second part, the author presents PET measurements of different phantom materials, which were activated by the proton beam. The results indicate that for an irradiation with a high number of protons for the sake of good statistics, dead time losses of the PET scanner may become important and lead to an underestimation of positron-emitter production yields.
Grevillot, L.; Bertrand, D.; Dessy, F.; Freud, N.; Sarrut, D.
2012-07-01
Active scanning delivery systems take full advantage of ion beams to best conform to the tumor and to spare surrounding healthy tissues; however, it is also a challenging technique for quality assurance. In this perspective, we upgraded the GATE/GEANT4 Monte Carlo platform in order to recalculate the treatment planning system (TPS) dose distributions for active scanning systems. A method that allows evaluating the TPS dose distributions with the GATE Monte Carlo platform has been developed and applied to the XiO TPS (Elekta), for the IBA proton pencil beam scanning (PBS) system. First, we evaluated the specificities of each dose engine. A dose-conversion scheme that allows one to convert dose to medium into dose to water was implemented within GATE. Specific test cases in homogeneous and heterogeneous configurations allowed for the estimation of the differences between the beam models implemented in XiO and GATE. Finally, dose distributions of a prostate treatment plan were compared. In homogeneous media, a satisfactory agreement was generally obtained between XiO and GATE. The maximum stopping power difference of 3% occurred in a human tissue of 0.9 g cm-3 density and led to a significant range shift. Comparisons in heterogeneous configurations pointed out the limits of the TPS dose calculation accuracy and the superiority of Monte Carlo simulations. The necessity of computing dose to water in our Monte Carlo code for comparisons with TPSs is also presented. Finally, the new capabilities of the platform are applied to a prostate treatment plan and dose differences between both dose engines are analyzed in detail. This work presents a generic method to compare TPS dose distributions with the GATE Monte Carlo platform. It is noteworthy that GATE is also a convenient tool for imaging applications, therefore opening new research possibilities for the PBS modality.
Energy Technology Data Exchange (ETDEWEB)
Kurosu, Keita [Department of Medical Physics and Engineering, Osaka University Graduate School of Medicine, Suita, Osaka 565-0871 (Japan); Department of Radiation Oncology, Osaka University Graduate School of Medicine, Suita, Osaka 565-0871 (Japan); Takashina, Masaaki; Koizumi, Masahiko [Department of Medical Physics and Engineering, Osaka University Graduate School of Medicine, Suita, Osaka 565-0871 (Japan); Das, Indra J. [Department of Radiation Oncology, Indiana University School of Medicine, Indianapolis, IN 46202 (United States); Moskvin, Vadim P., E-mail: vadim.p.moskvin@gmail.com [Department of Radiation Oncology, Indiana University School of Medicine, Indianapolis, IN 46202 (United States)
2014-10-01
Although three general-purpose Monte Carlo (MC) simulation tools: Geant4, FLUKA and PHITS have been used extensively, differences in calculation results have been reported. The major causes are the implementation of the physical model, preset value of the ionization potential or definition of the maximum step size. In order to achieve artifact free MC simulation, an optimized parameters list for each simulation system is required. Several authors have already proposed the optimized lists, but those studies were performed with a simple system such as only a water phantom. Since particle beams have a transport, interaction and electromagnetic processes during beam delivery, establishment of an optimized parameters-list for whole beam delivery system is therefore of major importance. The purpose of this study was to determine the optimized parameters list for GATE and PHITS using proton treatment nozzle computational model. The simulation was performed with the broad scanning proton beam. The influences of the customizing parameters on the percentage depth dose (PDD) profile and the proton range were investigated by comparison with the result of FLUKA, and then the optimal parameters were determined. The PDD profile and the proton range obtained from our optimized parameters list showed different characteristics from the results obtained with simple system. This led to the conclusion that the physical model, particle transport mechanics and different geometry-based descriptions need accurate customization in planning computational experiments for artifact-free MC simulation.
Tseung, H Wan Chan; Kreofsky, C R; Ma, D; Beltran, C
2016-01-01
Purpose: To demonstrate the feasibility of fast Monte Carlo (MC) based inverse biological planning for the treatment of head and neck tumors in spot-scanning proton therapy. Methods: Recently, a fast and accurate Graphics Processor Unit (GPU)-based MC simulation of proton transport was developed and used as the dose calculation engine in a GPU-accelerated IMPT optimizer. Besides dose, the dose-averaged linear energy transfer (LETd) can be simultaneously scored, which makes biological dose (BD) optimization possible. To convert from LETd to BD, a linear relation was assumed. Using this novel optimizer, inverse biological planning was applied to 4 patients: 2 small and 1 large thyroid tumor targets, and 1 glioma case. To create these plans, constraints were placed to maintain the physical dose (PD) within 1.25 times the prescription while maximizing target BD. For comparison, conventional IMRT and IMPT plans were created for each case in Eclipse (Varian, Inc). The same critical structure PD constraints were use...
International Nuclear Information System (INIS)
Proton therapy facilities are shielded to limit the amount of secondary radiation to which patients, occupational workers and members of the general public are exposed. The most commonly applied shielding design methods for proton therapy facilities comprise semi-empirical and analytical methods to estimate the neutron dose equivalent. This study compares the results of these methods with a detailed simulation of a proton therapy facility by using the Monte Carlo technique. A comparison of neutron dose equivalent values predicted by the various methods reveals the superior accuracy of the Monte Carlo predictions in locations where the calculations converge. However, the reliability of the overall shielding design increases if simulation results, for which solutions have not converged, e.g. owing to too few particle histories, can be excluded, and deterministic models are being used at these locations. Criteria to accept or reject Monte Carlo calculations in such complex structures are not well understood. An optimum rejection criterion would allow all converging solutions of Monte Carlo simulation to be taken into account, and reject all solutions with uncertainties larger than the design safety margins. In this study, the optimum rejection criterion of 10% was found. The mean ratio was 26, 62% of all receptor locations showed a ratio between 0.9 and 10, and 92% were between 1 and 100. (authors)
Proton therapy Monte Carlo SRNA-VOX code
Ilić Radovan D.
2012-01-01
The most powerful feature of the Monte Carlo method is the possibility of simulating all individual particle interactions in three dimensions and performing numerical experiments with a preset error. These facts were the motivation behind the development of a general-purpose Monte Carlo SRNA program for proton transport simulation in technical systems described by standard geometrical forms (plane, sphere, cone, cylinder, cube). Some of the possible applications of the SRNA program are:...
A generic algorithm for Monte Carlo simulation of proton transport
Salvat, Francesc
2013-12-01
A mixed (class II) algorithm for Monte Carlo simulation of the transport of protons, and other heavy charged particles, in matter is presented. The emphasis is on the electromagnetic interactions (elastic and inelastic collisions) which are simulated using strategies similar to those employed in the electron-photon code PENELOPE. Elastic collisions are described in terms of numerical differential cross sections (DCSs) in the center-of-mass frame, calculated from the eikonal approximation with the Dirac-Hartree-Fock-Slater atomic potential. The polar scattering angle is sampled by employing an adaptive numerical algorithm which allows control of interpolation errors. The energy transferred to the recoiling target atoms (nuclear stopping) is consistently described by transformation to the laboratory frame. Inelastic collisions are simulated from DCSs based on the plane-wave Born approximation (PWBA), making use of the Sternheimer-Liljequist model of the generalized oscillator strength, with parameters adjusted to reproduce (1) the electronic stopping power read from the input file, and (2) the total cross sections for impact ionization of inner subshells. The latter were calculated from the PWBA including screening and Coulomb corrections. This approach provides quite a realistic description of the energy-loss distribution in single collisions, and of the emission of X-rays induced by proton impact. The simulation algorithm can be readily modified to include nuclear reactions, when the corresponding cross sections and emission probabilities are available, and bremsstrahlung emission.
Proton therapy Monte Carlo SRNA-VOX code
Directory of Open Access Journals (Sweden)
Ilić Radovan D.
2012-01-01
Full Text Available The most powerful feature of the Monte Carlo method is the possibility of simulating all individual particle interactions in three dimensions and performing numerical experiments with a preset error. These facts were the motivation behind the development of a general-purpose Monte Carlo SRNA program for proton transport simulation in technical systems described by standard geometrical forms (plane, sphere, cone, cylinder, cube. Some of the possible applications of the SRNA program are: (a a general code for proton transport modeling, (b design of accelerator-driven systems, (c simulation of proton scattering and degrading shapes and composition, (d research on proton detectors; and (e radiation protection at accelerator installations. This wide range of possible applications of the program demands the development of various versions of SRNA-VOX codes for proton transport modeling in voxelized geometries and has, finally, resulted in the ISTAR package for the calculation of deposited energy distribution in patients on the basis of CT data in radiotherapy. All of the said codes are capable of using 3-D proton sources with an arbitrary energy spectrum in an interval of 100 keV to 250 MeV.
Monte Carlo comparison of x-ray and proton CT for range calculations of proton therapy beams
International Nuclear Information System (INIS)
Proton computed tomography (CT) has been described as a solution for imaging the proton stopping power of patient tissues, therefore reducing the uncertainty of the conversion of x-ray CT images to relative stopping power (RSP) maps and its associated margins. This study aimed to investigate this assertion under the assumption of ideal detection systems. We have developed a Monte Carlo framework to assess proton CT performances for the main steps of a proton therapy treatment planning, i.e. proton or x-ray CT imaging, conversion to RSP maps based on the calibration of a tissue phantom, and proton dose simulations. Irradiations of a computational phantom with pencil beams were simulated on various anatomical sites and the proton range was assessed on the reference, the proton CT-based and the x-ray CT-based material maps. Errors on the tissue’s RSP reconstructed from proton CT were found to be significantly smaller and less dependent on the tissue distribution. The imaging dose was also found to be much more uniform and conformal to the primary beam. The mean absolute deviation for range calculations based on x-ray CT varies from 0.18 to 2.01 mm depending on the localization, while it is smaller than 0.1 mm for proton CT. Under the assumption of a perfect detection system, proton range predictions based on proton CT are therefore both more accurate and more uniform than those based on x-ray CT. (paper)
Quantum Monte Carlo study of the protonated water dimer
Dagrada, Mario; Saitta, Antonino M; Sorella, Sandro; Mauri, Francesco
2013-01-01
We report an extensive theoretical study of the protonated water dimer (Zundel ion) by means of the highly correlated variational Monte Carlo and lattice regularized Monte Carlo approaches. This system represents the simplest model for proton transfer (PT) and a correct description of its properties is essential in order to understand the PT mechanism in more complex acqueous systems. Our Jastrow correlated AGP wave function ensures an accurate treatment of electron correlations. Exploiting the advantages of contracting the primitive basis set over atomic hybrid orbitals, we are able to limit dramatically the number of variational parameters with a systematic control on the numerical precision, crucial in order to simulate larger systems. We investigate energetics and geometrical properties of the Zundel ion as a function of the oxygen-oxygen distance, taken as reaction coordinate. In both cases, our QMC results are found in excellent agreement with coupled cluster CCSD(T) technique, the quantum chemistry "go...
Kurosu, Keita; Das, Indra J.; Moskvin, Vadim P.
2016-01-01
Spot scanning, owing to its superior dose-shaping capability, provides unsurpassed dose conformity, in particular for complex targets. However, the robustness of the delivered dose distribution and prescription has to be verified. Monte Carlo (MC) simulation has the potential to generate significant advantages for high-precise particle therapy, especially for medium containing inhomogeneities. However, the inherent choice of computational parameters in MC simulation codes of GATE, PHITS and FLUKA that is observed for uniform scanning proton beam needs to be evaluated. This means that the relationship between the effect of input parameters and the calculation results should be carefully scrutinized. The objective of this study was, therefore, to determine the optimal parameters for the spot scanning proton beam for both GATE and PHITS codes by using data from FLUKA simulation as a reference. The proton beam scanning system of the Indiana University Health Proton Therapy Center was modeled in FLUKA, and the geometry was subsequently and identically transferred to GATE and PHITS. Although the beam transport is managed by spot scanning system, the spot location is always set at the center of a water phantom of 600 × 600 × 300 mm3, which is placed after the treatment nozzle. The percentage depth dose (PDD) is computed along the central axis using 0.5 × 0.5 × 0.5 mm3 voxels in the water phantom. The PDDs and the proton ranges obtained with several computational parameters are then compared to those of FLUKA, and optimal parameters are determined from the accuracy of the proton range, suppressed dose deviation, and computational time minimization. Our results indicate that the optimized parameters are different from those for uniform scanning, suggesting that the gold standard for setting computational parameters for any proton therapy application cannot be determined consistently since the impact of setting parameters depends on the proton irradiation technique. We
Energy Technology Data Exchange (ETDEWEB)
Kurosu, K [Department of Radiation Oncology, Osaka University Graduate School of Medicine, Osaka (Japan); Department of Medical Physics ' Engineering, Osaka University Graduate School of Medicine, Osaka (Japan); Takashina, M; Koizumi, M [Department of Medical Physics ' Engineering, Osaka University Graduate School of Medicine, Osaka (Japan); Das, I; Moskvin, V [Department of Radiation Oncology, Indiana University School of Medicine, Indianapolis, IN (United States)
2014-06-01
Purpose: Monte Carlo codes are becoming important tools for proton beam dosimetry. However, the relationships between the customizing parameters and percentage depth dose (PDD) of GATE and PHITS codes have not been reported which are studied for PDD and proton range compared to the FLUKA code and the experimental data. Methods: The beam delivery system of the Indiana University Health Proton Therapy Center was modeled for the uniform scanning beam in FLUKA and transferred identically into GATE and PHITS. This computational model was built from the blue print and validated with the commissioning data. Three parameters evaluated are the maximum step size, cut off energy and physical and transport model. The dependence of the PDDs on the customizing parameters was compared with the published results of previous studies. Results: The optimal parameters for the simulation of the whole beam delivery system were defined by referring to the calculation results obtained with each parameter. Although the PDDs from FLUKA and the experimental data show a good agreement, those of GATE and PHITS obtained with our optimal parameters show a minor discrepancy. The measured proton range R90 was 269.37 mm, compared to the calculated range of 269.63 mm, 268.96 mm, and 270.85 mm with FLUKA, GATE and PHITS, respectively. Conclusion: We evaluated the dependence of the results for PDDs obtained with GATE and PHITS Monte Carlo generalpurpose codes on the customizing parameters by using the whole computational model of the treatment nozzle. The optimal parameters for the simulation were then defined by referring to the calculation results. The physical model, particle transport mechanics and the different geometrybased descriptions need accurate customization in three simulation codes to agree with experimental data for artifact-free Monte Carlo simulation. This study was supported by Grants-in Aid for Cancer Research (H22-3rd Term Cancer Control-General-043) from the Ministry of Health
Clinical implementation of full Monte Carlo dose calculation in proton beam therapy
Energy Technology Data Exchange (ETDEWEB)
Paganetti, Harald; Jiang, Hongyu; Parodi, Katia; Slopsema, Roelf; Engelsman, Martijn [Department of Radiation Oncology, Massachusetts General Hospital and Harvard Medical School, Boston, MA 02114 (United States)
2008-09-07
The goal of this work was to facilitate the clinical use of Monte Carlo proton dose calculation to support routine treatment planning and delivery. The Monte Carlo code Geant4 was used to simulate the treatment head setup, including a time-dependent simulation of modulator wheels (for broad beam modulation) and magnetic field settings (for beam scanning). Any patient-field-specific setup can be modeled according to the treatment control system of the facility. The code was benchmarked against phantom measurements. Using a simulation of the ionization chamber reading in the treatment head allows the Monte Carlo dose to be specified in absolute units (Gy per ionization chamber reading). Next, the capability of reading CT data information was implemented into the Monte Carlo code to model patient anatomy. To allow time-efficient dose calculation, the standard Geant4 tracking algorithm was modified. Finally, a software link of the Monte Carlo dose engine to the patient database and the commercial planning system was established to allow data exchange, thus completing the implementation of the proton Monte Carlo dose calculation engine ('DoC++'). Monte Carlo re-calculated plans are a valuable tool to revisit decisions in the planning process. Identification of clinically significant differences between Monte Carlo and pencil-beam-based dose calculations may also drive improvements of current pencil-beam methods. As an example, four patients (29 fields in total) with tumors in the head and neck regions were analyzed. Differences between the pencil-beam algorithm and Monte Carlo were identified in particular near the end of range, both due to dose degradation and overall differences in range prediction due to bony anatomy in the beam path. Further, the Monte Carlo reports dose-to-tissue as compared to dose-to-water by the planning system. Our implementation is tailored to a specific Monte Carlo code and the treatment planning system XiO (Computerized Medical
Energy Technology Data Exchange (ETDEWEB)
Krueger, Rachel A. [Department of Chemistry, California Institute of Technology, Pasadena, California 91125 (United States); Haibach, Frederick G. [Confluent Science, Wilbraham, Massachusetts 01095 (United States); Fry, Dana L.; Gomez, Maria A., E-mail: magomez@mtholyoke.edu [Department of Chemistry, Mount Holyoke College, South Hadley, Massachusetts 01075 (United States)
2015-04-21
A centrality measure based on the time of first returns rather than the number of steps is developed and applied to finding proton traps and access points to proton highways in the doped perovskite oxides: AZr{sub 0.875}D{sub 0.125}O{sub 3}, where A is Ba or Sr and the dopant D is Y or Al. The high centrality region near the dopant is wider in the SrZrO{sub 3} systems than the BaZrO{sub 3} systems. In the aluminum-doped systems, a region of intermediate centrality (secondary region) is found in a plane away from the dopant. Kinetic Monte Carlo (kMC) trajectories show that this secondary region is an entry to fast conduction planes in the aluminum-doped systems in contrast to the highest centrality area near the dopant trap. The yttrium-doped systems do not show this secondary region because the fast conduction routes are in the same plane as the dopant and hence already in the high centrality trapped area. This centrality measure complements kMC by highlighting key areas in trajectories. The limiting activation barriers found via kMC are in very good agreement with experiments and related to the barriers to escape dopant traps.
Monte Carlo simulations of a novel Micromegas 2D array for proton dosimetry
Dolney, D.; Ainsley, C.; Hollebeek, R.; Maughan, R.
2016-02-01
Modern proton therapy affords control of the delivery of radiotherapeutic dose on fine length and temporal scales. The authors have developed a novel detector technology based on Micromesh Gaseous Structure (Micromegas) that is uniquely tailored for applications using therapeutic proton beams. An implementation of a prototype Micromegas detector for Monte Carlo using Geant4 is presented here. Comparison of simulation results with measurements demonstrates agreement in relative dose along the proton longitudinal dose profile to be 1%. The effect of a radioactive calibration source embedded in the chamber gas is demonstrated by measurements and reproduced by simulations, also at the 1% level. Our Monte Carlo simulations are shown to reproduce the time structure of ionization pulses produced by a double-scattering delivery system.
A Monte Carlo track structure code for low energy protons
Endo, S; Nikjoo, H; Uehara, S; Hoshi, M; Ishikawa, M; Shizuma, K
2002-01-01
A code is described for simulation of protons (100 eV to 10 MeV) track structure in water vapor. The code simulates molecular interaction by interaction for the transport of primary ions and secondary electrons in the form of ionizations and excitations. When a low velocity ion collides with the atoms or molecules of a target, the ion may also capture or lose electrons. The probabilities for these processes are described by the quantity cross-section. Although proton track simulation at energies above Bragg peak (>0.3 MeV) has been achieved to a high degree of precision, simulations at energies near or below the Bragg peak have only been attempted recently because of the lack of relevant cross-section data. As the hydrogen atom has a different ionization cross-section from that of a proton, charge exchange processes need to be considered in order to calculate stopping power for low energy protons. In this paper, we have used state-of-the-art Monte Carlo track simulation techniques, in conjunction with the pub...
Srna-Monte Carlo codes for proton transport simulation in combined and voxelized geometries
International Nuclear Information System (INIS)
This paper describes new Monte Carlo codes for proton transport simulations in complex geometrical forms and in materials of different composition. The SRNA codes were developed for three dimensional (3D) dose distribution calculation in proton therapy and dosimetry. The model of these codes is based on the theory of proton multiple scattering and a simple model of compound nucleus decay. The developed package consists of two codes: SRNA-2KG and SRNA-VOX. The first code simulates proton transport in combined geometry that can be described by planes and second order surfaces. The second one uses the voxelized geometry of material zones and is specifically adopted for the application of patient computer tomography data. Transition probabilities for both codes are given by the SRNADAT program. In this paper, we will present the models and algorithms of our programs, as well as the results of the numerical experiments we have carried out applying them, along with the results of proton transport simulation obtained through the PETRA and GEANT programs. The simulation of the proton beam characterization by means of the Multi-Layer Faraday Cup and spatial distribution of positron emitters obtained by our program indicate the imminent application of Monte Carlo techniques in clinical practice. (author)
Srna - Monte Carlo codes for proton transport simulation in combined and voxelized geometries
Directory of Open Access Journals (Sweden)
Ilić Radovan D.
2002-01-01
Full Text Available This paper describes new Monte Carlo codes for proton transport simulations in complex geometrical forms and in materials of different composition. The SRNA codes were developed for three dimensional (3D dose distribution calculation in proton therapy and dosimetry. The model of these codes is based on the theory of proton multiple scattering and a simple model of compound nucleus decay. The developed package consists of two codes: SRNA-2KG and SRNA-VOX. The first code simulates proton transport in combined geometry that can be described by planes and second order surfaces. The second one uses the voxelized geometry of material zones and is specifically adopted for the application of patient computer tomography data. Transition probabilities for both codes are given by the SRNADAT program. In this paper, we will present the models and algorithms of our programs, as well as the results of the numerical experiments we have carried out applying them, along with the results of proton transport simulation obtained through the PETRA and GEANT programs. The simulation of the proton beam characterization by means of the Multi-Layer Faraday Cup and spatial distribution of positron emitters obtained by our program indicate the imminent application of Monte Carlo techniques in clinical practice.
International Nuclear Information System (INIS)
Purpose: To construct a dose monitoring system based on an endorectal balloon coupled to thin scintillating fibers to study the dose delivered to the rectum during prostate cancer proton therapy Methods: The Geant4 Monte Carlo toolkit version 9.6p02 was used to simulate prostate cancer proton therapy treatments of an endorectal balloon (for immobilization of a 2.9 cm diameter prostate gland) and a set of 34 scintillating fibers symmetrically placed around the balloon and perpendicular to the proton beam direction (for dosimetry measurements) Results: A linear response of the fibers to the dose delivered was observed within <2%, a property that makes them good candidates for real time dosimetry. Results obtained show that the closest fiber recorded about 1/3 of the dose to the target with a 1/r2 decrease in the dose distribution as one goes toward the frontal and distal top fibers. Very low dose was recorded by the bottom fibers (about 45 times comparatively), which is a clear indication that the overall volume of the rectal wall that is exposed to a higher dose is relatively minimized. Further analysis indicated a simple scaling relationship between the dose to the prostate and the dose to the top fibers (a linear fit gave a slope of −0.07±0.07 MeV per treatment Gy) Conclusion: Thin (1 mm × 1 mm × 100 cm) long scintillating fibers were found to be ideal for real time in-vivo dose measurement to the rectum for prostate cancer proton therapy. The linear response of the fibers to the dose delivered makes them good candidates of dosimeters. With thorough calibration and the ability to define a good correlation between the dose to the target and the dose to the fibers, such dosimeters can be used for real time dose verification to the target
Monte Carlo Calculations Supporting Patient Plan Verification in Proton Therapy.
Lima, Thiago V M; Dosanjh, Manjit; Ferrari, Alfredo; Molineli, Silvia; Ciocca, Mario; Mairani, Andrea
2016-01-01
Patient's treatment plan verification covers substantial amount of the quality assurance (QA) resources; this is especially true for Intensity-Modulated Proton Therapy (IMPT). The use of Monte Carlo (MC) simulations in supporting QA has been widely discussed, and several methods have been proposed. In this paper, we studied an alternative approach from the one being currently applied clinically at Centro Nazionale di Adroterapia Oncologica (CNAO). We reanalyzed the previously published data (Molinelli et al. (1)), where 9 patient plans were investigated in which the warning QA threshold of 3% mean dose deviation was crossed. The possibility that these differences between measurement and calculated dose were related to dose modeling (Treatment Planning Systems (TPS) vs. MC), limitations on dose delivery system, or detectors mispositioning was originally explored, but other factors, such as the geometric description of the detectors, were not ruled out. For the purpose of this work, we compared ionization chambers' measurements with different MC simulation results. It was also studied that some physical effects were introduced by this new approach, for example, inter-detector interference and the delta ray thresholds. The simulations accounting for a detailed geometry typically are superior (statistical difference - p-value around 0.01) to most of the MC simulations used at CNAO (only inferior to the shift approach used). No real improvement was observed in reducing the current delta ray threshold used (100 keV), and no significant interference between ion chambers in the phantom were detected (p-value 0.81). In conclusion, it was observed that the detailed geometrical description improves the agreement between measurement and MC calculations in some cases. But in other cases, position uncertainty represents the dominant uncertainty. The inter-chamber disturbance was not detected for the therapeutic protons energies, and the results from the current delta threshold
Monte Carlo Calculations Supporting Patient Plan Verification in Proton Therapy
Lima, Thiago V. M.; Dosanjh, Manjit; Ferrari, Alfredo; Molineli, Silvia; Ciocca, Mario; Mairani, Andrea
2016-01-01
Patient’s treatment plan verification covers substantial amount of the quality assurance (QA) resources; this is especially true for Intensity-Modulated Proton Therapy (IMPT). The use of Monte Carlo (MC) simulations in supporting QA has been widely discussed, and several methods have been proposed. In this paper, we studied an alternative approach from the one being currently applied clinically at Centro Nazionale di Adroterapia Oncologica (CNAO). We reanalyzed the previously published data (Molinelli et al. (1)), where 9 patient plans were investigated in which the warning QA threshold of 3% mean dose deviation was crossed. The possibility that these differences between measurement and calculated dose were related to dose modeling (Treatment Planning Systems (TPS) vs. MC), limitations on dose delivery system, or detectors mispositioning was originally explored, but other factors, such as the geometric description of the detectors, were not ruled out. For the purpose of this work, we compared ionization chambers’ measurements with different MC simulation results. It was also studied that some physical effects were introduced by this new approach, for example, inter-detector interference and the delta ray thresholds. The simulations accounting for a detailed geometry typically are superior (statistical difference – p-value around 0.01) to most of the MC simulations used at CNAO (only inferior to the shift approach used). No real improvement was observed in reducing the current delta ray threshold used (100 keV), and no significant interference between ion chambers in the phantom were detected (p-value 0.81). In conclusion, it was observed that the detailed geometrical description improves the agreement between measurement and MC calculations in some cases. But in other cases, position uncertainty represents the dominant uncertainty. The inter-chamber disturbance was not detected for the therapeutic protons energies, and the results from the current delta
Monte Carlo calculations supporting patient plan verification in proton therapy
Directory of Open Access Journals (Sweden)
Thiago Viana Miranda Lima
2016-03-01
Full Text Available Patient’s treatment plan verification covers substantial amount of the quality assurance (QA resources, this is especially true for Intensity Modulated Proton Therapy (IMPT. The use of Monte Carlo (MC simulations in supporting QA has been widely discussed and several methods have been proposed. In this paper we studied an alternative approach from the one being currently applied clinically at Centro Nazionale di Adroterapia Oncologica (CNAO. We reanalysed the previously published data (Molinelli et al. 2013, where 9 patient plans were investigated in which the warning QA threshold of 3% mean dose deviation was crossed. The possibility that these differences between measurement and calculated dose were related to dose modelling (Treatment Planning Systems (TPS vs MC, limitations on dose delivery system or detectors mispositioning was originally explored but other factors such as the geometric description of the detectors were not ruled out. For the purpose of this work we compared ionisation-chambers measurements with different MC simulations results. It was also studied some physical effects introduced by this new approach for example inter detector interference and the delta ray thresholds. The simulations accounting for a detailed geometry typically are superior (statistical difference - p-value around 0.01 to most of the MC simulations used at CNAO (only inferior to the shift approach used. No real improvement were observed in reducing the current delta-ray threshold used (100 keV and no significant interference between ion chambers in the phantom were detected (p-value 0.81. In conclusion, it was observed that the detailed geometrical description improves the agreement between measurement and MC calculations in some cases. But in other cases position uncertainty represents the dominant uncertainty. The inter chamber disturbance was not detected for the therapeutic protons energies and the results from the current delta threshold are
Comparison of linear energy transfer scoring techniques in Monte Carlo simulations of proton beams
International Nuclear Information System (INIS)
Monte Carlo (MC) simulations are commonly used to study linear energy transfer (LET) distributions in therapeutic proton beams. Various techniques have been used to score LET in MC simulations. The goal of this work was to compare LET distributions obtained using different LET scoring techniques and examine the sensitivity of these distributions to changes in commonly adjusted simulation parameters. We used three different techniques to score average proton LET in TOPAS, which is a MC platform based on the Geant4 simulation toolkit. We determined the sensitivity of each scoring technique to variations in the range production thresholds for secondary electrons and protons. We also compared the depth-LET distributions that we acquired using each technique in a simple monoenergetic proton beam and in a more clinically relevant modulated proton therapy beam. Distributions of both fluence-averaged LET (LETΦ) and dose-averaged LET (LETD) were studied. We found that LETD values varied more between different scoring techniques than the LETΦ values did, and different LET scoring techniques showed different sensitivities to changes in simulation parameters. (note)
Monte Carlo simulation in proton computed tomography: a study of image reconstruction technique
Energy Technology Data Exchange (ETDEWEB)
Inocente, Guilherme Franco; Stenico, Gabriela V.; Hormaza, Joel Mesa [Universidade Estadual Paulista Julio de Mesquita Filho (UNESP), Botucatu, SP (Brazil). Inst. de Biociencias. Dept. de Fisica e Biofisica
2012-07-01
Full text: The radiation method is one of the most used for cancer treatment. In this context arises therapy with proton beams in front of conventional radiotherapy. It is known that with proton therapy there are more advantages to the patient treated when compared with more conventional methods. The dose distributed along the path, especially in healthy tissues - neighbor the tumor, is smaller and the accuracy of treatment is much better. To carry out the treatment, the patient undergoes a plan through images for visualization and location of the target volume. The main method for obtaining these images is computed tomography X-ray (XCT). For treatment with proton beam this imaging technique can to generate some uncertainties. The purpose of this project is to study the feasibility of reconstructing images generated from the irradiation with proton beams, thereby reducing some inaccuracies, as it will be the same type of radiation as treatment planning, and also to drastically reduce some errors location, since the planning can be done at the same place and just before where the patient is treated. This study aims to obtain a relationship between the intrinsic property of the interaction of photons and protons with matter. For this we use computational simulation based on Monte Carlo method with the code SRIM 2008 and MCNPX v.2.5.0, to reconstruct images using the technique used in conventional computed tomography. (author)
Srna-Monte Carlo codes for proton transport simulation in combined and voxelized geometries
Ilic, R D; Stankovic, S J
2002-01-01
This paper describes new Monte Carlo codes for proton transport simulations in complex geometrical forms and in materials of different composition. The SRNA codes were developed for three dimensional (3D) dose distribution calculation in proton therapy and dosimetry. The model of these codes is based on the theory of proton multiple scattering and a simple model of compound nucleus decay. The developed package consists of two codes: SRNA-2KG and SRNA-VOX. The first code simulates proton transport in combined geometry that can be described by planes and second order surfaces. The second one uses the voxelized geometry of material zones and is specifically adopted for the application of patient computer tomography data. Transition probabilities for both codes are given by the SRNADAT program. In this paper, we will present the models and algorithms of our programs, as well as the results of the numerical experiments we have carried out applying them, along with the results of proton transport simulation obtaine...
The Monte Carlo SRNA-VOX code for 3D proton dose distribution in voxelized geometry using CT data
Energy Technology Data Exchange (ETDEWEB)
Ilic, Radovan D [Laboratory of Physics (010), Vinca Institute of Nuclear Sciences, PO Box 522, 11001 Belgrade (Serbia and Montenegro); Spasic-Jokic, Vesna [Laboratory of Physics (010), Vinca Institute of Nuclear Sciences, PO Box 522, 11001 Belgrade (Serbia and Montenegro); Belicev, Petar [Laboratory of Physics (010), Vinca Institute of Nuclear Sciences, PO Box 522, 11001 Belgrade (Serbia and Montenegro); Dragovic, Milos [Center for Nuclear Medicine MEDICA NUCLEARE, Bulevar Despota Stefana 69, 11000 Belgrade (Serbia and Montenegro)
2005-03-07
This paper describes the application of the SRNA Monte Carlo package for proton transport simulations in complex geometry and different material compositions. The SRNA package was developed for 3D dose distribution calculation in proton therapy and dosimetry and it was based on the theory of multiple scattering. The decay of proton induced compound nuclei was simulated by the Russian MSDM model and our own using ICRU 63 data. The developed package consists of two codes: the SRNA-2KG, which simulates proton transport in combinatorial geometry and the SRNA-VOX, which uses the voxelized geometry using the CT data and conversion of the Hounsfield's data to tissue elemental composition. Transition probabilities for both codes are prepared by the SRNADAT code. The simulation of the proton beam characterization by multi-layer Faraday cup, spatial distribution of positron emitters obtained by the SRNA-2KG code and intercomparison of computational codes in radiation dosimetry, indicate immediate application of the Monte Carlo techniques in clinical practice. In this paper, we briefly present the physical model implemented in the SRNA package, the ISTAR proton dose planning software, as well as the results of the numerical experiments with proton beams to obtain 3D dose distribution in the eye and breast tumour.
The Monte Carlo SRNA-VOX code for 3D proton dose distribution in voxelized geometry using CT data
Ilic, Radovan D.; Spasic-Jokic, Vesna; Belicev, Petar; Dragovic, Milos
2005-03-01
This paper describes the application of the SRNA Monte Carlo package for proton transport simulations in complex geometry and different material compositions. The SRNA package was developed for 3D dose distribution calculation in proton therapy and dosimetry and it was based on the theory of multiple scattering. The decay of proton induced compound nuclei was simulated by the Russian MSDM model and our own using ICRU 63 data. The developed package consists of two codes: the SRNA-2KG, which simulates proton transport in combinatorial geometry and the SRNA-VOX, which uses the voxelized geometry using the CT data and conversion of the Hounsfield's data to tissue elemental composition. Transition probabilities for both codes are prepared by the SRNADAT code. The simulation of the proton beam characterization by multi-layer Faraday cup, spatial distribution of positron emitters obtained by the SRNA-2KG code and intercomparison of computational codes in radiation dosimetry, indicate immediate application of the Monte Carlo techniques in clinical practice. In this paper, we briefly present the physical model implemented in the SRNA package, the ISTAR proton dose planning software, as well as the results of the numerical experiments with proton beams to obtain 3D dose distribution in the eye and breast tumour.
The Monte Carlo SRNA-VOX code for 3D proton dose distribution in voxelized geometry using CT data
International Nuclear Information System (INIS)
This paper describes the application of the SRNA Monte Carlo package for proton transport simulations in complex geometry and different material compositions. The SRNA package was developed for 3D dose distribution calculation in proton therapy and dosimetry and it was based on the theory of multiple scattering. The decay of proton induced compound nuclei was simulated by the Russian MSDM model and our own using ICRU 63 data. The developed package consists of two codes: the SRNA-2KG, which simulates proton transport in combinatorial geometry and the SRNA-VOX, which uses the voxelized geometry using the CT data and conversion of the Hounsfield's data to tissue elemental composition. Transition probabilities for both codes are prepared by the SRNADAT code. The simulation of the proton beam characterization by multi-layer Faraday cup, spatial distribution of positron emitters obtained by the SRNA-2KG code and intercomparison of computational codes in radiation dosimetry, indicate immediate application of the Monte Carlo techniques in clinical practice. In this paper, we briefly present the physical model implemented in the SRNA package, the ISTAR proton dose planning software, as well as the results of the numerical experiments with proton beams to obtain 3D dose distribution in the eye and breast tumour
International Nuclear Information System (INIS)
This paper describes the application of SRNA Monte Carlo package for proton transport simulations in complex geometry and different material composition. SRNA package was developed for 3D dose distribution calculation in proton therapy and dosimetry and it was based on the theory of multiple scattering. The compound nuclei decay was simulated by our own and the Russian MSDM models using ICRU 63 data. The developed package consists of two codes: SRNA-2KG, which simulates proton transport in the combinatorial geometry and SRNA-VOX, which uses the voxelized geometry using the CT data and conversion of the Hounsfield's data to tissue elemental composition. Transition probabilities for both codes are prepared by the SRNADAT code. The simulation of proton beam characterization by Multi-Layer Faraday Cup, spatial distribution of positron emitters obtained by SRNA-2KG code, and intercomparison of computational codes in radiation dosimetry, indicate the immediate application of the Monte Carlo techniques in clinical practice. In this paper, we briefly present the physical model implemented in SRNA package, the ISTAR proton dose planning software, as well as the results of the numerical experiments with proton beams to obtain 3D dose distribution in the eye and breast tumor. (author)
Directory of Open Access Journals (Sweden)
Ilić Radovan D.
2004-01-01
Full Text Available This paper describes the application of SRNA Monte Carlo package for proton transport simulations in complex geometry and different material composition. SRNA package was developed for 3D dose distribution calculation in proton therapy and dosimetry and it was based on the theory of multiple scattering. The compound nuclei decay was simulated by our own and the Russian MSDM models using ICRU 63 data. The developed package consists of two codes SRNA-2KG, which simulates proton transport in the combinatorial geometry and SRNA-VOX, which uses the voxelized geometry using the CT data and conversion of the Hounsfield’s data to tissue elemental composition. Transition probabilities for both codes are prepared by the SRNADAT code. The simulation of proton beam characterization by Multi-Layer Faraday Cup, spatial distribution of positron emitters obtained by SRNA-2KG code, and intercomparison of computational codes in radiation dosimetry, indicate the immediate application of the Monte Carlo techniques in clinical practice. In this paper, we briefly present the physical model implemented in SRNA pack age, the ISTAR proton dose planning software, as well as the results of the numerical experiments with proton beams to obtain 3D dose distribution in the eye and breast tumor.
Lai, Bo-Lun; Sheu, Rong-Jiun; Lin, Uei-Tyng
2015-05-01
Monte Carlo simulations are generally considered the most accurate method for complex accelerator shielding analysis. Simplified models based on point-source line-of-sight approximation are often preferable in practice because they are intuitive and easy to use. A set of shielding data, including source terms and attenuation lengths for several common targets (iron, graphite, tissue, and copper) and shielding materials (concrete, iron, and lead) were generated by performing Monte Carlo simulations for 100-300 MeV protons. Possible applications and a proper use of the data set were demonstrated through a practical case study, in which shielding analysis on a typical proton treatment room was conducted. A thorough and consistent comparison between the predictions of our point-source line-of-sight model and those obtained by Monte Carlo simulations for a 360° dose distribution around the room perimeter showed that the data set can yield fairly accurate or conservative estimates for the transmitted doses, except for those near the maze exit. In addition, this study demonstrated that appropriate coupling between the generated source term and empirical formulae for radiation streaming can be used to predict a reasonable dose distribution along the maze. This case study proved the effectiveness and advantage of applying the data set to a quick shielding design and dose evaluation for proton therapy accelerators. PMID:25811254
Computed tomography with a low-intensity proton flux: results of a Monte Carlo simulation study
Schulte, Reinhard W.; Klock, Margio C. L.; Bashkirov, Vladimir; Evseev, Ivan G.; de Assis, Joaquim T.; Yevseyeva, Olga; Lopes, Ricardo T.; Li, Tianfang; Williams, David C.; Wroe, Andrew J.; Schelin, Hugo R.
2004-10-01
Conformal proton radiation therapy requires accurate prediction of the Bragg peak position. This problem may be solved by using protons rather than conventional x-rays to determine the relative electron density distribution via proton computed tomography (proton CT). However, proton CT has its own limitations, which need to be carefully studied before this technique can be introduced into routine clinical practice. In this work, we have used analytical relationships as well as the Monte Carlo simulation tool GEANT4 to study the principal resolution limits of proton CT. The GEANT4 simulations were validated by comparing them to predictions of the Bethe Bloch theory and Tschalar's theory of energy loss straggling, and were found to be in good agreement. The relationship between phantom thickness, initial energy, and the relative electron density uncertainty was systematically investigated to estimate the number of protons and dose needed to obtain a given density resolution. The predictions of this study were verified by simulating the performance of a hypothetical proton CT scanner when imaging a cylindrical water phantom with embedded density inhomogeneities. We show that a reasonable density resolution can be achieved with a relatively small number of protons, thus providing a possible dose advantage over x-ray CT.
Application of proton boron fusion reaction to radiation therapy: A Monte Carlo simulation study
Energy Technology Data Exchange (ETDEWEB)
Yoon, Do-Kun; Jung, Joo-Young; Suh, Tae Suk, E-mail: suhsanta@catholic.ac.kr [Department of Biomedical Engineering and Research Institute of Biomedical Engineering, College of Medicine, Catholic University of Korea, Seoul 505 (Korea, Republic of)
2014-12-01
Three alpha particles are emitted from the point of reaction between a proton and boron. The alpha particles are effective in inducing the death of a tumor cell. After boron is accumulated in the tumor region, the emitted from outside the body proton can react with the boron in the tumor region. An increase of the proton's maximum dose level is caused by the boron and only the tumor cell is damaged more critically. In addition, a prompt gamma ray is emitted from the proton boron reaction point. Here, we show that the effectiveness of the proton boron fusion therapy was verified using Monte Carlo simulations. We found that a dramatic increase by more than half of the proton's maximum dose level was induced by the boron in the tumor region. This increase occurred only when the proton's maximum dose point was located within the boron uptake region. In addition, the 719 keV prompt gamma ray peak produced by the proton boron fusion reaction was positively detected. This therapy method features the advantages such as the application of Bragg-peak to the therapy, the accurate targeting of tumor, improved therapy effects, and the monitoring of the therapy region during treatment.
The proton therapy nozzles at Samsung Medical Center: A Monte Carlo simulation study using TOPAS
Chung, Kwangzoo; Kim, Jinsung; Kim, Dae-Hyun; Ahn, Sunghwan; Han, Youngyih
2015-07-01
To expedite the commissioning process of the proton therapy system at Samsung Medical Center (SMC), we have developed a Monte Carlo simulation model of the proton therapy nozzles by using TOol for PArticle Simulation (TOPAS). At SMC proton therapy center, we have two gantry rooms with different types of nozzles: a multi-purpose nozzle and a dedicated scanning nozzle. Each nozzle has been modeled in detail following the geometry information provided by the manufacturer, Sumitomo Heavy Industries, Ltd. For this purpose, the novel features of TOPAS, such as the time feature or the ridge filter class, have been used, and the appropriate physics models for proton nozzle simulation have been defined. Dosimetric properties, like percent depth dose curve, spreadout Bragg peak (SOBP), and beam spot size, have been simulated and verified against measured beam data. Beyond the Monte Carlo nozzle modeling, we have developed an interface between TOPAS and the treatment planning system (TPS), RayStation. An exported radiotherapy (RT) plan from the TPS is interpreted by using an interface and is then translated into the TOPAS input text. The developed Monte Carlo nozzle model can be used to estimate the non-beam performance, such as the neutron background, of the nozzles. Furthermore, the nozzle model can be used to study the mechanical optimization of the design of the nozzle.
Comparison of some popular Monte Carlo solution for proton transportation within pCT problem
Energy Technology Data Exchange (ETDEWEB)
Evseev, Ivan; Assis, Joaquim T. de; Yevseyeva, Olga [Universidade do Estado do Rio de Janeiro (UERJ), Nova Friburgo, RJ (Brazil). Inst. Politecnico], E-mail: evseev@iprj.uerj.br, E-mail: joaquim@iprj.uerj.br, E-mail: yevseyeva@iprj.uerj.br; Lopes, Ricardo T.; Cardoso, Jose J.B.; Silva, Ademir X. da [Universidade Federal do Rio de Janeiro (UFRJ), RJ (Brazil). Coordenacao dos Programas de Pos-graduacao de Engenharia (COPPE). Lab. de Instrumentacao Nuclear], E-mail: ricardo@lin.ufrj.br, E-mail: jjbrum@oi.com.br, E-mail: ademir@con.ufrj.br; Vinagre Filho, Ubirajara M. [Instituto de Engenharia Nuclear IEN/CNEN-RJ, Rio de Janeiro, RJ (Brazil)], E-mail: bira@ien.gov.br; Hormaza, Joel M. [UNESP, Botucatu, SP (Brazil). Inst. de Biociencias], E-mail: jmesa@ibb.unesp.br; Schelin, Hugo R.; Paschuk, Sergei A.; Setti, Joao A.P.; Milhoretto, Edney [Universidade Tecnologica Federal do Parana, Curitiba, PR (Brazil)], E-mail: schelin@cpgei.cefetpr.br, E-mail: sergei@utfpr.edu.br, E-mail: jsetti@gmail.com, E-mail: edneymilhoretto@yahoo.com
2007-07-01
The proton transport in matter is described by the Boltzmann kinetic equation for the proton flux density. This equation, however, does not have a general analytical solution. Some approximate analytical solutions have been developed within a number of significant simplifications. Alternatively, the Monte Carlo simulations are widely used. Current work is devoted to the discussion of the proton energy spectra obtained by simulation with SRIM2006, GEANT4 and MCNPX packages. The simulations have been performed considering some further applications of the obtained results in computed tomography with proton beam (pCT). Thus the initial and outgoing proton energies (3 / 300 MeV) as well as the thickness of irradiated target (water and aluminum phantoms within 90% of the full range for a given proton beam energy) were considered in the interval of values typical for pCT applications. One from the most interesting results of this comparison is that while the MCNPX spectra are in a good agreement with analytical description within Fokker-Plank approximation and the GEANT4 simulated spectra are slightly shifted from them the SRIM2006 simulations predict a notably higher mean energy loss for protons. (author)
International Nuclear Information System (INIS)
In this paper the use of the Filtered Back Projection (FBP) Algorithm, in order to reconstruct tomographic images using the high energy (200-250 MeV) proton beams, is investigated. The algorithm has been studied in detail with a Monte Carlo approach and image quality has been analysed and compared with the total absorbed dose. A proton Computed Tomography (pCT) apparatus, developed by our group, has been fully simulated to exploit the power of the Geant4 Monte Carlo toolkit. From the simulation of the apparatus, a set of tomographic images of a test phantom has been reconstructed using the FBP at different absorbed dose values. The images have been evaluated in terms of homogeneity, noise, contrast, spatial and density resolution.
Energy Technology Data Exchange (ETDEWEB)
Cirrone, G.A.P., E-mail: cirrone@lns.infn.it [Laboratori Nazionali del Sud - National Instiute for Nuclear Physics INFN (INFN-LNS), Via S.Sofia 64, 95100 Catania (Italy); Bucciolini, M. [Department of ' Fisiopatologia Clinica' , University of Florence, V.le Morgagni 85, I-50134 Florence (Italy); Bruzzi, M. [Energetic Department, University of Florence, Via S. Marta 3, I-50139 Florence (Italy); Candiano, G. [Laboratorio di Tecnologie Oncologiche HSR, Giglio Contrada, Pietrapollastra-Pisciotto, 90015 Cefalu, Palermo (Italy); Civinini, C. [National Institute for Nuclear Physics INFN, Section of Florence, Via G. Sansone 1, Sesto Fiorentino, I-50019 Florence (Italy); Cuttone, G. [Laboratori Nazionali del Sud - National Instiute for Nuclear Physics INFN (INFN-LNS), Via S.Sofia 64, 95100 Catania (Italy); Guarino, P. [Nuclear Engineering Department, University of Palermo, Via... Palermo (Italy); Laboratori Nazionali del Sud - National Instiute for Nuclear Physics INFN (INFN-LNS), Via S.Sofia 64, 95100 Catania (Italy); Lo Presti, D. [Physics Department, University of Catania, Via S. Sofia 64, I-95123, Catania (Italy); Mazzaglia, S.E. [Laboratori Nazionali del Sud - National Instiute for Nuclear Physics INFN (INFN-LNS), Via S.Sofia 64, 95100 Catania (Italy); Pallotta, S. [Department of ' Fisiopatologia Clinica' , University of Florence, V.le Morgagni 85, I-50134 Florence (Italy); Randazzo, N. [National Institute for Nuclear Physics INFN, Section of Catania, Via S.Sofia 64, 95123 Catania (Italy); Sipala, V. [National Institute for Nuclear Physics INFN, Section of Catania, Via S.Sofia 64, 95123 Catania (Italy); Physics Department, University of Catania, Via S. Sofia 64, I-95123, Catania (Italy); Stancampiano, C. [National Institute for Nuclear Physics INFN, Section of Catania, Via S.Sofia 64, 95123 Catania (Italy); and others
2011-12-01
In this paper the use of the Filtered Back Projection (FBP) Algorithm, in order to reconstruct tomographic images using the high energy (200-250 MeV) proton beams, is investigated. The algorithm has been studied in detail with a Monte Carlo approach and image quality has been analysed and compared with the total absorbed dose. A proton Computed Tomography (pCT) apparatus, developed by our group, has been fully simulated to exploit the power of the Geant4 Monte Carlo toolkit. From the simulation of the apparatus, a set of tomographic images of a test phantom has been reconstructed using the FBP at different absorbed dose values. The images have been evaluated in terms of homogeneity, noise, contrast, spatial and density resolution.
Maria Grazia PiaINFN Sezione di Genova; Marcia BegalliState University Rio de Janeiro; Anton LechnerVienna University of Technology; Lina QuintieriINFN Laboratori Nazionali di Frascati; Paolo SaraccoINFN Sezione di Genova
2014-01-01
The issue of how epistemic uncertainties affect the outcome of Monte Carlo simulation is discussed by means of a concrete use case: the simulation of the longitudinal energy deposition profile of low energy protons. A variety of electromagnetic and hadronic physics models is investigated, and their effects are analyzed. Possible systematic effects are highlighted. The results identify requirements for experimental measurements capable of reducing epistemic uncertainties in the physics models.
Pia, Maria Grazia; Lechner, Anton; Quintieri, Lina; Saracco, Paolo
2010-01-01
The issue of how epistemic uncertainties affect the outcome of Monte Carlo simulation is discussed by means of a concrete use case: the simulation of the longitudinal energy deposition profile of low energy protons. A variety of electromagnetic and hadronic physics models is investigated, and their effects are analyzed. Possible systematic effects are highlighted. The results identify requirements for experimental measurements capable of reducing epistemic uncertainties in the physics models.
Feasibility Study of Neutron Dose for Real Time Image Guided Proton Therapy: A Monte Carlo Study
Kim, Jin Sung; Shin, Jung Suk; Kim, Daehyun; Shin, EunHyuk; Chung, Kwangzoo; Cho, Sungkoo; Ahn, Sung Hwan; Ju, Sanggyu; Chung, Yoonsun; Jung, Sang Hoon; Han, Youngyih
2015-01-01
Two full rotating gantry with different nozzles (Multipurpose nozzle with MLC, Scanning Dedicated nozzle) with conventional cyclotron system is installed and under commissioning for various proton treatment options at Samsung Medical Center in Korea. The purpose of this study is to investigate neutron dose equivalent per therapeutic dose, H/D, to x-ray imaging equipment under various treatment conditions with monte carlo simulation. At first, we investigated H/D with the various modifications...
Wu, D; X. T. He; Yu, W.; Fritzsche, S.
2016-01-01
A Monte-Carlo approach to proton stopping in warm dense matter is implemented into an existing particle-in-cell code. The model is based on multiple binary-collisions among electron-electron, electron-ion and ion-ion, taking into account contributions from both free and bound electrons, and allows to calculate particle stopping in much more natural manner. At low temperature limit, when ``all'' electron are bounded at the nucleus, the stopping power converges to the predictions of Bethe-Bloch...
The Proton Therapy Nozzles at Samsung Medical Center: A Monte Carlo Simulation Study using TOPAS
Chung, Kwangzoo; Kim, Dae-Hyun; Ahn, Sunghwan; Han, Youngyih
2015-01-01
To expedite the commissioning process of the proton therapy system at Samsung Medical Center (SMC), we have developed a Monte Carlo simulation model of the proton therapy nozzles using TOPAS. At SMC proton therapy center, we have two gantry rooms with different types of nozzles; a multi-purpose nozzle and a dedicated scanning nozzle. Each nozzle has been modeled in detail following the geometry information provided by the manufacturer, Sumitomo Heavy Industries, Ltd. For this purpose, novel features of TOPAS, such as the time feature or the ridge filter class, have been used. And the appropriate physics models for proton nozzle simulation were defined. Dosimetric properties, like percent depth dose curve, spread-out Bragg peak (SOBP), beam spot size, have been simulated and verified against measured beam data. Beyond the Monte Carlo nozzle modeling, we have developed an interface between TOPAS and the treatment planning system (TPS), RayStation. An exported RT plan data from the TPS has been interpreted by th...
Shielding properties of iron at high energy proton accelerators studied by a Monte Carlo code
International Nuclear Information System (INIS)
Shielding properties of a lateral iron shield and of iron and concrete shields at angles between 5deg and 30deg are studied by means of the Monte Carlo program FLUNEV (DESY-D3 version of the FLUKA code extended for emission and transport of low energy neutrons). The following quantities were calculated for a high energy proton beam hitting an extended iron target: total and partial dose equivalents, attenuation coefficients, neutron spectra, star densities (compared also with the CASIM code) and quality factors. The dependence of the dose equivalent on the energy of primary protons, the effect of a concrete layer behind a lateral iron shielding and the total number of neutrons produced in the target were also estimated. (orig.)
Energy Technology Data Exchange (ETDEWEB)
von Wittenau, A; Aufderheide, M B; Henderson, G L
2010-05-07
Given the cost and lead-times involved in high-energy proton radiography, it is prudent to model proposed radiographic experiments to see if the images predicted would return useful information. We recently modified our raytracing transmission radiography modeling code HADES to perform simplified Monte Carlo simulations of the transport of protons in a proton radiography beamline. Beamline objects include the initial diffuser, vacuum magnetic fields, windows, angle-selecting collimators, and objects described as distorted 2D (planar or cylindrical) meshes or as distorted 3D hexahedral meshes. We present an overview of the algorithms used for the modeling and code timings for simulations through typical 2D and 3D meshes. We next calculate expected changes in image blur as scattering materials are placed upstream and downstream of a resolution test object (a 3 mm thick sheet of tantalum, into which 0.4 mm wide slits have been cut), and as the current supplied to the focusing magnets is varied. We compare and contrast the resulting simulations with the results of measurements obtained at the 800 MeV Los Alamos LANSCE Line-C proton radiography facility.
Energy Technology Data Exchange (ETDEWEB)
Schach von Wittenau, Alexis E., E-mail: schachvonwittenau1@llnl.gov [Lawrence Livermore National Laboratory, Livermore, CA 94551 (United States); Aufderheide, Maurice; Henderson, Gary [Lawrence Livermore National Laboratory, Livermore, CA 94551 (United States)
2011-10-01
Given the cost and lead-times involved in high-energy proton radiography, it is prudent to model proposed radiographic experiments to see if the images predicted would return useful information. We recently modified our raytracing transmission radiography modeling code HADES to perform simplified Monte Carlo simulations of the transport of protons in a proton radiography beamline. Beamline objects include the initial diffuser, vacuum magnetic fields, windows, angle-selecting collimators, and objects described as distorted 2D (planar or cylindrical) meshes or as distorted 3D hexahedral meshes. We describe the algorithms used for simulations through typical 2D and 3D meshes. We calculate expected changes in image blur as scattering materials are placed upstream and downstream of a resolution test object (a 3 mm thick sheet of tantalum, into which 0.4 mm wide slits have been cut), and as the current supplied to the focusing magnets is varied. We compare and contrast the resulting simulations with the results of measurements obtained at the 800 MeV Los Alamos LANSCE Line-C proton radiography facility.
TH-A-19A-10: Fast Four Dimensional Monte Carlo Dose Computations for Proton Therapy of Lung Cancer
Energy Technology Data Exchange (ETDEWEB)
Mirkovic, D; Titt, U; Mohan, R [U.T M.D. Anderson Cancer Center, Houston, TX (United States); Yepes, P [Rice University, Houston, TX (United States)
2014-06-15
Purpose: To develop and validate a fast and accurate four dimensional (4D) Monte Carlo (MC) dose computation system for proton therapy of lung cancer and other thoracic and abdominal malignancies in which the delivered dose distributions can be affected by respiratory motion of the patient. Methods: A 4D computer tomography (CT) scan for a lung cancer patient treated with protons in our clinic was used to create a time dependent patient model using our in-house, MCNPX-based Monte Carlo system (“MC{sup 2}”). The beam line configurations for two passively scattered proton beams used in the actual treatment were extracted from the clinical treatment plan and a set of input files was created automatically using MC{sup 2}. A full MC simulation of the beam line was computed using MCNPX and a set of phase space files for each beam was collected at the distal surface of the range compensator. The particles from these phase space files were transported through the 10 voxelized patient models corresponding to the 10 phases of the breathing cycle in the 4DCT, using MCNPX and an accelerated (fast) MC code called “FDC”, developed by us and which is based on the track repeating algorithm. The accuracy of the fast algorithm was assessed by comparing the two time dependent dose distributions. Results: The error of less than 1% in 100% of the voxels in all phases of the breathing cycle was achieved using this method with a speedup of more than 1000 times. Conclusion: The proposed method, which uses full MC to simulate the beam line and the accelerated MC code FDC for the time consuming particle transport inside the complex, time dependent, geometry of the patient shows excellent accuracy together with an extraordinary speed.
Monte Carlo approach for hadron azimuthal correlations in high energy proton and nuclear collisions
Ayala, Alejandro; Jalilian-Marian, Jamal; Magnin, J; Tejeda-Yeomans, Maria Elena
2012-01-01
We use a Monte Carlo approach to study hadron azimuthal angular correlations in high energy proton-proton and central nucleus-nucleus collisions at the BNL Relativistic Heavy Ion Collider (RHIC) energies at mid-rapidity. We build a hadron event generator that incorporates the production of $2\\to 2$ and $2\\to 3$ parton processes and their evolution into hadron states. For nucleus-nucleus collisions we include the effect of parton energy loss in the Quark-Gluon Plasma using a modified fragmentation function approach. In the presence of the medium, for the case when three partons are produced in the hard scattering, we analyze the Monte Carlo sample in parton and hadron momentum bins to reconstruct the angular correlations. We characterize this sample by the number of partons that are able to hadronize by fragmentation within the selected bins. In the nuclear environment the model allows hadronization by fragmentation only for partons with momentum above a threshold $p_T^{{\\tiny{thresh}}}=2.4$ GeV. We argue that...
Monte Carlo approach for hadron azimuthal correlations in high energy proton and nuclear collisions
Ayala, Alejandro; Dominguez, Isabel; Jalilian-Marian, Jamal; Magnin, J.; Tejeda-Yeomans, Maria Elena
2012-09-01
We use a Monte Carlo approach to study hadron azimuthal angular correlations in high-energy proton-proton and central nucleus-nucleus collisions at the BNL Relativistic Heavy Ion Collider energies at midrapidity. We build a hadron event generator that incorporates the production of 2→2 and 2→3 parton processes and their evolution into hadron states. For nucleus-nucleus collisions we include the effect of parton energy loss in the quark-gluon plasma using a modified fragmentation function approach. In the presence of the medium, for the case when three partons are produced in the hard scattering, we analyze the Monte Carlo sample in parton and hadron momentum bins to reconstruct the angular correlations. We characterize this sample by the number of partons that are able to hadronize by fragmentation within the selected bins. In the nuclear environment the model allows hadronization by fragmentation only for partons with momentum above a threshold pTthresh=2.4 GeV. We argue that one should treat properly the effect of those partons with momentum below the threshold, because their interaction with the medium may lead to showers of low-momentum hadrons along the direction of motion of the original partons as the medium becomes diluted.
Koch, Nicholas C; Newhauser, Wayne D
2010-02-01
Proton beam radiotherapy is an effective and non-invasive treatment for uveal melanoma. Recent research efforts have focused on improving the dosimetric accuracy of treatment planning and overcoming the present limitation of relative analytical dose calculations. Monte Carlo algorithms have been shown to accurately predict dose per monitor unit (D/MU) values, but this has yet to be shown for analytical algorithms dedicated to ocular proton therapy, which are typically less computationally expensive than Monte Carlo algorithms. The objective of this study was to determine if an analytical method could predict absolute dose distributions and D/MU values for a variety of treatment fields like those used in ocular proton therapy. To accomplish this objective, we used a previously validated Monte Carlo model of an ocular nozzle to develop an analytical algorithm to predict three-dimensional distributions of D/MU values from pristine Bragg peaks and therapeutically useful spread-out Bragg peaks (SOBPs). Results demonstrated generally good agreement between the analytical and Monte Carlo absolute dose calculations. While agreement in the proximal region decreased for beams with less penetrating Bragg peaks compared with the open-beam condition, the difference was shown to be largely attributable to edge-scattered protons. A method for including this effect in any future analytical algorithm was proposed. Comparisons of D/MU values showed typical agreement to within 0.5%. We conclude that analytical algorithms can be employed to accurately predict absolute proton dose distributions delivered by an ocular nozzle.
International Nuclear Information System (INIS)
Proton beam radiotherapy is an effective and non-invasive treatment for uveal melanoma. Recent research efforts have focused on improving the dosimetric accuracy of treatment planning and overcoming the present limitation of relative analytical dose calculations. Monte Carlo algorithms have been shown to accurately predict dose per monitor unit (D/MU) values, but this has yet to be shown for analytical algorithms dedicated to ocular proton therapy, which are typically less computationally expensive than Monte Carlo algorithms. The objective of this study was to determine if an analytical method could predict absolute dose distributions and D/MU values for a variety of treatment fields like those used in ocular proton therapy. To accomplish this objective, we used a previously validated Monte Carlo model of an ocular nozzle to develop an analytical algorithm to predict three-dimensional distributions of D/MU values from pristine Bragg peaks and therapeutically useful spread-out Bragg peaks (SOBPs). Results demonstrated generally good agreement between the analytical and Monte Carlo absolute dose calculations. While agreement in the proximal region decreased for beams with less penetrating Bragg peaks compared with the open-beam condition, the difference was shown to be largely attributable to edge-scattered protons. A method for including this effect in any future analytical algorithm was proposed. Comparisons of D/MU values showed typical agreement to within 0.5%. We conclude that analytical algorithms can be employed to accurately predict absolute proton dose distributions delivered by an ocular nozzle.
Effect of elemental compositions on Monte Carlo dose calculations in proton therapy of eye tumors
Rasouli, Fatemeh S.; Farhad Masoudi, S.; Keshazare, Shiva; Jette, David
2015-12-01
Recent studies in eye plaque brachytherapy have found considerable differences between the dosimetric results by using a water phantom, and a complete human eye model. Since the eye continues to be simulated as water-equivalent tissue in the proton therapy literature, a similar study for investigating such a difference in treating eye tumors by protons is indispensable. The present study inquires into this effect in proton therapy utilizing Monte Carlo simulations. A three-dimensional eye model with elemental compositions is simulated and used to examine the dose deposition to the phantom. The beam is planned to pass through a designed beam line to moderate the protons to the desired energies for ocular treatments. The results are compared with similar irradiation to a water phantom, as well as to a material with uniform density throughout the whole volume. Spread-out Bragg peaks (SOBPs) are created by adding pristine peaks to cover a typical tumor volume. Moreover, the corresponding beam parameters recommended by the ICRU are calculated, and the isodose curves are computed. The results show that the maximum dose deposited in ocular media is approximately 5-7% more than in the water phantom, and about 1-1.5% less than in the homogenized material of density 1.05 g cm-3. Furthermore, there is about a 0.2 mm shift in the Bragg peak due to the tissue composition difference between the models. It is found that using the weighted dose profiles optimized in a water phantom for the realistic eye model leads to a small disturbance of the SOBP plateau dose. In spite of the plaque brachytherapy results for treatment of eye tumors, it is found that the differences between the simplified models presented in this work, especially the phantom containing the homogenized material, are not clinically significant in proton therapy. Taking into account the intrinsic uncertainty of the patient dose calculation for protons, and practical problems corresponding to applying patient
A Monte Carlo pencil beam scanning model for proton treatment plan simulation using GATE/GEANT4
Energy Technology Data Exchange (ETDEWEB)
Grevillot, L; Freud, N; Sarrut, D [Universite de Lyon, CREATIS, CNRS UMR5220, Inserm U1044, INSA-Lyon, Universite Lyon 1, Centre Leon Berard, Lyon (France); Bertrand, D; Dessy, F, E-mail: loic.grevillot@creatis.insa-lyon.fr [IBA, B-1348, Louvain-la Neuve (Belgium)
2011-08-21
This work proposes a generic method for modeling scanned ion beam delivery systems, without simulation of the treatment nozzle and based exclusively on beam data library (BDL) measurements required for treatment planning systems (TPS). To this aim, new tools dedicated to treatment plan simulation were implemented in the Gate Monte Carlo platform. The method was applied to a dedicated nozzle from IBA for proton pencil beam scanning delivery. Optical and energy parameters of the system were modeled using a set of proton depth-dose profiles and spot sizes measured at 27 therapeutic energies. For further validation of the beam model, specific 2D and 3D plans were produced and then measured with appropriate dosimetric tools. Dose contributions from secondary particles produced by nuclear interactions were also investigated using field size factor experiments. Pristine Bragg peaks were reproduced with 0.7 mm range and 0.2 mm spot size accuracy. A 32 cm range spread-out Bragg peak with 10 cm modulation was reproduced with 0.8 mm range accuracy and a maximum point-to-point dose difference of less than 2%. A 2D test pattern consisting of a combination of homogeneous and high-gradient dose regions passed a 2%/2 mm gamma index comparison for 97% of the points. In conclusion, the generic modeling method proposed for scanned ion beam delivery systems was applicable to an IBA proton therapy system. The key advantage of the method is that it only requires BDL measurements of the system. The validation tests performed so far demonstrated that the beam model achieves clinical performance, paving the way for further studies toward TPS benchmarking. The method involves new sources that are available in the new Gate release V6.1 and could be further applied to other particle therapy systems delivering protons or other types of ions like carbon.
Monte Carlo calculations of relativistic solar proton propagation in interplanetary space
Lumme, M.; Torsti, J. J.; Vainikka, E.; Peltonen, J.; Nieminen, M.; Valtonen, E.; Arvelta, H.
1985-01-01
Particle fluxes and pitch angle distributions of relativistic solar protons at 1 AU were determined by Monte Carlo calculations. The analysis covers two hours after the release of the particles from the Sun and total of eight 100000 particle trajectories were simulated. The pitch angle scattering was assumed to be isotropic ad the scattering mean free path was varied from 0.1 to 4 AU. As an application, the solar injection time and interplanetary scattering mean free path of particles that gave rise to the GLE on May, 1978 were determined. Assuming exponential form, the injection decay time was found to be about 11 minutes. The m.f.p. of pitch angle scattering during the event was about 1 AU.
International Nuclear Information System (INIS)
Purpose: Monte Carlo simulation plays an important role for proton Pencil Beam Scanning (PBS) technique. However, MC simulation demands high computing power and is limited to few large proton centers that can afford a computer cluster. We study the feasibility of utilizing cloud computing in the MC simulation of PBS beams. Methods: A GATE/GEANT4 based MC simulation software was installed on a commercial cloud computing virtual machine (Linux 64-bits, Amazon EC2). Single spot Integral Depth Dose (IDD) curves and in-air transverse profiles were used to tune the source parameters to simulate an IBA machine. With the use of StarCluster software developed at MIT, a Linux cluster with 2–100 nodes can be conveniently launched in the cloud. A proton PBS plan was then exported to the cloud where the MC simulation was run. Results: The simulated PBS plan has a field size of 10×10cm2, 20cm range, 10cm modulation, and contains over 10,000 beam spots. EC2 instance type m1.medium was selected considering the CPU/memory requirement and 40 instances were used to form a Linux cluster. To minimize cost, master node was created with on-demand instance and worker nodes were created with spot-instance. The hourly cost for the 40-node cluster was $0.63 and the projected cost for a 100-node cluster was $1.41. Ten million events were simulated to plot PDD and profile, with each job containing 500k events. The simulation completed within 1 hour and an overall statistical uncertainty of < 2% was achieved. Good agreement between MC simulation and measurement was observed. Conclusion: Cloud computing is a cost-effective and easy to maintain platform to run proton PBS MC simulation. When proton MC packages such as GATE and TOPAS are combined with cloud computing, it will greatly facilitate the pursuing of PBS MC studies, especially for newly established proton centers or individual researchers
Energy Technology Data Exchange (ETDEWEB)
Krause, Claudius
2012-04-15
High energy proton-proton collisions lead to a large amount of secondary particles to be measured in a detector. A final state containing top quarks is of particular interest. But top quarks are only produced in a small fraction of the collisions. Hence, criteria must be defined to separate events containing top quarks from the background. From detectors, we record signals, for example hits in the tracker system or deposits in the calorimeters. In order to obtain the momentum of the particles, we apply algorithms to reconstruct tracks in space. More sophisticated algorithms are needed to identify the flavour of quarks, such as b-tagging. Several steps are needed to test these algorithms. Collision products of proton-proton events are generated using Monte Carlo techniques and their passage through the detector is simulated. After that, the algorithms are applied and the signal efficiency and the mistagging rate can be obtained. There are, however, many different approaches and algorithms realized in programs, so the question arises if the choice of the Monte Carlo generator influences the measured quantities. In this thesis, two commonly used Monte Carlo generators, SHERPA and MadGraph/MadEvent, are compared and the differences in the selection efficiency of semimuonic tt events are estimated. In addition, the distributions of kinematic variables are shown. A special chapter about the matching of matrix elements with parton showers is included. The main algorithms, CKKW for SHERPA and MLM for MadGraph/MadEvent, are introduced.
Improved efficiency in Monte Carlo simulation for passive-scattering proton therapy
International Nuclear Information System (INIS)
The aim of this work was to improve the computational efficiency of Monte Carlo simulations when tracking protons through a proton therapy treatment head. Two proton therapy facilities were considered, the Francis H Burr Proton Therapy Center (FHBPTC) at the Massachusetts General Hospital and the Crocker Lab eye treatment facility used by University of California at San Francisco (UCSFETF). The computational efficiency was evaluated for phase space files scored at the exit of the treatment head to determine optimal parameters to improve efficiency while maintaining accuracy in the dose calculation.For FHBPTC, particles were split by a factor of 8 upstream of the second scatterer and upstream of the aperture. The radius of the region for Russian roulette was set to 2.5 or 1.5 times the radius of the aperture and a secondary particle production cut (PC) of 50 mm was applied. For UCSFETF, particles were split a factor of 16 upstream of a water absorber column and upstream of the aperture. Here, the radius of the region for Russian roulette was set to 4 times the radius of the aperture and a PC of 0.05 mm was applied. In both setups, the cylindrical symmetry of the proton beam was exploited to position the split particles randomly spaced around the beam axis.When simulating a phase space for subsequent water phantom simulations, efficiency gains between a factor of 19.9 ± 0.1 and 52.21 ± 0.04 for the FHTPC setups and 57.3 ± 0.5 for the UCSFETF setups were obtained. For a phase space used as input for simulations in a patient geometry, the gain was a factor of 78.6 ± 7.5. Lateral-dose curves in water were within the accepted clinical tolerance of 2%, with statistical uncertainties of 0.5% for the two facilities. For the patient geometry and by considering the 2% and 2mm criteria, 98.4% of the voxels showed a gamma index lower than unity. An analysis of the dose distribution resulted in systematic deviations below of 0.88% for 20
Wu, D; Yu, W; Fritzsche, S
2016-01-01
A Monte-Carlo approach to proton stopping in warm dense matter is implemented into an existing particle-in-cell code. The model is based on multiple binary-collisions among electron-electron, electron-ion and ion-ion, taking into account contributions from both free and bound electrons, and allows to calculate particle stopping in much more natural manner. At low temperature limit, when ``all'' electron are bounded at the nucleus, the stopping power converges to the predictions of Bethe-Bloch theory, which shows good consistency with data provided by the NIST. With the rising of temperatures, more and more bound electron are ionized, thus giving rise to an increased stopping power to cold matter, which is consistent with the report of a recently experimental measurement [Phys. Rev. Lett. 114, 215002 (2015)]. When temperature is further increased, with ionizations reaching the maximum, lowered stopping power is observed, which is due to the suppression of collision frequency between projected proton beam and h...
Accelerated GPU based SPECT Monte Carlo simulations
Garcia, Marie-Paule; Bert, Julien; Benoit, Didier; Bardiès, Manuel; Visvikis, Dimitris
2016-06-01
Monte Carlo (MC) modelling is widely used in the field of single photon emission computed tomography (SPECT) as it is a reliable technique to simulate very high quality scans. This technique provides very accurate modelling of the radiation transport and particle interactions in a heterogeneous medium. Various MC codes exist for nuclear medicine imaging simulations. Recently, new strategies exploiting the computing capabilities of graphical processing units (GPU) have been proposed. This work aims at evaluating the accuracy of such GPU implementation strategies in comparison to standard MC codes in the context of SPECT imaging. GATE was considered the reference MC toolkit and used to evaluate the performance of newly developed GPU Geant4-based Monte Carlo simulation (GGEMS) modules for SPECT imaging. Radioisotopes with different photon energies were used with these various CPU and GPU Geant4-based MC codes in order to assess the best strategy for each configuration. Three different isotopes were considered: 99m Tc, 111In and 131I, using a low energy high resolution (LEHR) collimator, a medium energy general purpose (MEGP) collimator and a high energy general purpose (HEGP) collimator respectively. Point source, uniform source, cylindrical phantom and anthropomorphic phantom acquisitions were simulated using a model of the GE infinia II 3/8" gamma camera. Both simulation platforms yielded a similar system sensitivity and image statistical quality for the various combinations. The overall acceleration factor between GATE and GGEMS platform derived from the same cylindrical phantom acquisition was between 18 and 27 for the different radioisotopes. Besides, a full MC simulation using an anthropomorphic phantom showed the full potential of the GGEMS platform, with a resulting acceleration factor up to 71. The good agreement with reference codes and the acceleration factors obtained support the use of GPU implementation strategies for improving computational efficiency
International Nuclear Information System (INIS)
8 particle histories were run. Results: Range measurements of the Monte-Carlo simulations matched the measured data within 1mm. Distal fall-off of the simulated fields matched within <1mm. Lateral penumbra and field size measurements of the standard-sized square and half-beam blocked fields matched within 1mm at all three planes compared. A small difference was seen in the in-air profiles at doses <0%. The suspected cause of the difference was the aperture shape. The measured data utilized a divergent aperture. The Monte-Carlo calculation used a non-divergent aperture. Conclusion: The validation measurements indicate that we were able to accurately model the MEVION s250 Proton therapy system using Monte-Carlo Calculations. This may reduce the commissioning time for future users. Purpose: Monte-Carlo modeling is an important tool for understanding the behavior of therapeutic proton beams in a heterogeneous media such as the patient. To gain confidence that a Monte-Carlo model is accurate in complex geometries and media, it must first be compared with measurement in simple situations. This study documents the validation of our Monte-Carlo Model. Methods: A model of the MEVION s250 Proton therapy system was created in the TOPAS Monte-Carlo environment using machine geometry and field shaping system information provided by the vendor. For each of 24 options, validation of the TOPAS model was performed by comparing the dose scored by TOPAS to the dose measurements obtained during the commissioning of the treatment planning system. The measurements compared consisted of: pristine peak depth-dose profiles, in-air profiles for a standard-sized square field (20cm×20cm or 10cm×10cm depending on the maximum field size for each option) at isocenter and at 20cm upstream and downstream of isocenter, and in-air profiles with a half-beam blocked aperture at isocenter and at 20cm upstream and downstream of isocenter. For all Monte-Carlo simulations,
Energy Technology Data Exchange (ETDEWEB)
Shin, J; Park, S; Jeong, J; Jeong, C [National Cancer Center, Goyang, Gyeonggi-do (Korea, Republic of); Lim, Y; Lee, S [National Cancer Center in Korea, Goyang, Gyeonggi-do (Korea, Republic of); SHIN, D [National Cancer Center, Goyangsi, Gyeonggi-do (Korea, Republic of); Incerti, S [Universite Bordeaux 1, CNRS.IN2P3, Centres d’Etudes Nucleaires de Bordeau, Gradignan, Gradignan (France)
2014-06-01
Purpose: In particle therapy and radiobiology, the investigation of mechanisms leading to the death of target cancer cells induced by ionising radiation is an active field of research. Recently, several studies based on Monte Carlo simulation codes have been initiated in order to simulate physical interactions of ionising particles at cellular scale and in DNA. Geant4-DNA is the one of them; it is an extension of the general purpose Geant4 Monte Carlo simulation toolkit for the simulation of physical interactions at sub-micrometre scale. In this study, we present Geant4-DNA Monte Carlo simulations for the prediction of DNA strand breakage using a geometrical modelling of DNA structure. Methods: For the simulation of DNA strand breakage, we developed a specific DNA geometrical structure. This structure consists of DNA components, such as the deoxynucleotide pairs, the DNA double helix, the nucleosomes and the chromatin fibre. Each component is made of water because the cross sections models currently available in Geant4-DNA for protons apply to liquid water only. Also, at the macroscopic-scale, protons were generated with various energies available for proton therapy at the National Cancer Center, obtained using validated proton beam simulations developed in previous studies. These multi-scale simulations were combined for the validation of Geant4-DNA in radiobiology. Results: In the double helix structure, the deposited energy in a strand allowed to determine direct DNA damage from physical interaction. In other words, the amount of dose and frequency of damage in microscopic geometries was related to direct radiobiological effect. Conclusion: In this report, we calculated the frequency of DNA strand breakage using Geant4- DNA physics processes for liquid water. This study is now on-going in order to develop geometries which use realistic DNA material, instead of liquid water. This will be tested as soon as cross sections for DNA material become available in Geant4
Feasibility Study of Neutron Dose for Real Time Image Guided Proton Therapy: A Monte Carlo Study
Kim, Jin Sung; Kim, Daehyun; Shin, EunHyuk; Chung, Kwangzoo; Cho, Sungkoo; Ahn, Sung Hwan; Ju, Sanggyu; Chung, Yoonsun; Jung, Sang Hoon; Han, Youngyih
2015-01-01
Two full rotating gantry with different nozzles (Multipurpose nozzle with MLC, Scanning Dedicated nozzle) with conventional cyclotron system is installed and under commissioning for various proton treatment options at Samsung Medical Center in Korea. The purpose of this study is to investigate neutron dose equivalent per therapeutic dose, H/D, to x-ray imaging equipment under various treatment conditions with monte carlo simulation. At first, we investigated H/D with the various modifications of the beam line devices (Scattering, Scanning, Multi-leaf collimator, Aperture, Compensator) at isocenter, 20, 40, 60 cm distance from isocenter and compared with other research groups. Next, we investigated the neutron dose at x-ray equipments used for real time imaging with various treatment conditions. Our investigation showed the 0.07 ~ 0.19 mSv/Gy at x-ray imaging equipments according to various treatment options and intestingly 50% neutron dose reduction effect of flat panel detector was observed due to multi- lea...
International Nuclear Information System (INIS)
Monte Carlo simulations play a crucial role for in-vivo treatment monitoring based on PET and prompt gamma imaging in proton and carbon-ion therapies. The accuracy of the nuclear fragmentation models implemented in these codes might affect the quality of the treatment verification. In this paper, we investigate the nuclear models implemented in GATE/Geant4 and FLUKA by comparing the angular and energy distributions of secondary particles exiting a homogeneous target of PMMA. Comparison results were restricted to fragmentation of 16O and 12C. Despite the very simple target and set-up, substantial discrepancies were observed between the two codes. For instance, the number of high energy (>1 MeV) prompt gammas exiting the target was about twice as large with GATE/Geant4 than with FLUKA both for proton and carbon ion beams. Such differences were not observed for the predicted annihilation photon production yields, for which ratios of 1.09 and 1.20 were obtained between GATE and FLUKA for the proton beam and the carbon ion beam, respectively. For neutrons and protons, discrepancies from 14% (exiting protons–carbon ion beam) to 57% (exiting neutrons–proton beam) have been identified in production yields as well as in the energy spectra for neutrons. (paper)
Energy Technology Data Exchange (ETDEWEB)
Cho, S; Shin, E H; Kim, J; Ahn, S H; Chung, K; Kim, D-H; Han, Y; Choi, D H [Samsung Medical Center, Seoul (Korea, Republic of)
2015-06-15
Purpose: To evaluate the shielding wall design to protect patients, staff and member of the general public for secondary neutron using a simply analytic solution, multi-Monte Carlo code MCNPX, ANISN and FLUKA. Methods: An analytical and multi-Monte Carlo method were calculated for proton facility (Sumitomo Heavy Industry Ltd.) at Samsung Medical Center in Korea. The NCRP-144 analytical evaluation methods, which produced conservative estimates on the dose equivalent values for the shielding, were used for analytical evaluations. Then, the radiation transport was simulated with the multi-Monte Carlo code. The neutron dose at evaluation point is got by the value using the production of the simulation value and the neutron dose coefficient introduced in ICRP-74. Results: The evaluation points of accelerator control room and control room entrance are mainly influenced by the point of the proton beam loss. So the neutron dose equivalent of accelerator control room for evaluation point is 0.651, 1.530, 0.912, 0.943 mSv/yr and the entrance of cyclotron room is 0.465, 0.790, 0.522, 0.453 mSv/yr with calculation by the method of NCRP-144 formalism, ANISN, FLUKA and MCNP, respectively. The most of Result of MCNPX and FLUKA using the complicated geometry showed smaller values than Result of ANISN. Conclusion: The neutron shielding for a proton therapy facility has been evaluated by the analytic model and multi-Monte Carlo methods. We confirmed that the setting of shielding was located in well accessible area to people when the proton facility is operated.
CERN Summer Student Report 2016 Monte Carlo Data Base Improvement
Caciulescu, Alexandru Razvan
2016-01-01
During my Summer Student project I worked on improving the Monte Carlo Data Base and MonALISA services for the ALICE Collaboration. The project included learning the infrastructure for tracking and monitoring of the Monte Carlo productions as well as developing a new RESTful API for seamless integration with the JIRA issue tracking framework.
MONTE: An automated Monte Carlo based approach to nuclear magnetic resonance assignment of proteins
Energy Technology Data Exchange (ETDEWEB)
Hitchens, T. Kevin; Lukin, Jonathan A.; Zhan Yiping; McCallum, Scott A.; Rule, Gordon S. [Carnegie Mellon University, Department of Biological Sciences (United States)], E-mail: rule@andrew.cmu.edu
2003-01-15
A general-purpose Monte Carlo assignment program has been developed to aid in the assignment of NMR resonances from proteins. By virtue of its flexible data requirements the program is capable of obtaining assignments of both heavily deuterated and fully protonated proteins. A wide variety of source data, such as inter-residue scalar connectivity, inter-residue dipolar (NOE) connectivity, and residue specific information, can be utilized in the assignment process. The program can also use known assignments from one form of a protein to facilitate the assignment of another form of the protein. This attribute is useful for assigning protein-ligand complexes when the assignments of the unliganded protein are known. The program can be also be used as an interactive research tool to assist in the choice of additional experimental data to facilitate completion of assignments. The assignment of a deuterated 45 kDa homodimeric Glutathione-S-transferase illustrates the principal features of the program.
Hydrogen-bonded proton transfer in the protonated guanine-cytosine (GC+H)+ base pair.
Lin, Yuexia; Wang, Hongyan; Gao, Simin; Schaefer, Henry F
2011-10-13
The single proton transfer at the different sites of the Watson-Crick (WC) guanine-cytosine (GC) DNA base pair are studied here using density functional methods. The conventional protonated structures, transition state (TS) and proton-transferred product (PT) structures of every relevant species are optimized. Each transition state and proton-transferred product structure has been compared with the corresponding conventional protonated structure to demonstrate the process of proton transfer and the change of geometrical structures. The relative energies of the protonated tautomers and the proton-transfer energy profiles in gas and solvent are analyzed. The proton-transferred product structure G(+H(+))-H(+)C(N3)(-H(+))(PT) has the lowest relative energy for which only two hydrogen bonds exist. Almost all 14 isomers of the protonated GC base pair involve hydrogen-bonded proton transfer following the three pathways, with the exception of structure G-H(+)C(O2). When the positive charge is primarily "located" on the guanine moiety (H(+)G-C, G-H(+)C(C4), and G-H(+)C(C6)), the H(1) proton transfers from the N(1) site of guanine to the N(3) site of cytosine. The structures G-H(+)C(C5) and G-H(+)C(C4) involve H(4a) proton transfer from the N(4) of cytosine to the O(6) site of guanine. H(2a) proton transfer from the N(2) site of guanine to the O(2) site of cytosine is found only for the structure G-H(+)C(C4). The structures to which a proton is added on the six-centered sites adjoining the hydrogen bonds are more prone to proton transfer in the gas phase, whereas a proton added on the minor groove and the sites adjoining the hydrogen bonds is favorable to the proton transfer in energy in the aqueous phase.
Monte carlo computation of the energy deposited by protons in water, bone and adipose
Küçer, Rahmi; Küçer, Nermin; Türemen, Görkem
2013-02-01
Protons are most suitable for treating deeply-seated tumors due to their unique depth dose distribution. The maximum dose of protons is a pronounced peak, called the Bragg peak, with zero dose behind the peak. The objective of radiation therapy with protons is to deliver the dose to the target volume by using this type of distribution. This is achieved with a finite number of Bragg peaks at the depth of the target volume. The location of the peak in terms of depth depends on the energy of the protons. Simulations are used to determine the depth dose distribution of proton beams passing through tissue, so it is important that experimental data agree with the simulation data. In this study, we used the FLUKA computer code to determine the correct position of the Bragg peak for proton beams passing through water, bone and adipose, and the results were compared with experimental data.
Energy Technology Data Exchange (ETDEWEB)
Palmans, H. [Ghent Univ. (Belgium). Dept. of Biomedical Physics; Verhaegen, F.
1995-12-01
In the last decade, several clinical proton beam therapy facilities have been developed. To satisfy the demand for uniformity in clinical (routine) proton beam dosimetry two dosimetry protocols (ECHED and AAPM) have been published. Both protocols neglect the influence of ion chamber dependent parameters on dose determination in proton beams because of the scatter properties of these beams, although the problem has not been studied thoroughly yet. A comparison between water calorimetry and ionisation chamber dosimetry showed a discrepancy of 2.6% between the former method and ionometry following the ECHED protocol. Possibly, a small part of this difference can be attributed to chamber dependent correction factors. Indications for this possibility are found in ionometry measurements. To allow the simulation of complex geometries with different media necessary for the study of those corrections, an existing proton Monte Carlo code (PTRAN, Berger) has been modified. The original code, that applies Mollire`s multiple scattering theory and Vavilov`s energy straggling theory, calculates depth dose profiles, energy distributions and radial distributions for pencil beams in water. Comparisons with measurements and calculations reported in the literature are done to test the program`s accuracy. Preliminary results of the influence of chamber design and chamber materials on dose to water determination are presented.
International Nuclear Information System (INIS)
In the last decade, several clinical proton beam therapy facilities have been developed. To satisfy the demand for uniformity in clinical (routine) proton beam dosimetry two dosimetry protocols (ECHED and AAPM) have been published. Both protocols neglect the influence of ion chamber dependent parameters on dose determination in proton beams because of the scatter properties of these beams, although the problem has not been studied thoroughly yet. A comparison between water calorimetry and ionisation chamber dosimetry showed a discrepancy of 2.6% between the former method and ionometry following the ECHED protocol. Possibly, a small part of this difference can be attributed to chamber dependent correction factors. Indications for this possibility are found in ionometry measurements. To allow the simulation of complex geometries with different media necessary for the study of those corrections, an existing proton Monte Carlo code (PTRAN, Berger) has been modified. The original code, that applies Mollire's multiple scattering theory and Vavilov's energy straggling theory, calculates depth dose profiles, energy distributions and radial distributions for pencil beams in water. Comparisons with measurements and calculations reported in the literature are done to test the program's accuracy. Preliminary results of the influence of chamber design and chamber materials on dose to water determination are presented
Gomà, Carles; Andreo, Pedro; Sempau, Josep
2016-03-01
This work calculates beam quality correction factors (k Q ) in monoenergetic proton beams using detailed Monte Carlo simulation of ionization chambers. It uses the Monte Carlo code penh and the electronic stopping powers resulting from the adoption of two different sets of mean excitation energy values for water and graphite: (i) the currently ICRU 37 and ICRU 49 recommended {{I}\\text{w}}=75~\\text{eV} and {{I}\\text{g}}=78~\\text{eV} and (ii) the recently proposed {{I}\\text{w}}=78~\\text{eV} and {{I}\\text{g}}=81.1~\\text{eV} . Twelve different ionization chambers were studied. The k Q factors calculated using the two different sets of I-values were found to agree with each other within 1.6% or better. k Q factors calculated using current ICRU I-values were found to agree within 2.3% or better with the k Q factors tabulated in IAEA TRS-398, and within 1% or better with experimental values published in the literature. k Q factors calculated using the new I-values were also found to agree within 1.1% or better with the experimental values. This work concludes that perturbation correction factors in proton beams—currently assumed to be equal to unity—are in fact significantly different from unity for some of the ionization chambers studied.
MCHIT - Monte Carlo model for proton and heavy-ion therapy
Pshenichnov, Igor; Greiner, Walter
2007-01-01
We study the propagation of nucleons and nuclei in tissue-like media within a Monte Carlo Model for Heavy-ion Therapy (MCHIT) based on the GEANT4 toolkit (version 8.2). The model takes into account fragmentation of projectile nuclei and secondary interactions of produced nuclear fragments. Model predictions are validated with available experimental data obtained for water and PMMA phantoms irradiated by monoenergetic carbon-ion beams. The MCHIT model describes well (1) the depth-dose distributions in water and PMMA, (2) the doses measured for fragments of certain charge, (3) the distributions of positron emitting nuclear fragments produced by carbon-ion beams, and (4) the energy spectra of secondary neutrons measured at different angles to the beam direction. Radial dose profiles for primary nuclei and for different projectile fragments are calculated and discussed as possible input for evaluation of biological dose distributions. It is shown that at the periphery of the transverse dose profile close to the B...
Schmid, S.; Landry, G.; Thieke, C.; Verhaegen, F.; Ganswindt, U.; Belka, C.; Parodi, K.; Dedes, G.
2015-12-01
Proton range verification based on prompt gamma imaging is increasingly considered in proton therapy. Tissue heterogeneity normal to the beam direction or near the end of range may considerably degrade the ability of prompt gamma imaging to detect proton range shifts. The goal of this study was to systematically investigate the accuracy and precision of range detection from prompt gamma emission profiles for various fractions for intensity modulated proton therapy of prostate cancer, using a comprehensive clinical dataset of 15 different CT scans for 5 patients. Monte Carlo simulations using Geant4 were performed to generate spot-by-spot dose distributions and prompt gamma emission profiles for prostate treatment plans. The prompt gammas were scored at their point of emission. Three CT scans of the same patient were used to evaluate the impact of inter-fractional changes on proton range. The range shifts deduced from the comparison of prompt gamma emission profiles in the planning CT and subsequent CTs were then correlated to the corresponding range shifts deduced from the dose distributions for individual pencil beams. The distributions of range shift differences between prompt gamma and dose were evaluated in terms of precision (defined as half the 95% inter-percentile range IPR) and accuracy (median). In total about 1700 individual proton pencil beams were investigated. The IPR of the relative range shift differences between the dose profiles and the prompt gamma profiles varied between ±1.4 mm and ±2.9 mm when using the more robust profile shifting analysis. The median was found smaller than 1 mm. Methods to identify and reject unreliable spots for range verification due to range mixing were derived and resulted in an average 10% spot rejection, clearly improving the prompt gamma-dose correlation. This work supports that prompt gamma imaging can offer a reliable indicator of range changes due to anatomical variations and tissue heterogeneity
Impact of the material composition on proton range variation - A Monte Carlo study
Wu, S. W.; Tung, C. J.; Lee, C. C.; Fan, K. H.; Huang, H. C.; Chao, T. C.
2015-11-01
In this study, we used the Geant4 toolkit to demonstrate the impacts of the material composition of tissues on proton range variation. Bragg curves of different materials subjected to a 250 MeV mono-energy proton beam were simulated and compared. These simulated materials included adipose, heart, brain, cartilage, cortical bone and water. The results showed that there was significant proton range deviation between Bragg curves, especially for cortical bone. The R50 values for a 250 MeV proton beam were approximately 39.55 cm, 35.52 cm, 37.00 cm, 36.51 cm, 36.72 cm, 22.53 cm, and 38.52 cm in the phantoms that were composed completely of adipose, cartilage, tissue, heart, brain, cortical bone, and water, respectively. Mass density and electron density were used to scale the proton range for each material; electron density provided better range scaling. In addition, a similar comparison was performed by artificially setting all material density to 1.0 g/cm3 to evaluate the range deviation due to chemical components alone. Tissue heterogeneity effects due to density variation were more significant, and less significant for chemical composition variation unless the Z/A was very different.
Energy Technology Data Exchange (ETDEWEB)
Farah, J; Bonfrate, A; Donadille, L; Dubourg, N; Lacoste, V; Martinetti, F; Sayah, R; Trompier, F; Clairand, I [IRSN - Institute for Radiological Protection and Nuclear Safety, Fontenay-aux-roses (France); Caresana, M [Politecnico di Milano, Milano (Italy); Delacroix, S; Nauraye, C [Institut Curie - Centre de Protontherapie d Orsay, Orsay (France); Herault, J [Centre Antoine Lacassagne, Nice (France); Piau, S; Vabre, I [Institut de Physique Nucleaire d Orsay, Orsay (France)
2014-06-01
Purpose: Measure stray radiation inside a passive scattering proton therapy facility, compare values to Monte Carlo (MC) simulations and identify the actual needs and challenges. Methods: Measurements and MC simulations were considered to acknowledge neutron exposure associated with 75 MeV ocular or 180 MeV intracranial passively scattered proton treatments. First, using a specifically-designed high sensitivity Bonner Sphere system, neutron spectra were measured at different positions inside the treatment rooms. Next, measurement-based mapping of neutron ambient dose equivalent was fulfilled using several TEPCs and rem-meters. Finally, photon and neutron organ doses were measured using TLDs, RPLs and PADCs set inside anthropomorphic phantoms (Rando, 1 and 5-years-old CIRS). All measurements were also simulated with MCNPX to investigate the efficiency of MC models in predicting stray neutrons considering different nuclear cross sections and models. Results: Knowledge of the neutron fluence and energy distribution inside a proton therapy room is critical for stray radiation dosimetry. However, as spectrometry unfolding is initiated using a MC guess spectrum and suffers from algorithmic limits a 20% spectrometry uncertainty is expected. H*(10) mapping with TEPCs and rem-meters showed a good agreement between the detectors. Differences within measurement uncertainty (10–15%) were observed and are inherent to the energy, fluence and directional response of each detector. For a typical ocular and intracranial treatment respectively, neutron doses outside the clinical target volume of 0.4 and 11 mGy were measured inside the Rando phantom. Photon doses were 2–10 times lower depending on organs position. High uncertainties (40%) are inherent to TLDs and PADCs measurements due to the need for neutron spectra at detector position. Finally, stray neutrons prediction with MC simulations proved to be extremely dependent on proton beam energy and the used nuclear models and
DEFF Research Database (Denmark)
Palmans, Hugo; Al-Sulaiti, L; Andreo, P;
2013-01-01
The conversion of absorbed dose-to-graphite in a graphite phantom to absorbed dose-to-water in a water phantom is performed by water to graphite stopping power ratios. If, however, the charged particle fluence is not equal at equivalent depths in graphite and water, a fluence correction factor, kfl......, is required as well. This is particularly relevant to the derivation of absorbed dose-to-water, the quantity of interest in radiotherapy, from a measurement of absorbed dose-to-graphite obtained with a graphite calorimeter. In this work, fluence correction factors for the conversion from dose-to-graphite...... in a graphite phantom to dose-to-water in a water phantom for 60 MeV mono-energetic protons were calculated using an analytical model and five different Monte Carlo codes (Geant4, FLUKA, MCNPX, SHIELD-HIT and McPTRAN.MEDIA). In general the fluence correction factors are found to be close to unity...
Monte Carlo simulations of soft proton flares: testing the physics with XMM-Newton
Fioretti, Valentina; Malaguti, Giuseppe; Spiga, Daniele; Tiengo, Andrea
2016-01-01
Low energy protons (<100-300 keV) in the Van Allen belt and the outer regions can enter the field of view of X-ray focusing telescopes, interact with the Wolter-I optics, and reach the focal plane. The use of special filters protects the XMM-Newton focal plane below an altitude of 70000 km, but above this limit the effect of soft protons is still present in the form of sudden flares in the count rate of the EPIC instruments, causing the loss of large amounts of observing time. We try to characterize the input proton population and the physics interaction by simulating, using the BoGEMMS framework, the proton interaction with a simplified model of the X-ray mirror module and the focal plane, and comparing the result with a real observation. The analysis of ten orbits of observations of the EPIC/pn instrument show that the detection of flares in regions far outside the radiation belt is largely influenced by the different orientation of the Earth's magnetosphere respect with XMM-Newton's orbit, confirming th...
Afanasiev, Alexandr; Vainio, Rami
2016-01-01
Context. Solar energetic particles observed in association with coronal mass ejections (CMEs) are produced by the CME-driven shock waves. The acceleration of particles is considered to be due to diffusive shock acceleration (DSA). Aims. We aim at a better understanding of DSA in the case of quasi-parallel shocks, in which self-generated turbulence in the shock vicinity plays a key role. Methods. We have developed and applied a new Monte Carlo simulation code for acceleration of protons in parallel coronal shocks. The code performs a self-consistent calculation of resonant interactions of particles with Alfv\\'en waves based on the quasi-linear theory. In contrast to the existing Monte Carlo codes of DSA, the new code features the full quasi-linear resonance condition of particle pitch-angle scattering. This allows us to take anisotropy of particle pitch-angle scattering into account, while the older codes implement an approximate resonance condition leading to isotropic scattering.We performed simulations with...
Wet-based glaciation in Phlegra Montes, Mars.
Gallagher, Colman; Balme, Matt
2016-04-01
Eskers are sinuous landforms composed of sediments deposited from meltwaters in ice-contact glacial conduits. This presentation describes the first definitive identification of eskers on Mars still physically linked with their parent system (1), a Late Amazonian-age glacier (~150 Ma) in Phlegra Montes. Previously described Amazonian-age glaciers on Mars are generally considered to have been dry based, having moved by creep in the absence of subglacial water required for sliding, but our observations indicate significant sub-glacial meltwater routing. The confinement of the Phlegra Montes glacial system to a regionally extensive graben is evidence that the esker formed due to sub-glacial melting in response to an elevated, but spatially restricted, geothermal heat flux rather than climate-induced warming. Now, however, new observations reveal the presence of many assemblages of glacial abrasion forms and associated channels that could be evidence of more widespread wet-based glaciation in Phlegra Montes, including the collapse of several distinct ice domes. This landform assemblage has not been described in other glaciated, mid-latitude regions of the martian northern hemisphere. Moreover, Phlegra Montes are flanked by lowlands displaying evidence of extensive volcanism, including contact between plains lava and piedmont glacial ice. These observations provide a rationale for investigating non-climatic forcing of glacial melting and associated landscape development on Mars, and can build on insights from Earth into the importance of geothermally-induced destabilisation of glaciers as a key amplifier of climate change. (1) Gallagher, C. and Balme, M. (2015). Eskers in a complete, wet-based glacial system in the Phlegra Montes region, Mars, Earth and Planetary Science Letters, 431, 96-109.
Monte Carlo Predictions of Proton SEE Cross-Sections from Heavy Ion Test Data
Xi, Kai; Zhang, Zhan-Gang; Hou, Ming-Dong; Sun, You-Mei; Luo, Jie; Liu, Tian-Qi; Wang, Bin; Ye, Bing; Yin, Ya-Nan; Liu, Jie
2015-01-01
The limits of previous methods promote us to design a new approach (named PRESTAGE) to predict proton single event effect (SEE) cross-sections using heavy-ion test data. To more realistically simulate the SEE mechanisms, we adopt Geant4 and the location-dependent strategy to describe the physics processes and the sensitivity of the device. Cross-sections predicted by PRESTAGE for over twenty devices are compared with the measured data. Evidences show that PRESTAGE can calculate not only single event upsets induced by proton indirect ionization, but also direct ionization effects and single event latch-ups. Most of the PRESTAGE calculated results agree with the experimental data within a factor of 2-3.
Elmekawy, Ahmed Farouk
The distal edge of therapeutic proton radiation beams was investigated by different methods. Proton beams produced at the Hampton University Proton Therapy Institute (HUPTI) were used to irradiate a Polymethylmethacrylate (PMMA) phantom for three different ranges (13.5, 17.0 and 21.0 cm) to investigate the distal slope dependence of the Bragg peak. The activation of 11 C was studied by scanning the phantom less than 10 minutes post-irradiation with a Philips Big Bore Gemini(c) PET/CT. The DICOM images were imported into the Varian Eclipse(c) Treatment Planning System (TPS) for analysis and then analyzed by ImageJ(c) . The distal slope ranged from ?0.1671 +/- 0.0036 to -0.1986 +/- 0.0052 (pixel intensity/slice number) for ranges 13.5 to 21.0 cm respectively. A realistic description of the setup was modeled using the GATE 7.0 Monte Carlo simulation tool and compared to the experiment data. The results show the distal slope ranged from -0.1158+/-0.0133 to -0.0787+/-0.002 (Gy/mm). Additionally, low activity, 11C were simulated to study the 11C reconstructed half-life dependence versus the initial activity for six ranges chosen around the previous activation study. The results of the expected/nominal half-life vs. activity ranged from -5 x 10-4 +/- 2.8104 x 10-4 to 1.6 x 10-3 +/- 9.44 x 10-4 (%diff./Bq). The comparison between two experiments with proton beams on a PMMA phantom and multi-layer ion chamber, and two GATE simulations of a proton beam incident on a water phantom and 11C PET study show that: (i) the distal fall-off variation of the steepness of the slopes are found to be similar thus validating the sensitivity of the PET technique to the range degradation and (ii) the average of the super-ratios difference between all studies observed is primarily due to the difference in the dose deposited in the media.
Energy Technology Data Exchange (ETDEWEB)
Guan, Fada; Peeler, Christopher; Taleei, Reza; Randeniya, Sharmalee; Ge, Shuaiping; Mirkovic, Dragan; Mohan, Radhe; Titt, Uwe, E-mail: UTitt@mdanderson.org [Department of Radiation Physics, The University of Texas MD Anderson Cancer Center, 1515 Holcombe Boulevard, Houston, Texas 77030 (United States); Bronk, Lawrence [Department of Experimental Radiation Oncology, The University of Texas MD Anderson Cancer Center, 1515 Holcombe Boulevard, Houston, Texas 77030 (United States); Geng, Changran [Department of Nuclear Science and Engineering, Nanjing University of Aeronautics and Astronautics, Nanjing 210016, China and Department of Radiation Oncology, Massachusetts General Hospital and Harvard Medical School, Boston, Massachusetts 02114 (United States); Grosshans, David [Department of Experimental Radiation Oncology, The University of Texas MD Anderson Cancer Center, 1515 Holcombe Boulevard, Houston, Texas 77030 and Department of Radiation Oncology, The University of Texas MD Anderson Cancer Center, 1515 Holcombe Boulevard, Houston, Texas 77030 (United States)
2015-11-15
Purpose: The motivation of this study was to find and eliminate the cause of errors in dose-averaged linear energy transfer (LET) calculations from therapeutic protons in small targets, such as biological cell layers, calculated using the GEANT 4 Monte Carlo code. Furthermore, the purpose was also to provide a recommendation to select an appropriate LET quantity from GEANT 4 simulations to correlate with biological effectiveness of therapeutic protons. Methods: The authors developed a particle tracking step based strategy to calculate the average LET quantities (track-averaged LET, LET{sub t} and dose-averaged LET, LET{sub d}) using GEANT 4 for different tracking step size limits. A step size limit refers to the maximally allowable tracking step length. The authors investigated how the tracking step size limit influenced the calculated LET{sub t} and LET{sub d} of protons with six different step limits ranging from 1 to 500 μm in a water phantom irradiated by a 79.7-MeV clinical proton beam. In addition, the authors analyzed the detailed stochastic energy deposition information including fluence spectra and dose spectra of the energy-deposition-per-step of protons. As a reference, the authors also calculated the averaged LET and analyzed the LET spectra combining the Monte Carlo method and the deterministic method. Relative biological effectiveness (RBE) calculations were performed to illustrate the impact of different LET calculation methods on the RBE-weighted dose. Results: Simulation results showed that the step limit effect was small for LET{sub t} but significant for LET{sub d}. This resulted from differences in the energy-deposition-per-step between the fluence spectra and dose spectra at different depths in the phantom. Using the Monte Carlo particle tracking method in GEANT 4 can result in incorrect LET{sub d} calculation results in the dose plateau region for small step limits. The erroneous LET{sub d} results can be attributed to the algorithm to
Bauer, J.; Unholtz, D.; Kurz, C.; Parodi, K.
2013-08-01
We report on the experimental campaign carried out at the Heidelberg Ion-Beam Therapy Center (HIT) to optimize the Monte Carlo (MC) modelling of proton-induced positron-emitter production. The presented experimental strategy constitutes a pragmatic inverse approach to overcome the known uncertainties in the modelling of positron-emitter production due to the lack of reliable cross-section data for the relevant therapeutic energy range. This work is motivated by the clinical implementation of offline PET/CT-based treatment verification at our facility. Here, the irradiation induced tissue activation in the patient is monitored shortly after the treatment delivery by means of a commercial PET/CT scanner and compared to a MC simulated activity expectation, derived under the assumption of a correct treatment delivery. At HIT, the MC particle transport and interaction code FLUKA is used for the simulation of the expected positron-emitter yield. For this particular application, the code is coupled to externally provided cross-section data of several proton-induced reactions. Studying experimentally the positron-emitting radionuclide yield in homogeneous phantoms provides access to the fundamental production channels. Therefore, five different materials have been irradiated by monoenergetic proton pencil beams at various energies and the induced β+ activity subsequently acquired with a commercial full-ring PET/CT scanner. With the analysis of dynamically reconstructed PET images, we are able to determine separately the spatial distribution of different radionuclide concentrations at the starting time of the PET scan. The laterally integrated radionuclide yields in depth are used to tune the input cross-section data such that the impact of both the physical production and the imaging process on the various positron-emitter yields is reproduced. The resulting cross-section data sets allow to model the absolute level of measured β+ activity induced in the investigated
Parallel proton transfer pathways in aqueous acid-base reactions
Cox, M. J.; Bakker, H.J.
2008-01-01
We study the mechanism of proton transfer (PT) between the photoacid 8-hydroxy-1,3, 6-pyrenetrisulfonic acid (HPTS) and the base chloroacetate in aqueous solution. We investigate both proton and deuteron transfer reactions in solutions with base concentrations ranging from 0.25M to 4M. Using femtosecond midinfrared spectroscopy, we probe the vibrational responses of HPTS, its conjugate photobase, the hydrated proton/deuteron, and chloroacetate. The measurement of these four resonances allows ...
Proton irradiation on textured bismuth based cuprate superconductors
International Nuclear Information System (INIS)
Textured bulk polycrystalline samples of bismuth based cuprate superconductors have been subjected to irradiation with 15 MeV protons. In case of Bi-2212, there has been substantial increase in Tc, which may be due to proton induced knock-out of loosely bound oxygen. In case of (Bi,Pb)-2223, there has been a reduction in Tc. The difference in behaviour in these two systems towards proton irradiation has been explained. (author). 7 refs., 3 figs., 1 tab
Energy Technology Data Exchange (ETDEWEB)
Young, L; Yang, F [Univ Washington, Seattle, WA (United States)
2015-06-15
Purpose: The application of optically stimulated luminescence dosimeters (OSLDs) may be extended to clinical investigations verifying irradiated doses in small animal models. In proton beams, the accurate positioning of the Bragg peak is essential for tumor targeting. The purpose of this study was to estimate the displacement of a pristine Bragg peak when an Al2O3:C nanodot (Landauer, Inc.) is placed on the surface of a water phantom and to evaluate corresponding changes in dose. Methods: Clinical proton pencil beam simulations were carried out with using TOPAS, a Monte Carlo platform layered on top of GEANT4. Point-shaped beams with no energy spread were modeled for energies 100MV, 150MV, 200MV, and 250MV. Dose scoring for 100,000 particle histories was conducted within a water phantom (20cm × 20cm irradiated area, 40cm depth) with its surface placed 214.5cm away from the source. The modeled nanodot had a 4mm radius and 0.2mm thickness. Results: A comparative analysis of Monte Carlo depth dose profiles modeled for these proton pencil beams did not demonstrate an energy dependent in the Bragg peak shift. The shifts in Bragg Peak depth for water phantoms modeled with a nanodot on the phantom surface ranged between 2.7 to 3.2 mm. In all cases, the Bragg Peaks were shifted closer to the irradiation source. The peak dose in phantoms with an OSLD remained unchanged with percent dose differences less than 0.55% when compared to phantom doses without the nanodot. Conclusion: Monte Carlo calculations show that the presence of OSLD nanodots in proton beam therapy will not change the position of a pristine Bragg Peak by more than 3 mm. Although the 3.0 mm shift will not have a detrimental effect in patients receiving proton therapy, this effect may not be negligible in dose verification measurements for mouse models at lower proton beam energies.
Sahoo, G. S.; Tripathy, S. P.; Molokanov, A. G.; Aleynikov, V. E.; Sharma, S. D.; Bandyopadhyay, T.
2016-05-01
In this work, we have used CR-39 detectors to estimate the LET (linear energy transfer) spectrum of secondary particles due to 171 MeV proton beam at different depths of water including the Bragg peak region. The measured LET spectra were compared with those obtained from FLUKA Monte Carlo simulation. The absorbed dose (DLET), dose equivalent (HLET) were estimated using the LET spectra. The values of DLET and HLET per incident proton fluence were found to increase with the increase in depth of water and were maximum at Bragg peak.
Fuel-Cell Electrolytes Based on Organosilica Hybrid Proton Conductors
Narayan, Sri R.; Yen, Shiao-Pin S.
2008-01-01
A new membrane composite material that combines an organosilica proton conductor with perfluorinated Nafion material to achieve good proton conductivity and high-temperature performance for membranes used for fuel cells in stationary, transportation, and portable applications has been developed. To achieve high proton conductivities of the order of 10(exp -1)S/cm over a wide range of temperatures, a composite membrane based on a new class of mesoporous, proton-conducting, hydrogen-bonded organosilica, used with Nafion, will allow for water retention and high proton conductivity over a wider range of temperatures than currently offered by Nafion alone. At the time of this reporting, this innovation is at the concept level. Some of the materials and processes investigated have shown good proton conductivity, but membranes have not yet been prepared and demonstrated.
Proton Spin Based On Chiral Dynamics
Weber, H. J.
1999-01-01
Chiral spin fraction models agree with the proton spin data only when the chiral quark-Goldstone boson couplings are pure spinflip. For axial-vector coupling from soft-pion physics this is true for massless quarks but not for constituent quarks. Axial-vector quark-Goldstone boson couplings with {\\bf constituent} quarks are found to be inconsistent with the proton spin data.
Proton Conductivity and Operational Features Of PBI-Based Membranes
DEFF Research Database (Denmark)
Qingfeng, Li; Jensen, Jens Oluf; Precht Noyé, Pernille;
2005-01-01
As an approach to high temperature operation of PEMFCs, acid-doped PBI membranes are under active development. The membrane exhibits high proton conductivity under low water contents at temperatures up to 200°C. Mechanisms of proton conduction for the membranes have been proposed. Based...
Shielding calculations for a 250 MeV hospital-based proton accelerator
Energy Technology Data Exchange (ETDEWEB)
Agosteo, S. [Politecnico di Milano (Italy). Ist. di Ingegneria Nucleare; Arduini, G. [Fondazione Italiana Ricerca sul Cancro, via Corridoni 7, 20122 Milano (Italy); Bodei, G. [Politecnico di Milano (Italy). Ist. di Ingegneria Nucleare; Monti, S. [ENEA, ERG-FISS-FIRE, via Martiri di Monte Sole 4, 40129 Bologna (Italy); Padoani, F. [ENEA, ERG-FISS-FIRE, via Martiri di Monte Sole 4, 40129 Bologna (Italy); Silari, M. [Consiglio Nazionale delle Ricerche, Istituto Tecnologie Biomediche Avanzate, via Ampere 56, I-20131, Milano (Italy); Tinti, R. [ENEA, ERG-FISS-FIRE, via Martiri di Monte Sole 4, 40129 Bologna (Italy); Tromba, G. [Sincrotrone (``ELETTRA``) Trieste, Padriciano 99, 34012 Trieste (Italy)
1996-05-21
The accelerator shields (250 MeV protons, 400 MeV/u {sup 16}O{sup 8+} ions) and treatment rooms of the Hadrontherapy Centre, a hospital-based facility under design in Italy, were determined by means of Monte Carlo calculations. The LCS and FLUKA codes were employed, together with analytical estimates carried out by making use of empirical formulas from the literature, and the results compared. In the case of 250 MeV protons a 250 cm thick concrete wall ensures an annual dose equivalent lower than 2 mSv in the environments adjacent to the accelerator room. The best ceiling thickness was found to be 200 cm for a unitary occupancy factor. The photon dose equivalent beyond the concrete shield was also estimated using the LCS code. In the case of ions the shield thickness was calculated using empirical formulas from the literature; the concrete thicknesses calculated for protons should ensure the required dose equivalent when some local shields are added. Monte Carlo calculations of the treatment room shielding were also carried out using the FLUKA code. (orig.).
Shin, Wook-Geun; Shin, Jae-Ik; Jeong, Jong Hwi; Lee, Se Byeong
2015-01-01
For the in vivo range verification in proton therapy, it has been tried to measure the spatial distribution of the prompt gammas generated by the proton-induced interactions with the close relationship with the proton dose distribution. However, the high energy of the prompt gammas and background gammas are still problematic in measuring the distribution. In this study, we suggested a new method determining the in vivo range by utilizing the time structure of the prompt gammas formed with the rotation of a range modulation wheel (RMW) in the passive scattering proton therapy. To validate the Monte Carlo code simulating the proton beam nozzle, axial percent depth doses (PDDs) were compared with the measured PDDs with the varying beam range of 4.73-24.01 cm. And the relationship between the proton dose rate and the time structure of the prompt gammas was assessed and compared in the water phantom. The results of the PDD showed accurate agreement within the relative errors of 1.1% in the distal range and 2.9% in...
Clinical dosimetry in photon radiotherapy. A Monte Carlo based investigation
International Nuclear Information System (INIS)
Practical clinical dosimetry is a fundamental step within the radiation therapy process and aims at quantifying the absorbed radiation dose within a 1-2% uncertainty. To achieve this level of accuracy, corrections are needed for calibrated and air-filled ionization chambers, which are used for dose measurement. The procedures of correction are based on cavity theory of Spencer-Attix and are defined in current dosimetry protocols. Energy dependent corrections for deviations from calibration beams account for changed ionization chamber response in the treatment beam. The corrections applied are usually based on semi-analytical models or measurements and are generally hard to determine due to their magnitude of only a few percents or even less. Furthermore the corrections are defined for fixed geometrical reference-conditions and do not apply to non-reference conditions in modern radiotherapy applications. The stochastic Monte Carlo method for the simulation of radiation transport is becoming a valuable tool in the field of Medical Physics. As a suitable tool for calculation of these corrections with high accuracy the simulations enable the investigation of ionization chambers under various conditions. The aim of this work is the consistent investigation of ionization chamber dosimetry in photon radiation therapy with the use of Monte Carlo methods. Nowadays Monte Carlo systems exist, which enable the accurate calculation of ionization chamber response in principle. Still, their bare use for studies of this type is limited due to the long calculation times needed for a meaningful result with a small statistical uncertainty, inherent to every result of a Monte Carlo simulation. Besides heavy use of computer hardware, techniques methods of variance reduction to reduce the needed calculation time can be applied. Methods for increasing the efficiency in the results of simulation were developed and incorporated in a modern and established Monte Carlo simulation environment
Bauer, J; Unholtz, D; Kurz, C; Parodi, K
2013-08-01
We report on the experimental campaign carried out at the Heidelberg Ion-Beam Therapy Center (HIT) to optimize the Monte Carlo (MC) modelling of proton-induced positron-emitter production. The presented experimental strategy constitutes a pragmatic inverse approach to overcome the known uncertainties in the modelling of positron-emitter production due to the lack of reliable cross-section data for the relevant therapeutic energy range. This work is motivated by the clinical implementation of offline PET/CT-based treatment verification at our facility. Here, the irradiation induced tissue activation in the patient is monitored shortly after the treatment delivery by means of a commercial PET/CT scanner and compared to a MC simulated activity expectation, derived under the assumption of a correct treatment delivery. At HIT, the MC particle transport and interaction code FLUKA is used for the simulation of the expected positron-emitter yield. For this particular application, the code is coupled to externally provided cross-section data of several proton-induced reactions. Studying experimentally the positron-emitting radionuclide yield in homogeneous phantoms provides access to the fundamental production channels. Therefore, five different materials have been irradiated by monoenergetic proton pencil beams at various energies and the induced β(+) activity subsequently acquired with a commercial full-ring PET/CT scanner. With the analysis of dynamically reconstructed PET images, we are able to determine separately the spatial distribution of different radionuclide concentrations at the starting time of the PET scan. The laterally integrated radionuclide yields in depth are used to tune the input cross-section data such that the impact of both the physical production and the imaging process on the various positron-emitter yields is reproduced. The resulting cross-section data sets allow to model the absolute level of measured β(+) activity induced in the investigated
Bznuni, S A; Zhamkochyan, V M; Polanski, A; Sosnin, A N; Khudaverdyan, A H
2001-01-01
Parameters of a subcritical cascade reactor driven by a proton accelerator and based on a primary lead-bismuth target, main reactor constructed analogously to the molten salt breeder (MSBR) reactor core and a booster-reactor analogous to the core of the BN-350 liquid metal cooled fast breeder reactor (LMFBR). It is shown by means of Monte-Carlo modeling that the reactor under study provides safe operation modes (k_{eff}=0.94-0.98), is apable to transmute effectively radioactive nuclear waste and reduces by an order of magnitude the requirements on the accelerator beam current. Calculations show that the maximal neutron flux in the thermal zone is 10^{14} cm^{12}\\cdot s^_{-1}, in the fast booster zone is 5.12\\cdot10^{15} cm^{12}\\cdot s{-1} at k_{eff}=0.98 and proton beam current I=2.1 mA.
Quantitative Monte Carlo-based holmium-166 SPECT reconstruction
Energy Technology Data Exchange (ETDEWEB)
Elschot, Mattijs; Smits, Maarten L. J.; Nijsen, Johannes F. W.; Lam, Marnix G. E. H.; Zonnenberg, Bernard A.; Bosch, Maurice A. A. J. van den; Jong, Hugo W. A. M. de [Department of Radiology and Nuclear Medicine, University Medical Center Utrecht, Heidelberglaan 100, 3584 CX Utrecht (Netherlands); Viergever, Max A. [Image Sciences Institute, University Medical Center Utrecht, Heidelberglaan 100, 3584 CX Utrecht (Netherlands)
2013-11-15
Purpose: Quantitative imaging of the radionuclide distribution is of increasing interest for microsphere radioembolization (RE) of liver malignancies, to aid treatment planning and dosimetry. For this purpose, holmium-166 ({sup 166}Ho) microspheres have been developed, which can be visualized with a gamma camera. The objective of this work is to develop and evaluate a new reconstruction method for quantitative {sup 166}Ho SPECT, including Monte Carlo-based modeling of photon contributions from the full energy spectrum.Methods: A fast Monte Carlo (MC) simulator was developed for simulation of {sup 166}Ho projection images and incorporated in a statistical reconstruction algorithm (SPECT-fMC). Photon scatter and attenuation for all photons sampled from the full {sup 166}Ho energy spectrum were modeled during reconstruction by Monte Carlo simulations. The energy- and distance-dependent collimator-detector response was modeled using precalculated convolution kernels. Phantom experiments were performed to quantitatively evaluate image contrast, image noise, count errors, and activity recovery coefficients (ARCs) of SPECT-fMC in comparison with those of an energy window-based method for correction of down-scattered high-energy photons (SPECT-DSW) and a previously presented hybrid method that combines MC simulation of photopeak scatter with energy window-based estimation of down-scattered high-energy contributions (SPECT-ppMC+DSW). Additionally, the impact of SPECT-fMC on whole-body recovered activities (A{sup est}) and estimated radiation absorbed doses was evaluated using clinical SPECT data of six {sup 166}Ho RE patients.Results: At the same noise level, SPECT-fMC images showed substantially higher contrast than SPECT-DSW and SPECT-ppMC+DSW in spheres ≥17 mm in diameter. The count error was reduced from 29% (SPECT-DSW) and 25% (SPECT-ppMC+DSW) to 12% (SPECT-fMC). ARCs in five spherical volumes of 1.96–106.21 ml were improved from 32%–63% (SPECT-DSW) and 50%–80
GPU-Monte Carlo based fast IMRT plan optimization
Directory of Open Access Journals (Sweden)
Yongbao Li
2014-03-01
Full Text Available Purpose: Intensity-modulated radiation treatment (IMRT plan optimization needs pre-calculated beamlet dose distribution. Pencil-beam or superposition/convolution type algorithms are typically used because of high computation speed. However, inaccurate beamlet dose distributions, particularly in cases with high levels of inhomogeneity, may mislead optimization, hindering the resulting plan quality. It is desire to use Monte Carlo (MC methods for beamlet dose calculations. Yet, the long computational time from repeated dose calculations for a number of beamlets prevents this application. It is our objective to integrate a GPU-based MC dose engine in lung IMRT optimization using a novel two-steps workflow.Methods: A GPU-based MC code gDPM is used. Each particle is tagged with an index of a beamlet where the source particle is from. Deposit dose are stored separately for beamlets based on the index. Due to limited GPU memory size, a pyramid space is allocated for each beamlet, and dose outside the space is neglected. A two-steps optimization workflow is proposed for fast MC-based optimization. At first step, a rough dose calculation is conducted with only a few number of particle per beamlet. Plan optimization is followed to get an approximated fluence map. In the second step, more accurate beamlet doses are calculated, where sampled number of particles for a beamlet is proportional to the intensity determined previously. A second-round optimization is conducted, yielding the final result.Results: For a lung case with 5317 beamlets, 105 particles per beamlet in the first round, and 108 particles per beam in the second round are enough to get a good plan quality. The total simulation time is 96.4 sec.Conclusion: A fast GPU-based MC dose calculation method along with a novel two-step optimization workflow are developed. The high efficiency allows the use of MC for IMRT optimizations.--------------------------------Cite this article as: Li Y, Tian Z
The general base in the thymidylate synthase catalyzed proton abstraction.
Ghosh, Ananda K; Islam, Zahidul; Krueger, Jonathan; Abeysinghe, Thelma; Kohen, Amnon
2015-12-14
The enzyme thymidylate synthase (TSase), an important chemotherapeutic drug target, catalyzes the formation of 2'-deoxythymidine-5'-monophosphate (dTMP), a precursor of one of the DNA building blocks. TSase catalyzes a multi-step mechanism that includes the abstraction of a proton from the C5 of the substrate 2'-deoxyuridine-5'-monophosphate (dUMP). Previous studies on ecTSase proposed that an active-site residue, Y94 serves the role of the general base abstracting this proton. However, since Y94 is neither very basic, nor connected to basic residues, nor located close enough to the pyrimidine proton to be abstracted, the actual identity of this base remains enigmatic. Based on crystal structures, an alternative hypothesis is that the nearest potential proton-acceptor of C5 of dUMP is a water molecule that is part of a hydrogen bond (H-bond) network comprised of several water molecules and several protein residues including H147, E58, N177, and Y94. Here, we examine the role of the residue Y94 in the proton abstraction step by removing its hydroxyl group (Y94F mutant). We investigated the effect of the mutation on the temperature dependence of intrinsic kinetic isotope effects (KIEs) and found that these KIEs are more temperature dependent than those of the wild-type enzyme (WT). These results suggest that the phenolic -OH of Y94 is a component of the transition state for the proton abstraction step. The findings further support the hypothesis that no single functional group is the general base, but a network of bases and hydroxyls (from water molecules and tyrosine) sharing H-bonds across the active site can serve the role of the general base to remove the pyrimidine proton.
Matsumoto, Shinnosuke; Koba, Yusuke; Kohno, Ryosuke; Lee, Choonsik; Bolch, Wesley E; Kai, Michiaki
2016-04-01
Proton therapy has the physical advantage of a Bragg peak that can provide a better dose distribution than conventional x-ray therapy. However, radiation exposure of normal tissues cannot be ignored because it is likely to increase the risk of secondary cancer. Evaluating secondary neutrons generated by the interaction of the proton beam with the treatment beam-line structure is necessary; thus, performing the optimization of radiation protection in proton therapy is required. In this research, the organ dose and energy spectrum were calculated from secondary neutrons using Monte Carlo simulations. The Monte Carlo code known as the Particle and Heavy Ion Transport code System (PHITS) was used to simulate the transport proton and its interaction with the treatment beam-line structure that modeled the double scattering body of the treatment nozzle at the National Cancer Center Hospital East. The doses of the organs in a hybrid computational phantom simulating a 5-y-old boy were calculated. In general, secondary neutron doses were found to decrease with increasing distance to the treatment field. Secondary neutron energy spectra were characterized by incident neutrons with three energy peaks: 1×10, 1, and 100 MeV. A block collimator and a patient collimator contributed significantly to organ doses. In particular, the secondary neutrons from the patient collimator were 30 times higher than those from the first scatter. These results suggested that proactive protection will be required in the design of the treatment beam-line structures and that organ doses from secondary neutrons may be able to be reduced. PMID:26910030
Energy Technology Data Exchange (ETDEWEB)
Pignol, J.-P. [Toronto-Sunnybrook Regional Cancer Centre, Radiotherapy Dept., Toronto, Ontario (Canada); Slabbert, J. [National Accelerator Centre, Faure (South Africa)
2001-02-01
Fast neutrons (FN) have a higher radio-biological effectiveness (RBE) compared with photons, however the mechanism of this increase remains a controversial issue. RBE variations are seen among various FN facilities and at the same facility when different tissue depths or thicknesses of hardening filters are used. These variations lead to uncertainties in dose reporting as well as in the comparisons of clinical results. Besides radiobiology and microdosimetry, another powerful method for the characterization of FN beams is the calculation of total proton and heavy ion kerma spectra. FLUKA and MCNP Monte Carlo code were used to simulate these kerma spectra following a set of microdosimetry measurements performed at the National Accelerator Centre. The calculated spectra confirmed major classical statements: RBE increase is linked to both slow energy protons and alpha particles yielded by (n,{alpha}) reactions on carbon and oxygen nuclei. The slow energy protons are produced by neutrons having an energy between 10 keV and 10 MeV, while the alpha particles are produced by neutrons having an energy between 10 keV and 15 MeV. Looking at the heavy ion kerma from <15 MeV and the proton kerma from neutrons <10 MeV, it is possible to anticipate y* and RBE trends. (author)
International Nuclear Information System (INIS)
Purpose: Proton therapy exhibits several advantages over photon therapy due to depth-dose distributions from proton interactions within the target material. However, uncertainties associated with protons beam range in the patient limit the advantage of proton therapy applications. To quantify beam range, positron-emitting nuclei (PEN) and prompt gamma (PG) techniques have been developed. These techniques use de-excitation photons to describe the location of the beam in the patient. To develop a detector system for implementing the PG technique for range verification applications in proton therapy, we studied the yields, energy and angular distributions of the secondary particles emitted from a PMMA phantom. Methods: Proton pencil beams of various energies incident onto a PMMA phantom with dimensions of 5 x 5 x 50 cm3 were used for simulation with the Geant4 toolkit using the standard electromagnetic packages as well as the packages based on the binary-cascade nuclear model. The emitted secondary particles are analyzed . Results: For 160 MeV incident protons, the yields of secondary neutrons and photons per 100 incident protons were ~6 and ~15 respectively. Secondary photon energy spectrum showed several energy peaks in the range between 0 and 10 MeV. The energy peaks located between 4 and 6 MeV were attributed to originate from direct proton interactions with 12C (~ 4.4 MeV) and 16O (~ 6 MeV), respectively. Most of the escaping secondary neutrons were found to have energies between 10 and 100 MeV. Isotropic emissions were found for lower energy neutrons (<10 MeV) and photons for all energies, while higher energy neutrons were emitted predominantly in the forward direction. The yields of emitted photons and neutrons increased with the increase of incident proton energies. Conclusions: A detector system is currently being developed incorporating the yields, energy and angular distributions of secondary particles from proton interactions obtained from this study
GPU based Monte Carlo for PET image reconstruction: parameter optimization
International Nuclear Information System (INIS)
This paper presents the optimization of a fully Monte Carlo (MC) based iterative image reconstruction of Positron Emission Tomography (PET) measurements. With our MC re- construction method all the physical effects in a PET system are taken into account thus superior image quality is achieved in exchange for increased computational effort. The method is feasible because we utilize the enormous processing power of Graphical Processing Units (GPUs) to solve the inherently parallel problem of photon transport. The MC approach regards the simulated positron decays as samples in mathematical sums required in the iterative reconstruction algorithm, so to complement the fast architecture, our work of optimization focuses on the number of simulated positron decays required to obtain sufficient image quality. We have achieved significant results in determining the optimal number of samples for arbitrary measurement data, this allows to achieve the best image quality with the least possible computational effort. Based on this research recommendations can be given for effective partitioning of computational effort into the iterations in limited time reconstructions. (author)
Proton radiography and proton computed tomography based on time-resolved dose measurements.
Testa, Mauro; Verburg, Joost M; Rose, Mark; Min, Chul Hee; Tang, Shikui; Bentefour, El Hassane; Paganetti, Harald; Lu, Hsiao-Ming
2013-11-21
We present a proof of principle study of proton radiography and proton computed tomography (pCT) based on time-resolved dose measurements. We used a prototype, two-dimensional, diode-array detector capable of fast dose rate measurements, to acquire proton radiographic images expressed directly in water equivalent path length (WEPL). The technique is based on the time dependence of the dose distribution delivered by a proton beam traversing a range modulator wheel in passive scattering proton therapy systems. The dose rate produced in the medium by such a system is periodic and has a unique pattern in time at each point along the beam path and thus encodes the WEPL. By measuring the time dose pattern at the point of interest, the WEPL to this point can be decoded. If one measures the time–dose patterns at points on a plane behind the patient for a beam with sufficient energy to penetrate the patient, the obtained 2D distribution of the WEPL forms an image. The technique requires only a 2D dosimeter array and it uses only the clinical beam for a fraction of second with negligible dose to patient. We first evaluated the accuracy of the technique in determining the WEPL for static phantoms aiming at beam range verification of the brain fields of medulloblastoma patients. Accurate beam ranges for these fields can significantly reduce the dose to the cranial skin of the patient and thus the risk of permanent alopecia. Second, we investigated the potential features of the technique for real-time imaging of a moving phantom. Real-time tumor tracking by proton radiography could provide more accurate validations of tumor motion models due to the more sensitive dependence of proton beam on tissue density compared to x-rays. Our radiographic technique is rapid (~100 ms) and simultaneous over the whole field, it can image mobile tumors without the problem of interplay effect inherently challenging for methods based on pencil beams. Third, we present the reconstructed p
Proton radiography and proton computed tomography based on time-resolved dose measurements
Testa, Mauro; Verburg, Joost M.; Rose, Mark; Min, Chul Hee; Tang, Shikui; Hassane Bentefour, El; Paganetti, Harald; Lu, Hsiao-Ming
2013-11-01
We present a proof of principle study of proton radiography and proton computed tomography (pCT) based on time-resolved dose measurements. We used a prototype, two-dimensional, diode-array detector capable of fast dose rate measurements, to acquire proton radiographic images expressed directly in water equivalent path length (WEPL). The technique is based on the time dependence of the dose distribution delivered by a proton beam traversing a range modulator wheel in passive scattering proton therapy systems. The dose rate produced in the medium by such a system is periodic and has a unique pattern in time at each point along the beam path and thus encodes the WEPL. By measuring the time dose pattern at the point of interest, the WEPL to this point can be decoded. If one measures the time-dose patterns at points on a plane behind the patient for a beam with sufficient energy to penetrate the patient, the obtained 2D distribution of the WEPL forms an image. The technique requires only a 2D dosimeter array and it uses only the clinical beam for a fraction of second with negligible dose to patient. We first evaluated the accuracy of the technique in determining the WEPL for static phantoms aiming at beam range verification of the brain fields of medulloblastoma patients. Accurate beam ranges for these fields can significantly reduce the dose to the cranial skin of the patient and thus the risk of permanent alopecia. Second, we investigated the potential features of the technique for real-time imaging of a moving phantom. Real-time tumor tracking by proton radiography could provide more accurate validations of tumor motion models due to the more sensitive dependence of proton beam on tissue density compared to x-rays. Our radiographic technique is rapid (˜100 ms) and simultaneous over the whole field, it can image mobile tumors without the problem of interplay effect inherently challenging for methods based on pencil beams. Third, we present the reconstructed p
Image based Monte Carlo Modeling for Computational Phantom
Cheng, Mengyun; Wang, Wen; Zhao, Kai; Fan, Yanchang; Long, Pengcheng; Wu, Yican
2014-06-01
The evaluation on the effects of ionizing radiation and the risk of radiation exposure on human body has been becoming one of the most important issues for radiation protection and radiotherapy fields, which is helpful to avoid unnecessary radiation and decrease harm to human body. In order to accurately evaluate the dose on human body, it is necessary to construct more realistic computational phantom. However, manual description and verfication of the models for Monte carlo(MC)simulation are very tedious, error-prone and time-consuming. In addiation, it is difficult to locate and fix the geometry error, and difficult to describe material information and assign it to cells. MCAM (CAD/Image-based Automatic Modeling Program for Neutronics and Radiation Transport Simulation) was developed as an interface program to achieve both CAD- and image-based automatic modeling by FDS Team (Advanced Nuclear Energy Research Team, http://www.fds.org.cn). The advanced version (Version 6) of MCAM can achieve automatic conversion from CT/segmented sectioned images to computational phantoms such as MCNP models. Imaged-based automatic modeling program(MCAM6.0) has been tested by several medical images and sectioned images. And it has been applied in the construction of Rad-HUMAN. Following manual segmentation and 3D reconstruction, a whole-body computational phantom of Chinese adult female called Rad-HUMAN was created by using MCAM6.0 from sectioned images of a Chinese visible human dataset. Rad-HUMAN contains 46 organs/tissues, which faithfully represented the average anatomical characteristics of the Chinese female. The dose conversion coefficients(Dt/Ka) from kerma free-in-air to absorbed dose of Rad-HUMAN were calculated. Rad-HUMAN can be applied to predict and evaluate dose distributions in the Treatment Plan System (TPS), as well as radiation exposure for human body in radiation protection.
Cell death following BNCT: a theoretical approach based on Monte Carlo simulations.
Ballarini, F; Bakeine, J; Bortolussi, S; Bruschi, P; Cansolino, L; Clerici, A M; Ferrari, C; Protti, N; Stella, S; Zonta, A; Zonta, C; Altieri, S
2011-12-01
In parallel to boron measurements and animal studies, investigations on radiation-induced cell death are also in progress in Pavia, with the aim of better characterisation of the effects of a BNCT treatment down to the cellular level. Such studies are being carried out not only experimentally but also theoretically, based on a mechanistic model and a Monte Carlo code. Such model assumes that: (1) only clustered DNA strand breaks can lead to chromosome aberrations; (2) only chromosome fragments within a certain threshold distance can undergo misrejoining; (3) the so-called "lethal aberrations" (dicentrics, rings and large deletions) lead to cell death. After applying the model to normal cells exposed to monochromatic fields of different radiation types, the irradiation section of the code was purposely extended to mimic the cell exposure to a mixed radiation field produced by the (10)B(n,α) (7)Li reaction, which gives rise to alpha particles and Li ions of short range and high biological effectiveness, and by the (14)N(n,p)(14)C reaction, which produces 0.58 MeV protons. Very good agreement between model predictions and literature data was found for human and animal cells exposed to X- or gamma-rays, protons and alpha particles, thus allowing to validate the model for cell death induced by monochromatic radiation fields. The model predictions showed good agreement also with experimental data obtained by our group exposing DHD cells to thermal neutrons in the TRIGA Mark II reactor of the University of Pavia; this allowed to validate the model also for a BNCT exposure scenario, providing a useful predictive tool to bridge the gap between irradiation and cell death. PMID:21481595
Jie, Binbin; Sah, Chihtang
Pure water has been characterized empirically for nearly a century, as dissociation into hydronium (H3O)1+ and hydroxide (HO)1- ions. Last March, we reported that the ~40 year experimental industrial standard of chemical equilibrium reaction constant, the ion product, can be accounted for by a statistical-physics-based concentration product of two electrical charge carriers, the positively charged protons, p+, and the negatively charged proton holes or prohols, p-, with a thermal activation energy or proton trapping well depth of Ep + / p - = 576 meV, in the 0-100OC pure liquid water. We now report that the empirically fitted industrial standard experimental data (1985, 1987, 2005) of the two dc ion mobilities in liquid water, can also be accounted for by trapping-limited drift of protons and prohols through proton channels of lower proton electrical potential valleys, Ep+/0 water rule.
Proton-beam writing channel based on an electrostatic accelerator
Lapin, A. S.; Rebrov, V. A.; Kolin'ko, S. V.; Salivon, V. F.; Ponomarev, A. G.
2016-09-01
We have described the structure of the proton-beam writing channel as a continuation of a nuclear scanning microprobe channel. The problem of the accuracy of positioning a probe by constructing a new high-frequency electrostatic scanning system has been solved. Special attention has been paid to designing the probe-forming system and its various configurations have been considered. The probe-forming system that best corresponds to the conditions of the lithographic process has been found based on solving the problem of optimizing proton beam formation. A system for controlling beam scanning using multifunctional module of integrated programmable logic systems has been developed.
A global reaction route mapping-based kinetic Monte Carlo algorithm.
Mitchell, Izaac; Irle, Stephan; Page, Alister J
2016-07-14
We propose a new on-the-fly kinetic Monte Carlo (KMC) method that is based on exhaustive potential energy surface searching carried out with the global reaction route mapping (GRRM) algorithm. Starting from any given equilibrium state, this GRRM-KMC algorithm performs a one-step GRRM search to identify all surrounding transition states. Intrinsic reaction coordinate pathways are then calculated to identify potential subsequent equilibrium states. Harmonic transition state theory is used to calculate rate constants for all potential pathways, before a standard KMC accept/reject selection is performed. The selected pathway is then used to propagate the system forward in time, which is calculated on the basis of 1st order kinetics. The GRRM-KMC algorithm is validated here in two challenging contexts: intramolecular proton transfer in malonaldehyde and surface carbon diffusion on an iron nanoparticle. We demonstrate that in both cases the GRRM-KMC method is capable of reproducing the 1st order kinetics observed during independent quantum chemical molecular dynamics simulations using the density-functional tight-binding potential. PMID:27421395
A global reaction route mapping-based kinetic Monte Carlo algorithm
Mitchell, Izaac; Irle, Stephan; Page, Alister J.
2016-07-01
We propose a new on-the-fly kinetic Monte Carlo (KMC) method that is based on exhaustive potential energy surface searching carried out with the global reaction route mapping (GRRM) algorithm. Starting from any given equilibrium state, this GRRM-KMC algorithm performs a one-step GRRM search to identify all surrounding transition states. Intrinsic reaction coordinate pathways are then calculated to identify potential subsequent equilibrium states. Harmonic transition state theory is used to calculate rate constants for all potential pathways, before a standard KMC accept/reject selection is performed. The selected pathway is then used to propagate the system forward in time, which is calculated on the basis of 1st order kinetics. The GRRM-KMC algorithm is validated here in two challenging contexts: intramolecular proton transfer in malonaldehyde and surface carbon diffusion on an iron nanoparticle. We demonstrate that in both cases the GRRM-KMC method is capable of reproducing the 1st order kinetics observed during independent quantum chemical molecular dynamics simulations using the density-functional tight-binding potential.
Proton irradiation of liquid crystal based adaptive optical devices
International Nuclear Information System (INIS)
To assess its radiation hardness, a liquid crystal based adaptive optical element has been irradiated using a 60 MeV proton beam. The device with the functionality of an optical beam steerer was characterised before, during and after the irradiation. A systematic set of measurements on the transmission and beam deflection angles was carried out. The measurements showed that the transmission decreased only marginally and that its optical performance degraded only after a very high proton fluence (1010p/cm2). The device showed complete annealing in the functionality as a beam steerer, which leads to the conclusion that the liquid crystal technology for optical devices is not vulnerable to proton irradiation as expected in space.
International Nuclear Information System (INIS)
Proton interaction with an exposed object material needs to be modeled with account for three basic processes: electromagnetic stopping of protons in matter, multiple coulomb scattering and nuclear interactions. Just the last type of processes is the topic of this paper. Monte Carlo codes are often used to simulate high-energy particle interaction with matter. However, nuclear interaction models implemented in these codes are rather extensive and their use in treatment planning systems requires huge computational resources. We have selected the IThMC code for its ability to reproduce experiments which measure the distribution of the projected ranges of nuclear secondary particles generated by proton beams in a multi-layer Faraday cup. The multi-layer Faraday cup detectors measure charge rather than dose and allow distinguishing between electromagnetic and nuclear interactions. The event generator used in the IThMC code is faster, but less accurate than any other used in testing. Our model of nuclear reactions demonstrates quite good agreement with experiment in the context of their effect on the Bragg peak in therapeutic applications
Clinical results of proton beam therapy for skull base chordoma
International Nuclear Information System (INIS)
Purpose: To evaluate clinical results of proton beam therapy for patients with skull base chordoma. Methods and materials: Thirteen patients with skull base chordoma who were treated with proton beams with or without X-rays at the University of Tsukuba between 1989 and 2000 were retrospectively reviewed. A median total tumor dose of 72.0 Gy (range, 63.0-95.0 Gy) was delivered. The patients were followed for a median period of 69.3 months (range, 14.6-123.4 months). Results: The 5-year local control rate was 46.0%. Cause-specific, overall, and disease-free survival rates at 5 years were 72.2%, 66.7%, and 42.2%, respectively. The local control rate was higher, without statistical significance, for those with preoperative tumors <30 mL. Partial or subtotal tumor removal did not yield better local control rates than for patients who underwent biopsy only as the latest surgery. Conclusion: Proton beam therapy is effective for patients with skull base chordoma, especially for those with small tumors. For a patient with a tumor of <30 mL with no prior treatment, biopsy without tumor removal seems to be appropriate before proton beam therapy
Smol'janinova, T I; Zhidkov, V A; Sokolov, G V
1982-01-01
The titration curves of nitrogen bases and fractions of disordered nucleotide pairs are obtained during DNA protonation. It is shown that purine bases are the first sites of the DNA double helix protonation. The cytosine protonation is due to proton-induced conformational transition within GC pairs with the sequence proton transfer from (N-7) of guanine to (N-3) of cytosine. Within DNA with unwound regions the bases are protonated in the following order: cytosine, adenine, guanine. It is shown that GC pairs are the primary centres in which the unwinding of protonated DNAs occurs. PMID:7079177
Monte Carlo-based simulation of dynamic jaws tomotherapy
International Nuclear Information System (INIS)
Purpose: Original TomoTherapy systems may involve a trade-off between conformity and treatment speed, the user being limited to three slice widths (1.0, 2.5, and 5.0 cm). This could be overcome by allowing the jaws to define arbitrary fields, including very small slice widths (<1 cm), which are challenging for a beam model. The aim of this work was to incorporate the dynamic jaws feature into a Monte Carlo (MC) model called TomoPen, based on the MC code PENELOPE, previously validated for the original TomoTherapy system. Methods: To keep the general structure of TomoPen and its efficiency, the simulation strategy introduces several techniques: (1) weight modifiers to account for any jaw settings using only the 5 cm phase-space file; (2) a simplified MC based model called FastStatic to compute the modifiers faster than pure MC; (3) actual simulation of dynamic jaws. Weight modifiers computed with both FastStatic and pure MC were compared. Dynamic jaws simulations were compared with the convolution/superposition (C/S) of TomoTherapy in the ''cheese'' phantom for a plan with two targets longitudinally separated by a gap of 3 cm. Optimization was performed in two modes: asymmetric jaws-constant couch speed (''running start stop,'' RSS) and symmetric jaws-variable couch speed (''symmetric running start stop,'' SRSS). Measurements with EDR2 films were also performed for RSS for the formal validation of TomoPen with dynamic jaws. Results: Weight modifiers computed with FastStatic were equivalent to pure MC within statistical uncertainties (0.5% for three standard deviations). Excellent agreement was achieved between TomoPen and C/S for both asymmetric jaw opening/constant couch speed and symmetric jaw opening/variable couch speed, with deviations well within 2%/2 mm. For RSS procedure, agreement between C/S and measurements was within 2%/2 mm for 95% of the points and 3%/3 mm for 98% of the points, where dose is greater than 30% of the prescription dose (gamma analysis
MCHITS: Monte Carlo based Method for Hyperlink Induced Topic Search on Networks
Directory of Open Access Journals (Sweden)
Zhaoyan Jin
2013-10-01
Full Text Available Hyperlink Induced Topic Search (HITS is the most authoritative and most widely used personalized ranking algorithm on networks. The HITS algorithm ranks nodes on networks according to power iteration, and has high complexity of computation. This paper models the HITS algorithm with the Monte Carlo method, and proposes Monte Carlo based algorithms for the HITS computation. Theoretical analysis and experiments show that the Monte Carlo based approximate computing of the HITS ranking reduces computing resources a lot while keeping higher accuracy, and is significantly better than related works
Energy Technology Data Exchange (ETDEWEB)
Incerti, S., E-mail: sebastien.incerti@tdt.edu.vn [Division of Nuclear Physics, Ton Duc Thang University, Ho Chi Minh City (Viet Nam); Faculty of Applied Sciences, Ton Duc Thang University, Ho Chi Minh City (Viet Nam); Univ. Bordeaux, CENBG, UMR 5797, F-33170 Gradignan (France); CNRS, IN2P3, CENBG, UMR 5797, F-33170 Gradignan (France); Barberet, Ph.; Dévès, G.; Michelet, C. [Univ. Bordeaux, CENBG, UMR 5797, F-33170 Gradignan (France); CNRS, IN2P3, CENBG, UMR 5797, F-33170 Gradignan (France); Francis, Z. [Université Saint Joseph, Science Faculty, Department of Physics, Beirut (Lebanon); Ivantchenko, V. [Ecoanalytica, Moscow (Russian Federation); Geant4 Associates International Ltd, Hebden Bridge (United Kingdom); Mantero, A. [SWHARD srl, via Greto di Cornigliano 6r, 16152 Genova (Italy); El Bitar, Z. [Institut Pluridisciplinaire Hubert Curien, CNRS/IN2P3, 67037 Strasbourg Cedex (France); Bernal, M.A. [Instituto de Física Gleb Wataghin, Universidade Estadual de Campinas, SP (Brazil); Tran, H.N. [Division of Nuclear Physics, Ton Duc Thang University, Ho Chi Minh City (Viet Nam); Faculty of Applied Sciences, Ton Duc Thang University, Ho Chi Minh City (Viet Nam); Karamitros, M.; Seznec, H. [Univ. Bordeaux, CENBG, UMR 5797, F-33170 Gradignan (France); CNRS, IN2P3, CENBG, UMR 5797, F-33170 Gradignan (France)
2015-09-01
The general purpose Geant4 Monte Carlo simulation toolkit is able to simulate radiative and non-radiative atomic de-excitation processes such as fluorescence and Auger electron emission, occurring after interaction of incident ionising radiation with target atomic electrons. In this paper, we evaluate the Geant4 modelling capability for the simulation of fluorescence spectra induced by 1.5 MeV proton irradiation of thin high-Z foils (Fe, GdF{sub 3}, Pt, Au) with potential interest for nanotechnologies and life sciences. Simulation results are compared to measurements performed at the Centre d’Etudes Nucléaires de Bordeaux-Gradignan AIFIRA nanobeam line irradiation facility in France. Simulation and experimental conditions are described and the influence of Geant4 electromagnetic physics models is discussed.
An empirical formula based on Monte Carlo simulation for diffuse reflectance from turbid media
Gnanatheepam, Einstein; Aruna, Prakasa Rao; Ganesan, Singaravelu
2016-03-01
Diffuse reflectance spectroscopy has been widely used in diagnostic oncology and characterization of laser irradiated tissue. However, still accurate and simple analytical equation does not exist for estimation of diffuse reflectance from turbid media. In this work, a diffuse reflectance lookup table for a range of tissue optical properties was generated using Monte Carlo simulation. Based on the generated Monte Carlo lookup table, an empirical formula for diffuse reflectance was developed using surface fitting method. The variance between the Monte Carlo lookup table surface and the surface obtained from the proposed empirical formula is less than 1%. The proposed empirical formula may be used for modeling of diffuse reflectance from tissue.
TH-A-19A-06: Site-Specific Comparison of Analytical and Monte Carlo Based Dose Calculations
Energy Technology Data Exchange (ETDEWEB)
Schuemann, J; Grassberger, C; Paganetti, H [Massachusetts General Hospital and Harvard Medical School, Boston, MA (United States); Dowdell, S [Illawarra Shoalhaven Local Health District, Wollongong (Australia)
2014-06-15
Purpose: To investigate the impact of complex patient geometries on the capability of analytical dose calculation algorithms to accurately predict dose distributions and to verify currently used uncertainty margins in proton therapy. Methods: Dose distributions predicted by an analytical pencilbeam algorithm were compared with Monte Carlo simulations (MCS) using TOPAS. 79 complete patient treatment plans were investigated for 7 disease sites (liver, prostate, breast, medulloblastoma spine and whole brain, lung and head and neck). A total of 508 individual passively scattered treatment fields were analyzed for field specific properties. Comparisons based on target coverage indices (EUD, D95, D90 and D50) were performed. Range differences were estimated for the distal position of the 90% dose level (R90) and the 50% dose level (R50). Two-dimensional distal dose surfaces were calculated and the root mean square differences (RMSD), average range difference (ARD) and average distal dose degradation (ADD), the distance between the distal position of the 80% and 20% dose levels (R80- R20), were analyzed. Results: We found target coverage indices calculated by TOPAS to generally be around 1–2% lower than predicted by the analytical algorithm. Differences in R90 predicted by TOPAS and the planning system can be larger than currently applied range margins in proton therapy for small regions distal to the target volume. We estimate new site-specific range margins (R90) for analytical dose calculations considering total range uncertainties and uncertainties from dose calculation alone based on the RMSD. Our results demonstrate that a reduction of currently used uncertainty margins is feasible for liver, prostate and whole brain fields even without introducing MC dose calculations. Conclusion: Analytical dose calculation algorithms predict dose distributions within clinical limits for more homogeneous patients sites (liver, prostate, whole brain). However, we recommend
Energy Technology Data Exchange (ETDEWEB)
Lindsay, C; Jirasek, A [University of Victoria (Australia); Blackmore, E; Hoehr, C; Schaffer, P; Trinczek, M [TRIUMF (Canada); Sossi, V [University of British Columbia (Canada)
2014-08-15
Uveal melanoma is a rare and deadly tumour of the eye with primary metastases in the liver resulting in an 8% 2-year survival rate upon detection. Large growths, or those in close proximity to the optic nerve, pose a particular challenge to the commonly employed eye-sparing technique of eye-plaque brachytherapy. In these cases external beam charged particle therapy offers improved odds in avoiding catastrophic side effects such as neuropathy or blindness. Since 1995, the British Columbia Cancer Agency in partnership with the TRIUMF national laboratory have offered proton therapy in the treatment of difficult ocular tumors. Having seen 175 patients, yielding 80% globe preservation and 82% metastasis free survival as of 2010, this modality has proven to be highly effective. Despite this success, there have been few studies into the use of the world's largest cyclotron in patient care. Here we describe first efforts of modeling the TRIUMF dose delivery system using the FLUKA Monte Carlo package. Details on geometry, estimating beam parameters, measurement of primary dose and simulation of PET isotope production are discussed. Proton depth dose in both modulated and pristine beams is successfully simulated to sub-millimeter precision in range (within limits of measurement) and 2% agreement to measurement within in a treatment volume. With the goal of using PET signals for in vivo dosimetry (alignment), a first look at PET isotope depth distribution is presented — comparing favourably to a naive method of approximating simulated PET slice activity in a Lucite phantom.
Burris-Mog, Trevor J.
The interaction of intense laser light (I > 10 18 W/cm2) with a thin target foil leads to the Target Normal Sheath Acceleration mechanism (TNSA). TNSA is responsible for the generation of high current, ultra-low emittance proton beams, which may allow for the development of a compact and cost effective proton therapy system for the treatment of cancer. Before this application can be realized, control is needed over the large divergence and the 100% kinetic energy spread that are characteristic of TNSA proton beams. The work presented here demonstrates control over the divergence and energy spread using strong magnetic fields generated by a pulse power solenoid. The solenoidal field results in a parallel proton beam with a kinetic energy spread DeltaE/E = 10%. Assuming that next generation lasers will be able to operate at 10 Hz, the 10% spread in the kinetic energy along with the 23% capture efficiency of the solenoid yield enough protons per laser pulse to, for the first time, consider applications in Radiation Oncology. Current lasers can generate proton beams with kinetic energies up to 67.5 MeV, but for therapy applications, the proton kinetic energy must reach 250 MeV. Since the maximum kinetic energy Emax of the proton scales with laser light intensity as Emax ∝ I0.5, next generation lasers may very well accelerate 250 MeV protons. As the kinetic energy of the protons is increased, the magnetic field strength of the solenoid will need to increase. The scaling of the magnetic field B with the kinetic energy of the protons follows B ∝ E1/2. Therefor, the field strength of the solenoid presented in this work will need to be increased by a factor of 2.4 in order to accommodate 250 MeV protons. This scaling factor seems reasonable, even with present technology. This work not only demonstrates control over beam divergence and energy spread, it also allows for us to now perform feasibility studies to further research what a laser-based proton therapy system
Ulmer, W.; Schaffner, B.
2010-01-01
We have developed a model for proton depth dose and lateral distributions based on Monte Carlo calculations (GEANT4) and an integration procedure of the Bethe-Bloch equation (BBE). The model accounts for the transport of primary and secondary protons, the creation of recoil protons and heavy recoil nuclei as well as lateral scattering of these contributions. The buildup, which is experimentally observed in higher energy depth dose curves, is modeled by inclusion of two different origins: 1. S...
Eisenhart, Thomas T; Howland, William C; Dempsey, Jillian L
2016-08-18
The proton-coupled electron transfer (PCET) oxidation of p-aminophenol in acetonitrile was initiated via stopped-flow rapid-mixing and spectroscopically monitored. For oxidation by ferrocenium in the presence of 7-(dimethylamino)quinoline proton acceptors, both the electron transfer and proton transfer components could be optically monitored in the visible region; the decay of the ferrocenium absorbance is readily monitored (λmax = 620 nm), and the absorbance of the 2,4-substituted 7-(dimethylamino)quinoline derivatives (λmax = 370-392 nm) red-shifts substantially (ca. 70 nm) upon protonation. Spectral analysis revealed the reaction proceeds via a stepwise electron transfer-proton transfer process, and modeling of the kinetics traces monitoring the ferrocenium and quinolinium signals provided rate constants for elementary proton and electron transfer steps. As the pKa values of the conjugate acids of the 2,4-R-7-(dimethylamino)quinoline derivatives employed were readily tuned by varying the substituents at the 2- and 4-positions of the quinoline backbone, the driving force for proton transfer was systematically varied. Proton transfer rate constants (kPT,2 = (1.5-7.5) × 10(8) M(-1) s(-1), kPT,4 = (0.55-3.0) × 10(7) M(-1) s(-1)) were found to correlate with the pKa of the conjugate acid of the proton acceptor, in agreement with anticipated free energy relationships for proton transfer processes in PCET reactions. PMID:27500804
Ulmer, W
2010-01-01
We have developed a model for proton depth dose and lateral distributions based on Monte Carlo calculations (GEANT4) and an integration procedure of the Bethe-Bloch equation (BBE). The model accounts for the transport of primary and secondary protons, the creation of recoil protons and heavy recoil nuclei as well as lateral scattering of these contributions. The buildup, which is experimentally observed in higher energy depth dose curves, is modeled by inclusion of two different origins: 1. Secondary reaction protons with a contribution of ca. 65 % of the buildup (for monoenergetic protons). 2. Landau tails as well as Gaussian type of fluctuations for range straggling effects. All parameters of the model for initially monoenergetic proton beams have been obtained from Monte Carlo calculations or checked by them. Furthermore, there are a few parameters, which can be obtained by fitting the model to measured depth dose curves in order to describe individual characteristics of the beamline - the most important b...
Proton radiotherapy in management of pediatric base of skull tumors
International Nuclear Information System (INIS)
Purpose: Primary skull base tumors of the developing child are rare and present a formidable challenge to both surgeons and radiation oncologists. Gross total resection with negative margins is rarely achieved, and the risks of functional, structural, and cosmetic deficits limit the radiation dose using conventional radiation techniques. Twenty-nine children and adolescents treated with conformal proton radiotherapy (proton RT) were analyzed to assess treatment efficacy and safety. Methods and Materials: Between July 1992 and April 1999, 29 patients with mesenchymal tumors underwent fractionated proton (13 patients) or fractionated combined proton and photon (16 patients) irradiation. The age at treatment ranged from 1 to 19 years (median 12); 14 patients were male and 15 female. Tumors were grouped as malignant or benign. Twenty patients had malignant histologic findings, including chordoma (n=10), chondrosarcoma (n=3), rhabdomyosarcoma (n=4), and other sarcomas (n=3). Target doses ranged between 50.4 and 78.6 Gy/cobalt Gray equivalent (CGE), delivered at doses of 1.8-2.0 Gy/CGE per fraction. The benign histologic findings included giant cell tumors (n=6), angiofibromas (n=2), and chondroblastoma (n=1). RT doses for this group ranged from 45.0 to 71.8 Gy/CGE. Despite maximal surgical resection, 28 (97%) of 29 patients had gross disease at the time of proton RT. Follow-up after proton RT ranged from 13 to 92 months (mean 40). Results: Of the 20 patients with malignant tumors, 5 (25%) had local failure; 1 patient had failure in the surgical access route and 3 patients developed distant metastases. Seven patients had died of progressive disease at the time of analysis. Local tumor control was maintained in 6 (60%) of 10 patients with chordoma, 3 (100%) of 3 with chondrosarcoma, 4 (100%) of 4 with rhabdomyosarcoma, and 2 (66%) of 3 with other sarcomas. The actuarial 5-year local control and overall survival rate was 72% and 56%, respectively, and the overall survival
Obot, I. B.; Kaya, Savaş; Kaya, Cemal; Tüzün, Burak
2016-06-01
DFT and Monte Carlo simulation were performed on three Schiff bases namely, 4-(4-bromophenyl)-N‧-(4-methoxybenzylidene)thiazole-2-carbohydrazide (BMTC), 4-(4-bromophenyl)-N‧-(2,4-dimethoxybenzylidene)thiazole-2-carbohydrazide (BDTC), 4-(4-bromophenyl)-N‧-(4-hydroxybenzylidene)thiazole-2-carbohydrazide (BHTC) recently studied as corrosion inhibitor for steel in acid medium. Electronic parameters relevant to their inhibition activity such as EHOMO, ELUMO, Energy gap (ΔE), hardness (η), softness (σ), the absolute electronegativity (χ), proton affinity (PA) and nucleophilicity (ω) etc., were computed and discussed. Monte Carlo simulations were applied to search for the most stable configuration and adsorption energies for the interaction of the inhibitors with Fe (110) surface. The theoretical data obtained are in most cases in agreement with experimental results.
Excited States of Proton-bound DNA/RNA Base Homo-dimers: Pyrimidines
Féraud, Géraldine; Dedonder, Claude; Jouvet, Christophe; Pino, Gustavo A
2015-01-01
We are presenting the electronic photo fragment spectra of the protonated pyrimidine DNA bases homo-dimers. Only the thymine dimer exhibits a well structured vibrational progression, while protonated monomer shows broad vibrational bands. This shows that proton bonding can block some non radiative processes present in the monomer.
Spot-Scanning-Based Proton Therapy for Extracranial Chordoma
Energy Technology Data Exchange (ETDEWEB)
Staab, Adrian, E-mail: adrian.staab@psi.ch [Center for Proton Therapy, Paul Scherrer Institute, Villigen (Switzerland); Rutz, Hans Peter; Ares, Carmen; Timmermann, Beate; Schneider, Ralf; Bolsi, Alessandra; Albertini, Francesca; Lomax, Antony; Goitein, Gudrun; Hug, Eugen [Center for Proton Therapy, Paul Scherrer Institute, Villigen (Switzerland)
2011-11-15
Purpose: To evaluate effectiveness and safety of spot-scanning-based proton-radiotherapy (PT) for extracranial chordomas (ECC). Methods and Material: Between 1999-2006, 40 patients with chordoma of C-, T-, and L-spine and sacrum were treated at Paul Scherrer Institute (PSI) with PT using spot-scanning. Median patient age was 58 years (range, 10-81 years); 63% were male, and 36% were female. Nineteen patients (47%) had gross residual disease (mean 69 cc; range, 13-495 cc) before PT, and 21 patients (53%) had undergone prior titanium-based surgical stabilization (SS) and reconstruction of the axial skeleton. Proton doses were expressed as Gy(RBE). A conversion factor of 1.1 was used to account for higher relative biological effectiveness (RBE) of protons compared with photons. Mean total dose was 72.5 Gy(RBE) [range, 59.4-75.2 Gy(RBE)] delivered at 1.8-2.0 Gy(RBE) dose per fraction. Median follow-up time was 43 months. Results: In 19 patients without surgical stabilization, actuarial local control (LC) rate at 5 years was 100%. LC for patients with gross residual disease but without surgical stabilization was also 100% at 5 years. In contrast, 12 failures occurred in 21 patients with SS, yielding a significantly decreased 5-year LC rate of 30% (p = 0.0003). For the entire cohort, 5-year LC rates were 62%, disease-free survival rates were 57%, and overall survival rates were 80%. Rates were 100% for patients without SS. No other factor, including dosimetric parameters (V95, V80) were predictive for tumor control on univariate analysis. Conclusion: Spot-scanning-based PT at PSI delivered subsequently to function-preserving surgery for tumor debulking, decompression of spinal cord, or biopsy only is safe and highly effective in patients with ECC without major surgical instrumentation even in view of large, unresectable disease.
Implementation of a Monte Carlo based inverse planning model for clinical IMRT with MCNP code
He, Tongming Tony
In IMRT inverse planning, inaccurate dose calculations and limitations in optimization algorithms introduce both systematic and convergence errors to treatment plans. The goal of this work is to practically implement a Monte Carlo based inverse planning model for clinical IMRT. The intention is to minimize both types of error in inverse planning and obtain treatment plans with better clinical accuracy than non-Monte Carlo based systems. The strategy is to calculate the dose matrices of small beamlets by using a Monte Carlo based method. Optimization of beamlet intensities is followed based on the calculated dose data using an optimization algorithm that is capable of escape from local minima and prevents possible pre-mature convergence. The MCNP 4B Monte Carlo code is improved to perform fast particle transport and dose tallying in lattice cells by adopting a selective transport and tallying algorithm. Efficient dose matrix calculation for small beamlets is made possible by adopting a scheme that allows concurrent calculation of multiple beamlets of single port. A finite-sized point source (FSPS) beam model is introduced for easy and accurate beam modeling. A DVH based objective function and a parallel platform based algorithm are developed for the optimization of intensities. The calculation accuracy of improved MCNP code and FSPS beam model is validated by dose measurements in phantoms. Agreements better than 1.5% or 0.2 cm have been achieved. Applications of the implemented model to clinical cases of brain, head/neck, lung, spine, pancreas and prostate have demonstrated the feasibility and capability of Monte Carlo based inverse planning for clinical IMRT. Dose distributions of selected treatment plans from a commercial non-Monte Carlo based system are evaluated in comparison with Monte Carlo based calculations. Systematic errors of up to 12% in tumor doses and up to 17% in critical structure doses have been observed. The clinical importance of Monte Carlo based
Parodi, K; Kraemer, M; Sommerer, F; Naumann, J; Mairani, A; Brons, S
2010-01-01
Scanned ion beam delivery promises superior flexibility and accuracy for highly conformal tumour therapy in comparison to the usage of passive beam shaping systems. The attainable precision demands correct overlapping of the pencil-like beams which build up the entire dose distribution in the treatment field. In particular, improper dose application due to deviations of the lateral beam profiles from the nominal planning conditions must be prevented via appropriate beam monitoring in the beamline, prior to the entrance in the patient. To assess the necessary tolerance thresholds of the beam monitoring system at the Heidelberg Ion Beam Therapy Center, Germany, this study has investigated several worst-case scenarios for a sensitive treatment plan, namely scanned proton and carbon ion delivery to a small target volume at a shallow depth. Deviations from the nominal lateral beam profiles were simulated, which may occur because of misaligned elements or changes of the beam optic in the beamline. Data have been an...
Application of Photon Transport Monte Carlo Module with GPU-based Parallel System
Energy Technology Data Exchange (ETDEWEB)
Park, Chang Je [Sejong University, Seoul (Korea, Republic of); Shon, Heejeong [Golden Eng. Co. LTD, Seoul (Korea, Republic of); Lee, Donghak [CoCo Link Inc., Seoul (Korea, Republic of)
2015-05-15
In general, it takes lots of computing time to get reliable results in Monte Carlo simulations especially in deep penetration problems with a thick shielding medium. To mitigate such a weakness of Monte Carlo methods, lots of variance reduction algorithms are proposed including geometry splitting and Russian roulette, weight windows, exponential transform, and forced collision, etc. Simultaneously, advanced computing hardware systems such as GPU(Graphics Processing Units)-based parallel machines are used to get a better performance of the Monte Carlo simulation. The GPU is much easier to access and to manage when comparing a CPU cluster system. It also becomes less expensive these days due to enhanced computer technology. There, lots of engineering areas adapt GPU-bases massive parallel computation technique. based photon transport Monte Carlo method. It provides almost 30 times speedup without any optimization and it is expected almost 200 times with fully supported GPU system. It is expected that GPU system with advanced parallelization algorithm will contribute successfully for development of the Monte Carlo module which requires quick and accurate simulations.
Application of Photon Transport Monte Carlo Module with GPU-based Parallel System
International Nuclear Information System (INIS)
In general, it takes lots of computing time to get reliable results in Monte Carlo simulations especially in deep penetration problems with a thick shielding medium. To mitigate such a weakness of Monte Carlo methods, lots of variance reduction algorithms are proposed including geometry splitting and Russian roulette, weight windows, exponential transform, and forced collision, etc. Simultaneously, advanced computing hardware systems such as GPU(Graphics Processing Units)-based parallel machines are used to get a better performance of the Monte Carlo simulation. The GPU is much easier to access and to manage when comparing a CPU cluster system. It also becomes less expensive these days due to enhanced computer technology. There, lots of engineering areas adapt GPU-bases massive parallel computation technique. based photon transport Monte Carlo method. It provides almost 30 times speedup without any optimization and it is expected almost 200 times with fully supported GPU system. It is expected that GPU system with advanced parallelization algorithm will contribute successfully for development of the Monte Carlo module which requires quick and accurate simulations
Pietrzak, Robert; Konefał, Adam; Sokół, Maria; Orlef, Andrzej
2016-08-01
The success of proton therapy depends strongly on the precision of treatment planning. Dose distribution in biological tissue may be obtained from Monte Carlo simulations using various scientific codes making it possible to perform very accurate calculations. However, there are many factors affecting the accuracy of modeling. One of them is a structure of objects called bins registering a dose. In this work the influence of bin structure on the dose distributions was examined. The MCNPX code calculations of Bragg curve for the 60 MeV proton beam were done in two ways: using simple logical detectors being the volumes determined in water, and using a precise model of ionization chamber used in clinical dosimetry. The results of the simulations were verified experimentally in the water phantom with Marcus ionization chamber. The average local dose difference between the measured relative doses in the water phantom and those calculated by means of the logical detectors was 1.4% at first 25 mm, whereas in the full depth range this difference was 1.6% for the maximum uncertainty in the calculations less than 2.4% and for the maximum measuring error of 1%. In case of the relative doses calculated with the use of the ionization chamber model this average difference was somewhat greater, being 2.3% at depths up to 25 mm and 2.4% in the full range of depths for the maximum uncertainty in the calculations of 3%. In the dose calculations the ionization chamber model does not offer any additional advantages over the logical detectors. The results provided by both models are similar and in good agreement with the measurements, however, the logical detector approach is a more time-effective method.
A Monte-Carlo-Based Network Method for Source Positioning in Bioluminescence Tomography
Zhun Xu; Xiaolei Song; Xiaomeng Zhang; Jing Bai
2007-01-01
We present an approach based on the improved Levenberg Marquardt (LM) algorithm of backpropagation (BP) neural network to estimate the light source position in bioluminescent imaging. For solving the forward problem, the table-based random sampling algorithm (TBRS), a fast Monte Carlo simulation method ...
Magro, G; Mairani, A; Mirandola, A; Panizza, D; Russo, S; Ferrari, A; Valvo, F; Fossati, P; Ciocca, M
2015-01-01
This study was performed to evaluate the accuracy of a commercial treatment planning system (TPS), in optimising proton pencil beam dose distributions for small targets of different sizes (5–30 mm side) located at increasing depths in water. The TPS analytical algorithm was benchmarked against experimental data and the FLUKA Monte Carlo (MC) code, previously validated for the selected beam-line. We tested the Siemens syngo® TPS plan optimisation module for water cubes fixing the configurable parameters at clinical standards, with homogeneous target coverage to a 2 Gy (RBE) dose prescription as unique goal. Plans were delivered and the dose at each volume centre was measured in water with a calibrated PTW Advanced Markus® chamber. An EBT3® film was also positioned at the phantom entrance window for the acquisition of 2D dose maps. Discrepancies between TPS calculated and MC simulated values were mainly due to the different lateral spread modeling and resulted in being related to the field-to-spot size r...
Magro, G.; Molinelli, S.; Mairani, A.; Mirandola, A.; Panizza, D.; Russo, S.; Ferrari, A.; Valvo, F.; Fossati, P.; Ciocca, M.
2015-09-01
This study was performed to evaluate the accuracy of a commercial treatment planning system (TPS), in optimising proton pencil beam dose distributions for small targets of different sizes (5-30 mm side) located at increasing depths in water. The TPS analytical algorithm was benchmarked against experimental data and the FLUKA Monte Carlo (MC) code, previously validated for the selected beam-line. We tested the Siemens syngo® TPS plan optimisation module for water cubes fixing the configurable parameters at clinical standards, with homogeneous target coverage to a 2 Gy (RBE) dose prescription as unique goal. Plans were delivered and the dose at each volume centre was measured in water with a calibrated PTW Advanced Markus® chamber. An EBT3® film was also positioned at the phantom entrance window for the acquisition of 2D dose maps. Discrepancies between TPS calculated and MC simulated values were mainly due to the different lateral spread modeling and resulted in being related to the field-to-spot size ratio. The accuracy of the TPS was proved to be clinically acceptable in all cases but very small and shallow volumes. In this contest, the use of MC to validate TPS results proved to be a reliable procedure for pre-treatment plan verification.
Monte-Carlo based prediction of radiochromic film response for hadrontherapy dosimetry
International Nuclear Information System (INIS)
A model has been developed to calculate MD-55-V2 radiochromic film response to ion irradiation. This model is based on photon film response and film saturation by high local energy deposition computed by Monte-Carlo simulation. We have studied the response of the film to photon irradiation and we proposed a calculation method for hadron beams.
Shape based Monte Carlo code for light transport in complex heterogeneous tissues
Margallo-Balbás, E.; French, P.J.
2007-01-01
A Monte Carlo code for the calculation of light transport in heterogeneous scattering media is presented together with its validation. Triangle meshes are used to define the interfaces between different materials, in contrast with techniques based on individual volume elements. This approach allows
Xue, Pengchong; Chen, Peng; Jia, Junhui; Xu, Qiuxia; Sun, Jiabao; Yao, Boqi; Zhang, Zhenqi; Lu, Ran
2014-03-11
A triphenylamine-based benzoxazole derivative exhibits a low contrast piezofluorochromic behavior under external pressure, and a high-contrast fluorescence change induced by protonation can be observed.
Choudhury, D K
2013-01-01
We construct a model for double parton distribution functions (dPDFs) based on the notion of self-similarity, pursued earlier for small \\textit{x} physics at HERA. The most general form of dPDFs contains total thirteen parameters to be fitted from data of proton-proton collision at LHC. It is shown that the constructed dPDF does not factorize into two single PDFs in conformity with QCD expectation, and it satisfies the condition that at the kinematic boundary $x_{1}+x_{2}=1 $ (where $x_{1}$ and $x_{2}$ are the longitudinal fractional momenta of two partons), the dPDF vanishes.
New memory devices based on the proton transfer process.
Wierzbowska, Małgorzata
2016-01-01
Memory devices operating due to the fast proton transfer (PT) process are proposed by the means of first-principles calculations. Writing information is performed using the electrostatic potential of scanning tunneling microscopy (STM). Reading information is based on the effect of the local magnetization induced at the zigzag graphene nanoribbon (Z-GNR) edge-saturated with oxygen or the hydroxy group-and can be realized with the use of giant magnetoresistance (GMR), a magnetic tunnel junction or spin-transfer torque devices. The energetic barriers for the hop forward and backward processes can be tuned by the distance and potential of the STM tip; this thus enables us to tailor the non-volatile logic states. The proposed system enables very dense packing of the logic cells and could be used in random access and flash memory devices. PMID:26596910
New memory devices based on the proton transfer process
Wierzbowska, Małgorzata
2016-01-01
Memory devices operating due to the fast proton transfer (PT) process are proposed by the means of first-principles calculations. Writing information is performed using the electrostatic potential of scanning tunneling microscopy (STM). Reading information is based on the effect of the local magnetization induced at the zigzag graphene nanoribbon (Z-GNR) edge—saturated with oxygen or the hydroxy group—and can be realized with the use of giant magnetoresistance (GMR), a magnetic tunnel junction or spin-transfer torque devices. The energetic barriers for the hop forward and backward processes can be tuned by the distance and potential of the STM tip; this thus enables us to tailor the non-volatile logic states. The proposed system enables very dense packing of the logic cells and could be used in random access and flash memory devices.
Eskers and other evidence of wet-based glaciation in Phlegra Montes, Mars.
Gallagher, Colman; Balme, Matt
2016-04-01
Although glacial landsystems produced under warm/wet based conditions are very common on Earth, glaciological and landform evidence indicates that glaciation on Mars during the Amazonian period (3 Ga to present) has been characterised by cold/dry based glaciers, consistent with the prevailing cold, hyperarid conditions. However, this presentation describes a system of sinuous ridges, interpreted as eskers (1), emerging from the degraded piedmont terminus of a Late Amazonian (˜150 Ma) glacier in the southern Phlegra Montes region of Mars. This is probably the first identification of martian eskers that can be directly linked to their parent glacier. Together with their contextual landform assemblage, the eskers are indicative of glacial melting and subglacial meltwater routing but the confinement of the system to a well-defined, regionally significant graben, and the absence of eskers elsewhere in the region, suggests that melting was a response to locally enhanced geothermal heat flux, rather than regional, climate-induced warming. Now, however, new observations reveal the presence of many assemblages of glacial abrasion forms and associated channels that could be evidence of more widespread wet-based glaciation in Phlegra Montes, including the collapse of several distinct ice domes. This landform assemblage has not been described in other glaciated, mid-latitude regions of the martian northern hemisphere. Moreover, Phlegra Montes are flanked by lowlands displaying evidence of extensive volcanism, including contact between plains lava and piedmont glacial ice. These observations suggest that the glaciation of Phlegra Montes might have been strongly conditioned by both volcanism and more restricted forms of ground-heating. These are important new insights both to the forcing of glacial dynamic and melting behaviour on Mars by factors other than climate and to the production of liquid water on Mars during the Late Amazonian. (1) Gallagher, C. and Balme, M. (2015
A Muon Source Proton Driver at JPARC-based Parameters
Energy Technology Data Exchange (ETDEWEB)
Neuffer, David [Fermilab
2016-06-01
An "ultimate" high intensity proton source for neutrino factories and/or muon colliders was projected to be a ~4 MW multi-GeV proton source providing short, intense proton pulses at ~15 Hz. The JPARC ~1 MW accelerators provide beam at parameters that in many respects overlap these goals. Proton pulses from the JPARC Main Ring can readily meet the pulsed intensity goals. We explore these parameters, describing the overlap and consider extensions that may take a JPARC-like facility toward this "ultimate" source. JPARC itself could serve as a stage 1 source for such a facility.
Jeraj, Robert; Keall, Paul
2000-12-01
The effect of the statistical uncertainty, or noise, in inverse treatment planning for intensity modulated radiotherapy (IMRT) based on Monte Carlo dose calculation was studied. Sets of Monte Carlo beamlets were calculated to give uncertainties at Dmax ranging from 0.2% to 4% for a lung tumour plan. The weights of these beamlets were optimized using a previously described procedure based on a simulated annealing optimization algorithm. Several different objective functions were used. It was determined that the use of Monte Carlo dose calculation in inverse treatment planning introduces two errors in the calculated plan. In addition to the statistical error due to the statistical uncertainty of the Monte Carlo calculation, a noise convergence error also appears. For the statistical error it was determined that apparently successfully optimized plans with a noisy dose calculation (3% 1σ at Dmax ), which satisfied the required uniformity of the dose within the tumour, showed as much as 7% underdose when recalculated with a noise-free dose calculation. The statistical error is larger towards the tumour and is only weakly dependent on the choice of objective function. The noise convergence error appears because the optimum weights are determined using a noisy calculation, which is different from the optimum weights determined for a noise-free calculation. Unlike the statistical error, the noise convergence error is generally larger outside the tumour, is case dependent and strongly depends on the required objectives.
Directory of Open Access Journals (Sweden)
Biniam Tesfamicael
2016-03-01
Full Text Available Purpose: The main purpose of this study was to monitor the secondary dose distribution originating from a water phantom during proton therapy of prostate cancer using scintillating fibers.Methods: The Geant4 Monte Carlo toolkit version 9.6.p02 was used to simulate a proton therapy of prostate cancer. Two cases were studied. In the first case, 8 × 8 = 64 equally spaced fibers inside three 4 × 4 × 2.54 cm3 Delrin® blocks were used to monitor the emission of secondary particles in the transverse (left and right and distal regions relative to the beam direction. In the second case, a scintillating block with a thickness of 2.54 cm and equal vertical and longitudinal dimensions as the water phantom was used. Geometrical cuts were implemented to extract the energy deposited in each fiber and inside the scintillating block.Results: The transverse dose distributions from the detected secondary particles in both cases are symmetric and agree to within <3.6%. The energy deposited gradually increases as one moves from the peripheral row of fibers towards the center of the block (aligned with the center of the prostate by a factor of approximately 5. The energy deposited was also observed to decrease as one goes from the frontal to distal region of the block. The ratio of the energy deposited in the prostate to the energy deposited in the middle two rows of fibers showed a linear relationship with a slope of (-3.55±2.26 × 10-5 MeV per treatment Gy delivered. The distal detectors recorded a negligible amount of energy deposited due to higher attenuation of the secondary particles by the water in that direction.Conclusion: With a good calibration and with the ability to define a good correlation between the radiation flux recorded by the external fibers and the dose delivered to the prostate, such fibers can be used for real time dose verification to the target. The system was also observed to respond to the series of Bragg Peaks used to generate the
Energy Technology Data Exchange (ETDEWEB)
Dowdell, Stephen; Paganetti, Harald [Department of Radiation Oncology, Massachusetts General Hospital and Harvard Medical School, Boston, Massachusetts 02114 (United States); Grassberger, Clemens [Department of Radiation Oncology, Massachusetts General Hospital and Harvard Medical School, Boston, Massachusetts 02114 and Centre for Proton Therapy, Paul Scherrer Institut, 5232 Villigen-PSI (Switzerland)
2013-12-15
Purpose: To compare motion effects in intensity modulated proton therapy (IMPT) lung treatments with different levels of intensity modulation.Methods: Spot scanning IMPT treatment plans were generated for ten lung cancer patients for 2.5Gy(RBE) and 12Gy(RBE) fractions and two distinct energy-dependent spot sizes (σ∼8–17 mm and ∼2–4 mm). IMPT plans were generated with the target homogeneity of each individual field restricted to <20% (IMPT{sub 20%}). These plans were compared to full IMPT (IMPT{sub full}), which had no restriction on the single field homogeneity. 4D Monte Carlo simulations were performed upon the patient 4DCT geometry, including deformable image registration and incorporating the detailed timing structure of the proton delivery system. Motion effects were quantified via comparison of the results of the 4D simulations (4D-IMPT{sub 20%}, 4D-IMPT{sub full}) with those of a 3D Monte Carlo simulation (3D-IMPT{sub 20%}, 3D-IMPT{sub full}) upon the planning CT using the equivalent uniform dose (EUD), V{sub 95} and D{sub 1}-D{sub 99}. The effects in normal lung were quantified using mean lung dose (MLD) and V{sub 90%}.Results: For 2.5Gy(RBE), the mean EUD for the large spot size is 99.9%± 2.8% for 4D-IMPT{sub 20%} compared to 100.1%± 2.9% for 4D-IMPT{sub full}. The corresponding values are 88.6%± 8.7% (4D-IMPT{sub 20%}) and 91.0%± 9.3% (4D-IMPT{sub full}) for the smaller spot size. The EUD value is higher in 69.7% of the considered deliveries for 4D-IMPT{sub full}. The V{sub 95} is also higher in 74.7% of the plans for 4D-IMPT{sub full}, implying that IMPT{sub full} plans experience less underdose compared to IMPT{sub 20%}. However, the target dose homogeneity is improved in the majority (67.8%) of plans for 4D-IMPT{sub 20%}. The higher EUD and V{sub 95} suggests that the degraded homogeneity in IMPT{sub full} is actually due to the introduction of hot spots in the target volume, perhaps resulting from the sharper in-target dose gradients. The
Monte Carlo vs. Pencil Beam based optimization of stereotactic lung IMRT
Weinmann Martin; Söhn Matthias; Muzik Jan; Sikora Marcin; Alber Markus
2009-01-01
Abstract Background The purpose of the present study is to compare finite size pencil beam (fsPB) and Monte Carlo (MC) based optimization of lung intensity-modulated stereotactic radiotherapy (lung IMSRT). Materials and methods A fsPB and a MC algorithm as implemented in a biological IMRT planning system were validated by film measurements in a static lung phantom. Then, they were applied for static lung IMSRT planning based on three different geometrical patient models (one phase static CT, ...
Monte Carlo vs. Pencil Beam based optimization of stereotactic lung IMRT
Sikora, Marcin; Muzik, Jan; Söhn, Matthias; Weinmann, Martin; Alber, Markus
2009-01-01
Background The purpose of the present study is to compare finite size pencil beam (fsPB) and Monte Carlo (MC) based optimization of lung intensity-modulated stereotactic radiotherapy (lung IMSRT). Materials and methods A fsPB and a MC algorithm as implemented in a biological IMRT planning system were validated by film measurements in a static lung phantom. Then, they were applied for static lung IMSRT planning based on three different geometrical patient models (one phase static CT, density o...
Monte Carlo vs. Pencil Beam based optimization of stereotactic lung IMRT
Sikora, Marcin Pawel; Muzik, Jan; Söhn, Matthias; Weinmann, Martin; Alber, Markus
2009-01-01
Background: The purpose of the present study is to compare finite size pencil beam (fsPB) and Monte Carlo (MC) based optimization of lung intensity-modulated stereotactic radiotherapy (lung IMSRT). Materials and methods: A fsPB and a MC algorithm as implemented in a biological IMRT planning system were validated by film measurements in a static lung phantom. Then, they were applied for static lung IMSRT planning based on three different geometrical patient models (one phase ...
Effects of CT based Voxel Phantoms on Dose Distribution Calculated with Monte Carlo Method
Chen, Chaobin; Huang, Qunying; Wu, Yican
2005-04-01
A few CT-based voxel phantoms were produced to investigate the sensitivity of Monte Carlo simulations of x-ray beam and electron beam to the proportions of elements and the mass densities of the materials used to express the patient's anatomical structure. The human body can be well outlined by air, lung, adipose, muscle, soft bone and hard bone to calculate the dose distribution with Monte Carlo method. The effects of the calibration curves established by using various CT scanners are not clinically significant based on our investigation. The deviation from the values of cumulative dose volume histogram derived from CT-based voxel phantoms is less than 1% for the given target.
Effects of CT based Voxel Phantoms on Dose Distribution Calculated with Monte Carlo Method
Institute of Scientific and Technical Information of China (English)
Chen Chaobin; Huang Qunying; Wu Yican
2005-01-01
A few CT-based voxel phantoms were produced to investigate the sensitivity of Monte Carlo simulations of X-ray beam and electron beam to the proportions of elements and the mass densities of the materials used to express the patient's anatomical structure. The human body can be well outlined by air, lung, adipose, muscle, soft bone and hard bone to calculate the dose distribution with Monte Carlo method. The effects of the calibration curves established by using various CT scanners are not clinically significant based on our investigation. The deviation from the values of cumulative dose volume histogram derived from CT-based voxel phantoms is less than 1% for the given target.
Caldwell, Allen
2015-01-01
Based on current CERN infrastructure, an electron-proton collider is proposed at a centre-of-mass energy of about 9 TeV. A 7 TeV LHC bunch is used as the proton driver to create a plasma wakefield which then accelerates electrons to 3 TeV, these then colliding with the other 7 TeV LHC proton beam. The basic parameters of the collider are presented, which although of very high energy, has integrated luminosities of the order of 1 pb$^{-1}$/year. For such a collider, with a centre-of-mass energy 30 times greater than HERA, parton momentum fractions, $x$, down to about $10^{-8}$ are accessible for $Q^2$ of 1 GeV$^2$ and could lead to effects of saturation or some other breakdown of DGLAP being observed. The total photon-proton cross section can be measured up to very high energies and also at different energies as the possibility of varying the electron beam energy is assumed; this could have synergy with cosmic-ray physics. Other physics which can be pursued at such a collider are contact interaction searches, ...
Simulation model based on Monte Carlo method for traffic assignment in local area road network
Institute of Scientific and Technical Information of China (English)
Yuchuan DU; Yuanjing GENG; Lijun SUN
2009-01-01
For a local area road network, the available traffic data of traveling are the flow volumes in the key intersections, not the complete OD matrix. Considering the circumstance characteristic and the data availability of a local area road network, a new model for traffic assignment based on Monte Carlo simulation of intersection turning movement is provided in this paper. For good stability in temporal sequence, turning ratio is adopted as the important parameter of this model. The formulation for local area road network assignment problems is proposed on the assumption of random turning behavior. The traffic assignment model based on the Monte Carlo method has been used in traffic analysis for an actual urban road network. The results comparing surveying traffic flow data and determining flow data by the previous model verify the applicability and validity of the proposed methodology.
Supanitsky, A. D.
2016-02-01
Gamma rays and neutrinos are produced as a result of proton-proton interactions that occur in different astrophysical contexts. The detection of these two types of messengers is of great importance for the study of different physical phenomena, related to nonthermal processes, taking place in different astrophysical scenarios. Therefore, the knowledge of the energy spectrum of these two types of particles, as a function of the incident proton energy, is essential for the interpretation of the observational data. In this paper, parametrizations of the energy spectra of gamma rays and neutrinos, originated in proton-proton collisions, are presented. The energy range of the incident protons considered extends from 102 to 108 GeV . The parametrizations are based on Monte Carlo simulations of proton-proton interactions performed with the hadronic interaction models QGSJET-II-04 and EPOS-LHC, which have recently been updated with the data taken by the Large Hadron Collider.
Laser-based detection and tracking moving objects using data-driven Markov chain Monte Carlo
Vu, Trung-Dung; Aycard, Olivier
2009-01-01
We present a method of simultaneous detection and tracking moving objects from a moving vehicle equipped with a single layer laser scanner. A model-based approach is introduced to interpret the laser measurement sequence by hypotheses of moving object trajectories over a sliding window of time. Knowledge of various aspects including object model, measurement model, motion model are integrated in one theoretically sound Bayesian framework. The data-driven Markov chain Monte Carlo (DDMCMC) tech...
Validation of a Monte Carlo Based Depletion Methodology Using HFIR Post-Irradiation Measurements
Energy Technology Data Exchange (ETDEWEB)
Chandler, David [ORNL; Maldonado, G Ivan [ORNL; Primm, Trent [ORNL
2009-11-01
Post-irradiation uranium isotopic atomic densities within the core of the High Flux Isotope Reactor (HFIR) were calculated and compared to uranium mass spectrographic data measured in the late 1960s and early 70s [1]. This study was performed in order to validate a Monte Carlo based depletion methodology for calculating the burn-up dependent nuclide inventory, specifically the post-irradiation uranium
Hu, Xingzhi; Chen, Xiaoqian; Parks, Geoffrey T.; Yao, Wen
2016-10-01
Ever-increasing demands of uncertainty-based design, analysis, and optimization in aerospace vehicles motivate the development of Monte Carlo methods with wide adaptability and high accuracy. This paper presents a comprehensive review of typical improved Monte Carlo methods and summarizes their characteristics to aid the uncertainty-based multidisciplinary design optimization (UMDO). Among them, Bayesian inference aims to tackle the problems with the availability of prior information like measurement data. Importance sampling (IS) settles the inconvenient sampling and difficult propagation through the incorporation of an intermediate importance distribution or sequential distributions. Optimized Latin hypercube sampling (OLHS) is a stratified sampling approach to achieving better space-filling and non-collapsing characteristics. Meta-modeling approximation based on Monte Carlo saves the computational cost by using cheap meta-models for the output response. All the reviewed methods are illustrated by corresponding aerospace applications, which are compared to show their techniques and usefulness in UMDO, thus providing a beneficial reference for future theoretical and applied research.
Development of a space radiation Monte Carlo computer simulation based on the FLUKA and ROOT codes
Pinsky, L; Ferrari, A; Sala, P; Carminati, F; Brun, R
2001-01-01
This NASA funded project is proceeding to develop a Monte Carlo-based computer simulation of the radiation environment in space. With actual funding only initially in place at the end of May 2000, the study is still in the early stage of development. The general tasks have been identified and personnel have been selected. The code to be assembled will be based upon two major existing software packages. The radiation transport simulation will be accomplished by updating the FLUKA Monte Carlo program, and the user interface will employ the ROOT software being developed at CERN. The end-product will be a Monte Carlo-based code which will complement the existing analytic codes such as BRYNTRN/HZETRN presently used by NASA to evaluate the effects of radiation shielding in space. The planned code will possess the ability to evaluate the radiation environment for spacecraft and habitats in Earth orbit, in interplanetary space, on the lunar surface, or on a planetary surface such as Mars. Furthermore, it will be usef...
Proton pump inhibitors in cirrhosis: Tradition or evidence based practice?
Institute of Scientific and Technical Information of China (English)
Francesca Lodato; Francesco Azzaroli; Maria Di Girolamo; Valentina Feletti; Paolo Cecinato; Andrea Lisotti; Davide Festi; Enrico Roda; Giuseppe Mazzella
2008-01-01
Proton Pump Inhibitors (PPI) are very effective in inhibiting acid secretion and are extensively used in many acid related diseases. They are also often used in patients with cirrhosis sometimes in the absence of a specific acid related disease, with the aim of preventing peptic complications in patients with variceal or hypertensive gastropathic bleeding receiving multidrug treatment. Contradicting reports support their use in cirrhosis and evidence of their efficacy in this condition is poor. Moreover there are convincing papers suggesting that acid secretion is reduced in patients with liver cirrhosis. With regard to H pylori infection, its prevalence in patients with cirrhosis is largely variable among different studies, and it seems that H pylori eradication does not prevent gastro-duodenal ulcer formation and bleeding. With regard to the prevention and treatment of oesophageal complications after banding or sclerotherapy of oesophageal varices, there is little evidence for a protective role of PPI. Moreover, due to liver metabolism of PPI, the dose of most available PPIs should be reduced in cirrhotics. In conclusion, the use of this class of drugs seems more habit related than evidence-based eventually leading to an increase in health costs.
New approach to polarized proton scattering based on Dirac dynamics
International Nuclear Information System (INIS)
The Dirac impulse approximation has to date provided dramatic improvement in our ability to predict, with no free parameters, spin observables in proton-nucleus elastic scattering at intermediate energies. The key ingredients of this approach are Dirac propagation and the nucleon-nucleon invariant amplitudes. So far, local approximations to the NN amplitudes have been used. The standard NN representation in terms of Dirac scalar, vector, and so on, parts which is free of kinematical singularities seems to naturally predict the correct coupling to negative energy states for energies above 300 MeV. At low energy, this coupling is subject to an ambiguity between pseudoscalar and pseudovector πN coupling mechanisms and it is evident that the pseudoscalar coupling treated in a local approximation causes too much scalar-vector difference and thus too large pair contributions. Once this problem is remedied, the Dirac optical potential is expected to be calculable from a nucleon-nucleon quasi-potential over the range 0 to 1000 MeV. For the energy region above about 300 MeV, the large scalar and vector potentials of Dirac phenomenology are seen to be accurately predicted by the impulse approximation. Work by Shakin and collaborators provides complementary results at low energy based on a nuclear matter g-matrix. A basic conclusion is that relativistic spin effects cannot be neglected in nuclear physics. 36 references
ERSN-OpenMC, a Java-based GUI for OpenMC Monte Carlo code
Directory of Open Access Journals (Sweden)
Jaafar EL Bakkali
2016-07-01
Full Text Available OpenMC is a new Monte Carlo transport particle simulation code focused on solving two types of neutronic problems mainly the k-eigenvalue criticality fission source problems and external fixed fission source problems. OpenMC does not have any Graphical User Interface and the creation of one is provided by our java-based application named ERSN-OpenMC. The main feature of this application is to provide to the users an easy-to-use and flexible graphical interface to build better and faster simulations, with less effort and great reliability. Additionally, this graphical tool was developed with several features, as the ability to automate the building process of OpenMC code and related libraries as well as the users are given the freedom to customize their installation of this Monte Carlo code. A full description of the ERSN-OpenMC application is presented in this paper.
Pair correlations in iron-based superconductors: Quantum Monte Carlo study
Energy Technology Data Exchange (ETDEWEB)
Kashurnikov, V.A.; Krasavin, A.V., E-mail: avkrasavin@gmail.com
2014-08-01
The new generalized quantum continuous time world line Monte Carlo algorithm was developed to calculate pair correlation functions for two-dimensional FeAs-clusters modeling of iron-based superconductors using a two-orbital model. The data obtained for clusters with sizes up to 10×10 FeAs-cells favor the possibility of an effective charge carrier's attraction that is corresponding the A{sub 1g}-symmetry, at some parameters of interaction. The analysis of pair correlations depending on the cluster size, temperature, interaction, and the type of symmetry of the order parameter is carried out. - Highlights: • New generalized quantum continuous time world line Monte Carlo algorithm is developed. • Pair correlation functions for two-dimensional FeAs-clusters are calculated. • Parameters of two-orbital model corresponding to attraction of carriers are defined.
Electron density of states of Fe-based superconductors: Quantum trajectory Monte Carlo method
Kashurnikov, V. A.; Krasavin, A. V.; Zhumagulov, Ya. V.
2016-03-01
The spectral and total electron densities of states in two-dimensional FeAs clusters, which simulate iron-based superconductors, have been calculated using the generalized quantum Monte Carlo algorithm within the full two-orbital model. Spectra have been reconstructed by solving the integral equation relating the Matsubara Green's function and spectral density by the method combining the gradient descent and Monte Carlo algorithms. The calculations have been performed for clusters with dimensions up to 10 × 10 FeAs cells. The profiles of the Fermi surface for the entire Brillouin zone have been presented in the quasiparticle approximation. Data for the total density of states near the Fermi level have been obtained. The effect of the interaction parameter, size of the cluster, and temperature on the spectrum of excitations has been studied.
Polarization imaging of multiply-scattered radiation based on integral-vector Monte Carlo method
International Nuclear Information System (INIS)
A new integral-vector Monte Carlo method (IVMCM) is developed to analyze the transfer of polarized radiation in 3D multiple scattering particle-laden media. The method is based on a 'successive order of scattering series' expression of the integral formulation of the vector radiative transfer equation (VRTE) for application of efficient statistical tools to improve convergence of Monte Carlo calculations of integrals. After validation against reference results in plane-parallel layer backscattering configurations, the model is applied to a cubic container filled with uniformly distributed monodispersed particles and irradiated by a monochromatic narrow collimated beam. 2D lateral images of effective Mueller matrix elements are calculated in the case of spherical and fractal aggregate particles. Detailed analysis of multiple scattering regimes, which are very similar for unpolarized radiation transfer, allows identifying the sensitivity of polarization imaging to size and morphology.
GPU-accelerated Monte Carlo simulation of particle coagulation based on the inverse method
Wei, J.; Kruis, F. E.
2013-09-01
Simulating particle coagulation using Monte Carlo methods is in general a challenging computational task due to its numerical complexity and the computing cost. Currently, the lowest computing costs are obtained when applying a graphic processing unit (GPU) originally developed for speeding up graphic processing in the consumer market. In this article we present an implementation of accelerating a Monte Carlo method based on the Inverse scheme for simulating particle coagulation on the GPU. The abundant data parallelism embedded within the Monte Carlo method is explained as it will allow an efficient parallelization of the MC code on the GPU. Furthermore, the computation accuracy of the MC on GPU was validated with a benchmark, a CPU-based discrete-sectional method. To evaluate the performance gains by using the GPU, the computing time on the GPU against its sequential counterpart on the CPU were compared. The measured speedups show that the GPU can accelerate the execution of the MC code by a factor 10-100, depending on the chosen particle number of simulation particles. The algorithm shows a linear dependence of computing time with the number of simulation particles, which is a remarkable result in view of the n2 dependence of the coagulation.
Espel, Federico Puente
The main objective of this PhD research is to develop a high accuracy modeling tool using a Monte Carlo based coupled system. The presented research comprises the development of models to include the thermal-hydraulic feedback to the Monte Carlo method and speed-up mechanisms to accelerate the Monte Carlo criticality calculation. Presently, deterministic codes based on the diffusion approximation of the Boltzmann transport equation, coupled with channel-based (or sub-channel based) thermal-hydraulic codes, carry out the three-dimensional (3-D) reactor core calculations of the Light Water Reactors (LWRs). These deterministic codes utilize nuclear homogenized data (normally over large spatial zones, consisting of fuel assembly or parts of fuel assembly, and in the best case, over small spatial zones, consisting of pin cell), which is functionalized in terms of thermal-hydraulic feedback parameters (in the form of off-line pre-generated cross-section libraries). High accuracy modeling is required for advanced nuclear reactor core designs that present increased geometry complexity and material heterogeneity. Such high-fidelity methods take advantage of the recent progress in computation technology and coupled neutron transport solutions with thermal-hydraulic feedback models on pin or even on sub-pin level (in terms of spatial scale). The continuous energy Monte Carlo method is well suited for solving such core environments with the detailed representation of the complicated 3-D problem. The major advantages of the Monte Carlo method over the deterministic methods are the continuous energy treatment and the exact 3-D geometry modeling. However, the Monte Carlo method involves vast computational time. The interest in Monte Carlo methods has increased thanks to the improvements of the capabilities of high performance computers. Coupled Monte-Carlo calculations can serve as reference solutions for verifying high-fidelity coupled deterministic neutron transport methods
Comparing analytical and Monte Carlo optical diffusion models in phosphor-based X-ray detectors
Kalyvas, N.; Liaparinos, P.
2014-03-01
Luminescent materials are employed as radiation to light converters in detectors of medical imaging systems, often referred to as phosphor screens. Several processes affect the light transfer properties of phosphors. Amongst the most important is the interaction of light. Light attenuation (absorption and scattering) can be described either through "diffusion" theory in theoretical models or "quantum" theory in Monte Carlo methods. Although analytical methods, based on photon diffusion equations, have been preferentially employed to investigate optical diffusion in the past, Monte Carlo simulation models can overcome several of the analytical modelling assumptions. The present study aimed to compare both methodologies and investigate the dependence of the analytical model optical parameters as a function of particle size. It was found that the optical photon attenuation coefficients calculated by analytical modeling are decreased with respect to the particle size (in the region 1- 12 μm). In addition, for particles sizes smaller than 6μm there is no simultaneous agreement between the theoretical modulation transfer function and light escape values with respect to the Monte Carlo data.
Energy Technology Data Exchange (ETDEWEB)
Murata, Isao [Osaka Univ., Suita (Japan); Mori, Takamasa; Nakagawa, Masayuki; Itakura, Hirofumi
1996-03-01
The method to calculate neutronics parameters of a core composed of randomly distributed spherical fuels has been developed based on a statistical geometry model with a continuous energy Monte Carlo method. This method was implemented in a general purpose Monte Carlo code MCNP, and a new code MCNP-CFP had been developed. This paper describes the model and method how to use it and the validation results. In the Monte Carlo calculation, the location of a spherical fuel is sampled probabilistically along the particle flight path from the spatial probability distribution of spherical fuels, called nearest neighbor distribution (NND). This sampling method was validated through the following two comparisons: (1) Calculations of inventory of coated fuel particles (CFPs) in a fuel compact by both track length estimator and direct evaluation method, and (2) Criticality calculations for ordered packed geometries. This method was also confined by applying to an analysis of the critical assembly experiment at VHTRC. The method established in the present study is quite unique so as to a probabilistic model of the geometry with a great number of spherical fuels distributed randomly. Realizing the speed-up by vector or parallel computations in future, it is expected to be widely used in calculation of a nuclear reactor core, especially HTGR cores. (author).
Oxide-based protonic conductors: Point defects and transport properties
DEFF Research Database (Denmark)
Bonanos, N.
2001-01-01
A variety of oxides doped with elements of lower valence acquire hydroxyl-type defects when exposed, at high temperature, to atmospheres containing water vapour. Since the hydrogen of the hydroxyl groups is mobile, the oxides display protonic conductivity and may be used as electrolytes in sensors......, hydrogen pumps, fuel cells, etc. The extent to which protonic defects form depends mainly on the partial pressure of water vapour, temperature and basicity of the constituent oxides, while their mobility depends, among other factors, on the metal-oxygen bond length and bond energy. The defect equilibria...... that determine the protonic concentrations are considered, with emphasis on the regime of low oxygen partial pressure. The measurement of the thermoelectric power (TEP) and of the H+/D+ isotope effect in conductivity are discussed as a means of characterising the conduction process. (C) 2001 Elsevier Science B...
Proton exchange in acid–base complexes induced by reaction coordinates with heavy atom motions
International Nuclear Information System (INIS)
Highlights: ► Proton exchange in acid–base complexes is studied. ► The structures, binding energies, and normal mode vibrations are calculated. ► Transition state structures of proton exchange mechanism are determined. ► In the complexes studied, the reaction coordinate involves heavy atom rocking. ► The reaction coordinate is not simply localized in the proton movements. - Abstract: We extend previous work on nitric acid–ammonia and nitric acid–alkylamine complexes to illustrate that proton exchange reaction coordinates involve the rocking motion of the base moiety in many double hydrogen-bonded gas phase strong acid–strong base complexes. The complexes studied involve the biologically and atmospherically relevant glycine, formic, acetic, propionic, and sulfuric acids with ammonia/alkylamine bases. In these complexes, the magnitude of the imaginary frequencies associated with the proton exchange transition states are −1. This contrasts with widely studied proton exchange reactions between symmetric carboxylic acid dimers or asymmetric DNA base pair and their analogs where the reaction coordinate is localized in proton motions and the magnitude of the imaginary frequencies for the transition states are >1100 cm−1. Calculations on complexes of these acids with water are performed for comparison. Variations of normal vibration modes along the reaction coordinate in the complexes are described.
Energy Technology Data Exchange (ETDEWEB)
Jones, Kevin C.; Solberg, Timothy D.; Avery, Stephen, E-mail: Stephen.Avery@uphs.upenn.edu [Department of Radiation Oncology, University of Pennsylvania, Philadelphia, Pennsylvania 19104 (United States); Vander Stappen, François; Janssens, Guillaume; Prieels, Damien [Ion Beam Applications SA, Louvain-la-Neuve 1348 (Belgium); Bawiec, Christopher R.; Lewin, Peter A. [School of Biomedical Engineering, Drexel University, Philadelphia, Pennsylvania 19104 (United States); Sehgal, Chandra M. [Department of Radiology, University of Pennsylvania, Philadelphia, Pennsylvania 19104 (United States)
2015-12-15
Purpose: To measure the acoustic signal generated by a pulsed proton spill from a hospital-based clinical cyclotron. Methods: An electronic function generator modulated the IBA C230 isochronous cyclotron to create a pulsed proton beam. The acoustic emissions generated by the proton beam were measured in water using a hydrophone. The acoustic measurements were repeated with increasing proton current and increasing distance between detector and beam. Results: The cyclotron generated proton spills with rise times of 18 μs and a maximum measured instantaneous proton current of 790 nA. Acoustic emissions generated by the proton energy deposition were measured to be on the order of mPa. The origin of the acoustic wave was identified as the proton beam based on the correlation between acoustic emission arrival time and distance between the hydrophone and proton beam. The acoustic frequency spectrum peaked at 10 kHz, and the acoustic pressure amplitude increased monotonically with increasing proton current. Conclusions: The authors report the first observation of acoustic emissions generated by a proton beam from a hospital-based clinical cyclotron. When modulated by an electronic function generator, the cyclotron is capable of creating proton spills with fast rise times (18 μs) and high instantaneous currents (790 nA). Measurements of the proton-generated acoustic emissions in a clinical setting may provide a method for in vivo proton range verification and patient monitoring.
State-of-the-art in Comprehensive Cascade Control Approach through Monte-Carlo Based Representation
Directory of Open Access Journals (Sweden)
A.H. Mazinan
2015-10-01
Full Text Available The research relies on the comprehensive cascade control approach to be developed in the area of spacecraft, as long as Monte-Carlo based representation is taken into real consideration with respect to state-of-the-art. It is obvious that the conventional methods do not have sufficient merit to be able to deal with such a process under control, constantly, provided that a number of system parameters variations are to be used in providing real situations. It is to note that the new insights in the area of the research’s topic are valuable to outperform a class of spacecrafts performance as the realizations of the acquired results are to be addressed in both real and academic environments. In a word, there are a combination of double closed loop based upon quaternion based control approach in connection with Euler based control approach to handle the three-axis rotational angles and its rates, synchronously, in association with pulse modulation analysis and control allocation, where the dynamics and kinematics of the present system under control are analyzed. A series of experiments are carried out to consider the approach performance in which the aforementioned Monte-Carlo based representation is to be realized in verifying the investigated outcomes.
International Nuclear Information System (INIS)
Proton therapy has become a subject of considerable interest in the radiation oncology community and it is expected that there will be a substantial growth in proton treatment facilities during the next decade. I was asked to write a historical review of proton therapy based on my personal experiences, which have all occurred in the United States, so therefore I have a somewhat parochial point of view. Space requirements did not permit me to mention all of the existing proton therapy facilities or the names of all of those who have contributed to proton therapy. (review)
Ulmer, W.; Schaffner, B.
2010-01-01
We have developed a model for proton depth dose and lateral distributions based on Monte Carlo calculations (GEANT4) and an integration procedure of the Bethe-Bloch equation (BBE). The model accounts for the transport of primary and secondary protons, the creation of recoil protons and heavy recoil nuclei as well as lateral scattering of these contributions. The buildup, which is experimentally observed in higher energy depth dose curves, is modeled by inclusion of two different origins: 1. S...
Meshkian, Mohsen
2016-02-01
Neutron radiography is rapidly extending as one of the methods for non-destructive screening of materials. There are various parameters to be studied for optimising imaging screens and image quality for different fast-neutron radiography systems. Herein, a Geant4 Monte Carlo simulation is employed to evaluate the response of a fast-neutron radiography system using a 252Cf neutron source. The neutron radiography system is comprised of a moderator as the neutron-to-proton converter with suspended silver-activated zinc sulphide (ZnS(Ag)) as the phosphor material. The neutron-induced protons deposit energy in the phosphor which consequently emits scintillation light. Further, radiographs are obtained by simulating the overall radiography system including source and sample. Two different standard samples are used to evaluate the quality of the radiographs.
Pair correlation functions of FeAs-based superconductors: Quantum Monte Carlo study
Kashurnikov, V. A.; Krasavin, A. V.
2015-01-01
The new generalized quantum continuous time world line Monte Carlo algorithm was developed to calculate pair correlation functions for two-dimensional FeAs-clusters modeling of iron-based superconductors within the framework of the two-orbital model. The analysis of pair correlations depending on the cluster size, temperature, interaction, and the type of symmetry of the order parameter is carried out. The data obtained for clusters with sizes up to 1 0x1 0 FeAs-cells favor the possibility of an effective charge carrier's attraction that is corresponding the A1g-symmetry, at some parameters of interaction.
Monte Carlo tests of the Rasch model based on scalability coefficients
DEFF Research Database (Denmark)
Christensen, Karl Bang; Kreiner, Svend
2010-01-01
For item responses fitting the Rasch model, the assumptions underlying the Mokken model of double monotonicity are met. This makes non-parametric item response theory a natural starting-point for Rasch item analysis. This paper studies scalability coefficients based on Loevinger's H coefficient...... that summarizes the number of Guttman errors in the data matrix. These coefficients are shown to yield efficient tests of the Rasch model using p-values computed using Markov chain Monte Carlo methods. The power of the tests of unequal item discrimination, and their ability to distinguish between local dependence...
International Nuclear Information System (INIS)
Channel capacity of ocean water is limited by propagation distance and optical properties. Previous studies on this problem are based on water-tank experiments with different amounts of Maalox antacid. However, propagation distance is limited by the experimental set-up and the optical properties are different from ocean water. Therefore, the experiment result is not accurate for the physical design of underwater wireless communications links. This letter developed a Monte Carlo model to study channel capacity of underwater optical communications. Moreover, this model can flexibly configure various parameters of transmitter, receiver and channel, and is suitable for physical underwater optical communications links design. (paper)
A new Monte-Carlo based simulation for the CryoEDM experiment
Raso-Barnett, Matthew
2015-01-01
This thesis presents a new Monte-Carlo based simulation of the physics of ultra-cold neutrons (UCN) in complex geometries and its application to the CryoEDM experiment. It includes a detailed description of the design and performance of this simulation along with its use in a project to study the magnetic depolarisation time of UCN within the apparatus due to magnetic impurities in the measurement cell, which is a crucial parameter in the sensitivity of a neutron electricdipole-moment (nEDM) ...
Ulmer, W
2010-01-01
We have developed a model for proton depth dose and lateral distributions based on Monte Carlo calculations (GEANT4) and an integration procedure of the Bethe-Bloch equation (BBE). The model accounts for the transport of primary and secondary protons, the creation of recoil protons and heavy recoil nuclei as well as lateral scattering of these contributions. The buildup, which is experimentally observed in higher energy depth dose curves, is modeled by inclusion of two different origins: 1. Secondary reaction protons with a contribution of ca. 65 % of the buildup (for monoenergetic protons). 2. Landau tails as well as Gaussian type of fluctuations for range straggling effects. All parameters of the model for initially monoenergetic proton beams have been obtained from Monte Carlo calculations or checked by them. Furthermore, there are a few parameters, which can be obtained by fitting the model to measured depth dose curves in order to describe individual characteristics of the beamline - the most important b...
PeneloPET, a Monte Carlo PET simulation tool based on PENELOPE: features and validation
Energy Technology Data Exchange (ETDEWEB)
Espana, S; Herraiz, J L; Vicente, E; Udias, J M [Grupo de Fisica Nuclear, Departmento de Fisica Atomica, Molecular y Nuclear, Universidad Complutense de Madrid, Madrid (Spain); Vaquero, J J; Desco, M [Unidad de Medicina y CirugIa Experimental, Hospital General Universitario Gregorio Maranon, Madrid (Spain)], E-mail: jose@nuc2.fis.ucm.es
2009-03-21
Monte Carlo simulations play an important role in positron emission tomography (PET) imaging, as an essential tool for the research and development of new scanners and for advanced image reconstruction. PeneloPET, a PET-dedicated Monte Carlo tool, is presented and validated in this work. PeneloPET is based on PENELOPE, a Monte Carlo code for the simulation of the transport in matter of electrons, positrons and photons, with energies from a few hundred eV to 1 GeV. PENELOPE is robust, fast and very accurate, but it may be unfriendly to people not acquainted with the FORTRAN programming language. PeneloPET is an easy-to-use application which allows comprehensive simulations of PET systems within PENELOPE. Complex and realistic simulations can be set by modifying a few simple input text files. Different levels of output data are available for analysis, from sinogram and lines-of-response (LORs) histogramming to fully detailed list mode. These data can be further exploited with the preferred programming language, including ROOT. PeneloPET simulates PET systems based on crystal array blocks coupled to photodetectors and allows the user to define radioactive sources, detectors, shielding and other parts of the scanner. The acquisition chain is simulated in high level detail; for instance, the electronic processing can include pile-up rejection mechanisms and time stamping of events, if desired. This paper describes PeneloPET and shows the results of extensive validations and comparisons of simulations against real measurements from commercial acquisition systems. PeneloPET is being extensively employed to improve the image quality of commercial PET systems and for the development of new ones.
GPU-based Monte Carlo radiotherapy dose calculation using phase-space sources
Townson, Reid; Tian, Zhen; Graves, Yan Jiang; Zavgorodni, Sergei; Jiang, Steve B
2013-01-01
A novel phase-space source implementation has been designed for GPU-based Monte Carlo dose calculation engines. Due to the parallelized nature of GPU hardware, it is essential to simultaneously transport particles of the same type and similar energies but separated spatially to yield a high efficiency. We present three methods for phase-space implementation that have been integrated into the most recent version of the GPU-based Monte Carlo radiotherapy dose calculation package gDPM v3.0. The first method is to sequentially read particles from a patient-dependent phase-space and sort them on-the-fly based on particle type and energy. The second method supplements this with a simple secondary collimator model and fluence map implementation so that patient-independent phase-space sources can be used. Finally, as the third method (called the phase-space-let, or PSL, method) we introduce a novel strategy to pre-process patient-independent phase-spaces and bin particles by type, energy and position. Position bins l...
Research on Reliability Modelling Method of Machining Center Based on Monte Carlo Simulation
Directory of Open Access Journals (Sweden)
Chuanhai Chen
2013-03-01
Full Text Available The aim of this study is to get the reliability of series system and analyze the reliability of machining center. So a modified method of reliability modelling based on Monte Carlo simulation for series system is proposed. The reliability function, which is built by the classical statistics method based on the assumption that machine tools were repaired as good as new, may be biased in the real case. The reliability functions of subsystems are established respectively and then the reliability model is built according to the reliability block diagram. Then the fitting reliability function of machine tools is established using the failure data of sample generated by Monte Carlo simulation, whose inverse reliability function is solved by the linearization technique based on radial basis function. Finally, an example of the machining center is presented using the proposed method to show its potential application. The analysis results show that the proposed method can provide an accurate reliability model compared with the conventional method.
Proton conductive membranes based on doped sulfonated polytriazole
Energy Technology Data Exchange (ETDEWEB)
Boaventura, M.; Brandao, L.; Mendes, A. [Laboratorio de Engenharia de Processos, Ambiente e Energia (LEPAE), Faculdade de Engenharia da Universidade do Porto, Rua Roberto Frias, 4200-465 Porto (Portugal); Ponce, M.L.; Nunes, S.P. [GKSS Research Centre Geesthacht GmbH, Max Planck Str. 1, D-21502, Geesthacht (Germany)
2010-11-15
This work reports the preparation and characterization of proton conducting sulfonated polytriazole membranes doped with three different agents: 1H-benzimidazole-2-sulfonic acid, benzimidazole and phosphoric acid. The modified membranes were characterized by scanning electron microscopy (SEM), infrared spectra, thermogravimetric analysis (TGA), dynamical mechanical thermal analysis (DMTA) and electrochemical impedance spectroscopy (EIS). The addition of doping agents resulted in a decrease of the glass transition temperature. For membranes doped with 85 wt.% phosphoric acid solution proton conductivity increased up to 2.10{sup -3} S cm{sup -1} at 120 C and at 5% relative humidity. The performance of the phosphoric acid doped membranes was evaluated in a fuel cell set-up at 120 C and 2.5% relative humidity. (author)
Energy Technology Data Exchange (ETDEWEB)
Iwamoto, Yosuke, E-mail: iwamoto.yosuke@jaea.go.jp [Japan Atomic Energy Agency, 2-4 Shirakata-Shirane, tokai, naka Ibaraki 319-1195 (Japan); Hagiwara, Masayuki [High Energy Accelerator Research Organization (Japan); Matsumoto, Tetsuro; Masuda, Akihiko [National Institute of Advanced Industrial Science and Technology (Japan); Iwase, Hiroshi [High Energy Accelerator Research Organization (Japan); Yashima, Hiroshi [Research Reactor Institute, Kyoto University (Japan); Shima, Tatsushi; Tamii, Atsushi [Research Center for Nuclear Physics, Osaka University (Japan); Nakamura, Takashi [Cyclotron and Radioisotope Center, Tohoku University, Institute of Technology, Shimizu Corporation (Japan)
2012-10-21
Secondary neutron-production double-differential cross-sections (DDXs) have been measured from interactions of 137 MeV and 200 MeV protons in a natural carbon target. The data were measured between 0 Degree-Sign and 25 Degree-Sign in the laboratory. DDXs were obtained with high energy resolution in the energy region from 3 MeV up to the maximum energy. The experimental data of 137 MeV protons at 10 Degree-Sign and 25 Degree-Sign were in good agreement with that of 113 MeV protons at 7.5 Degree-Sign and 30 Degree-Sign at LANSCE/WNR in the energy region below 80 MeV. Benchmark calculations were carried out with the PHITS code using the evaluated nuclear data files of JENDL/HE-2007 and ENDF/B-VII, and the theoretical models of Bertini-GEM and ISOBAR-GEM. For the 137 MeV proton incidence, calculations using JENDL/HE-2007 generally reproduced the shape and the intensity of experimental spectra well including the ground state of the {sup 12}N state produced by the {sup 12}C(p,n){sup 12}N reaction. For the 200 MeV proton incidence, all calculated results underestimated the experimental data by the factor of two except for the calculated result using ISOBAR model. ISOBAR predicts the nucleon emission to the forward angles qualitatively better than the Bertini model. These experimental data will be useful to evaluate the carbon data and as benchmark data for investigating the validity of the Monte Carlo simulation for the shielding design of accelerator facilities.
Energy Technology Data Exchange (ETDEWEB)
Ureba, A.; Pereira-Barbeiro, A. R.; Jimenez-Ortega, E.; Baeza, J. A.; Salguero, F. J.; Leal, A.
2013-07-01
The use of Monte Carlo (MC) has shown an improvement in the accuracy of the calculation of the dose compared to other analytics algorithms installed on the systems of business planning, especially in the case of non-standard situations typical of complex techniques such as IMRT and VMAT. Our treatment planning system called CARMEN, is based on the complete simulation, both the beam transport in the head of the accelerator and the patient, and simulation designed for efficient operation in terms of the accuracy of the estimate and the required computation times. (Author)
Comment on "A study on tetrahedron-based inhomogeneous Monte-Carlo optical simulation".
Fang, Qianqian
2011-01-01
The Monte Carlo (MC) method is a popular approach to modeling photon propagation inside general turbid media, such as human tissue. Progress had been made in the past year with the independent proposals of two mesh-based Monte Carlo methods employing ray-tracing techniques. Both methods have shown improvements in accuracy and efficiency in modeling complex domains. A recent paper by Shen and Wang [Biomed. Opt. Express 2, 44 (2011)] reported preliminary results towards the cross-validation of the two mesh-based MC algorithms and software implementations, showing a 3-6 fold speed difference between the two software packages. In this comment, we share our views on unbiased software comparisons and discuss additional issues such as the use of pre-computed data, interpolation strategies, impact of compiler settings, use of Russian roulette, memory cost and potential pitfalls in measuring algorithm performance. Despite key differences between the two algorithms in handling of non-tetrahedral meshes, we found that they share similar structure and performance for tetrahedral meshes. A significant fraction of the observed speed differences in the mentioned article was the result of inconsistent use of compilers and libraries. PMID:21559136
Comment on "A study on tetrahedron-based inhomogeneous Monte-Carlo optical simulation".
Fang, Qianqian
2011-04-19
The Monte Carlo (MC) method is a popular approach to modeling photon propagation inside general turbid media, such as human tissue. Progress had been made in the past year with the independent proposals of two mesh-based Monte Carlo methods employing ray-tracing techniques. Both methods have shown improvements in accuracy and efficiency in modeling complex domains. A recent paper by Shen and Wang [Biomed. Opt. Express 2, 44 (2011)] reported preliminary results towards the cross-validation of the two mesh-based MC algorithms and software implementations, showing a 3-6 fold speed difference between the two software packages. In this comment, we share our views on unbiased software comparisons and discuss additional issues such as the use of pre-computed data, interpolation strategies, impact of compiler settings, use of Russian roulette, memory cost and potential pitfalls in measuring algorithm performance. Despite key differences between the two algorithms in handling of non-tetrahedral meshes, we found that they share similar structure and performance for tetrahedral meshes. A significant fraction of the observed speed differences in the mentioned article was the result of inconsistent use of compilers and libraries.
Single event upset cross section calculation for secondary particles induced by proton using Geant4
International Nuclear Information System (INIS)
Based on Monte-Carlo software Geant4, a model for calculating the proton single event upset (SEU) cross section of SRAM cell was presented. The secondary particles induced by protons were considered and effective sensitive regions were determined according to the range of the secondary particles. The single event upset and multiple bits upset (MBU) cross sections for protons with different energy were calculated. The results are in agreement with the theoretical and experimental data. (authors)
Pedestrian counting with grid-based binary sensors based on Monte Carlo method
Fujii, Shuto; Taniguchi, Yoshiaki; Hasegawa, Go; Matsuoka, Morito
2014-01-01
Abstract In this paper, we propose a method for estimating the number of pedestrians walking in opposite directions, as in cases of a shopping street or a sidewalk in a downtown area. The proposed method utilizes a compound-eye sensor that is constructed by placing two binary sensors for the pedestrians’ movement direction and multiple binary sensors for the vertical direction of the pedestrians’ movement direction. A number of Monte Carlo simulations about the movement of pedestrians are con...
Inverse treatment planning for radiation therapy based on fast Monte Carlo dose calculation
International Nuclear Information System (INIS)
An inverse treatment planning system based on fast Monte Carlo (MC) dose calculation is presented. It allows optimisation of intensity modulated dose distributions in 15 to 60 minutes on present day personal computers. If a multi-processor machine is available, parallel simulation of particle histories is also possible, leading to further calculation time reductions. The optimisation process is divided into two stages. The first stage results influence profiles based on pencil beam (PB) dose calculation. The second stage starts with MC verification and post-optimisation of the PB dose and fluence distributions. Because of the potential to accurately model beam modifiers, MC based inverse planning systems are able to optimise compensator thicknesses and leaf trajectories instead of intensity profiles only. The corresponding techniques, whose implementation is the subject for future work, are also presented here. (orig.)
GPU-based high performance Monte Carlo simulation in neutron transport
Energy Technology Data Exchange (ETDEWEB)
Heimlich, Adino; Mol, Antonio C.A.; Pereira, Claudio M.N.A. [Instituto de Engenharia Nuclear (IEN/CNEN-RJ), Rio de Janeiro, RJ (Brazil). Lab. de Inteligencia Artificial Aplicada], e-mail: cmnap@ien.gov.br
2009-07-01
Graphics Processing Units (GPU) are high performance co-processors intended, originally, to improve the use and quality of computer graphics applications. Since researchers and practitioners realized the potential of using GPU for general purpose, their application has been extended to other fields out of computer graphics scope. The main objective of this work is to evaluate the impact of using GPU in neutron transport simulation by Monte Carlo method. To accomplish that, GPU- and CPU-based (single and multicore) approaches were developed and applied to a simple, but time-consuming problem. Comparisons demonstrated that the GPU-based approach is about 15 times faster than a parallel 8-core CPU-based approach also developed in this work. (author)
IMPROVED ALGORITHM FOR ROAD REGION SEGMENTATION BASED ON SEQUENTIAL MONTE-CARLO ESTIMATION
Directory of Open Access Journals (Sweden)
Zdenek Prochazka
2014-12-01
Full Text Available In recent years, many researchers and car makers put a lot of intensive effort into development of autonomous driving systems. Since visual information is the main modality used by human driver, a camera mounted on moving platform is very important kind of sensor, and various computer vision algorithms to handle vehicle surrounding situation are under intensive research. Our final goal is to develop a vision based lane detection system with ability to handle various types of road shapes, working on both structured and unstructured roads, ideally under presence of shadows. This paper presents a modified road region segmentation algorithm based on sequential Monte-Carlo estimation. Detailed description of the algorithm is given, and evaluation results show that the proposed algorithm outperforms the segmentation algorithm developed as a part of our previous work, as well as an conventional algorithm based on colour histogram.
Monte Carlo calculation based on hydrogen composition of the tissue for MV photon radiotherapy.
Demol, Benjamin; Viard, Romain; Reynaert, Nick
2015-01-01
The purpose of this study was to demonstrate that Monte Carlo treatment planning systems require tissue characterization (density and composition) as a function of CT number. A discrete set of tissue classes with a specific composition is introduced. In the current work we demonstrate that, for megavoltage photon radiotherapy, only the hydrogen content of the different tissues is of interest. This conclusion might have an impact on MRI-based dose calculations and on MVCT calibration using tissue substitutes. A stoichiometric calibration was performed, grouping tissues with similar atomic composition into 15 dosimetrically equivalent subsets. To demonstrate the importance of hydrogen, a new scheme was derived, with correct hydrogen content, complemented by oxygen (all elements differing from hydrogen are replaced by oxygen). Mass attenuation coefficients and mass stopping powers for this scheme were calculated and compared to the original scheme. Twenty-five CyberKnife treatment plans were recalculated by an in-house developed Monte Carlo system using tissue density and hydrogen content derived from the CT images. The results were compared to Monte Carlo simulations using the original stoichiometric calibration. Between 300 keV and 3 MeV, the relative difference of mass attenuation coefficients is under 1% within all subsets. Between 10 keV and 20 MeV, the relative difference of mass stopping powers goes up to 5% in hard bone and remains below 2% for all other tissue subsets. Dose-volume histograms (DVHs) of the treatment plans present no visual difference between the two schemes. Relative differences of dose indexes D98, D95, D50, D05, D02, and Dmean were analyzed and a distribution centered around zero and of standard deviation below 2% (3 σ) was established. On the other hand, once the hydrogen content is slightly modified, important dose differences are obtained. Monte Carlo dose planning in the field of megavoltage photon radiotherapy is fully achievable using
The precision of respiratory-gated delivery of synchrotron-based pulsed beam proton therapy
Energy Technology Data Exchange (ETDEWEB)
Tsunashima, Yoshikazu; Vedam, Sastry; Dong Lei; Balter, Peter; Mohan, Radhe [Department of Radiation Physics, Unit 94, University of Texas M D Anderson Cancer Center, 1515 Holcombe Blvd, Houston, TX 77030 (United States); Umezawa, Masumi, E-mail: ytsunash@mdanderson.or [Accelerator System Group Medical System Project, Hitachi, Ltd, Energy and Environmental Systems Laboratory, 2-1, Omika-cho 7-chome, Hitachi-shi, Ibaraki-ken 319-1221 (Japan)
2010-12-21
A synchrotron-based proton therapy system operates in a low repetition rate pulsed beam delivery mode. Unlike cyclotron-based beam delivery, there is no guarantee that a synchrotron beam can be delivered effectively or precisely under the respiratory-gated mode. To evaluate the performance of gated synchrotron treatment, we simulated proton beam delivery in the synchrotron-based respiratory-gated mode using realistic patient breathing signals. Parameters used in the simulation were respiratory motion traces (70 traces from 24 patients), respiratory gate levels (10%, 20% and 30% duty cycles at the exhalation phase) and synchrotron magnet excitation cycles (T{sub cyc}) (fixed T{sub cyc} mode: 2.7, 3.0-6.0 s and each patient breathing cycle, and variable T{sub cyc} mode). The simulations were computed according to the breathing trace in which the proton beams were delivered. In the shorter fixed T{sub cyc} (<4 s), most of the proton beams were delivered uniformly to the target during the entire expiration phase of the respiratory cycle. In the longer fixed T{sub cyc} (>4 s) and the variable T{sub cyc} mode, the proton beams were not consistently delivered during the end-expiration phase of the respiratory cycle. However we found that the longer and variable T{sub cyc} operation modes delivered proton beams more precisely during irregular breathing.
Zhou, Yuhua; Yang, Jing; Su, Haibin; Zeng, Jie; Jiang, San Ping; Goddard, William A
2014-04-01
We have developed for fuel cells a novel proton exchange membrane (PEM) using inorganic phosphotungstic acid (HPW) as proton carrier and mesoporous silica as matrix (HPW-meso-silica) . The proton conductivity measured by electrochemical impedance spectroscopy is 0.11 S cm(-1) at 90 °C and 100% relative humidity (RH) with a low activation energy of ∼14 kJ mol(-1). In order to determine the energetics associated with proton migration within the HPW-meso-silica PEM and to determine the mechanism of proton hopping, we report density functional theory (DFT) calculations using the generalized gradient approximation (GGA). These DFT calculations revealed that the proton transfer process involves both intramolecular and intermolecular proton transfer pathways. When the adjacent HPWs are close (less than 17.0 Å apart), the calculated activation energy for intramolecular proton transfer within a HPW molecule is higher (29.1-18.8 kJ/mol) than the barrier for intermolecular proton transfer along the hydrogen bond. We find that the overall barrier for proton movement within the HPW-meso-silica membranes is determined by the intramolecular proton transfer pathway, which explains why the proton conductivity remains unchanged when the weight percentage of HPW on meso-silica is above 67 wt %. In contrast, the activation energy of proton transfer on a clean SiO2 (111) surface is computed to be as high as ∼40 kJ mol(-1), confirming the very low proton conductivity on clean silica surfaces observed experimentally. PMID:24628538
Avalanche proton-boron fusion based on elastic nuclear collisions
Eliezer, Shalom; Hora, Heinrich; Korn, Georg; Nissim, Noaz; Martinez Val, Josè Maria
2016-05-01
Recent experiments done at Prague with the 600 J/0.2 ns PALS laser interacting with a layer of boron dopants in a hydrogen enriched target have produced around 109 alphas. We suggest that these unexpected very high fusion reactions of proton with 11B indicate an avalanche multiplication for the measured anomalously high nuclear reaction yields. This can be explained by elastic nuclear collisions in the broad 600 keV energy band, which is coincident with the high nuclear p-11B fusion cross section, by the way of multiplication through generation of three secondary alpha particles from a single primarily produced alpha particle.
Fission yield calculation using toy model based on Monte Carlo simulation
Energy Technology Data Exchange (ETDEWEB)
Jubaidah, E-mail: jubaidah@student.itb.ac.id [Nuclear Physics and Biophysics Division, Department of Physics, Bandung Institute of Technology. Jl. Ganesa No. 10 Bandung – West Java, Indonesia 40132 (Indonesia); Physics Department, Faculty of Mathematics and Natural Science – State University of Medan. Jl. Willem Iskandar Pasar V Medan Estate – North Sumatera, Indonesia 20221 (Indonesia); Kurniadi, Rizal, E-mail: rijalk@fi.itb.ac.id [Nuclear Physics and Biophysics Division, Department of Physics, Bandung Institute of Technology. Jl. Ganesa No. 10 Bandung – West Java, Indonesia 40132 (Indonesia)
2015-09-30
Toy model is a new approximation in predicting fission yield distribution. Toy model assumes nucleus as an elastic toy consist of marbles. The number of marbles represents the number of nucleons, A. This toy nucleus is able to imitate the real nucleus properties. In this research, the toy nucleons are only influenced by central force. A heavy toy nucleus induced by a toy nucleon will be split into two fragments. These two fission fragments are called fission yield. In this research, energy entanglement is neglected. Fission process in toy model is illustrated by two Gaussian curves intersecting each other. There are five Gaussian parameters used in this research. They are scission point of the two curves (R{sub c}), mean of left curve (μ{sub L}) and mean of right curve (μ{sub R}), deviation of left curve (σ{sub L}) and deviation of right curve (σ{sub R}). The fission yields distribution is analyses based on Monte Carlo simulation. The result shows that variation in σ or µ can significanly move the average frequency of asymmetry fission yields. This also varies the range of fission yields distribution probability. In addition, variation in iteration coefficient only change the frequency of fission yields. Monte Carlo simulation for fission yield calculation using toy model successfully indicates the same tendency with experiment results, where average of light fission yield is in the range of 90
Fission yield calculation using toy model based on Monte Carlo simulation
International Nuclear Information System (INIS)
Toy model is a new approximation in predicting fission yield distribution. Toy model assumes nucleus as an elastic toy consist of marbles. The number of marbles represents the number of nucleons, A. This toy nucleus is able to imitate the real nucleus properties. In this research, the toy nucleons are only influenced by central force. A heavy toy nucleus induced by a toy nucleon will be split into two fragments. These two fission fragments are called fission yield. In this research, energy entanglement is neglected. Fission process in toy model is illustrated by two Gaussian curves intersecting each other. There are five Gaussian parameters used in this research. They are scission point of the two curves (Rc), mean of left curve (μL) and mean of right curve (μR), deviation of left curve (σL) and deviation of right curve (σR). The fission yields distribution is analyses based on Monte Carlo simulation. The result shows that variation in σ or µ can significanly move the average frequency of asymmetry fission yields. This also varies the range of fission yields distribution probability. In addition, variation in iteration coefficient only change the frequency of fission yields. Monte Carlo simulation for fission yield calculation using toy model successfully indicates the same tendency with experiment results, where average of light fission yield is in the range of 90
International Nuclear Information System (INIS)
After an accidental release of radionuclides to the inhabited environment the external gamma irradiation from deposited radioactivity contributes significantly to the radiation exposure of the population for extended periods. For evaluating this exposure pathway, three main model requirements are needed: (i) to calculate the air kerma value per photon emitted per unit source area, based on Monte Carlo (MC) simulations; (ii) to describe the distribution and dynamics of radionuclides on the diverse urban surfaces; and (iii) to combine all these elements in a relevant urban model to calculate the resulting doses according to the actual scenario. This paper provides an overview about the different approaches to calculate photon transport in urban areas and about several dose calculation codes published. Two types of Monte Carlo simulations are presented using the global and the local approaches of photon transport. Moreover, two different philosophies of the dose calculation, the 'location factor method' and a combination of relative contamination of surfaces with air kerma values are described. The main features of six codes (ECOSYS, EDEM2M, EXPURT, PARATI, TEMAS, URGENT) are highlighted together with a short model-model features intercomparison
A Strategy for Finding the Optimal Scale of Plant Core Collection Based on Monte Carlo Simulation
Directory of Open Access Journals (Sweden)
Jiancheng Wang
2014-01-01
Full Text Available Core collection is an ideal resource for genome-wide association studies (GWAS. A subcore collection is a subset of a core collection. A strategy was proposed for finding the optimal sampling percentage on plant subcore collection based on Monte Carlo simulation. A cotton germplasm group of 168 accessions with 20 quantitative traits was used to construct subcore collections. Mixed linear model approach was used to eliminate environment effect and GE (genotype × environment effect. Least distance stepwise sampling (LDSS method combining 6 commonly used genetic distances and unweighted pair-group average (UPGMA cluster method was adopted to construct subcore collections. Homogeneous population assessing method was adopted to assess the validity of 7 evaluating parameters of subcore collection. Monte Carlo simulation was conducted on the sampling percentage, the number of traits, and the evaluating parameters. A new method for “distilling free-form natural laws from experimental data” was adopted to find the best formula to determine the optimal sampling percentages. The results showed that coincidence rate of range (CR was the most valid evaluating parameter and was suitable to serve as a threshold to find the optimal sampling percentage. The principal component analysis showed that subcore collections constructed by the optimal sampling percentages calculated by present strategy were well representative.
Lechtman, E; Mashouf, S; Chattopadhyay, N; Keller, B M; Lai, P; Cai, Z; Reilly, R M; Pignol, J-P
2013-05-21
Radiosensitization using gold nanoparticles (AuNPs) has been shown to vary widely with cell line, irradiation energy, AuNP size, concentration and intracellular localization. We developed a Monte Carlo-based AuNP radiosensitization predictive model (ARP), which takes into account the detailed energy deposition at the nano-scale. This model was compared to experimental cell survival and macroscopic dose enhancement predictions. PC-3 prostate cancer cell survival was characterized after irradiation using a 300 kVp photon source with and without AuNPs present in the cell culture media. Detailed Monte Carlo simulations were conducted, producing individual tracks of photoelectric products escaping AuNPs and energy deposition was scored in nano-scale voxels in a model cell nucleus. Cell survival in our predictive model was calculated by integrating the radiation induced lethal event density over the nucleus volume. Experimental AuNP radiosensitization was observed with a sensitizer enhancement ratio (SER) of 1.21 ± 0.13. SERs estimated using the ARP model and the macroscopic enhancement model were 1.20 ± 0.12 and 1.07 ± 0.10 respectively. In the hypothetical case of AuNPs localized within the nucleus, the ARP model predicted a SER of 1.29 ± 0.13, demonstrating the influence of AuNP intracellular localization on radiosensitization.
Sign Learning Kink-based (SiLK) Quantum Monte Carlo for molecular systems
International Nuclear Information System (INIS)
The Sign Learning Kink (SiLK) based Quantum Monte Carlo (QMC) method is used to calculate the ab initio ground state energies for multiple geometries of the H2O, N2, and F2 molecules. The method is based on Feynman’s path integral formulation of quantum mechanics and has two stages. The first stage is called the learning stage and reduces the well-known QMC minus sign problem by optimizing the linear combinations of Slater determinants which are used in the second stage, a conventional QMC simulation. The method is tested using different vector spaces and compared to the results of other quantum chemical methods and to exact diagonalization. Our findings demonstrate that the SiLK method is accurate and reduces or eliminates the minus sign problem
Simulation of Cone Beam CT System Based on Monte Carlo Method
Wang, Yu; Cao, Ruifen; Hu, Liqin; Li, Bingbing
2014-01-01
Adaptive Radiation Therapy (ART) was developed based on Image-guided Radiation Therapy (IGRT) and it is the trend of photon radiation therapy. To get a better use of Cone Beam CT (CBCT) images for ART, the CBCT system model was established based on Monte Carlo program and validated against the measurement. The BEAMnrc program was adopted to the KV x-ray tube. Both IOURCE-13 and ISOURCE-24 were chosen to simulate the path of beam particles. The measured Percentage Depth Dose (PDD) and lateral dose profiles under 1cm water were compared with the dose calculated by DOSXYZnrc program. The calculated PDD was better than 1% within the depth of 10cm. More than 85% points of calculated lateral dose profiles was within 2%. The correct CBCT system model helps to improve CBCT image quality for dose verification in ART and assess the CBCT image concomitant dose risk.
Sign learning kink-based (SiLK) quantum Monte Carlo for molecular systems
Ma, Xiaoyao; Loffler, Frank; Kowalski, Karol; Bhaskaran-Nair, Kiran; Jarrell, Mark; Moreno, Juana
2015-01-01
The Sign Learning Kink (SiLK) based Quantum Monte Carlo (QMC) method is used to calculate the ab initio ground state energies for multiple geometries of the H$_{2}$O, N$_2$, and F$_2$ molecules. The method is based on Feynman's path integral formulation of quantum mechanics and has two stages. The first stage is called the learning stage and reduces the well-known QMC minus sign problem by optimizing the linear combinations of Slater determinants which are used in the second stage, a conventional QMC simulation. The method is tested using different vector spaces and compared to the results of other quantum chemical methods and to exact diagonalization. Our findings demonstrate that the SiLK method is accurate and reduces or eliminates the minus sign problem.
Sign Learning Kink-based (SiLK) Quantum Monte Carlo for molecular systems.
Ma, Xiaoyao; Hall, Randall W; Löffler, Frank; Kowalski, Karol; Bhaskaran-Nair, Kiran; Jarrell, Mark; Moreno, Juana
2016-01-01
The Sign Learning Kink (SiLK) based Quantum Monte Carlo (QMC) method is used to calculate the ab initio ground state energies for multiple geometries of the H2O, N2, and F2 molecules. The method is based on Feynman's path integral formulation of quantum mechanics and has two stages. The first stage is called the learning stage and reduces the well-known QMC minus sign problem by optimizing the linear combinations of Slater determinants which are used in the second stage, a conventional QMC simulation. The method is tested using different vector spaces and compared to the results of other quantum chemical methods and to exact diagonalization. Our findings demonstrate that the SiLK method is accurate and reduces or eliminates the minus sign problem. PMID:26747795
Sign Learning Kink-based (SiLK) Quantum Monte Carlo for molecular systems
Ma, Xiaoyao; Hall, Randall W.; Löffler, Frank; Kowalski, Karol; Bhaskaran-Nair, Kiran; Jarrell, Mark; Moreno, Juana
2016-01-01
The Sign Learning Kink (SiLK) based Quantum Monte Carlo (QMC) method is used to calculate the ab initio ground state energies for multiple geometries of the H2O, N2, and F2 molecules. The method is based on Feynman's path integral formulation of quantum mechanics and has two stages. The first stage is called the learning stage and reduces the well-known QMC minus sign problem by optimizing the linear combinations of Slater determinants which are used in the second stage, a conventional QMC simulation. The method is tested using different vector spaces and compared to the results of other quantum chemical methods and to exact diagonalization. Our findings demonstrate that the SiLK method is accurate and reduces or eliminates the minus sign problem.
Sign learning kink-based (SiLK) quantum Monte Carlo for molecular systems
Energy Technology Data Exchange (ETDEWEB)
Ma, Xiaoyao; Hall, Randall W.; Loffler, Frank; Kowalski, Karol; Bhaskaran-Nair, Kiran; Jarrell, Mark; Moreno, Juana
2016-01-07
The Sign Learning Kink (SiLK) based Quantum Monte Carlo (QMC) method is used to calculate the ab initio ground state energies for multiple geometries of the H2O, N2, and F2 molecules. The method is based on Feynman’s path integral formulation of quantum mechanics and has two stages. The first stage is called the learning stage and reduces the well-known QMC minus sign problem by optimizing the linear combinations of Slater determinants which are used in the second stage, a conventional QMC simulation. The method is tested using different vector spaces and compared to the results of other quantum chemical methods and to exact diagonalization. Our findings demonstrate that the SiLK method is accurate and reduces or eliminates the minus sign problem.
Sign Learning Kink-based (SiLK) Quantum Monte Carlo for molecular systems
Energy Technology Data Exchange (ETDEWEB)
Ma, Xiaoyao [Department of Physics and Astronomy, Louisiana State University, Baton Rouge, Louisiana 70803 (United States); Hall, Randall W. [Department of Natural Sciences and Mathematics, Dominican University of California, San Rafael, California 94901 (United States); Department of Chemistry, Louisiana State University, Baton Rouge, Louisiana 70803 (United States); Löffler, Frank [Center for Computation and Technology, Louisiana State University, Baton Rouge, Louisiana 70803 (United States); Kowalski, Karol [William R. Wiley Environmental Molecular Sciences Laboratory, Battelle, Pacific Northwest National Laboratory, Richland, Washington 99352 (United States); Bhaskaran-Nair, Kiran; Jarrell, Mark; Moreno, Juana [Department of Physics and Astronomy, Louisiana State University, Baton Rouge, Louisiana 70803 (United States); Center for Computation and Technology, Louisiana State University, Baton Rouge, Louisiana 70803 (United States)
2016-01-07
The Sign Learning Kink (SiLK) based Quantum Monte Carlo (QMC) method is used to calculate the ab initio ground state energies for multiple geometries of the H{sub 2}O, N{sub 2}, and F{sub 2} molecules. The method is based on Feynman’s path integral formulation of quantum mechanics and has two stages. The first stage is called the learning stage and reduces the well-known QMC minus sign problem by optimizing the linear combinations of Slater determinants which are used in the second stage, a conventional QMC simulation. The method is tested using different vector spaces and compared to the results of other quantum chemical methods and to exact diagonalization. Our findings demonstrate that the SiLK method is accurate and reduces or eliminates the minus sign problem.
Study of CANDU thorium-based fuel cycles by deterministic and Monte Carlo methods
International Nuclear Information System (INIS)
In the framework of the Generation IV forum, there is a renewal of interest in self-sustainable thorium fuel cycles applied to various concepts such as Molten Salt Reactors [1, 2] or High Temperature Reactors [3, 4]. Precise evaluations of the U-233 production potential relying on existing reactors such as PWRs [5] or CANDUs [6] are hence necessary. As a consequence of its design (online refueling and D2O moderator in a thermal spectrum), the CANDU reactor has moreover an excellent neutron economy and consequently a high fissile conversion ratio [7]. For these reasons, we try here, with a shorter term view, to re-evaluate the economic competitiveness of once-through thorium-based fuel cycles in CANDU [8]. Two simulation tools are used: the deterministic Canadian cell code DRAGON [9] and MURE [10], a C++ tool for reactor evolution calculations based on the Monte Carlo code MCNP [11]. (authors)
Quasi-Monte Carlo Simulation-Based SFEM for Slope Reliability Analysis
Institute of Scientific and Technical Information of China (English)
Yu Yuzhen; Xie Liquan; Zhang Bingyin
2005-01-01
Considering the stochastic spatial variation of geotechnical parameters over the slope, a Stochastic Finite Element Method (SFEM) is established based on the combination of the Shear Strength Reduction (SSR) concept and quasi-Monte Carlo simulation. The shear strength reduction FEM is superior to the slice method based on the limit equilibrium theory in many ways, so it will be more powerful to assess the reliability of global slope stability when combined with probability theory. To illustrate the performance of the proposed method, it is applied to an example of simple slope. The results of simulation show that the proposed method is effective to perform the reliability analysis of global slope stability without presupposing a potential slip surface.
Niccolini, G.; Alcolea, J.
Solving the radiative transfer problem is a common problematic to may fields in astrophysics. With the increasing angular resolution of spatial or ground-based telescopes (VLTI, HST) but also with the next decade instruments (NGST, ALMA, ...), astrophysical objects reveal and will certainly reveal complex spatial structures. Consequently, it is necessary to develop numerical tools being able to solve the radiative transfer equation in three dimensions in order to model and interpret these observations. I present a 3D radiative transfer program, using a new method for the construction of an adaptive spatial grid, based on the Monte Claro method. With the help of this tools, one can solve the continuum radiative transfer problem (e.g. a dusty medium), computes the temperature structure of the considered medium and obtain the flux of the object (SED and images).
SQUID-based beam position monitoring for proton EDM experiment
Haciomeroglu, Selcuk
2014-09-01
One of the major systematic errors in the proton EDM experiment is the radial B-field, since it couples the magnetic dipole moment and causes a vertical spin precession. For a proton with EDM at the level of 10-29 e.cm, 0.22 pG of B-field and 10.5 MV/m of E-field cause same vertical spin precession. On the other hand, the radial B-field splits the counter-rotating beams depending on the vertical focusing strength in the ring The magnetic field due to this split modulated at a few kHz can be measured by a SQUID-magnetometer. This measurement requires the B-field to be kept less than 1 nT everywhere around the ring using shields of mu-metal and aluminum layers. Then, the SQUID measurements involve noise from three sources: outside the shields, the shields themselves and the beam. We study these three sources of noise using an electric circuit (mimicking the beam) inside a magnetic shielding room which consists two-layers of mu-metal and an aluminum layer.
GGEMS-Brachy: GPU GEant4-based Monte Carlo simulation for brachytherapy applications
International Nuclear Information System (INIS)
In brachytherapy, plans are routinely calculated using the AAPM TG43 formalism which considers the patient as a simple water object. An accurate modeling of the physical processes considering patient heterogeneity using Monte Carlo simulation (MCS) methods is currently too time-consuming and computationally demanding to be routinely used. In this work we implemented and evaluated an accurate and fast MCS on Graphics Processing Units (GPU) for brachytherapy low dose rate (LDR) applications. A previously proposed Geant4 based MCS framework implemented on GPU (GGEMS) was extended to include a hybrid GPU navigator, allowing navigation within voxelized patient specific images and analytically modeled 125I seeds used in LDR brachytherapy. In addition, dose scoring based on track length estimator including uncertainty calculations was incorporated. The implemented GGEMS-brachy platform was validated using a comparison with Geant4 simulations and reference datasets. Finally, a comparative dosimetry study based on the current clinical standard (TG43) and the proposed platform was performed on twelve prostate cancer patients undergoing LDR brachytherapy. Considering patient 3D CT volumes of 400 × 250 × 65 voxels and an average of 58 implanted seeds, the mean patient dosimetry study run time for a 2% dose uncertainty was 9.35 s (≈500 ms 10−6 simulated particles) and 2.5 s when using one and four GPUs, respectively. The performance of the proposed GGEMS-brachy platform allows envisaging the use of Monte Carlo simulation based dosimetry studies in brachytherapy compatible with clinical practice. Although the proposed platform was evaluated for prostate cancer, it is equally applicable to other LDR brachytherapy clinical applications. Future extensions will allow its application in high dose rate brachytherapy applications. (paper)
Energy Technology Data Exchange (ETDEWEB)
Dedes, G; Asano, Y; Parodi, K [Ludwig Maximilians University Munich, Garching, DE (Germany); Arbor, N [Universite de Strasbourg, Strasbourg (France); Dauvergne, D; Testa, E [Universite Lyon 1, Institut de Physique Nucleaire de Lyon, Lyon (France); Letang, J; Rit, S [Universite Lyon 1, INSA Lyon, CREATIS, Lyon (France)
2015-06-15
Purpose: The quantification of the intrinsic performances of proton computed tomography (pCT) as a modality for treatment planning in proton therapy. The performance of an ideal pCT scanner is studied as a function of various parameters. Methods: Using GATE/Geant4, we simulated an ideal pCT scanner and scans of several cylindrical phantoms with various tissue equivalent inserts of different sizes. Insert materials were selected in order to be of clinical relevance. Tomographic images were reconstructed using a filtered backprojection algorithm taking into account the scattering of protons into the phantom. To quantify the performance of the ideal pCT scanner, we study the precision and the accuracy with respect to the theoretical relative stopping power ratios (RSP) values for different beam energies, imaging doses, insert sizes and detector positions. The planning range uncertainty resulting from the reconstructed RSP is also assessed by comparison with the range of the protons in the analytically simulated phantoms. Results: The results indicate that pCT can intrinsically achieve RSP resolution below 1%, for most examined tissues at beam energies below 300 MeV and for imaging doses around 1 mGy. RSP maps accuracy of less than 0.5 % is observed for most tissue types within the studied dose range (0.2–1.5 mGy). Finally, the uncertainty in the proton range due to the accuracy of the reconstructed RSP map is well below 1%. Conclusion: This work explores the intrinsic performance of pCT as an imaging modality for proton treatment planning. The obtained results show that under ideal conditions, 3D RSP maps can be reconstructed with an accuracy better than 1%. Hence, pCT is a promising candidate for reducing the range uncertainties introduced by the use of X-ray CT alongside with a semiempirical calibration to RSP.Supported by the DFG Cluster of Excellence Munich-Centre for Advanced Photonics (MAP)
International Nuclear Information System (INIS)
Because proton beams activate positron emitters in patients, positron emission tomography (PET) has the potential to play a unique role in the in vivo verification of proton radiotherapy. Unfortunately, the PET image is not directly proportional to the delivered radiation dose distribution. Current treatment verification strategies using PET therefore compare the actual PET image with full-blown Monte Carlo simulations of the PET signal. In this paper, we describe a simpler and more direct way to reconstruct the expected PET signal from the local radiation dose distribution near the distal fall-off region, which is calculated by the treatment planning programme. Under reasonable assumptions, the PET image can be described as a convolution of the dose distribution with a filter function. We develop a formalism to derive the filter function analytically. The main concept is the introduction of 'Q-tilde' functions defined as the convolution of a Gaussian with a powerlaw function. Special Q-tilde functions are the Gaussian itself and the error function. The convolution of two Q-tilde functions is another Q-tilde function. By fitting elementary dose distributions and their corresponding PET signals with Q-tilde functions, we derive the Q-tilde function approximation of the filter. The new filtering method has been validated through comparisons with Monte Carlo calculations and, in one case, with measured data. While the basic concept is developed under idealized conditions assuming that the absorbing medium is homogeneous near the distal fall-off region, a generalization to inhomogeneous situations is also described. As a result, the method can determine the distal fall-off region of the PET signal, and consequently the range of the proton beam, with millimetre accuracy. Quantification of the produced activity is possible. In conclusion, the PET activity resulting from a proton beam treatment can be determined by locally filtering the dose distribution as obtained from
Possible magnetism based on orbital motion of protons in ice
Yen, Fei; Liu, Yongsheng; Berlie, Adam
2016-01-01
A peak anomaly is observed in the magnetic susceptibility as a function of temperature in solid H2O near Tp=60 K. At external magnetic fields below 2 kOe, Tp becomes positive in the temperature range between 45 and 66 K. The magnetic field dependence of the susceptibility in the same temperature range exhibits an inverted ferromagnetic hysteretic loop superimposed on top of the diamagnetic signature of ice at fields below 600 Oe. We suggest that a fraction of protons that are capable of undergoing correlated tunneling in a hexagonal path without disrupting the stoichiometry of the lattice create an induced magnetic field opposite to the induced magnetic field created by the electrons upon application of an external field which counters the overall diamagnetism of the material.
Localization Algorithm of Ranging Sensor Networks Based on Monte Carlo%基于Monte Carlo的测距传感网络定位算法
Institute of Scientific and Technical Information of China (English)
2013-01-01
针对现有蒙特卡罗定位存在的一些应用缺陷，提出一种基于ZigBee传感网测距的蒙特卡罗定位算法。介绍了改进算法的实现步骤，该方法在定位时获取多个外部信息，同时将定位样本历史信息应用到位置估计中，且在ZigBee室内测距的特点上加入了改进的高斯滤波算法并与均值滤波进行对比。测试结果表明，该算法较原有算法取得了明显的改进优势。%The Monte Carlo localization algorithm based on ZigBee sensor network ranging is proposed to improve Monte Carlo application defect. Introduce the implement steps of improved algorithm. The method acquired many exterior information when in location, then use the location sample history information to position estimation. Add improved Gaussian filter algorithm into ZigBee indoor ranging feature and compare it with mean value. The test results show that the method is significantly improved advantage.
MaGe - a Geant4-based Monte Carlo framework for low-background experiments
Chan, Yuen-Dat; Henning, Reyco; Gehman, Victor M; Johnson, Rob A; Jordan, David V; Kazkaz, Kareem; Knapp, Markus; Kroninger, Kevin; Lenz, Daniel; Liu, Jing; Liu, Xiang; Marino, Michael G; Mokhtarani, Akbar; Pandola, Luciano; Schubert, Alexis G; Tomei, Claudia
2008-01-01
A Monte Carlo framework, MaGe, has been developed based on the Geant4 simulation toolkit. Its purpose is to simulate physics processes in low-energy and low-background radiation detectors, specifically for the Majorana and Gerda $^{76}$Ge neutrinoless double-beta decay experiments. This jointly-developed tool is also used to verify the simulation of physics processes relevant to other low-background experiments in Geant4. The MaGe framework contains simulations of prototype experiments and test stands, and is easily extended to incorporate new geometries and configurations while still using the same verified physics processes, tunings, and code framework. This reduces duplication of efforts and improves the robustness of and confidence in the simulation output.
Auxiliary-field based trial wave functions in quantum Monte Carlo simulations
Chang, Chia-Chen; Rubenstein, Brenda; Morales, Miguel
We propose a simple scheme for generating correlated multi-determinant trial wave functions for quantum Monte Carlo algorithms. The method is based on the Hubbard-Stratonovich transformation which decouples a two-body Jastrow-type correlator into one-body projectors coupled to auxiliary fields. We apply the technique to generate stochastic representations of the Gutzwiller wave function, and present benchmark resuts for the ground state energy of the Hubbard model in one dimension. Extensions of the proposed scheme to chemical systems will also be discussed. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344, 15-ERD-013.
Calculation and analysis of heat source of PWR assemblies based on Monte Carlo method
International Nuclear Information System (INIS)
When fission occurs in nuclear fuel in reactor core, it releases numerous neutron and γ radiation, which takes energy deposition in fuel components and yields many factors such as thermal stressing and radiation damage influencing the safe operation of a reactor. Using the three-dimensional Monte Carlo transport calculation program MCNP and continuous cross-section database based on ENDF/B series to calculate the heat rate of the heat source on reference assemblies of a PWR when loading with 18-month short refueling cycle mode, and get the precise values of the control rod, thimble plug and new burnable poison rod within Gd, so as to provide basis for reactor design and safety verification. (authors)
Monte Carlo-based Noise Compensation in Coil Intensity Corrected Endorectal MRI
Lui, Dorothy; Haider, Masoom; Wong, Alexander
2015-01-01
Background: Prostate cancer is one of the most common forms of cancer found in males making early diagnosis important. Magnetic resonance imaging (MRI) has been useful in visualizing and localizing tumor candidates and with the use of endorectal coils (ERC), the signal-to-noise ratio (SNR) can be improved. The coils introduce intensity inhomogeneities and the surface coil intensity correction built into MRI scanners is used to reduce these inhomogeneities. However, the correction typically performed at the MRI scanner level leads to noise amplification and noise level variations. Methods: In this study, we introduce a new Monte Carlo-based noise compensation approach for coil intensity corrected endorectal MRI which allows for effective noise compensation and preservation of details within the prostate. The approach accounts for the ERC SNR profile via a spatially-adaptive noise model for correcting non-stationary noise variations. Such a method is useful particularly for improving the image quality of coil i...
GPU-based fast Monte Carlo simulation for radiotherapy dose calculation
Jia, Xun; Graves, Yan Jiang; Folkerts, Michael; Jiang, Steve B
2011-01-01
Monte Carlo (MC) simulation is commonly considered to be the most accurate dose calculation method in radiotherapy. However, its efficiency still requires improvement for many routine clinical applications. In this paper, we present our recent progress towards the development a GPU-based MC dose calculation package, gDPM v2.0. It utilizes the parallel computation ability of a GPU to achieve high efficiency, while maintaining the same particle transport physics as in the original DPM code and hence the same level of simulation accuracy. In GPU computing, divergence of execution paths between threads can considerably reduce the efficiency. Since photons and electrons undergo different physics and hence attain different execution paths, we use a simulation scheme where photon transport and electron transport are separated to partially relieve the thread divergence issue. High performance random number generator and hardware linear interpolation are also utilized. We have also developed various components to hand...
Proton radiography to improve proton therapy treatment
Takatsu, J.; van der Graaf, E. R.; Van Goethem, M. -J.; van Beuzekom, M.; Klaver, T.; Visser, J.; Brandenburg, S.; Biegun, A. K.
2016-01-01
The quality of cancer treatment with protons critically depends on an accurate prediction of the proton stopping powers for the tissues traversed by the protons. Today, treatment planning in proton radiotherapy is based on stopping power calculations from densities of X-ray Computed Tomography (CT)
Theoretical study of the influence of ribose on the proton transfer phenomenon of nucleic acid bases
International Nuclear Information System (INIS)
The first comprehensive theoretical study of ribose's effects on the behavior of proton transfer of nucleic acid base is presented. The specific hydrogen bonding of the ribose hydroxyls plays a very important role in the stabilization of the structure of ribonucleoside. Nine stable uridine conformations have been reported. The intermolecular proton transfer of the isolated, monohydrated uridine complexes in three different regions were extensively explored on the basis of density functional theory at the B3LYP/6-31+G* level. With the introduction of the ribose, not only the structural parameters of the nucleic acid bases changed, but also the energy barriers of the proton transfer process changed. Furthermore, changes of the electron distributions of the molecular orbital of the nucleic acid bases were also analyzed by NBO analysis. Consideration of the ribose's influence represents a much more real situation in the RNA
International Nuclear Information System (INIS)
The internal radiation dose calculations based on Chinese models is important in nuclear medicine. Most of the existing models are based on the physical and anatomical data of Caucasian, whose anatomical structure and physiological parameters are quite different from the Chinese, may lead significant effect on internal radiation. Therefore, it is necessary to establish the model based on the Chinese ethnic characteristics, and applied to radiation dosimetry calculation. In this study, a voxel model was established based on the high resolution Visible Chinese Human (VCH). The transport procedure of photon and electron was simulated using the MCNPX Monte Carlo code. Absorbed fraction (AF) and specific absorbed fraction (SAF) were calculated and S-factors and mean absorbed doses for organs with 99mTc located in liver were also obtained. In comparison with those of VIP-Man and MIRD models, discrepancies were found to be correlated with the racial and anatomical differences in organ mass and inter-organ distance. The internal dosimetry data based on other models that were used to apply to Chinese adult population are replaced with Chinese specific data. The obtained results provide a reference for nuclear medicine, such as dose verification after surgery and potential radiation evaluation for radionuclides in preclinical research, etc. (authors)
A CAD based automatic modeling method for primitive solid based Monte Carlo calculation geometry
International Nuclear Information System (INIS)
The Multi-Physics Coupling Analysis Modeling Program (MCAM), developed by FDS Team, China, is an advanced modeling tool aiming to solve the modeling challenges for multi-physics coupling simulation. The automatic modeling method for SuperMC, the Super Monte Carlo Calculation Program for Nuclear and Radiation Process, was recently developed and integrated in MCAM5.2. This method could bi-convert between CAD model and SuperMC input file. While converting from CAD model to SuperMC model, the CAD model was decomposed into several convex solids set, and then corresponding SuperMC convex basic solids were generated and output. While inverting from SuperMC model to CAD model, the basic primitive solids was created and related operation was done to according the SuperMC model. This method was benchmarked with ITER Benchmark model. The results showed that the method was correct and effective. (author)
2012-01-01
Proton Therapy Physics goes beyond current books on proton therapy to provide an in-depth overview of the physics aspects of this radiation therapy modality, eliminating the need to dig through information scattered in the medical physics literature. After tracing the history of proton therapy, the book summarizes the atomic and nuclear physics background necessary for understanding proton interactions with tissue. It describes the physics of proton accelerators, the parameters of clinical proton beams, and the mechanisms to generate a conformal dose distribution in a patient. The text then covers detector systems and measuring techniques for reference dosimetry, outlines basic quality assurance and commissioning guidelines, and gives examples of Monte Carlo simulations in proton therapy. The book moves on to discussions of treatment planning for single- and multiple-field uniform doses, dose calculation concepts and algorithms, and precision and uncertainties for nonmoving and moving targets. It also exami...
The models of proton assisted and the unassisted formation of CGC base triplets.
Medhi, Chitrani
2002-01-01
The triple helix is formed by combining a double and a single strand DNAs in low pH and dissociates in high pH. Under such conditions, protonation of cytosine in the single strand is necessary for triplex formation where cytosine-guanine-cytosine (CGC+) base triplet stabilizes the triple helix. The mechanism of CGC+ triplet formation from guanine-cytosine (GC) and a protonated cytosine (C+) shows the importance of N3 proton. Similarly in the case of CGC (unprotonated) triplet, the donor acceptor H-bond at N3 hydrogen of the cytosine analog (C) initiates the interaction with GC. The correspondence between the two models of triplets, CGC+ and CGC, unambiguously assigned that protonation at N3 cytosine in low pH to be the first step in triplet formation, but a donor acceptor triplet (CGC) can be designed without involving a proton in the Hoogsteen H-bond. Further, the bases of cytosine analogue also show the capability of forming Watson Crick (WC) H-bonds with guanine.
Tahat, Amani; Martí Rabassa, Jordi; Khwaldeh, Ali; Tahat, Kaher
2014-01-01
In computational physics proton transfer phenomena could be viewed as pattern classification problems based on a set of input features allowing to classify the proton motion into two categories: transfer‘occurred’and transfer‘not occurred’. The goal of this paper is to evaluate the use of artificial neural networks in the classification of proton transfer events, based on the feed-forward back propagation neural network, used as a classifier to distinguish between the two transfer cases. In t...
International Nuclear Information System (INIS)
We present a new Monte Carlo method based upon the theoretical proposal of Claverie and Soto. By contrast with other Quantum Monte Carlo methods used so far, the present approach uses a pure diffusion process without any branching. The many-fermion problem (with the specific constraint due to the Pauli principle) receives a natural solution in the framework of this method: in particular, there is neither the fixed-node approximation not the nodal release problem which occur in other approaches (see, e.g., Ref. 8 for a recent account). We give some numerical results concerning simple systems in order to illustrate the numerical feasibility of the proposed algorithm
Development of a GPU-based Monte Carlo dose calculation code for coupled electron-photon transport
Jia, Xun; Sempau, Josep; Choi, Dongju; Majumdar, Amitava; Jiang, Steve B
2009-01-01
Monte Carlo simulation is the most accurate method for absorbed dose calculations in radiotherapy. Its efficiency still requires improvement for routine clinical applications, especially for online adaptive radiotherapy. In this paper, we report our recent development on a GPU-based Monte Carlo dose calculation code for coupled electron-photon transport. We have implemented the Dose Planning Method (DPM) Monte Carlo dose calculation package (Sempau et al, Phys. Med. Biol., 45(2000)2263-2291) on GPU architecture under CUDA platform. The implementation has been tested with respect to the original sequential DPM code on CPU in two cases. Our results demonstrate the adequate accuracy of the GPU implementation for both electron and photon beams in radiotherapy energy range. A speed up factor of 4.5 and 5.5 times have been observed for electron and photon testing cases, respectively, using an NVIDIA Tesla C1060 GPU card against a 2.27GHz Intel Xeon CPU processor .
International Nuclear Information System (INIS)
The response of scintillation counters in an ionization calorimeter to incident protons of momenta 3.5 to 200 GeV/c was simulated using the CALOR computer code system. Results of the simulation are compared with data taken at Brookhaven National Laboratory for 50-, 100-, and 278-GeV hadrons. Mechanisms which produce large pulse heights for low-energy incident particles are discussed. 14 references, 7 figures
International Nuclear Information System (INIS)
Electronic systems in space and terrestrial environments are subjected to a flow of particles of natural origin, which can induce dysfunctions. These particles can cause Single Event Upsets (SEU) in SRAM memories. Although non-destructive, the SEU can have consequences on the equipment functioning in applications requiring a great reliability (airplane, satellite, launcher, medical, etc). Thus, an evaluation of the sensitivity of the component technology is necessary to predict the reliability of a system. In atmospheric environment, the SEU sensitivity is mainly caused by the secondary ions resulting from the nuclear reactions between the neutrons and the atoms of the component. In space environment, the protons with strong energies induce the same effects as the atmospheric neutrons. In our work, a new code of prediction of the rate of SEU has been developed (MC-DASIE) in order to quantify the sensitivity for a given environment and to explore the mechanisms of failures according to technology. This code makes it possible to study various technologies of memories SRAM (Bulk and SOI) in neutron and proton environment between 1 MeV and 1 GeV. Thus, MC-DASIE was used with experiment data to study the effect of integration on the sensitivity of the memories in terrestrial environment, a comparison between the neutron and proton irradiations and the influence of the modeling of the target component on the calculation of the rate of SEU. (author)
Monte Carlo vs. Pencil Beam based optimization of stereotactic lung IMRT
Directory of Open Access Journals (Sweden)
Weinmann Martin
2009-12-01
Full Text Available Abstract Background The purpose of the present study is to compare finite size pencil beam (fsPB and Monte Carlo (MC based optimization of lung intensity-modulated stereotactic radiotherapy (lung IMSRT. Materials and methods A fsPB and a MC algorithm as implemented in a biological IMRT planning system were validated by film measurements in a static lung phantom. Then, they were applied for static lung IMSRT planning based on three different geometrical patient models (one phase static CT, density overwrite one phase static CT, average CT of the same patient. Both 6 and 15 MV beam energies were used. The resulting treatment plans were compared by how well they fulfilled the prescribed optimization constraints both for the dose distributions calculated on the static patient models and for the accumulated dose, recalculated with MC on each of 8 CTs of a 4DCT set. Results In the phantom measurements, the MC dose engine showed discrepancies Conclusions It is feasible to employ the MC dose engine for optimization of lung IMSRT and the plans are superior to fsPB. Use of static patient models introduces a bias in the MC dose distribution compared to the 4D MC recalculated dose, but this bias is predictable and therefore MC based optimization on static patient models is considered safe.
A new method for RGB to CIELAB color space transformation based on Markov chain Monte Carlo
Chen, Yajun; Liu, Ding; Liang, Junli
2013-10-01
During printing quality inspection, the inspection of color error is an important content. However, the RGB color space is device-dependent, usually RGB color captured from CCD camera must be transformed into CIELAB color space, which is perceptually uniform and device-independent. To cope with the problem, a Markov chain Monte Carlo (MCMC) based algorithms for the RGB to the CIELAB color space transformation is proposed in this paper. Firstly, the modeling color targets and testing color targets is established, respectively used in modeling and performance testing process. Secondly, we derive a Bayesian model for estimation the coefficients of a polynomial, which can be used to describe the relation between RGB and CIELAB color space. Thirdly, a Markov chain is set up base on Gibbs sampling algorithm (one of the MCMC algorithm) to estimate the coefficients of polynomial. Finally, the color difference of testing color targets is computed for evaluating the performance of the proposed method. The experimental results showed that the nonlinear polynomial regression based on MCMC algorithm is effective, whose performance is similar to the least square approach and can accurately model the RGB to the CIELAB color space conversion and guarantee the color error evaluation for printing quality inspection system.
Energy Technology Data Exchange (ETDEWEB)
Park, Peter C. [Department of Radiation Physics, University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Schreibmann, Eduard; Roper, Justin; Elder, Eric; Crocker, Ian [Department of Radiation Oncology, Winship Cancer Institute of Emory University, Atlanta, Georgia (United States); Fox, Tim [Varian Medical Systems, Palo Alto, California (United States); Zhu, X. Ronald [Department of Radiation Physics, University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Dong, Lei [Scripps Proton Therapy Center, San Diego, California (United States); Dhabaan, Anees, E-mail: anees.dhabaan@emory.edu [Department of Radiation Oncology, Winship Cancer Institute of Emory University, Atlanta, Georgia (United States)
2015-03-15
Purpose: Computed tomography (CT) artifacts can severely degrade dose calculation accuracy in proton therapy. Prompted by the recently increased popularity of magnetic resonance imaging (MRI) in the radiation therapy clinic, we developed an MRI-based CT artifact correction method for improving the accuracy of proton range calculations. Methods and Materials: The proposed method replaces corrupted CT data by mapping CT Hounsfield units (HU number) from a nearby artifact-free slice, using a coregistered MRI. MRI and CT volumetric images were registered with use of 3-dimensional (3D) deformable image registration (DIR). The registration was fine-tuned on a slice-by-slice basis by using 2D DIR. Based on the intensity of paired MRI pixel values and HU from an artifact-free slice, we performed a comprehensive analysis to predict the correct HU for the corrupted region. For a proof-of-concept validation, metal artifacts were simulated on a reference data set. Proton range was calculated using reference, artifactual, and corrected images to quantify the reduction in proton range error. The correction method was applied to 4 unique clinical cases. Results: The correction method resulted in substantial artifact reduction, both quantitatively and qualitatively. On respective simulated brain and head and neck CT images, the mean error was reduced from 495 and 370 HU to 108 and 92 HU after correction. Correspondingly, the absolute mean proton range errors of 2.4 cm and 1.7 cm were reduced to less than 2 mm in both cases. Conclusions: Our MRI-based CT artifact correction method can improve CT image quality and proton range calculation accuracy for patients with severe CT artifacts.
Ultrafast cone-beam CT scatter correction with GPU-based Monte Carlo simulation
Directory of Open Access Journals (Sweden)
Yuan Xu
2014-03-01
Full Text Available Purpose: Scatter artifacts severely degrade image quality of cone-beam CT (CBCT. We present an ultrafast scatter correction framework by using GPU-based Monte Carlo (MC simulation and prior patient CT image, aiming at automatically finish the whole process including both scatter correction and reconstruction within 30 seconds.Methods: The method consists of six steps: 1 FDK reconstruction using raw projection data; 2 Rigid Registration of planning CT to the FDK results; 3 MC scatter calculation at sparse view angles using the planning CT; 4 Interpolation of the calculated scatter signals to other angles; 5 Removal of scatter from the raw projections; 6 FDK reconstruction using the scatter-corrected projections. In addition to using GPU to accelerate MC photon simulations, we also use a small number of photons and a down-sampled CT image in simulation to further reduce computation time. A novel denoising algorithm is used to eliminate MC noise from the simulated scatter images caused by low photon numbers. The method is validated on one simulated head-and-neck case with 364 projection angles.Results: We have examined variation of the scatter signal among projection angles using Fourier analysis. It is found that scatter images at 31 angles are sufficient to restore those at all angles with < 0.1% error. For the simulated patient case with a resolution of 512 × 512 × 100, we simulated 5 × 106 photons per angle. The total computation time is 20.52 seconds on a Nvidia GTX Titan GPU, and the time at each step is 2.53, 0.64, 14.78, 0.13, 0.19, and 2.25 seconds, respectively. The scatter-induced shading/cupping artifacts are substantially reduced, and the average HU error of a region-of-interest is reduced from 75.9 to 19.0 HU.Conclusion: A practical ultrafast MC-based CBCT scatter correction scheme is developed. It accomplished the whole procedure of scatter correction and reconstruction within 30 seconds.----------------------------Cite this
Radiosurgery with photons or protons for benign and malignant tumours of the skull base: a review
Directory of Open Access Journals (Sweden)
Amichetti Maurizio
2012-12-01
Full Text Available Abstract Stereotactic radiosurgery (SRS is an important treatment option for intracranial lesions. Many studies have shown the effectiveness of photon-SRS for the treatment of skull base (SB tumours; however, limited data are available for proton-SRS. Several photon-SRS techniques, including Gamma Knife, modified linear accelerators (Linac and CyberKnife, have been developed and several studies have compared treatment plan characteristics between protons and photons. The principles of classical radiobiology are similar for protons and photons even though they differ in terms of physical properties and interaction with matter resulting in different dose distributions. Protons have special characteristics that allow normal tissues to be spared better than with the use of photons, although their potential clinical superiority remains to be demonstrated. A critical analysis of the fundamental radiobiological principles, dosimetric characteristics, clinical results, and toxicity of proton- and photon-SRS for SB tumours is provided and discussed with an attempt of defining the advantages and limits of each radiosurgical technique.
Ulmer, W.; Schaffner, B.
2011-03-01
We have developed a model for proton depth dose and lateral distributions based on Monte Carlo calculations (GEANT4) and an integration procedure of Bethe-Bloch equation (BBE). The model accounts for the transport of primary and secondary protons, the creation of recoil protons and heavy recoil nuclei as well as lateral scattering of these contributions. The buildup, which is experimentally observed in higher energy depth dose curves, is modeled by an inclusion of two different origins: (1) secondary reaction protons with a contribution of ca. 65% of the buildup (for monoenergetic protons). (2) Landau tails as well as Gaussian type of fluctuations for range straggling effects. All parameters of the model for initially monoenergetic proton beams have been obtained from Monte Carlo calculations or checked by them. Furthermore, there are a few parameters, which can be obtained by fitting the model to the measured depth dose curves in order to describe individual characteristics of the beamline—the most important being the initial energy spread. We find that the free parameters of the depth dose model can be predicted for any intermediate energy from a couple of measured curves.
A research plan based on high intensity proton accelerator Neutron Science Research Center
Energy Technology Data Exchange (ETDEWEB)
Mizumoto, Motoharu [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment
1997-03-01
A plan called Neutron Science Research Center (NSRC) has been proposed in JAERI. The center is a complex composed of research facilities based on a proton linac with an energy of 1.5GeV and an average current of 10mA. The research facilities will consist of Thermal/Cold Neutron Facility, Neutron Irradiation Facility, Neutron Physics Facility, OMEGA/Nuclear Energy Facility, Spallation RI Beam Facility, Meson/Muon Facility and Medium Energy Experiment Facility, where high intensity proton beam and secondary particle beams such as neutron, pion, muon and unstable radio isotope (RI) beams generated from the proton beam will be utilized for innovative researches in the fields on nuclear engineering and basic sciences. (author)
DEFF Research Database (Denmark)
Cleemann, Lars Nilausen; Buazar, F.; Li, Qingfeng;
2013-01-01
Degradation of carbon supported platinum catalysts is a major failure mode for the long term durability of high temperature proton exchange membrane fuel cells based on phosphoric acid doped polybenzimidazole membranes. With Vulcan carbon black as a reference, thermally treated carbon black...
Development of an unstructured mesh based geometry model in the Serpent 2 Monte Carlo code
International Nuclear Information System (INIS)
This paper presents a new unstructured mesh based geometry type, developed in the Serpent 2 Monte Carlo code as a by-product of another study related to multi-physics applications and coupling to CFD codes. The new geometry type is intended for the modeling of complicated and irregular objects, which are not easily constructed using the conventional CSG based approach. The capability is put to test by modeling the 'Stanford Critical Bunny' – a variation of a well-known 3D test case for methods used in the world of computer graphics. The results show that the geometry routine in Serpent 2 can handle the unstructured mesh, and that the use of delta-tracking results in a considerable reduction in the overall calculation time as the geometry is refined. The methodology is still very much under development, with the final goal of implementing a geometry routine capable of reading standardized geometry formats used by 3D design and imaging tools in industry and medical physics. (author)
Energy Technology Data Exchange (ETDEWEB)
Baba, Justin S [ORNL; John, Dwayne O [ORNL; Koju, Vijay [ORNL
2015-01-01
The propagation of light in turbid media is an active area of research with relevance to numerous investigational fields, e.g., biomedical diagnostics and therapeutics. The statistical random-walk nature of photon propagation through turbid media is ideal for computational based modeling and simulation. Ready access to super computing resources provide a means for attaining brute force solutions to stochastic light-matter interactions entailing scattering by facilitating timely propagation of sufficient (>10million) photons while tracking characteristic parameters based on the incorporated physics of the problem. One such model that works well for isotropic but fails for anisotropic scatter, which is the case for many biomedical sample scattering problems, is the diffusion approximation. In this report, we address this by utilizing Berry phase (BP) evolution as a means for capturing anisotropic scattering characteristics of samples in the preceding depth where the diffusion approximation fails. We extend the polarization sensitive Monte Carlo method of Ramella-Roman, et al.,1 to include the computationally intensive tracking of photon trajectory in addition to polarization state at every scattering event. To speed-up the computations, which entail the appropriate rotations of reference frames, the code was parallelized using OpenMP. The results presented reveal that BP is strongly correlated to the photon penetration depth, thus potentiating the possibility of polarimetric depth resolved characterization of highly scattering samples, e.g., biological tissues.
Saha, Krishnendu; Straus, Kenneth J; Chen, Yu; Glick, Stephen J
2014-08-28
To maximize sensitivity, it is desirable that ring Positron Emission Tomography (PET) systems dedicated for imaging the breast have a small bore. Unfortunately, due to parallax error this causes substantial degradation in spatial resolution for objects near the periphery of the breast. In this work, a framework for computing and incorporating an accurate system matrix into iterative reconstruction is presented in an effort to reduce spatial resolution degradation towards the periphery of the breast. The GATE Monte Carlo Simulation software was utilized to accurately model the system matrix for a breast PET system. A strategy for increasing the count statistics in the system matrix computation and for reducing the system element storage space was used by calculating only a subset of matrix elements and then estimating the rest of the elements by using the geometric symmetry of the cylindrical scanner. To implement this strategy, polar voxel basis functions were used to represent the object, resulting in a block-circulant system matrix. Simulation studies using a breast PET scanner model with ring geometry demonstrated improved contrast at 45% reduced noise level and 1.5 to 3 times resolution performance improvement when compared to MLEM reconstruction using a simple line-integral model. The GATE based system matrix reconstruction technique promises to improve resolution and noise performance and reduce image distortion at FOV periphery compared to line-integral based system matrix reconstruction.
Eskers in a complete, wet-based glacial system in the Phlegra Montes region, Mars
Gallagher, Colman; Balme, Matthew
2015-12-01
Although glacial landsystems produced under warm/wet based conditions are very common on Earth, even here, observations of subglacial landforms such as eskers emerging from extant glaciers are rare. This paper describes a system of sinuous ridges emerging from the in situ but now degraded piedmont terminus of a Late Amazonian-aged (∼150 Ma) glacier-like form in the southern Phlegra Montes region of Mars. We believe this to be the first identification of martian eskers that can be directly linked to their parent glacier. Together with their contextual landform assemblage, the eskers are indicative of significant glacial meltwater production and subglacial routing. However, although the eskers are evidence of a wet-based regime, the confinement of the glacial system to a well-defined, regionally significant graben, and the absence of eskers elsewhere in the region, is interpreted as evidence of sub-glacial melting as a response to locally enhanced geothermal heat flux rather than climate-induced warming. These observations offer important new insights to the forcing of glacial dynamic and melting behaviour on Mars by factors other than climate.
The development of GPU-based parallel PRNG for Monte Carlo applications in CUDA Fortran
Directory of Open Access Journals (Sweden)
Hamed Kargaran
2016-04-01
Full Text Available The implementation of Monte Carlo simulation on the CUDA Fortran requires a fast random number generation with good statistical properties on GPU. In this study, a GPU-based parallel pseudo random number generator (GPPRNG have been proposed to use in high performance computing systems. According to the type of GPU memory usage, GPU scheme is divided into two work modes including GLOBAL_MODE and SHARED_MODE. To generate parallel random numbers based on the independent sequence method, the combination of middle-square method and chaotic map along with the Xorshift PRNG have been employed. Implementation of our developed PPRNG on a single GPU showed a speedup of 150x and 470x (with respect to the speed of PRNG on a single CPU core for GLOBAL_MODE and SHARED_MODE, respectively. To evaluate the accuracy of our developed GPPRNG, its performance was compared to that of some other commercially available PPRNGs such as MATLAB, FORTRAN and Miller-Park algorithm through employing the specific standard tests. The results of this comparison showed that the developed GPPRNG in this study can be used as a fast and accurate tool for computational science applications.
The development of GPU-based parallel PRNG for Monte Carlo applications in CUDA Fortran
Kargaran, Hamed; Minuchehr, Abdolhamid; Zolfaghari, Ahmad
2016-04-01
The implementation of Monte Carlo simulation on the CUDA Fortran requires a fast random number generation with good statistical properties on GPU. In this study, a GPU-based parallel pseudo random number generator (GPPRNG) have been proposed to use in high performance computing systems. According to the type of GPU memory usage, GPU scheme is divided into two work modes including GLOBAL_MODE and SHARED_MODE. To generate parallel random numbers based on the independent sequence method, the combination of middle-square method and chaotic map along with the Xorshift PRNG have been employed. Implementation of our developed PPRNG on a single GPU showed a speedup of 150x and 470x (with respect to the speed of PRNG on a single CPU core) for GLOBAL_MODE and SHARED_MODE, respectively. To evaluate the accuracy of our developed GPPRNG, its performance was compared to that of some other commercially available PPRNGs such as MATLAB, FORTRAN and Miller-Park algorithm through employing the specific standard tests. The results of this comparison showed that the developed GPPRNG in this study can be used as a fast and accurate tool for computational science applications.
TH-E-BRE-08: GPU-Monte Carlo Based Fast IMRT Plan Optimization
Energy Technology Data Exchange (ETDEWEB)
Li, Y; Tian, Z; Shi, F; Jiang, S; Jia, X [The University of Texas Southwestern Medical Ctr, Dallas, TX (United States)
2014-06-15
Purpose: Intensity-modulated radiation treatment (IMRT) plan optimization needs pre-calculated beamlet dose distribution. Pencil-beam or superposition/convolution type algorithms are typically used because of high computation speed. However, inaccurate beamlet dose distributions, particularly in cases with high levels of inhomogeneity, may mislead optimization, hindering the resulting plan quality. It is desire to use Monte Carlo (MC) methods for beamlet dose calculations. Yet, the long computational time from repeated dose calculations for a number of beamlets prevents this application. It is our objective to integrate a GPU-based MC dose engine in lung IMRT optimization using a novel two-steps workflow. Methods: A GPU-based MC code gDPM is used. Each particle is tagged with an index of a beamlet where the source particle is from. Deposit dose are stored separately for beamlets based on the index. Due to limited GPU memory size, a pyramid space is allocated for each beamlet, and dose outside the space is neglected. A two-steps optimization workflow is proposed for fast MC-based optimization. At first step, rough beamlet dose calculations is conducted with only a small number of particles per beamlet. Plan optimization is followed to get an approximated fluence map. In the second step, more accurate beamlet doses are calculated, where sampled number of particles for a beamlet is proportional to the intensity determined previously. A second-round optimization is conducted, yielding the final Result. Results: For a lung case with 5317 beamlets, 10{sup 5} particles per beamlet in the first round, and 10{sup 8} particles per beam in the second round are enough to get a good plan quality. The total simulation time is 96.4 sec. Conclusion: A fast GPU-based MC dose calculation method along with a novel two-step optimization workflow are developed. The high efficiency allows the use of MC for IMRT optimizations.
Kuusk, Priit, 1938-
2001-01-01
Novembrikuus elab šveitsi linn Basel "Euroopa muusikakuu" tähe all. Noor norra pianist Leif Ove Andsnes kutsuti Londonisse esinema. Konkursipreemiaid erinevatel konkurssidelt. Suri ameerika laulja Monte Pederson
Energy Technology Data Exchange (ETDEWEB)
Ramos-Mendez, J; Faddegon, B [University of California San Francisco, San Francisco, CA (United States); Paganetti, H [Massachusetts General Hospital, Boston, MA (United States)
2015-06-15
Purpose: We used TOPAS (TOPAS wraps and extends Geant4 for medical physicists) to compare Geant4 physics models with published data for neutron shielding calculations. Subsequently, we calculated the source terms and attenuation lengths (shielding data) of the total ambient dose equivalent (TADE) in concrete for neutrons produced by protons in brass. Methods: Stage1: The Bertini and Binary nuclear models available in Geant4 were compared with published attenuation at depth of the TADE in concrete and iron. Stage2: Shielding data of the TADE in concrete was calculated for 50– 200 MeV proton beams on brass. Stage3: Shielding data from Stage2 was extrapolated for 235 MeV proton beams. This data was used in a point-line-source analytical model to calculate the ambient dose per unit therapeutic dose at two locations inside one treatment room at the Francis H Burr Proton Therapy Center. Finally, we compared these results with experimental data and full TOPAS simulations. Results: At larger angles (∼130o) the TADE in concrete calculated with the Bertini model was about 9 times larger than that calculated with the Binary model. The attenuation length in concrete calculated with the Binary model agreed with published data within 7%±0.4% (statistical uncertainty) for the deepest regions and 5%±0.1% for shallower regions. For iron the agreement was within 3%±0.1%. The ambient dose per therapeutic dose calculated with the Binary model, relative to the experimental data, was a ratio of 0.93±0.16 and 1.23±0.24 for two locations. The analytical model overestimated the dose by four orders of magnitude. These differences are attributed to the complexity of the geometry. Conclusion: The Binary and Bertini models gave comparable results, with the Binary model giving the best agreement with published data at large angle. Shielding data we calculated using the Binary model is useful for fast shielding calculations with other analytical models. This work was supported by
International Nuclear Information System (INIS)
This study aims to utilize a measurement-based Monte Carlo (MBMC) method to evaluate the accuracy of dose distributions calculated using the Eclipse radiotherapy treatment planning system (TPS) based on the anisotropic analytical algorithm. Dose distributions were calculated for the nasopharyngeal carcinoma (NPC) patients treated with the intensity modulated radiotherapy (IMRT). Ten NPC IMRT plans were evaluated by comparing their dose distributions with those obtained from the in-house MBMC programs for the same CT images and beam geometry. To reconstruct the fluence distribution of the IMRT field, an efficiency map was obtained by dividing the energy fluence of the intensity modulated field by that of the open field, both acquired from an aS1000 electronic portal imaging device. The integrated image of the non-gated mode was used to acquire the full dose distribution delivered during the IMRT treatment. This efficiency map redistributed the particle weightings of the open field phase-space file for IMRT applications. Dose differences were observed in the tumor and air cavity boundary. The mean difference between MBMC and TPS in terms of the planning target volume coverage was 0.6% (range: 0.0–2.3%). The mean difference for the conformity index was 0.01 (range: 0.0–0.01). In conclusion, the MBMC method serves as an independent IMRT dose verification tool in a clinical setting. - Highlights: ► The patient-based Monte Carlo method serves as a reference standard to verify IMRT doses. ► 3D Dose distributions for NPC patients have been verified by the Monte Carlo method. ► Doses predicted by the Monte Carlo method matched closely with those by the TPS. ► The Monte Carlo method predicted a higher mean dose to the middle ears than the TPS. ► Critical organ doses should be confirmed to avoid overdose to normal organs
A GPU-based Monte Carlo dose calculation code for photon transport in a voxel phantom
International Nuclear Information System (INIS)
As the most accurate method to estimate absorbed dose in radiotherapy, Monte Carlo method has been widely used in radiotherapy treatment planning. Nevertheless, its efficiency can be improved for clinical routine applications. In this paper, we present the CUBMC code, a GPU-based Mc photon transport algorithm for dose calculation under the Compute Unified Device Architecture platform. The simulation of physical events is based on the algorithm used in Penelope, and the cross section table used is the one generated by the Material routine, als present in Penelope code. Photons are transported in voxel-based geometries with different compositions. To demonstrate the capabilities of the algorithm developed in the present work four 128 x 128 x 128 voxel phantoms have been considered. One of them is composed by a homogeneous water-based media, the second is composed by bone, the third is composed by lung and the fourth is composed by a heterogeneous bone and vacuum geometry. Simulations were done considering a 6 MeV monoenergetic photon point source. There are two distinct approaches that were used for transport simulation. The first of them forces the photon to stop at every voxel frontier, the second one is the Woodcock method, where the photon stop in the frontier will be considered depending on the material changing across the photon travel line. Dose calculations using these methods are compared for validation with Penelope and MCNP5 codes. Speed-up factors are compared using a NVidia GTX 560-Ti GPU card against a 2.27 GHz Intel Xeon CPU processor. (Author)
A GPU-based Monte Carlo dose calculation code for photon transport in a voxel phantom
Energy Technology Data Exchange (ETDEWEB)
Bellezzo, M.; Do Nascimento, E.; Yoriyaz, H., E-mail: mbellezzo@gmail.br [Instituto de Pesquisas Energeticas e Nucleares / CNEN, Av. Lineu Prestes 2242, Cidade Universitaria, 05508-000 Sao Paulo (Brazil)
2014-08-15
As the most accurate method to estimate absorbed dose in radiotherapy, Monte Carlo method has been widely used in radiotherapy treatment planning. Nevertheless, its efficiency can be improved for clinical routine applications. In this paper, we present the CUBMC code, a GPU-based Mc photon transport algorithm for dose calculation under the Compute Unified Device Architecture platform. The simulation of physical events is based on the algorithm used in Penelope, and the cross section table used is the one generated by the Material routine, als present in Penelope code. Photons are transported in voxel-based geometries with different compositions. To demonstrate the capabilities of the algorithm developed in the present work four 128 x 128 x 128 voxel phantoms have been considered. One of them is composed by a homogeneous water-based media, the second is composed by bone, the third is composed by lung and the fourth is composed by a heterogeneous bone and vacuum geometry. Simulations were done considering a 6 MeV monoenergetic photon point source. There are two distinct approaches that were used for transport simulation. The first of them forces the photon to stop at every voxel frontier, the second one is the Woodcock method, where the photon stop in the frontier will be considered depending on the material changing across the photon travel line. Dose calculations using these methods are compared for validation with Penelope and MCNP5 codes. Speed-up factors are compared using a NVidia GTX 560-Ti GPU card against a 2.27 GHz Intel Xeon CPU processor. (Author)
Kudrolli, Haris A.
2001-04-01
A three dimensional (3D) reconstruction procedure for Positron Emission Tomography (PET) based on inverse Monte Carlo analysis is presented. PET is a medical imaging modality which employs a positron emitting radio-tracer to give functional images of an organ's metabolic activity. This makes PET an invaluable tool in the detection of cancer and for in-vivo biochemical measurements. There are a number of analytical and iterative algorithms for image reconstruction of PET data. Analytical algorithms are computationally fast, but the assumptions intrinsic in the line integral model limit their accuracy. Iterative algorithms can apply accurate models for reconstruction and give improvements in image quality, but at an increased computational cost. These algorithms require the explicit calculation of the system response matrix, which may not be easy to calculate. This matrix gives the probability that a photon emitted from a certain source element will be detected in a particular detector line of response. The ``Three Dimensional Stochastic Sampling'' (SS3D) procedure implements iterative algorithms in a manner that does not require the explicit calculation of the system response matrix. It uses Monte Carlo techniques to simulate the process of photon emission from a source distribution and interaction with the detector. This technique has the advantage of being able to model complex detector systems and also take into account the physics of gamma ray interaction within the source and detector systems, which leads to an accurate image estimate. A series of simulation studies was conducted to validate the method using the Maximum Likelihood - Expectation Maximization (ML-EM) algorithm. The accuracy of the reconstructed images was improved by using an algorithm that required a priori knowledge of the source distribution. Means to reduce the computational time for reconstruction were explored by using parallel processors and algorithms that had faster convergence rates
Efficiency of respiratory-gated delivery of synchrotron-based pulsed proton irradiation
Energy Technology Data Exchange (ETDEWEB)
Tsunashima, Yoshikazu; Vedam, Sastry; Dong, Lei; Bues, Martin; Balter, Peter; Smith, Alfred; Mohan, Radhe [Department of Radiation Physics, University of Texas M D Anderson Cancer Center, 1515 Holcombe Blvd., Unit 94, Houston, TX 77030 (United States); Umezawa, Masumi [Hitachi America Ltd, PTC-H Construction Site, 7707 Fannin Street, Suite 203, Houston, TX 77054 (United States); Sakae, Takeji [Proton Medical Research Center, University of Tsukuba, 1-1-1 Tennodai, Tsukuba, Ibaraki, 305-0801 (Japan)], E-mail: svedam@mdanderson.org
2008-04-07
Significant differences exist in respiratory-gated proton beam delivery with a synchrotron-based accelerator system when compared to photon therapy with a conventional linear accelerator. Delivery of protons with a synchrotron accelerator is governed by a magnet excitation cycle pattern. Optimal synchronization of the magnet excitation cycle pattern with the respiratory motion pattern is critical to the efficiency of respiratory-gated proton delivery. There has been little systematic analysis to optimize the accelerator's operational parameters to improve gated treatment efficiency. The goal of this study was to estimate the overall efficiency of respiratory-gated synchrotron-based proton irradiation through realistic simulation. Using 62 respiratory motion traces from 38 patients, we simulated respiratory gating for duty cycles of 30%, 20% and 10% around peak exhalation for various fixed and variable magnet excitation patterns. In each case, the time required to deliver 100 monitor units in both non-gated and gated irradiation scenarios was determined. Based on results from this study, the minimum time required to deliver 100 MU was 1.1 min for non-gated irradiation. For respiratory-gated delivery at a 30% duty cycle around peak exhalation, corresponding average delivery times were typically three times longer with a fixed magnet excitation cycle pattern. However, when a variable excitation cycle was allowed in synchrony with the patient's respiratory cycle, the treatment time only doubled. Thus, respiratory-gated delivery of synchrotron-based pulsed proton irradiation is feasible and more efficient when a variable magnet excitation cycle pattern is used.
Monte Carlo simulation of novel breast imaging modalities based on coherent x-ray scattering
Ghammraoui, Bahaa; Badal, Andreu
2014-07-01
We present upgraded versions of MC-GPU and penEasy_Imaging, two open-source Monte Carlo codes for the simulation of radiographic projections and CT, that have been extended and validated to account for the effect of molecular interference in the coherent x-ray scatter. The codes were first validation by comparison between simulated and measured energy dispersive x-ray diffraction (EDXRD) spectra. A second validation was by evaluation of the rejection factor of a focused anti-scatter grid. To exemplify the capabilities of the new codes, the modified MC-GPU code was used to examine the possibility of characterizing breast tissue composition and microcalcifications in a volume of interest inside a whole breast phantom using EDXRD and to simulate a coherent scatter computed tomography (CSCT) system based on first generation CT acquisition geometry. It was confirmed that EDXRD and CSCT have the potential to characterize tissue composition inside a whole breast. The GPU-accelerated code was able to simulate, in just a few hours, a complete CSCT acquisition composed of 9758 independent pencil-beam projections. In summary, it has been shown that the presented software can be used for fast and accurate simulation of novel breast imaging modalities relying on scattering measurements and therefore can assist in the characterization and optimization of promising modalities currently under development.
IVF cycle cost estimation using Activity Based Costing and Monte Carlo simulation.
Cassettari, Lucia; Mosca, Marco; Mosca, Roberto; Rolando, Fabio; Costa, Mauro; Pisaturo, Valerio
2016-03-01
The Authors present a new methodological approach in stochastic regime to determine the actual costs of an healthcare process. The paper specifically shows the application of the methodology for the determination of the cost of an Assisted reproductive technology (ART) treatment in Italy. The reason of this research comes from the fact that deterministic regime is inadequate to implement an accurate estimate of the cost of this particular treatment. In fact the durations of the different activities involved are unfixed and described by means of frequency distributions. Hence the need to determine in addition to the mean value of the cost, the interval within which it is intended to vary with a known confidence level. Consequently the cost obtained for each type of cycle investigated (in vitro fertilization and embryo transfer with or without intracytoplasmic sperm injection), shows tolerance intervals around the mean value sufficiently restricted as to make the data obtained statistically robust and therefore usable also as reference for any benchmark with other Countries. It should be noted that under a methodological point of view the approach was rigorous. In fact it was used both the technique of Activity Based Costing for determining the cost of individual activities of the process both the Monte Carlo simulation, with control of experimental error, for the construction of the tolerance intervals on the final result.
Monte Carlo based time-domain Hspice noise simulation for CSA-CRRC circuit
International Nuclear Information System (INIS)
We present a time-domain Monte Carlo based Hspice noise simulation for a charge-sensitive preamplifier-CRRC (CSA-CRRC) circuit with random amplitude piecewise noise waveform. The amplitude distribution of thermal noise is modeled with Gaussian random number. For 1/f noise, its amplitude distribution is modeled with several low-pass filters with thermal noise generators. These time-domain noise sources are connected in parallel with the drain and source nodes of the CMOS input transistor of CSA. The Hspice simulation of the CSA-CRRC circuit with these noise sources yielded ENC values at the output node of the shaper for thermal and 1/f noise of 47e- and 732e-, respectively. ENC values calculated from the frequency-domain transfer function and its integration are 44e- and 882e-, respectively. The values for Hspice simulation are similar to those for frequency-domain calculation. A test chip was designed and fabricated for this study. The measured ENC value was 904 e-. This study shows that the time-domain noise modeling is valid and the transient Hspice noise simulation can be an effective tool for low-noise circuit design
IVF cycle cost estimation using Activity Based Costing and Monte Carlo simulation.
Cassettari, Lucia; Mosca, Marco; Mosca, Roberto; Rolando, Fabio; Costa, Mauro; Pisaturo, Valerio
2016-03-01
The Authors present a new methodological approach in stochastic regime to determine the actual costs of an healthcare process. The paper specifically shows the application of the methodology for the determination of the cost of an Assisted reproductive technology (ART) treatment in Italy. The reason of this research comes from the fact that deterministic regime is inadequate to implement an accurate estimate of the cost of this particular treatment. In fact the durations of the different activities involved are unfixed and described by means of frequency distributions. Hence the need to determine in addition to the mean value of the cost, the interval within which it is intended to vary with a known confidence level. Consequently the cost obtained for each type of cycle investigated (in vitro fertilization and embryo transfer with or without intracytoplasmic sperm injection), shows tolerance intervals around the mean value sufficiently restricted as to make the data obtained statistically robust and therefore usable also as reference for any benchmark with other Countries. It should be noted that under a methodological point of view the approach was rigorous. In fact it was used both the technique of Activity Based Costing for determining the cost of individual activities of the process both the Monte Carlo simulation, with control of experimental error, for the construction of the tolerance intervals on the final result. PMID:24752546
Monte Carlo efficiency calibration of a neutron generator-based total-body irradiator
International Nuclear Information System (INIS)
Many body composition measurement systems are calibrated against a single-sized reference phantom. Prompt-gamma neutron activation (PGNA) provides the only direct measure of total body nitrogen (TBN), an index of the body's lean tissue mass. In PGNA systems, body size influences neutron flux attenuation, induced gamma signal distribution, and counting efficiency. Thus, calibration based on a single-sized phantom could result in inaccurate TBN values. We used Monte Carlo simulations (MCNP-5; Los Alamos National Laboratory) in order to map a system's response to the range of body weights (65-160 kg) and body fat distributions (25-60%) in obese humans. Calibration curves were constructed to derive body-size correction factors relative to a standard reference phantom, providing customized adjustments to account for differences in body habitus of obese adults. The use of MCNP-generated calibration curves should allow for a better estimate of the true changes in lean tissue mass that many occur during intervention programs focused only on weight loss. (author)
Fiber optic microprobes with rare-earth-based phosphor tips for proton beam characterization
Darafsheh, Arash; Kassaee, Alireza; Taleei, Reza; Dolney, Derek; Finlay, Jarod C.
2016-03-01
We investigated the feasibility of using fiber optics probes with rare-earth-based phosphor tips for proton beam radiation dosimetry. We designed and fabricated a fiber probe with submillimeter resolution based on TbF3 phosphors and evaluated its performance for measurement of proton beams including profiles and range. The fiber optic probe, embedded in tissue-mimicking plastics, was irradiated with a clinical proton beam and the luminescence spectroscopy was performed by a CCD-coupled spectrograph to analyze the emission spectra of the fiber tip. By using a linear fitting algorithm we extracted the contribution of the ionoluminescence signal to obtain the percentage depth dose in phantoms and compared that with measurements performed with a standard ion chamber. We observed a quenching effect in the spread out Bragg peak region, manifested as an under-responding of the signal due to the high linear energy transfer of the beam. However, the beam profiles measurements were not affected by the quenching effect indicating that the fiber probes can be used for high-resolution measurements of proton beams profile.
CT based treatment planning system of proton beam therapy for ocular melanoma
Energy Technology Data Exchange (ETDEWEB)
Nakano, Takashi E-mail: tnakano@med.gunma-u.ac.jp; Kanai, Tatsuaki; Furukawa, Shigeo; Shibayama, Kouichi; Sato, Sinichiro; Hiraoka, Takeshi; Morita, Shinroku; Tsujii, Hirohiko
2003-09-01
A computed tomography (CT) based treatment planning system of proton beam therapy was established specially for ocular melanoma treatment. A technique of collimated proton beams with maximum energy of 70 MeV are applied for treatment for ocular melanoma. The vertical proton beam line has a range modulator for spreading beams out, a multi-leaf collimator, an aperture, light beam localizer, field light, and X-ray verification system. The treatment planning program includes; eye model, selecting the best direction of gaze, designing the shape of aperture, determining the proton range and range modulation necessary to encompass the target volume, and indicating the relative positions of the eyes, beam center and creation of beam aperture. Tumor contours are extracted from CT/MRI images of 1 mm thickness by assistant by various information of fundus photography and ultrasonography. The CT image-based treatment system for ocular melanoma is useful for Japanese patients as having thick choroid membrane in terms of dose sparing to skin and normal organs in the eye. The characteristics of the system and merits/demerits were reported.
Monte Carlo vs. Pencil Beam based optimization of stereotactic lung IMRT
International Nuclear Information System (INIS)
The purpose of the present study is to compare finite size pencil beam (fsPB) and Monte Carlo (MC) based optimization of lung intensity-modulated stereotactic radiotherapy (lung IMSRT). A fsPB and a MC algorithm as implemented in a biological IMRT planning system were validated by film measurements in a static lung phantom. Then, they were applied for static lung IMSRT planning based on three different geometrical patient models (one phase static CT, density overwrite one phase static CT, average CT) of the same patient. Both 6 and 15 MV beam energies were used. The resulting treatment plans were compared by how well they fulfilled the prescribed optimization constraints both for the dose distributions calculated on the static patient models and for the accumulated dose, recalculated with MC on each of 8 CTs of a 4DCT set. In the phantom measurements, the MC dose engine showed discrepancies < 2%, while the fsPB dose engine showed discrepancies of up to 8% in the presence of lateral electron disequilibrium in the target. In the patient plan optimization, this translates into violations of organ at risk constraints and unpredictable target doses for the fsPB optimized plans. For the 4D MC recalculated dose distribution, MC optimized plans always underestimate the target doses, but the organ at risk doses were comparable. The results depend on the static patient model, and the smallest discrepancy was found for the MC optimized plan on the density overwrite one phase static CT model. It is feasible to employ the MC dose engine for optimization of lung IMSRT and the plans are superior to fsPB. Use of static patient models introduces a bias in the MC dose distribution compared to the 4D MC recalculated dose, but this bias is predictable and therefore MC based optimization on static patient models is considered safe
Chi, Yujie; Tian, Zhen; Jia, Xun
2016-08-01
Monte Carlo (MC) particle transport simulation on a graphics-processing unit (GPU) platform has been extensively studied recently due to the efficiency advantage achieved via massive parallelization. Almost all of the existing GPU-based MC packages were developed for voxelized geometry. This limited application scope of these packages. The purpose of this paper is to develop a module to model parametric geometry and integrate it in GPU-based MC simulations. In our module, each continuous region was defined by its bounding surfaces that were parameterized by quadratic functions. Particle navigation functions in this geometry were developed. The module was incorporated to two previously developed GPU-based MC packages and was tested in two example problems: (1) low energy photon transport simulation in a brachytherapy case with a shielded cylinder applicator and (2) MeV coupled photon/electron transport simulation in a phantom containing several inserts of different shapes. In both cases, the calculated dose distributions agreed well with those calculated in the corresponding voxelized geometry. The averaged dose differences were 1.03% and 0.29%, respectively. We also used the developed package to perform simulations of a Varian VS 2000 brachytherapy source and generated a phase-space file. The computation time under the parameterized geometry depended on the memory location storing the geometry data. When the data was stored in GPU's shared memory, the highest computational speed was achieved. Incorporation of parameterized geometry yielded a computation time that was ~3 times of that in the corresponding voxelized geometry. We also developed a strategy to use an auxiliary index array to reduce frequency of geometry calculations and hence improve efficiency. With this strategy, the computational time ranged in 1.75-2.03 times of the voxelized geometry for coupled photon/electron transport depending on the voxel dimension of the auxiliary index array, and in 0
Chi, Yujie; Tian, Zhen; Jia, Xun
2016-08-01
Monte Carlo (MC) particle transport simulation on a graphics-processing unit (GPU) platform has been extensively studied recently due to the efficiency advantage achieved via massive parallelization. Almost all of the existing GPU-based MC packages were developed for voxelized geometry. This limited application scope of these packages. The purpose of this paper is to develop a module to model parametric geometry and integrate it in GPU-based MC simulations. In our module, each continuous region was defined by its bounding surfaces that were parameterized by quadratic functions. Particle navigation functions in this geometry were developed. The module was incorporated to two previously developed GPU-based MC packages and was tested in two example problems: (1) low energy photon transport simulation in a brachytherapy case with a shielded cylinder applicator and (2) MeV coupled photon/electron transport simulation in a phantom containing several inserts of different shapes. In both cases, the calculated dose distributions agreed well with those calculated in the corresponding voxelized geometry. The averaged dose differences were 1.03% and 0.29%, respectively. We also used the developed package to perform simulations of a Varian VS 2000 brachytherapy source and generated a phase-space file. The computation time under the parameterized geometry depended on the memory location storing the geometry data. When the data was stored in GPU’s shared memory, the highest computational speed was achieved. Incorporation of parameterized geometry yielded a computation time that was ~3 times of that in the corresponding voxelized geometry. We also developed a strategy to use an auxiliary index array to reduce frequency of geometry calculations and hence improve efficiency. With this strategy, the computational time ranged in 1.75–2.03 times of the voxelized geometry for coupled photon/electron transport depending on the voxel dimension of the auxiliary index array, and in 0
Chi, Yujie; Tian, Zhen; Jia, Xun
2016-08-01
Monte Carlo (MC) particle transport simulation on a graphics-processing unit (GPU) platform has been extensively studied recently due to the efficiency advantage achieved via massive parallelization. Almost all of the existing GPU-based MC packages were developed for voxelized geometry. This limited application scope of these packages. The purpose of this paper is to develop a module to model parametric geometry and integrate it in GPU-based MC simulations. In our module, each continuous region was defined by its bounding surfaces that were parameterized by quadratic functions. Particle navigation functions in this geometry were developed. The module was incorporated to two previously developed GPU-based MC packages and was tested in two example problems: (1) low energy photon transport simulation in a brachytherapy case with a shielded cylinder applicator and (2) MeV coupled photon/electron transport simulation in a phantom containing several inserts of different shapes. In both cases, the calculated dose distributions agreed well with those calculated in the corresponding voxelized geometry. The averaged dose differences were 1.03% and 0.29%, respectively. We also used the developed package to perform simulations of a Varian VS 2000 brachytherapy source and generated a phase-space file. The computation time under the parameterized geometry depended on the memory location storing the geometry data. When the data was stored in GPU's shared memory, the highest computational speed was achieved. Incorporation of parameterized geometry yielded a computation time that was ~3 times of that in the corresponding voxelized geometry. We also developed a strategy to use an auxiliary index array to reduce frequency of geometry calculations and hence improve efficiency. With this strategy, the computational time ranged in 1.75-2.03 times of the voxelized geometry for coupled photon/electron transport depending on the voxel dimension of the auxiliary index array, and in 0
International Nuclear Information System (INIS)
Laser-driven proton beams for ground-based study of space radiation effects on semiconductor devices are considered. Laser irradiation focused onto thin foil targets can generate proton spectra at intensity and fluence levels that are adequate to make such laser-driven sources feasible for this space application. Technical areas for further development are also briefly discussed. (author)
Directory of Open Access Journals (Sweden)
Vahid Moslemi
2011-03-01
Full Text Available Introduction: In brachytherapy, radioactive sources are placed close to the tumor, therefore, small changes in their positions can cause large changes in the dose distribution. This emphasizes the need for computerized treatment planning. The usual method for treatment planning of cervix brachytherapy uses conventional radiographs in the Manchester system. Nowadays, because of their advantages in locating the source positions and the surrounding tissues, CT and MRI images are replacing conventional radiographs. In this study, we used CT images in Monte Carlo based dose calculation for brachytherapy treatment planning, using an interface software to create the geometry file required in the MCNP code. The aim of using the interface software is to facilitate and speed up the geometry set-up for simulations based on the patient’s anatomy. This paper examines the feasibility of this method in cervix brachytherapy and assesses its accuracy and speed. Material and Methods: For dosimetric measurements regarding the treatment plan, a pelvic phantom was made from polyethylene in which the treatment applicators could be placed. For simulations using CT images, the phantom was scanned at 120 kVp. Using an interface software written in MATLAB, the CT images were converted into MCNP input file and the simulation was then performed. Results: Using the interface software, preparation time for the simulations of the applicator and surrounding structures was approximately 3 minutes; the corresponding time needed in the conventional MCNP geometry entry being approximately 1 hour. The discrepancy in the simulated and measured doses to point A was 1.7% of the prescribed dose. The corresponding dose differences between the two methods in rectum and bladder were 3.0% and 3.7% of the prescribed dose, respectively. Comparing the results of simulation using the interface software with those of simulation using the standard MCNP geometry entry showed a less than 1
A Multiple Scattering Theory for Proton Penetration
Institute of Scientific and Technical Information of China (English)
YANG Dai-Lun; WU Zhang-Wen; JIANG Steve-Bin; LUO Zheng-Ming
2004-01-01
@@ We extend the electron small-angle multiple scattering theory to proton penetration. After introducing the concept of narrow energy spectra, the proton energy loss process is included in the proton deep penetration theory. It precisely describes the whole process of proton penetration. Compared to the Monte Carlo method,this method maintains the comparable precision and possesses much higher computational efficiency. Thus, it shows the real feasibility of applying this algorithm to proton clinical radiation therapy.
Li, Yongbao; Tian, Zhen; Shi, Feng; Song, Ting; Wu, Zhaoxia; Liu, Yaqiang; Jiang, Steve; Jia, Xun
2015-04-01
Intensity-modulated radiation treatment (IMRT) plan optimization needs beamlet dose distributions. Pencil-beam or superposition/convolution type algorithms are typically used because of their high computational speed. However, inaccurate beamlet dose distributions may mislead the optimization process and hinder the resulting plan quality. To solve this problem, the Monte Carlo (MC) simulation method has been used to compute all beamlet doses prior to the optimization step. The conventional approach samples the same number of particles from each beamlet. Yet this is not the optimal use of MC in this problem. In fact, there are beamlets that have very small intensities after solving the plan optimization problem. For those beamlets, it may be possible to use fewer particles in dose calculations to increase efficiency. Based on this idea, we have developed a new MC-based IMRT plan optimization framework that iteratively performs MC dose calculation and plan optimization. At each dose calculation step, the particle numbers for beamlets were adjusted based on the beamlet intensities obtained through solving the plan optimization problem in the last iteration step. We modified a GPU-based MC dose engine to allow simultaneous computations of a large number of beamlet doses. To test the accuracy of our modified dose engine, we compared the dose from a broad beam and the summed beamlet doses in this beam in an inhomogeneous phantom. Agreement within 1% for the maximum difference and 0.55% for the average difference was observed. We then validated the proposed MC-based optimization schemes in one lung IMRT case. It was found that the conventional scheme required 106 particles from each beamlet to achieve an optimization result that was 3% difference in fluence map and 1% difference in dose from the ground truth. In contrast, the proposed scheme achieved the same level of accuracy with on average 1.2 × 105 particles per beamlet. Correspondingly, the computation time
Monte-Carlo simulation of an ultra small-angle neutron scattering instrument based on Soller slits
Energy Technology Data Exchange (ETDEWEB)
Rieker, T. [Univ. of New Mexico, Albuquerque, NM (United States); Hubbard, P. [Sandia National Labs., Albuquerque, NM (United States)
1997-09-01
Monte Carlo simulations are used to investigate an ultra small-angle neutron scattering instrument for use at a pulsed source based on a Soller slit collimator and analyzer. The simulations show that for a q{sub min} of {approximately}le-4 {angstrom}{sup -1} (15 {angstrom} neutrons) a few tenths of a percent of the incident flux is transmitted through both collimators at q=0.
A comparison of Monte Carlo generators
Golan, Tomasz
2014-01-01
A comparison of GENIE, NEUT, NUANCE, and NuWro Monte Carlo neutrino event generators is presented using a set of four observables: protons multiplicity, total visible energy, most energetic proton momentum, and $\\pi^+$ two-dimensional energy vs cosine distribution.
Li, Jing; Cao, Xue-Li; Wang, Yuan-Yuan; Zhang, Shu-Ran; Du, Dong-Ying; Qin, Jun-Sheng; Li, Shun-Li; Su, Zhong-Min; Lan, Ya-Qian
2016-06-27
Two novel polyoxometalate (POM)-based coordination polymers, namely, [Co(bpz)(Hbpz)][Co(SO4 )0.5 (H2 O)2 (bpz)]4 [PMo(VI) 8 Mo(V) 4 V(IV) 4 O42 ]⋅13 H2 O (NENU-530) and [Ni2 (bpz)(Hbpz)3 (H2 O)2 ][PMo(VI) 8 Mo(V) 4 V(IV) 4 O44 ]⋅8 H2 O (NENU-531) (H2 bpz=3,3',5,5'-tetramethyl-4,4'-bipyrazole), were isolated by hydrothermal methods, which represented 3D networks constructed by POM units, the protonated ligand and sulfate group. In contrast with most POM-based coordination polymers, these two compounds exhibit exceptional excellent chemical and thermal stability. More importantly, NENU-530 shows a high proton conductivity of 1.5×10(-3) S cm(-1) at 75 °C and 98 % RH, which is one order of magnitude higher than that of NENU-531. Furthermore, structural analysis and functional measurement successfully demonstrated that the introduction of sulfate group is favorable for proton conductivity. Herein, the syntheses, crystal structures, proton conductivity, and the relationship between structure and property are presented.
Li, Jing; Cao, Xue-Li; Wang, Yuan-Yuan; Zhang, Shu-Ran; Du, Dong-Ying; Qin, Jun-Sheng; Li, Shun-Li; Su, Zhong-Min; Lan, Ya-Qian
2016-06-27
Two novel polyoxometalate (POM)-based coordination polymers, namely, [Co(bpz)(Hbpz)][Co(SO4 )0.5 (H2 O)2 (bpz)]4 [PMo(VI) 8 Mo(V) 4 V(IV) 4 O42 ]⋅13 H2 O (NENU-530) and [Ni2 (bpz)(Hbpz)3 (H2 O)2 ][PMo(VI) 8 Mo(V) 4 V(IV) 4 O44 ]⋅8 H2 O (NENU-531) (H2 bpz=3,3',5,5'-tetramethyl-4,4'-bipyrazole), were isolated by hydrothermal methods, which represented 3D networks constructed by POM units, the protonated ligand and sulfate group. In contrast with most POM-based coordination polymers, these two compounds exhibit exceptional excellent chemical and thermal stability. More importantly, NENU-530 shows a high proton conductivity of 1.5×10(-3) S cm(-1) at 75 °C and 98 % RH, which is one order of magnitude higher than that of NENU-531. Furthermore, structural analysis and functional measurement successfully demonstrated that the introduction of sulfate group is favorable for proton conductivity. Herein, the syntheses, crystal structures, proton conductivity, and the relationship between structure and property are presented. PMID:27243145
Main Parameters of LCxFCC Based Electron-Proton Colliders
Acar, Y C; Oner, B B; Sultansoy, S
2016-01-01
Multi-TeV center of mass energy ep colliders based on the Future Circular Collider (FCC) and linear colliders (LC) are proposed and corresponding luminosity values are estimated. Parameters of upgraded versions of the FCC are determined to optimize luminosity of electron-proton collisions keeping beam-beam effects in mind. It is shown that L_{ep}\\sim10^{32}\\,cm^{-2}s^{-1} can be achieved with moderate upgrade of the FCC parameters.
Thompson, Emily J; Berben, Louise A
2015-09-28
Environmentally sustainable hydrogen-evolving electrocatalysts are key in a renewable fuel economy, and ligand-based proton and electron transfer could circumvent the need for precious metal ions in electrocatalytic H2 production. Herein, we show that electrocatalytic generation of H2 by a redox-active ligand complex of Al(3+) occurs at -1.16 V vs. SCE (500 mV overpotential). PMID:26249108
Ding, George X.; Duggan, Dennis M.; Coffey, Charles W.; Shokrani, Parvaneh; Cygler, Joanna E.
2006-06-01
The purpose of this study is to present our experience of commissioning, testing and use of the first commercial macro Monte Carlo based dose calculation algorithm for electron beam treatment planning and to investigate new issues regarding dose reporting (dose-to-water versus dose-to-medium) as well as statistical uncertainties for the calculations arising when Monte Carlo based systems are used in patient dose calculations. All phantoms studied were obtained by CT scan. The calculated dose distributions and monitor units were validated against measurements with film and ionization chambers in phantoms containing two-dimensional (2D) and three-dimensional (3D) type low- and high-density inhomogeneities at different source-to-surface distances. Beam energies ranged from 6 to 18 MeV. New required experimental input data for commissioning are presented. The result of validation shows an excellent agreement between calculated and measured dose distributions. The calculated monitor units were within 2% of measured values except in the case of a 6 MeV beam and small cutout fields at extended SSDs (>110 cm). The investigation on the new issue of dose reporting demonstrates the differences up to 4% for lung and 12% for bone when 'dose-to-medium' is calculated and reported instead of 'dose-to-water' as done in a conventional system. The accuracy of the Monte Carlo calculation is shown to be clinically acceptable even for very complex 3D-type inhomogeneities. As Monte Carlo based treatment planning systems begin to enter clinical practice, new issues, such as dose reporting and statistical variations, may be clinically significant. Therefore it is imperative that a consistent approach to dose reporting is used.
Monte Carlo-based QA for IMRT of head and neck cancers
Tang, F.; Sham, J.; Ma, C.-M.; Li, J.-S.
2007-06-01
It is well-known that the presence of large air cavity in a dense medium (or patient) introduces significant electronic disequilibrium when irradiated with megavoltage X-ray field. This condition may worsen by the possible use of tiny beamlets in intensity-modulated radiation therapy (IMRT). Commercial treatment planning systems (TPSs), in particular those based on the pencil-beam method, do not provide accurate dose computation for the lungs and other cavity-laden body sites such as the head and neck. In this paper we present the use of Monte Carlo (MC) technique for dose re-calculation of IMRT of head and neck cancers. In our clinic, a turn-key software system is set up for MC calculation and comparison with TPS-calculated treatment plans as part of the quality assurance (QA) programme for IMRT delivery. A set of 10 off-the-self PCs is employed as the MC calculation engine with treatment plan parameters imported from the TPS via a graphical user interface (GUI) which also provides a platform for launching remote MC simulation and subsequent dose comparison with the TPS. The TPS-segmented intensity maps are used as input for the simulation hence skipping the time-consuming simulation of the multi-leaf collimator (MLC). The primary objective of this approach is to assess the accuracy of the TPS calculations in the presence of air cavities in the head and neck whereas the accuracy of leaf segmentation is verified by fluence measurement using a fluoroscopic camera-based imaging device. This measurement can also validate the correct transfer of intensity maps to the record and verify system. Comparisons between TPS and MC calculations of 6 MV IMRT for typical head and neck treatments review regional consistency in dose distribution except at and around the sinuses where our pencil-beam-based TPS sometimes over-predicts the dose by up to 10%, depending on the size of the cavities. In addition, dose re-buildup of up to 4% is observed at the posterior nasopharyngeal
Monte Carlo-based multiphysics coupling analysis of x-ray pulsar telescope
Li, Liansheng; Deng, Loulou; Mei, Zhiwu; Zuo, Fuchang; Zhou, Hao
2015-10-01
X-ray pulsar telescope (XPT) is a complex optical payload, which involves optical, mechanical, electrical and thermal disciplines. The multiphysics coupling analysis (MCA) plays an important role in improving the in-orbit performance. However, the conventional MCA methods encounter two serious problems in dealing with the XTP. One is that both the energy and reflectivity information of X-ray can't be taken into consideration, which always misunderstands the essence of XPT. Another is that the coupling data can't be transferred automatically among different disciplines, leading to computational inefficiency and high design cost. Therefore, a new MCA method for XPT is proposed based on the Monte Carlo method and total reflective theory. The main idea, procedures and operational steps of the proposed method are addressed in detail. Firstly, it takes both the energy and reflectivity information of X-ray into consideration simultaneously. And formulate the thermal-structural coupling equation and multiphysics coupling analysis model based on the finite element method. Then, the thermalstructural coupling analysis under different working conditions has been implemented. Secondly, the mirror deformations are obtained using construction geometry function. Meanwhile, the polynomial function is adopted to fit the deformed mirror and meanwhile evaluate the fitting error. Thirdly, the focusing performance analysis of XPT can be evaluated by the RMS. Finally, a Wolter-I XPT is taken as an example to verify the proposed MCA method. The simulation results show that the thermal-structural coupling deformation is bigger than others, the vary law of deformation effect on the focusing performance has been obtained. The focusing performances of thermal-structural, thermal, structural deformations have degraded 30.01%, 14.35% and 7.85% respectively. The RMS of dispersion spot are 2.9143mm, 2.2038mm and 2.1311mm. As a result, the validity of the proposed method is verified through
The information-based complexity of approximation problem by adaptive Monte Carlo methods
Institute of Scientific and Technical Information of China (English)
2008-01-01
In this paper, we study the complexity of information of approximation problem on the multivariate Sobolev space with bounded mixed derivative MWpr,α(Td), 1 < p < ∞, in the norm of Lq(Td), 1 < q < ∞, by adaptive Monte Carlo methods. Applying the discretization technique and some properties of pseudo-s-scale, we determine the exact asymptotic orders of this problem.
International Nuclear Information System (INIS)
Geometry navigation plays the most fundamental role in Monte Carlo particle transport simulation. It's mainly responsible for locating a particle inside which geometry volume it is and computing the distance to the volume boundary along the certain particle trajectory during each particle history. Geometry navigation directly affects the run-time performance of the Monte Carlo particle transport simulation, especially for large scale complicated systems. Two geometry acceleration algorithms, the automatic neighbor search algorithm and the oriented bounding box algorithm, are presented for improving geometry navigation performance. The algorithms have been implemented in the Super Monte Carlo Calculation Program for Nuclear and Radiation Process (SuperMC) version 2.0. The FDS-II and ITER benchmark models have been tested to highlight the efficiency gains that can be achieved by using the acceleration algorithms. The exact gains may be problem dependent, but testing results showed that runtime of Monte Carlo simulation can be considerably reduced 50%∼60% with the proposed acceleration algorithms. (author)
van der Graaf, E. R.; Limburg, J.; Koomans, R. L.; Tijs, M.
2011-01-01
The calibration of scintillation detectors for gamma radiation in a well characterized setup can be transferred to other geometries using Monte Carlo simulations to account for the differences between the calibration and the other geometry. In this study a calibration facility was used that is const
A GPU implementation of a track-repeating algorithm for proton radiotherapy dose calculations
Yepes, Pablo P; Taddei, Phillip J
2010-01-01
An essential component in proton radiotherapy is the algorithm to calculate the radiation dose to be delivered to the patient. The most common dose algorithms are fast but they are approximate analytical approaches. However their level of accuracy is not always satisfactory, especially for heterogeneous anatomic areas, like the thorax. Monte Carlo techniques provide superior accuracy, however, they often require large computation resources, which render them impractical for routine clinical use. Track-repeating algorithms, for example the Fast Dose Calculator, have shown promise for achieving the accuracy of Monte Carlo simulations for proton radiotherapy dose calculations in a fraction of the computation time. We report on the implementation of the Fast Dose Calculator for proton radiotherapy on a card equipped with graphics processor units (GPU) rather than a central processing unit architecture. This implementation reproduces the full Monte Carlo and CPU-based track-repeating dose calculations within 2%, w...
Directory of Open Access Journals (Sweden)
Xueli Chen
2010-01-01
Full Text Available During the past decade, Monte Carlo method has obtained wide applications in optical imaging to simulate photon transport process inside tissues. However, this method has not been effectively extended to the simulation of free-space photon transport at present. In this paper, a uniform framework for noncontact optical imaging is proposed based on Monte Carlo method, which consists of the simulation of photon transport both in tissues and in free space. Specifically, the simplification theory of lens system is utilized to model the camera lens equipped in the optical imaging system, and Monte Carlo method is employed to describe the energy transformation from the tissue surface to the CCD camera. Also, the focusing effect of camera lens is considered to establish the relationship of corresponding points between tissue surface and CCD camera. Furthermore, a parallel version of the framework is realized, making the simulation much more convenient and effective. The feasibility of the uniform framework and the effectiveness of the parallel version are demonstrated with a cylindrical phantom based on real experimental results.
Energy Technology Data Exchange (ETDEWEB)
Volpe, L., E-mail: luca.volpe@mib.infn.it [Universita degli Studi di Milano-Bicocca, Piazza della scienza 3, Milano 20126 (Italy); Batani, D.; Morace, A. [Universita degli Studi di Milano-Bicocca, Piazza della scienza 3, Milano 20126 (Italy); Nicolai, Ph.; Regan, C. [CELIA, Universite de Bordeaux, CNRS, CEA, F33405 (France); Ravasio, A. [LULI, UMR 7605, CNRS, CEA, Universite Paris VI, Ecole Polytechnique, 91128 Palaiseau Cedex (France)
2011-10-11
Generation of high intensity and well collimated multi-energetic proton beams from laser-matter interaction extends the possibility to use protons as a diagnostic tool to image imploding target in Inertial Confinement Fusion (ICF) experiments. Due to the very large mass densities reached during implosion, protons traveling through the target undergo a very large number of collisions. Therefore the analysis of experimentally obtained proton images requires care and accurate numerical simulations using both hydrodynamic and Monte Carlo codes. The impact of multiple scattering needs to be carefully considered by taking into account the exact stopping power for dense matter and for the underdense plasma corona. In our paper, density, temperature and ionization degree profiles of the imploding target are obtained by 2D hydrodynamic simulations performed using CHIC code. Proton radiography images are simulated using the Monte Carlo code (MCNPX; adapted to correctly describe multiple scattering and plasma stopping power) in order to reconstruct the complete hydrodynamic history of the imploding target. Finally we develop a simple analytical model to study the performance of proton radiography as a function of initial experimental parameters, and identify two different regimes for proton radiography in ICF.
Dual-energy CT-based material extraction for tissue segmentation in Monte Carlo dose calculations
Bazalova, Magdalena; Carrier, Jean-François; Beaulieu, Luc; Verhaegen, Frank
2008-05-01
Monte Carlo (MC) dose calculations are performed on patient geometries derived from computed tomography (CT) images. For most available MC codes, the Hounsfield units (HU) in each voxel of a CT image have to be converted into mass density (ρ) and material type. This is typically done with a (HU; ρ) calibration curve which may lead to mis-assignment of media. In this work, an improved material segmentation using dual-energy CT-based material extraction is presented. For this purpose, the differences in extracted effective atomic numbers Z and the relative electron densities ρe of each voxel are used. Dual-energy CT material extraction based on parametrization of the linear attenuation coefficient for 17 tissue-equivalent inserts inside a solid water phantom was done. Scans of the phantom were acquired at 100 kVp and 140 kVp from which Z and ρe values of each insert were derived. The mean errors on Z and ρe extraction were 2.8% and 1.8%, respectively. Phantom dose calculations were performed for 250 kVp and 18 MV photon beams and an 18 MeV electron beam in the EGSnrc/DOSXYZnrc code. Two material assignments were used: the conventional (HU; ρ) and the novel (HU; ρ, Z) dual-energy CT tissue segmentation. The dose calculation errors using the conventional tissue segmentation were as high as 17% in a mis-assigned soft bone tissue-equivalent material for the 250 kVp photon beam. Similarly, the errors for the 18 MeV electron beam and the 18 MV photon beam were up to 6% and 3% in some mis-assigned media. The assignment of all tissue-equivalent inserts was accurate using the novel dual-energy CT material assignment. As a result, the dose calculation errors were below 1% in all beam arrangements. Comparable improvement in dose calculation accuracy is expected for human tissues. The dual-energy tissue segmentation offers a significantly higher accuracy compared to the conventional single-energy segmentation.
International Nuclear Information System (INIS)
The scanner is based on the nuclear scattering of high energy protons by the nucleons (protons and neutrons) included in the atomic nuclei. Because of the wide scattering angle, three coordinates in space of the interaction point can be computed, giving directly three dimensional radiographs. Volumic resolution is of about a few cubic-millimeters. Because the base interaction is the strong nuclear force, the atomic dependence of the information obtained is different from that of the X-ray scanner, for which the base interaction is electro-magnetic force. (orig./VJ)
Uncertainties in Monte Carlo-based absorbed dose calculations for an experimental benchmark
International Nuclear Information System (INIS)
There is a need to verify the accuracy of general purpose Monte Carlo codes like EGSnrc, which are commonly employed for investigations of dosimetric problems in radiation therapy. A number of experimental benchmarks have been published to compare calculated values of absorbed dose to experimentally determined values. However, there is a lack of absolute benchmarks, i.e. benchmarks without involved normalization which may cause some quantities to be cancelled. Therefore, at the Physikalisch-Technische Bundesanstalt a benchmark experiment was performed, which aimed at the absolute verification of radiation transport calculations for dosimetry in radiation therapy. A thimble-type ionization chamber in a solid phantom was irradiated by high-energy bremsstrahlung and the mean absorbed dose in the sensitive volume was measured per incident electron of the target. The characteristics of the accelerator and experimental setup were precisely determined and the results of a corresponding Monte Carlo simulation with EGSnrc are presented within this study. For a meaningful comparison, an analysis of the uncertainty of the Monte Carlo simulation is necessary. In this study uncertainties with regard to the simulation geometry, the radiation source, transport options of the Monte Carlo code and specific interaction cross sections are investigated, applying the general methodology of the Guide to the expression of uncertainty in measurement. Besides studying the general influence of changes in transport options of the EGSnrc code, uncertainties are analyzed by estimating the sensitivity coefficients of various input quantities in a first step. Secondly, standard uncertainties are assigned to each quantity which are known from the experiment, e.g. uncertainties for geometric dimensions. Data for more fundamental quantities such as photon cross sections and the I-value of electron stopping powers are taken from literature. The significant uncertainty contributions are identified as
Uncertainties in Monte Carlo-based absorbed dose calculations for an experimental benchmark.
Renner, F; Wulff, J; Kapsch, R-P; Zink, K
2015-10-01
There is a need to verify the accuracy of general purpose Monte Carlo codes like EGSnrc, which are commonly employed for investigations of dosimetric problems in radiation therapy. A number of experimental benchmarks have been published to compare calculated values of absorbed dose to experimentally determined values. However, there is a lack of absolute benchmarks, i.e. benchmarks without involved normalization which may cause some quantities to be cancelled. Therefore, at the Physikalisch-Technische Bundesanstalt a benchmark experiment was performed, which aimed at the absolute verification of radiation transport calculations for dosimetry in radiation therapy. A thimble-type ionization chamber in a solid phantom was irradiated by high-energy bremsstrahlung and the mean absorbed dose in the sensitive volume was measured per incident electron of the target. The characteristics of the accelerator and experimental setup were precisely determined and the results of a corresponding Monte Carlo simulation with EGSnrc are presented within this study. For a meaningful comparison, an analysis of the uncertainty of the Monte Carlo simulation is necessary. In this study uncertainties with regard to the simulation geometry, the radiation source, transport options of the Monte Carlo code and specific interaction cross sections are investigated, applying the general methodology of the Guide to the expression of uncertainty in measurement. Besides studying the general influence of changes in transport options of the EGSnrc code, uncertainties are analyzed by estimating the sensitivity coefficients of various input quantities in a first step. Secondly, standard uncertainties are assigned to each quantity which are known from the experiment, e.g. uncertainties for geometric dimensions. Data for more fundamental quantities such as photon cross sections and the I-value of electron stopping powers are taken from literature. The significant uncertainty contributions are identified as
Sun, Wenping
2014-07-25
Yttrium and indium co-doped barium zirconate is investigated to develop a chemically stable and sintering active proton conductor for solid oxide fuel cells (SOFCs). BaZr0.8Y0.2-xInxO3- δ possesses a pure cubic perovskite structure. The sintering activity of BaZr0.8Y0.2-xInxO3- δ increases significantly with In concentration. BaZr0.8Y0.15In0.05O3- δ (BZYI5) exhibits the highest total electrical conductivity among the sintered oxides. BZYI5 also retains high chemical stability against CO2, vapor, and reduction of H2. The good sintering activity, high conductivity, and chemical stability of BZYI5 facilitate the fabrication of durable SOFCs based on a highly conductive BZYI5 electrolyte film by cost-effective ceramic processes. Fully dense BZYI5 electrolyte film is successfully prepared on the anode substrate by a facile drop-coating technique followed by co-firing at 1400 °C for 5 h in air. The BZYI5 film exhibits one of the highest conductivity among the BaZrO3-based electrolyte films with various sintering aids. BZYI5-based single cells output very encouraging and by far the highest peak power density for BaZrO3-based proton-conducting SOFCs, reaching as high as 379 mW cm-2 at 700 °C. The results demonstrate that Y and In co-doping is an effective strategy for exploring sintering active and chemically stable BaZrO3-based proton conductors for high performance proton-conducting SOFCs. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Kumada, H; Saito, K; Nakamura, T; Sakae, T; Sakurai, H; Matsumura, A; Ono, K
2011-12-01
Treatment planning for boron neutron capture therapy generally utilizes Monte-Carlo methods for calculation of the dose distribution. The new treatment planning system JCDS-FX employs the multi-purpose Monte-Carlo code PHITS to calculate the dose distribution. JCDS-FX allows to build a precise voxel model consisting of pixel based voxel cells in the scale of 0.4×0.4×2.0 mm(3) voxel in order to perform high-accuracy dose estimation, e.g. for the purpose of calculating the dose distribution in a human body. However, the miniaturization of the voxel size increases calculation time considerably. The aim of this study is to investigate sophisticated modeling methods which can perform Monte-Carlo calculations for human geometry efficiently. Thus, we devised a new voxel modeling method "Multistep Lattice-Voxel method," which can configure a voxel model that combines different voxel sizes by utilizing the lattice function over and over. To verify the performance of the calculation with the modeling method, several calculations for human geometry were carried out. The results demonstrated that the Multistep Lattice-Voxel method enabled the precise voxel model to reduce calculation time substantially while keeping the high-accuracy of dose estimation.
Energy Technology Data Exchange (ETDEWEB)
Pwa, Aung E-mail: a_pwa@postoffice.utas.edu.au; Siegele, R.; Cohen, D.D.; Stelcer, E.; Moort, J.C. van
2002-05-01
Proton induced X-ray emission (PIXE) and proton induced gamma ray emission (PIGME) analysis has been used in geochemical exploration to determine various elements in rocks and regolith in relation to gold and base metal mineralisation. Elements analysed by PIXE include K, Fe, Ca, Ti, Mn, Cl, Ga, Rb, Sr, Zr, Y, Nb, Cu, Zn, Pb, Ni, As, V and Mo, and those by PIGME are Al, Na, Mg, F and Li. One of our research areas is Cobar, northwest of New South Wales, Australia. The study areas include the McKinnons and Peak gold deposits, the Wagga Tank base metal deposit and Lower Tank prospect, northeast of the CSA mine. Au, Cu, Zn, Pb, As and Ni are elevated as ore indicators near and around the ore deposits while K, Al, Ca, Na, Ti, Rb, Sr, Ga and V are depleted due to feldspar and mica destruction during alteration.
International Nuclear Information System (INIS)
Proton induced X-ray emission (PIXE) and proton induced gamma ray emission (PIGME) analysis has been used in geochemical exploration to determine various elements in rocks and regolith in relation to gold and base metal mineralisation. Elements analysed by PIXE include K, Fe, Ca, Ti, Mn, Cl, Ga, Rb, Sr, Zr, Y, Nb, Cu, Zn, Pb, Ni, As, V and Mo, and those by PIGME are Al, Na, Mg, F and Li. One of our research areas is Cobar, northwest of New South Wales, Australia. The study areas include the McKinnons and Peak gold deposits, the Wagga Tank base metal deposit and Lower Tank prospect, northeast of the CSA mine. Au, Cu, Zn, Pb, As and Ni are elevated as ore indicators near and around the ore deposits while K, Al, Ca, Na, Ti, Rb, Sr, Ga and V are depleted due to feldspar and mica destruction during alteration
Monte Carol-Based Dosimetry of Beta-Emitters for Intravascular Brachytherapy
Energy Technology Data Exchange (ETDEWEB)
Choi, C.K.
2002-06-25
Monte Carlo simulations for radiation dosimetry and the experimental verifications of the simulations have been developed for the treatment geometry of intravascular brachytherapy, a form of radionuclide therapy for occluded coronary disease (restenosis). Monte Carlo code, MCNP4C, has been used to calculate the radiation dose from the encapsulated array of B-emitting seeds (Sr/Y-source train). Solid water phantoms have been fabricated to measure the dose on the radiochromic films that were exposed to the beta source train for both linear and curved coronary vessel geometries. While the dose difference for the 5-degree curved vessel at the prescription point of f+2.0 mm is within the 10% guideline set by the AAPM, however, the difference increased dramatically to 16.85% for the 10-degree case which requires additional adjustment for the acceptable dosimetry planning. The experimental dose measurements agree well with the simulation results
Doronin, Alexander; Rushmeier, Holly E.; Meglinski, Igor; Bykov, Alexander V.
2016-03-01
We present a new Monte Carlo based approach for the modelling of Bidirectional Scattering-Surface Reflectance Distribution Function (BSSRDF) for accurate rendering of human skin appearance. The variations of both skin tissues structure and the major chromophores are taken into account correspondingly to the different ethnic and age groups. The computational solution utilizes HTML5, accelerated by the graphics processing units (GPUs), and therefore is convenient for the practical use at the most of modern computer-based devices and operating systems. The results of imitation of human skin reflectance spectra, corresponding skin colours and examples of 3D faces rendering are presented and compared with the results of phantom studies.
Using a Monte-Carlo-based approach to evaluate the uncertainty on fringe projection technique
Molimard, Jérôme
2013-01-01
A complete uncertainty analysis on a given fringe projection set-up has been performed using Monte-Carlo approach. In particular the calibration procedure is taken into account. Two applications are given: at a macroscopic scale, phase noise is predominant whilst at microscopic scale, both phase noise and calibration errors are important. Finally, uncertainty found at macroscopic scale is close to some experimental tests (~100 {\\mu}m).
A Monte Carlo method based on antithetic variates for network reliability computations
El Khadiri, Mohamed; Rubino, Gerardo
1992-01-01
The exact evaluation of usual reliability measures of communication networks is seriously limited because of the excessive computational time usually needed to obtain them. In the general case, the computation of almost all the interesting reliability metrics are NP-hard problems. An alternative approach is to estimate them by means of a Monte Carlo simulation. This allows to deal with larger models than those that can be evaluated exactly. In this paper, we propose an algorithm much more per...
Proton exchange membrane fuel cells modeling based on artificial neural networks
Institute of Scientific and Technical Information of China (English)
Yudong Tian; Xinjian Zhu; Guangyi Cao
2005-01-01
To understand the complexity of the mathematical models of a proton exchange membrane fuel cell (PEMFC) and their shortage of practical PEMFC control, the PEMFC complex mechanism and the existing PEMFC models are analyzed, and artificial neural networks based PEMFC modeling is advanced. The structure, algorithm, training and simulation of PEMFC modeling based on improved BP networks are given out in detail. The computer simulation and conducted experiment verify that this model is fast and accurate, and can be used as a suitable operational model for PEMFC real-time control.
Tippayakul, Chanatip
The main objective of this research is to develop a practical fuel management system for the Pennsylvania State University Breazeale research reactor (PSBR) based on several advanced Monte Carlo coupled depletion methodologies. Primarily, this research involved two major activities: model and method developments and analyses and validations of the developed models and methods. The starting point of this research was the utilization of the earlier developed fuel management tool, TRIGSIM, to create the Monte Carlo model of core loading 51 (end of the core loading). It was found when comparing the normalized power results of the Monte Carlo model to those of the current fuel management system (using HELIOS/ADMARC-H) that they agreed reasonably well (within 2%--3% differences on average). Moreover, the reactivity of some fuel elements was calculated by the Monte Carlo model and it was compared with measured data. It was also found that the fuel element reactivity results of the Monte Carlo model were in good agreement with the measured data. However, the subsequent task of analyzing the conversion from the core loading 51 to the core loading 52 using TRIGSIM showed quite significant difference of each control rod worth between the Monte Carlo model and the current methodology model. The differences were mainly caused by inconsistent absorber atomic number densities between the two models. Hence, the model of the first operating core (core loading 2) was revised in light of new information about the absorber atomic densities to validate the Monte Carlo model with the measured data. With the revised Monte Carlo model, the results agreed better to the measured data. Although TRIGSIM showed good modeling and capabilities, the accuracy of TRIGSIM could be further improved by adopting more advanced algorithms. Therefore, TRIGSIM was planned to be upgraded. The first task of upgrading TRIGSIM involved the improvement of the temperature modeling capability. The new TRIGSIM was
Singla, Nidhi; Bhadram, Venkata Srinu; Narayana, Chandrabhas; Chowdhury, Papia
2013-04-01
The motivation of the present work is to understand the optical, chemical, and electrical aspects of the proton transfer mechanism of indole (I) and some carbonyl based indole derivatives: indole-3-carboxaldehyde (I3C) and indole-7-carboxaldehyde (I7C) for both powder form and their liquid solution. Structural information for indole derivatives (isolated molecule and in solution) is obtained with density functional theory (DFT) and time dependent DFT (TD-DFT) methods. Calculated transition energies are used to generate UV-vis, FTIR, Raman, and NMR spectra which are later verified with the experimental spectra. The occurrence of different conformers [cis (N(c)), trans (N(t)), and zwitterion (Z*)] have been interpreted by Mulliken charge, natural bond orbital (NBO) analysis, and polarization versus electric field (P-E loop) studies. (1)H and (13)C NMR and molecular vibrational frequencies of the fundamental modes established the stability of Nc due to the presence of intramolecular hydrogen bonding (IHB) in the ground state (S0). Computed/experimental UV-vis absorption/emission studies reveal the creation of new species: zwitterion (Z*) and anion (A*) in the excited state (S1) due to excited state intramolecular and intermolecular proton transfer (ESI(ra)PT and ESI(er)PT). Increased electrical conductivity (σ(ac)) with temperature and increased ferroelectric polarization at higher field verifies proton conduction in I7C.
MONTE CARLO CALCULATION OF ENERGY DEPOSITION BY DELTA RAYS AROUND ION TRACKS
Institute of Scientific and Technical Information of China (English)
张纯祥; 刘小伟; 等
1994-01-01
The radial distribution of dose around the path of a heavy ion has been studied by a Monte Carlo transport analysis of the delta rays produced along the track of a heavy ion based on classical binary collision dynamics and a single scattering model for the electron transport process.Result comparisons among this work and semi-empirical expression based delta ray theory of track structure,as well as other Monte Carlo calculations are made for 1,3MeV protons and several heavy ions.The results of the Monte Carlo simulations for energetic heavy ions are in agreement with experimental data and with results of different methods.The characteristic of this Monte Carlo calculation is a simulation of the delta rays theory of track structure.
Energy Technology Data Exchange (ETDEWEB)
Rivard, Mark J.; Melhus, Christopher S.; Granero, Domingo; Perez-Calatayud, Jose; Ballester, Facundo [Department of Radiation Oncology, Tufts University School of Medicine, Boston, Massachusetts 02111 (United States); Radiation Oncology Department, Physics Section, ' ' La Fe' ' University Hospital, Avenida Campanar 21, E-46009 Valencia (Spain); Department of Atomic, Molecular, and Nuclear Physics, University of Valencia, C/Dr. Moliner 50, E-46100 Burjassot, Spain and IFIC (University of Valencia-CSIC), C/Dr. Moliner 50, E-46100 Burjassot (Spain)
2009-06-15
Certain brachytherapy dose distributions, such as those for LDR prostate implants, are readily modeled by treatment planning systems (TPS) that use the superposition principle of individual seed dose distributions to calculate the total dose distribution. However, dose distributions for brachytherapy treatments using high-Z shields or having significant material heterogeneities are not currently well modeled using conventional TPS. The purpose of this study is to establish a new treatment planning technique (Tufts technique) that could be applied in some clinical situations where the conventional approach is not acceptable and dose distributions present cylindrical symmetry. Dose distributions from complex brachytherapy source configurations determined with Monte Carlo methods were used as input data. These source distributions included the 2 and 3 cm diameter Valencia skin applicators from Nucletron, 4-8 cm diameter AccuBoost peripheral breast brachytherapy applicators from Advanced Radiation Therapy, and a 16 mm COMS-based eye plaque using {sup 103}Pd, {sup 125}I, and {sup 131}Cs seeds. Radial dose functions and 2D anisotropy functions were obtained by positioning the coordinate system origin along the dose distribution cylindrical axis of symmetry. Origin:tissue distance and active length were chosen to minimize TPS interpolation errors. Dosimetry parameters were entered into the PINNACLE TPS, and dose distributions were subsequently calculated and compared to the original Monte Carlo-derived dose distributions. The new planning technique was able to reproduce brachytherapy dose distributions for all three applicator types, producing dosimetric agreement typically within 2% when compared with Monte Carlo-derived dose distributions. Agreement between Monte Carlo-derived and planned dose distributions improved as the spatial resolution of the fitted dosimetry parameters improved. For agreement within 5% throughout the clinical volume, spatial resolution of
Institute of Scientific and Technical Information of China (English)
QIAO Li-gen; SHI Wen-fang
2012-01-01
A series of novel amphibious organic/inorganic hybrid proton exchange membranes with H3PO4 doped which could be used under both wet and dry conditions was prepared through a sol-gel process based on acrylated triethoxysilane(A-TES)and benzyltetrazole-modified triethoxysilane(BT-TES).The dual-curing approach including UV-curing and thermal curing was used to obtain the crosslinked membranes.Polyethylene glycol(400)diacrylate(PEGDA)was used as an oligomer to form the polymeric matrix.The molecular structures of precursors were characterized by 1H,13C and 29Si NMR spectra.The thermogravimetric analysis(TGA)results show that the membranes exhibit acceptable thermal stability for their application at above 200 ℃.The differential scanning calorimeter(DSC)determination indicates that the crosslinked membranes with the mass ratios of below 1.6 of BT-TES to A-TES and the same mass of H3PO4 doped as that of A-TES possess the-Tgs,and the lowest Tg(-28.9 ℃)exists for the membrane with double mass of H3PO4 doped as well.The high proton conductivity in a range of 9.4-17.3 mS/cm with the corresponding water uptake of 19.1％-32.8％ of the membranes was detected at 90 ℃ under wet conditions.Meanwhile,the proton conductivity in a dry environment for the membrane with a mass ratio of 2.4 of BT-TES to A-TES and double H3PO4 loading increases from 4.89× 10-2 mS/cm at 30 ℃ to 25.7 mS/cm at 140 ℃.The excellent proton transport ability under both hydrous and anhydrous conditions demonstrates a potential application in the polymer electrolyte membrane fuel cells.
Forward hadron production in ultraperipheral proton-heavy-ion collisions at the LHC and RHIC
Mitsuka, Gaku
2015-01-01
We discuss hadron production in the forward rapidity region in ultraperipheral proton-lead collisions at the LHC and proton-gold collisions at RHIC. Our discussion is based on the Monte Carlo simulations of the interactions of virtual photons emitted by a fast moving nucleus with a proton beam. We simulate the virtual photon flux with the STARLIGHT event generator and then particle production with the SOPHIA, DPMJET, and PYTHIA event generators. We show the rapidity distributions of charged and neutral particles, and the momentum distributions of neutral pions and neutrons at forward rapidities. According to the Monte Carlo simulations, we find large cross sections of ultraperipheral collisions for particle production especially in the very forward region, leading to substantial background contributions to investigations of collective nuclear effects and spin physics. Finally we can distinguish between proton-nucleus inelastic interactions and ultraperipheral collisions with additional requirements of either ...
An analytic linear accelerator source model for GPU-based Monte Carlo dose calculations
Tian, Zhen; Li, Yongbao; Folkerts, Michael; Shi, Feng; Jiang, Steve B.; Jia, Xun
2015-10-01
Recently, there has been a lot of research interest in developing fast Monte Carlo (MC) dose calculation methods on graphics processing unit (GPU) platforms. A good linear accelerator (linac) source model is critical for both accuracy and efficiency considerations. In principle, an analytical source model should be more preferred for GPU-based MC dose engines than a phase-space file-based model, in that data loading and CPU-GPU data transfer can be avoided. In this paper, we presented an analytical field-independent source model specifically developed for GPU-based MC dose calculations, associated with a GPU-friendly sampling scheme. A key concept called phase-space-ring (PSR) was proposed. Each PSR contained a group of particles that were of the same type, close in energy and reside in a narrow ring on the phase-space plane located just above the upper jaws. The model parameterized the probability densities of particle location, direction and energy for each primary photon PSR, scattered photon PSR and electron PSR. Models of one 2D Gaussian distribution or multiple Gaussian components were employed to represent the particle direction distributions of these PSRs. A method was developed to analyze a reference phase-space file and derive corresponding model parameters. To efficiently use our model in MC dose calculations on GPU, we proposed a GPU-friendly sampling strategy, which ensured that the particles sampled and transported simultaneously are of the same type and close in energy to alleviate GPU thread divergences. To test the accuracy of our model, dose distributions of a set of open fields in a water phantom were calculated using our source model and compared to those calculated using the reference phase-space files. For the high dose gradient regions, the average distance-to-agreement (DTA) was within 1 mm and the maximum DTA within 2 mm. For relatively low dose gradient regions, the root-mean-square (RMS) dose difference was within 1.1% and the maximum
Houska, Tobias; Multsch, Sebastian; Kraft, Philipp; Frede, Hans-Georg; Breuer, Lutz
2014-05-01
Computer simulations are widely used to support decision making and planning in the agriculture sector. On the one hand, many plant growth models use simplified hydrological processes and structures, e.g. by the use of a small number of soil layers or by the application of simple water flow approaches. On the other hand, in many hydrological models plant growth processes are poorly represented. Hence, fully coupled models with a high degree of process representation would allow a more detailed analysis of the dynamic behaviour of the soil-plant interface. We used the Python programming language to couple two of such high process oriented independent models and to calibrate both models simultaneously. The Catchment Modelling Framework (CMF) simulated soil hydrology based on the Richards equation and the Van-Genuchten-Mualem retention curve. CMF was coupled with the Plant growth Modelling Framework (PMF), which predicts plant growth on the basis of radiation use efficiency, degree days, water shortage and dynamic root biomass allocation. The Monte Carlo based Generalised Likelihood Uncertainty Estimation (GLUE) method was applied to parameterize the coupled model and to investigate the related uncertainty of model predictions to it. Overall, 19 model parameters (4 for CMF and 15 for PMF) were analysed through 2 x 106 model runs randomly drawn from an equally distributed parameter space. Three objective functions were used to evaluate the model performance, i.e. coefficient of determination (R2), bias and model efficiency according to Nash Sutcliffe (NSE). The model was applied to three sites with different management in Muencheberg (Germany) for the simulation of winter wheat (Triticum aestivum L.) in a cross-validation experiment. Field observations for model evaluation included soil water content and the dry matters of roots, storages, stems and leaves. Best parameter sets resulted in NSE of 0.57 for the simulation of soil moisture across all three sites. The shape
First tests for an online treatment monitoring system with in-beam PET for proton therapy
Kraan, Aafke C; Belcari, N; Camarlinghi, N; Cappucci, F; Ciocca, M; Ferrari, A; Ferretti, S; Mairani, A; Molinelli, S; Pullia, M; Retico, A; Sala, P; Sportelli, G; Del Guerra, A; Rosso, V
2014-01-01
PET imaging is a non-invasive technique for particle range verification in proton therapy. It is based on measuring the beta+ annihilations caused by nuclear interactions of the protons in the patient. In this work we present measurements for proton range verification in phantoms, performed at the CNAO particle therapy treatment center in Pavia, Italy, with our 10 x 10 cm^2 planar PET prototype DoPET. PMMA phantoms were irradiated with mono-energetic proton beams and clinical treatment plans, and PET data were acquired during and shortly after proton irradiation. We created 1-D profiles of the beta+ activity along the proton beam-axis, and evaluated the difference between the proximal rise and the distal fall-off position of the activity distribution. A good agreement with FLUKA Monte Carlo predictions was obtained. We also assessed the system response when the PMMA phantom contained an air cavity. The system was able to detect these cavities quickly after irradiation.
Monte Carlo-based treatment planning system calculation engine for microbeam radiation therapy
Energy Technology Data Exchange (ETDEWEB)
Martinez-Rovira, I.; Sempau, J.; Prezado, Y. [Institut de Tecniques Energetiques, Universitat Politecnica de Catalunya, Diagonal 647, Barcelona E-08028 (Spain) and ID17 Biomedical Beamline, European Synchrotron Radiation Facility (ESRF), 6 rue Jules Horowitz B.P. 220, F-38043 Grenoble Cedex (France); Institut de Tecniques Energetiques, Universitat Politecnica de Catalunya, Diagonal 647, Barcelona E-08028 (Spain); Laboratoire Imagerie et modelisation en neurobiologie et cancerologie, UMR8165, Centre National de la Recherche Scientifique (CNRS), Universites Paris 7 et Paris 11, Bat 440., 15 rue Georges Clemenceau, F-91406 Orsay Cedex (France)
2012-05-15
Purpose: Microbeam radiation therapy (MRT) is a synchrotron radiotherapy technique that explores the limits of the dose-volume effect. Preclinical studies have shown that MRT irradiations (arrays of 25-75-{mu}m-wide microbeams spaced by 200-400 {mu}m) are able to eradicate highly aggressive animal tumor models while healthy tissue is preserved. These promising results have provided the basis for the forthcoming clinical trials at the ID17 Biomedical Beamline of the European Synchrotron Radiation Facility (ESRF). The first step includes irradiation of pets (cats and dogs) as a milestone before treatment of human patients. Within this context, accurate dose calculations are required. The distinct features of both beam generation and irradiation geometry in MRT with respect to conventional techniques require the development of a specific MRT treatment planning system (TPS). In particular, a Monte Carlo (MC)-based calculation engine for the MRT TPS has been developed in this work. Experimental verification in heterogeneous phantoms and optimization of the computation time have also been performed. Methods: The penelope/penEasy MC code was used to compute dose distributions from a realistic beam source model. Experimental verification was carried out by means of radiochromic films placed within heterogeneous slab-phantoms. Once validation was completed, dose computations in a virtual model of a patient, reconstructed from computed tomography (CT) images, were performed. To this end, decoupling of the CT image voxel grid (a few cubic millimeter volume) to the dose bin grid, which has micrometer dimensions in the transversal direction of the microbeams, was performed. Optimization of the simulation parameters, the use of variance-reduction (VR) techniques, and other methods, such as the parallelization of the simulations, were applied in order to speed up the dose computation. Results: Good agreement between MC simulations and experimental results was achieved, even at
Generation of scintigraphic images in a virtual dosimetry trial based on Monte Carlo modelling
International Nuclear Information System (INIS)
Full text of publication follows. Aim: the purpose of dosimetry calculations in therapeutic nuclear medicine is to maximize tumour absorbed dose while minimizing normal tissue toxicities. However a wide heterogeneity of dosimetric approaches is observed: there is no standardized dosimetric protocol to date. The DosiTest project (www.dositest.com) intends to identify critical steps in the dosimetry chain by implementing clinical dosimetry in different Nuclear Medicine departments, on scintigraphic images generated by Monte Carlo simulation from a same virtual patient. This study aims at presenting the different steps contributing to image generation, following the imaging protocol of a given participating centre, Milan's European Institute of Oncology (IEO). Materiel and methods: the chosen clinical application is that of 111In-pentetreotide (OctreoscanTM). Pharmacokinetic data from the literature are used to derive a compartmental model. The kinetic rates between 6 compartments (liver, spleen, kidneys, blood, urine, remainder body) were obtained from WinSaam [3]: the activity in each compartment is known at any time point. The TestDose [1] software (computing architecture of DosiTest) implements the NURBS-based phantom NCAT-WB [2] to generate anatomical data for the virtual patient. IEO gamma-camera was modelled with GATE [4] v6.2. Scintigraphic images were simulated for each compartment and the resulting projections were weighted by the respective pharmacokinetics for each compartment. The final step consisted in aggregating each compartment to generate the resulting image. Results: following IEO's imaging protocol, planar and tomographic image simulations were generated at various time points. Computation times (on a 480 virtual cores computing cluster) for 'step and shoot' whole body simulations (5 steps/time point) and acceptable statistics were: 10 days for extra-vascular fluid, 28 h for blood, 12 h for liver, 7 h for kidneys, and 1-2 h for
Proton radiography to improve proton therapy treatment
Takatsu, J.; van der Graaf, E. R.; Van Goethem, M.-J.; van Beuzekom, M.; Klaver, T.; Visser, J.; Brandenburg, S.; Biegun, A. K.
2016-01-01
The quality of cancer treatment with protons critically depends on an accurate prediction of the proton stopping powers for the tissues traversed by the protons. Today, treatment planning in proton radiotherapy is based on stopping power calculations from densities of X-ray Computed Tomography (CT) images. This causes systematic uncertainties in the calculated proton range in a patient of typically 3-4%, but can become even 10% in bone regions [1,2,3,4,5,6,7,8]. This may lead to no dose in parts of the tumor and too high dose in healthy tissues [1]. A direct measurement of proton stopping powers with high-energy protons will allow reducing these uncertainties and will improve the quality of the treatment. Several studies have shown that a sufficiently accurate radiograph can be obtained by tracking individual protons traversing a phantom (patient) [4,6,10]. Our studies benefit from the gas-filled time projection chambers based on GridPix technology [2], developed at Nikhef, capable of tracking a single proton. A BaF2 crystal measuring the residual energy of protons was used. Proton radiographs of phantom consisting of different tissue-like materials were measured with a 30×30 mm2 150 MeV proton beam. Measurements were simulated with the Geant4 toolkit.First experimental and simulated energy radiographs are in very good agreement [3]. In this paper we focus on simulation studies of the proton scattering angle as it affects the position resolution of the proton energy loss radiograph. By selecting protons with a small scattering angle, the image quality can be improved significantly.
... News Physician Resources Professions Site Index A-Z Proton Therapy Proton therapy delivers radiation to tumor tissue ... feel during and after the procedure? What is proton therapy and how is it used? Protons are ...
International Nuclear Information System (INIS)
This paper presents an unstructured mesh based multi-physics interface implemented in the Serpent 2 Monte Carlo code, for the purpose of coupling the neutronics solution to component-scale thermal hydraulics calculations, such as computational fluid dynamics (CFD). The work continues the development of a multi-physics coupling scheme, which relies on the separation of state-point information from the geometry input, and the capability to handle temperature and density distributions by a rejection sampling algorithm. The new interface type is demonstrated by a simplified molten-salt reactor test case, using a thermal hydraulics solution provided by the CFD solver in OpenFOAM. (author)
Sampling-Based Nuclear Data Uncertainty Quantification for Continuous Energy Monte Carlo Codes
Zhu, Ting
2015-01-01
The goal of the present PhD research is to establish a methodology of nuclear data uncertainty quantification (NDUQ) for MCNPX, the continuous-energy Monte-Carlo (M-C) code. The high fidelity (continuous-energy treatment and flexible geometry modelling) of MCNPX makes it the choice of routine criticality safety calculations at PSI/LRS, but also raises challenges for NDUQ by conventional sensitivity/uncertainty (S/U) methods. The methodology developed during this PhD research is fundamentally ...
Monte Carlo calculations for design of An accelerator based PGNAA facility
International Nuclear Information System (INIS)
Monte Carlo calculations were carried out for design of a set up for Prompt Gamma Ray Neutron Activation Analysis (PGNAA) by 14 MeV neutrons to analyze cement raw material samples. The calculations were carried out using code the MCNP4B2. Various geometry parameters of the PGNAA experimental setup such as sample thickness, moderator geometry and detector shielding etc were optimized by maximizing the prompt gamma ray yield of different elements of sample material. Finally calibration curve of the PGNAA setup were generated for various concentrations of calcium in the material sample. Results of this simulation are presented. (author)
Monte Carlo calculations for design of An accelerator based PGNAA facility
Energy Technology Data Exchange (ETDEWEB)
Nagadi, M.M.; Naqvi, A.A. [King Fahd University of Petroleum and Minerals, Center for Applied Physical Sciences, Dhahran (Saudi Arabia); Rehman, Khateeb-ur; Kidwai, S. [King Fahd University of Petroleum and Minerals, Department of Physics, Dhahran (Saudi Arabia)
2002-08-01
Monte Carlo calculations were carried out for design of a set up for Prompt Gamma Ray Neutron Activation Analysis (PGNAA) by 14 MeV neutrons to analyze cement raw material samples. The calculations were carried out using code the MCNP4B2. Various geometry parameters of the PGNAA experimental setup such as sample thickness, moderator geometry and detector shielding etc were optimized by maximizing the prompt gamma ray yield of different elements of sample material. Finally calibration curve of the PGNAA setup were generated for various concentrations of calcium in the material sample. Results of this simulation are presented. (author)
Microlens assembly error analysis for light field camera based on Monte Carlo method
Li, Sai; Yuan, Yuan; Zhang, Hao-Wei; Liu, Bin; Tan, He-Ping
2016-08-01
This paper describes numerical analysis of microlens assembly errors in light field cameras using the Monte Carlo method. Assuming that there were no manufacturing errors, home-built program was used to simulate images of coupling distance error, movement error and rotation error that could appear during microlens installation. By researching these images, sub-aperture images and refocus images, we found that the images present different degrees of fuzziness and deformation for different microlens assembly errors, while the subaperture image presents aliasing, obscured images and other distortions that result in unclear refocus images.
Random vibration analysis of switching apparatus based on Monte Carlo method
Institute of Scientific and Technical Information of China (English)
ZHAI Guo-fu; CHEN Ying-hua; REN Wan-bin
2007-01-01
The performance in vibration environment of switching apparatus containing mechanical contact is an important element when judging the apparatus's reliability. A piecewise linear two-degrees-of-freedom mathematical model considering contact loss was built in this work, and the vibration performance of the model under random external Gaussian white noise excitation was investigated by using Monte Carlo simulation in Matlab/Simulink. Simulation showed that the spectral content and statistical characters of the contact force coincided strongly with reality. The random vibration character of the contact system was solved using time (numerical) domain simulation in this paper. Conclusions reached here are of great importance for reliability design of switching apparatus.
Cambraia Lopes, P; Clementel, E; Crespo, P; Henrotin, S; Huizenga, J.; G. Janssens; Parodi, K.; Prieels, D.; Roellinghoff, F; Smeets, J.; Stichelbaut, F.; Schaart, D. R.
2015-01-01
Proton range monitoring may facilitate online adaptive proton therapy and improve treatment outcomes. Imaging of proton-induced prompt gamma (PG) rays using a knife-edge slit collimator is currently under investigation as a potential tool for real-time proton range monitoring. A major challenge in collimated PG imaging is the suppression of neutron-induced background counts. In this work, we present an initial performance test of two knife-edge slit camera prototypes based on arrays of digita...
Particle Swarm Optimization based predictive control of Proton Exchange Membrane Fuel Cell (PEMFC)
Institute of Scientific and Technical Information of China (English)
无
2006-01-01
Proton Exchange Membrane Fuel Cells (PEMFCs) are the main focus of their current development as power sources because they are capable of higher power density and faster start-up than other fuel cells. The humidification system and output performance of PEMFC stack are briefly analyzed. Predictive control of PEMFC based on Support Vector Regression Machine (SVRM) is presented and the SVRM is constructed. The processing plant is modelled on SVRM and the predictive control law is obtained by using Particle Swarm Optimization (PSO). The simulation and the results showed that the SVRM and the PSO receding optimization applied to the PEMFC predictive control yielded good performance.
International Nuclear Information System (INIS)
In computational physics proton transfer phenomena could be viewed as pattern classification problems based on a set of input features allowing classification of the proton motion into two categories: transfer ‘occurred’ and transfer ‘not occurred’. The goal of this paper is to evaluate the use of artificial neural networks in the classification of proton transfer events, based on the feed-forward back propagation neural network, used as a classifier to distinguish between the two transfer cases. In this paper, we use a new developed data mining and pattern recognition tool for automating, controlling, and drawing charts of the output data of an Empirical Valence Bond existing code. The study analyzes the need for pattern recognition in aqueous proton transfer processes and how the learning approach in error back propagation (multilayer perceptron algorithms) could be satisfactorily employed in the present case. We present a tool for pattern recognition and validate the code including a real physical case study. The results of applying the artificial neural networks methodology to crowd patterns based upon selected physical properties (e.g., temperature, density) show the abilities of the network to learn proton transfer patterns corresponding to properties of the aqueous environments, which is in turn proved to be fully compatible with previous proton transfer studies. (condensed matter: structural, mechanical, and thermal properties)
Energy Technology Data Exchange (ETDEWEB)
Al-Subeihi, Ala' A.A., E-mail: subeihi@yahoo.com [Division of Toxicology, Wageningen University, Tuinlaan 5, 6703 HE Wageningen (Netherlands); BEN-HAYYAN-Aqaba International Laboratories, Aqaba Special Economic Zone Authority (ASEZA), P. O. Box 2565, Aqaba 77110 (Jordan); Alhusainy, Wasma; Kiwamoto, Reiko; Spenkelink, Bert [Division of Toxicology, Wageningen University, Tuinlaan 5, 6703 HE Wageningen (Netherlands); Bladeren, Peter J. van [Division of Toxicology, Wageningen University, Tuinlaan 5, 6703 HE Wageningen (Netherlands); Nestec S.A., Avenue Nestlé 55, 1800 Vevey (Switzerland); Rietjens, Ivonne M.C.M.; Punt, Ans [Division of Toxicology, Wageningen University, Tuinlaan 5, 6703 HE Wageningen (Netherlands)
2015-03-01
The present study aims at predicting the level of formation of the ultimate carcinogenic metabolite of methyleugenol, 1′-sulfooxymethyleugenol, in the human population by taking variability in key bioactivation and detoxification reactions into account using Monte Carlo simulations. Depending on the metabolic route, variation was simulated based on kinetic constants obtained from incubations with a range of individual human liver fractions or by combining kinetic constants obtained for specific isoenzymes with literature reported human variation in the activity of these enzymes. The results of the study indicate that formation of 1′-sulfooxymethyleugenol is predominantly affected by variation in i) P450 1A2-catalyzed bioactivation of methyleugenol to 1′-hydroxymethyleugenol, ii) P450 2B6-catalyzed epoxidation of methyleugenol, iii) the apparent kinetic constants for oxidation of 1′-hydroxymethyleugenol, and iv) the apparent kinetic constants for sulfation of 1′-hydroxymethyleugenol. Based on the Monte Carlo simulations a so-called chemical-specific adjustment factor (CSAF) for intraspecies variation could be derived by dividing different percentiles by the 50th percentile of the predicted population distribution for 1′-sulfooxymethyleugenol formation. The obtained CSAF value at the 90th percentile was 3.2, indicating that the default uncertainty factor of 3.16 for human variability in kinetics may adequately cover the variation within 90% of the population. Covering 99% of the population requires a larger uncertainty factor of 6.4. In conclusion, the results showed that adequate predictions on interindividual human variation can be made with Monte Carlo-based PBK modeling. For methyleugenol this variation was observed to be in line with the default variation generally assumed in risk assessment. - Highlights: • Interindividual human differences in methyleugenol bioactivation were simulated. • This was done using in vitro incubations, PBK modeling
Fluence-based dosimetry of proton and heavier ion beams using single track detectors
Klimpki, G.; Mescher, H.; Akselrod, M. S.; Jäkel, O.; Greilich, S.
2016-02-01
Due to their superior spatial resolution, small and biocompatible fluorescent nuclear track detectors (FNTDs) open up the possibility of characterizing swift heavy charged particle fields on a single track level. Permanently stored spectroscopic information such as energy deposition and particle field composition is of particular importance in heavy ion radiotherapy, since radiation quality is one of the decisive predictors for clinical outcome. Findings presented within this paper aim towards single track reconstruction and fluence-based dosimetry of proton and heavier ion fields. Three-dimensional information on individual ion trajectories through the detector volume is obtained using fully automated image processing software. Angular distributions of multidirectional fields can be measured accurately within ±2° uncertainty. This translates into less than 5% overall fluence deviation from the chosen irradiation reference. The combination of single ion tracking with an improved energy loss calibration curve based on 90 FNTD irradiations with protons as well as helium, carbon and oxygen ions enables spectroscopic analysis of a detector irradiated in Bragg peak proximity of a 270 MeV u-1 carbon ion field. Fluence-based dosimetry results agree with treatment planning software reference.
International Nuclear Information System (INIS)
The numerical simulation of the dynamics of fast ions coming from neutral beam injection (NBI) heating is an important task in fusion devices, since these particles are used as sources to heat and fuel the plasma and their uncontrolled losses can damage the walls of the reactor. This paper shows a new application that simulates these dynamics on the grid: FastDEP. FastDEP plugs together two Monte Carlo codes used in fusion science, namely FAFNER2 and ISDEP, and add new functionalities. Physically, FAFNER2 provides the fast ion initial state in the device while ISDEP calculates their evolution in time; as a result, the fast ion distribution function in TJ-II stellerator has been estimated, but the code can be used on any other device. In this paper a comparison between the physics of the two NBI injectors in TJ-II is presented, together with the differences between fast ion confinement and the driven momentum in the two cases. The simulations have been obtained using Montera, a framework developed for achieving grid efficient executions of Monte Carlo applications. (paper)
A Monte Carlo and continuum study of mechanical properties of nanoparticle based films
Energy Technology Data Exchange (ETDEWEB)
Ogunsola, Oluwatosin; Ehrman, Sheryl [University of Maryland, Department of Chemical and Biomolecular Engineering, Chemical and Nuclear Engineering Building (United States)], E-mail: sehrman@eng.umd.edu
2008-01-15
A combination Monte Carlo and equivalent-continuum simulation approach was used to investigate the structure-mechanical property relationships of titania nanoparticle deposits. Films of titania composed of nanoparticle aggregates were simulated using a Monte Carlo approach with diffusion-limited aggregation. Each aggregate in the simulation is fractal-like and random in structure. In the film structure, it is assumed that bond strength is a function of distance with two limiting values for the bond strengths: one representing the strong chemical bond between the particles at closest proximity in the aggregate and the other representing the weak van der Waals bond between particles from different aggregates. The Young's modulus of the film is estimated using an equivalent-continuum modeling approach, and the influences of particle diameter (5-100 nm) and aggregate size (3-400 particles per aggregate) on predicted Young's modulus are investigated. The Young's modulus is observed to increase with a decrease in primary particle size and is independent of the size of the aggregates deposited. Decreasing porosity resulted in an increase in Young's modulus as expected from results reported previously in the literature.
A lattice-based Monte Carlo evaluation of Canada Deuterium Uranium-6 safety parameters
Energy Technology Data Exchange (ETDEWEB)
Kim, Yong Hee; Hartanto, Donny; Kim, Woo Song [Dept. of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology (KAIST), Daejeon (Korea, Republic of)
2016-06-15
Important safety parameters such as the fuel temperature coefficient (FTC) and the power coefficient of reactivity (PCR) of the CANada Deuterium Uranium (CANDU-6) reactor have been evaluated using the Monte Carlo method. For accurate analysis of the parameters, the Doppler broadening rejection correction scheme was implemented in the MCNPX code to account for the thermal motion of the heavy uranium-238 nucleus in the neutron-U scattering reactions. In this work, a standard fuel lattice has been modeled and the fuel is depleted using MCNPX. The FTC value is evaluated for several burnup points including the mid-burnup representing a near-equilibrium core. The Doppler effect has been evaluated using several cross-section libraries such as ENDF/B-VI.8, ENDF/B-VII.0, JEFF-3.1.1, and JENDL-4.0. The PCR value is also evaluated at mid-burnup conditions to characterize the safety features of an equilibrium CANDU-6 reactor. To improve the reliability of the Monte Carlo calculations, we considered a huge number of neutron histories in this work and the standard deviation of the k-infinity values is only 0.5-1 pcm.
Energy Technology Data Exchange (ETDEWEB)
Burke, TImothy P. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Kiedrowski, Brian C. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Martin, William R. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Brown, Forrest B. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2015-11-19
Kernel Density Estimators (KDEs) are a non-parametric density estimation technique that has recently been applied to Monte Carlo radiation transport simulations. Kernel density estimators are an alternative to histogram tallies for obtaining global solutions in Monte Carlo tallies. With KDEs, a single event, either a collision or particle track, can contribute to the score at multiple tally points with the uncertainty at those points being independent of the desired resolution of the solution. Thus, KDEs show potential for obtaining estimates of a global solution with reduced variance when compared to a histogram. Previously, KDEs have been applied to neutronics for one-group reactor physics problems and fixed source shielding applications. However, little work was done to obtain reaction rates using KDEs. This paper introduces a new form of the MFP KDE that is capable of handling general geometries. Furthermore, extending the MFP KDE to 2-D problems in continuous energy introduces inaccuracies to the solution. An ad-hoc solution to these inaccuracies is introduced that produces errors smaller than 4% at material interfaces.
International Nuclear Information System (INIS)
Over the years, various types of tritium-in-air monitors have been designed and developed based on different principles. Ionization chamber, proportional counter and scintillation detector systems are few among them. A plastic scintillator based, flow-cell type online tritium-in-air monitoring system was developed for online monitoring of tritium in air. The value of the scintillator mass inside the cell-volume, which maximizes the response of the detector system, should be obtained to get maximum efficiency. The present study is aimed to optimize the amount of mass of the plastic scintillator film for the flow-cell based tritium monitoring instrument so that maximum efficiency is achieved. The Monte Carlo based EGSnrc code system has been used for this purpose
Energy Technology Data Exchange (ETDEWEB)
Doolan, P [University College London, London (United Kingdom); Massachusetts General Hospital, Boston, MA (United States); Sharp, G; Testa, M; Lu, H-M [Massachusetts General Hospital, Boston, MA (United States); Bentefour, E [Ion Beam Applications (IBA), Louvain la Neuve (Belgium); Royle, G [University College London, London (United Kingdom)
2014-06-15
Purpose: Beam range uncertainty in proton treatment comes primarily from converting the patient's X-ray CT (xCT) dataset to relative stopping power (RSP). Current practices use a single curve for this conversion, produced by a stoichiometric calibration based on tissue composition data for average, healthy, adult humans, but not for the individual in question. Proton radiographs produce water-equivalent path length (WEPL) maps, dependent on the RSP of tissues within the specific patient. This work investigates the use of such WEPL maps to optimize patient-specific calibration curves for reducing beam range uncertainty. Methods: The optimization procedure works on the principle of minimizing the difference between the known WEPL map, obtained from a proton radiograph, and a digitally-reconstructed WEPL map (DRWM) through an RSP dataset, by altering the calibration curve that is used to convert the xCT into an RSP dataset. DRWMs were produced with Plastimatch, an in-house developed software, and an optimization procedure was implemented in Matlab. Tests were made on a range of systems including simulated datasets with computed WEPL maps and phantoms (anthropomorphic and real biological tissue) with WEPL maps measured by single detector proton radiography. Results: For the simulated datasets, the optimizer showed excellent results. It was able to either completely eradicate or significantly reduce the root-mean-square-error (RMSE) in the WEPL for the homogeneous phantoms (to zero for individual materials or from 1.5% to 0.2% for the simultaneous optimization of multiple materials). For the heterogeneous phantom the RMSE was reduced from 1.9% to 0.3%. Conclusion: An optimization procedure has been designed to produce patient-specific calibration curves. Test results on a range of systems with different complexities and sizes have been promising for accurate beam range control in patients. This project was funded equally by the Engineering and Physical Sciences
Commissioning of a compact laser-based proton beam line for high intensity bunches around 10Â MeV
Busold, S.; Schumacher, D.; Deppert, O.; Brabetz, C.; Kroll, F.; Blažević, A.; Bagnoud, V.; Roth, M.
2014-03-01
We report on the first results of experiments with a new laser-based proton beam line at the GSI accelerator facility in Darmstadt. It delivers high current bunches at proton energies around 9.6 MeV, containing more than 109 particles in less than 10 ns and with tunable energy spread down to 2.7% (ΔE/E0 at FWHM). A target normal sheath acceleration stage serves as a proton source and a pulsed solenoid provides for beam collimation and energy selection. Finally a synchronous radio frequency (rf) field is applied via a rf cavity for energy compression at a synchronous phase of -90 deg. The proton bunch is characterized at the end of the very compact beam line, only 3 m behind the laser matter interaction point, which defines the particle source.
Analysis of accelerator based neutron spectra for BNCT using proton recoil spectroscopy
Energy Technology Data Exchange (ETDEWEB)
Wielopolski, L.; Ludewig, H.; Powell, J.R.; Raparia, D.; Alessi, J.G.; Lowenstein, D.I.
1999-03-01
Boron Neutron Capture Therapy (BNCT) is a promising binary treatment modality for high-grade primary brain tumors (glioblastoma multiforme, GM) and other cancers. BNCT employs a boron-10 containing compound that preferentially accumulates in the cancer cells in the brain. Upon neutron capture by {sup 10}B energetic alpha particles and triton released at the absorption site kill the cancer cell. In order to gain penetration depth in the brain Fairchild proposed, for this purpose, the use of energetic epithermal neutrons at about 10 keV. Phase 1/2 clinical trials of BNCT for GM are underway at the Brookhaven Medical Research Reactor (BMRR) and at the MIT Reactor, using these nuclear reactors as the source for epithermal neutrons. In light of the limitations of new reactor installations, e.g. cost, safety and licensing, and limited capability for modulating the reactor based neutron beam energy spectra, alternative neutron sources are being contemplated for wider implementation of this modality in a hospital environment. For example, accelerator based neutron sources offer the possibility of tailoring the neutron beams, in terms of improved depth-dose distributions, to the individual and offer, with relative ease, the capability of modifying the neutron beam energy and port size. In previous work new concepts for compact accelerator/target configuration were published. In this work, using the Van de Graaff accelerator the authors have explored different materials for filtering and reflecting neutron beams produced by irradiating a thick Li target with 1.8 to 2.5 MeV proton beams. However, since the yield and the maximum neutron energy emerging from the Li-7(p,n)Be-7 reaction increase with increase in the proton beam energy, there is a need for optimization of the proton energy versus filter and shielding requirements to obtain the desired epithermal neutron beam. The MCNP-4A computer code was used for the initial design studies that were verified with benchmark
ANALYSIS OF ACCELERATOR BASED NEUTRON SPECTRA FOR BNCT USING PROTON RECOIL SPECTROSCOPY
Energy Technology Data Exchange (ETDEWEB)
WIELOPOLSKI,L.; LUDEWIG,H.; POWELL,J.R.; RAPARIA,D.; ALESSI,J.G.; LOWENSTEIN,D.I.
1998-11-06
Boron Neutron Capture Therapy (BNCT) is a promising binary treatment modality for high-grade primary brain tumors (glioblastoma multiforme, GM) and other cancers. BNCT employs a boron-10 containing compound that preferentially accumulates in the cancer cells in the brain. Upon neutron capture by {sup 10}B energetic alpha particles and triton released at the absorption site kill the cancer cell. In order to gain penetration depth in the brain Fairchild proposed, for this purpose, the use of energetic epithermal neutrons at about 10 keV. Phase I/II clinical trials of BNCT for GM are underway at the Brookhaven Medical Research Reactor (BMRR) and at the MIT Reactor, using these nuclear reactors as the source for epithermal neutrons. In light of the limitations of new reactor installations, e.g. cost, safety and licensing, and limited capability for modulating the reactor based neutron beam energy spectra alternative neutron sources are being contemplated for wider implementation of this modality in a hospital environment. For example, accelerator based neutron sources offer the possibility of tailoring the neutron beams, in terms of improved depth-dose distributions, to the individual and offer, with relative ease, the capability of modifying the neutron beam energy and port size. In previous work new concepts for compact accelerator/target configuration were published. In this work, using the Van de Graaff accelerator the authors have explored different materials for filtering and reflecting neutron beams produced by irradiating a thick Li target with 1.8 to 2.5 MeV proton beams. However, since the yield and the maximum neutron energy emerging from the Li-7(p,n)Be-7 reaction increase with increase in the proton beam energy, there is a need for optimization of the proton energy versus filter and shielding requirements to obtain the desired epithermal neutron beam. The MCNP-4A computer code was used for the initial design studies that were verified with benchmark
Effect of a proton conducting filler on the physico-chemical properties of SPEEK-based membranes
Energy Technology Data Exchange (ETDEWEB)
Mecheri, B.; Chen, F.; Traversa, E. [Department of Chemical Science and Technology, University of Rome ' ' Tor Vergata' ' , Via della Ricerca Scientifica 1, 00133 Roma (Italy); D' Epifanio, A. [Department of Chemical Science and Technology, University of Rome ' ' Tor Vergata' ' , Via della Ricerca Scientifica 1, 00133 Roma (Italy); Hunter College of the City University of New York, New York, NY 10065 (United States); Pisani, L. [CRS4 Parco Scientifico e Tecnologico, POLARIS, 09010 Pula(CA) (Italy); Weise, F.C.; Greenbaum, S. [Hunter College of the City University of New York, New York, NY 10065 (United States); Licoccia, S.
2009-08-15
Composite membranes based on sulphonated polyetherether ketone (SPEEK) having a 60% degree of sulphonation (DS=0.6) and containing 23 and 50 wt.-% hydrated tin oxide (SnO{sub 2}.nH{sub 2}O) were prepared and characterised. The lower water uptake (WU) and the higher conductivity values recorded for the composite membranes with respect to pure SPEEK reference suggested the involvement of SnO{sub 2}.nH{sub 2}O in the proton conduction mechanism. Pulsed-field-gradient spin-echo (PFGSE) NMR was employed to obtain a direct measurement of water self-diffusion coefficient in the membranes. Differences were observed between the unfilled SPEEK and the composites, including departures from the normal correlation between water diffusivity and proton conductivity in the case of composites. To better understand the SnO{sub 2}.nH{sub 2}O effect on the proton transport properties of the SPEEK-based membrane, we employed an analytical model that predicts the membrane conductivity as a function of its hydration level and porous structure. The comparison of the model results with the experimental proton conductivity values demonstrated that the tin oxide phase provides additional paths between the water clusters for proton transport, resulting in reduced tortuosity and enhanced proton conductivity. Moreover, the composite showed reduced methanol crossover with respect to the unfilled membrane. (Abstract Copyright [2009], Wiley Periodicals, Inc.)
International Nuclear Information System (INIS)
We report on the development of a Monte Carlo application, based on the GEANT4 toolkit, for the characterization and optimization of electron beams for clinical applications produced by a laser-driven plasma source. The GEANT4 application is conceived so as to represent in the most general way the physical and geometrical features of a typical laser-driven accelerator. It is designed to provide standard dosimetric figures such as percentage dose depth curves, two-dimensional dose distributions and 3D dose profiles at different positions both inside and outside the interaction chamber. The application was validated by comparing its predictions to experimental measurements carried out on a real laser-driven accelerator. The work is aimed at optimizing the source, by using this novel application, for radiobiological studies and, in perspective, for medical applications. - Highlights: • Development of a Monte Carlo application based on GEANT4 toolkit. • Experimental measurements carried out with a laser-driven acceleration system. • Validation of Geant4 application comparing experimental data with the simulated ones. • Dosimetric characterization of the acceleration system
Energy Technology Data Exchange (ETDEWEB)
Lamia, D., E-mail: debora.lamia@ibfm.cnr.it [Institute of Molecular Bioimaging and Physiology IBFM CNR – LATO, Cefalù (Italy); Russo, G., E-mail: giorgio.russo@ibfm.cnr.it [Institute of Molecular Bioimaging and Physiology IBFM CNR – LATO, Cefalù (Italy); Casarino, C.; Gagliano, L.; Candiano, G.C. [Institute of Molecular Bioimaging and Physiology IBFM CNR – LATO, Cefalù (Italy); Labate, L. [Intense Laser Irradiation Laboratory (ILIL) – National Institute of Optics INO CNR, Pisa (Italy); National Institute for Nuclear Physics INFN, Pisa Section and Frascati National Laboratories LNF (Italy); Baffigi, F.; Fulgentini, L.; Giulietti, A.; Koester, P.; Palla, D. [Intense Laser Irradiation Laboratory (ILIL) – National Institute of Optics INO CNR, Pisa (Italy); Gizzi, L.A. [Intense Laser Irradiation Laboratory (ILIL) – National Institute of Optics INO CNR, Pisa (Italy); National Institute for Nuclear Physics INFN, Pisa Section and Frascati National Laboratories LNF (Italy); Gilardi, M.C. [Institute of Molecular Bioimaging and Physiology IBFM CNR, Segrate (Italy); University of Milano-Bicocca, Milano (Italy)
2015-06-21
We report on the development of a Monte Carlo application, based on the GEANT4 toolkit, for the characterization and optimization of electron beams for clinical applications produced by a laser-driven plasma source. The GEANT4 application is conceived so as to represent in the most general way the physical and geometrical features of a typical laser-driven accelerator. It is designed to provide standard dosimetric figures such as percentage dose depth curves, two-dimensional dose distributions and 3D dose profiles at different positions both inside and outside the interaction chamber. The application was validated by comparing its predictions to experimental measurements carried out on a real laser-driven accelerator. The work is aimed at optimizing the source, by using this novel application, for radiobiological studies and, in perspective, for medical applications. - Highlights: • Development of a Monte Carlo application based on GEANT4 toolkit. • Experimental measurements carried out with a laser-driven acceleration system. • Validation of Geant4 application comparing experimental data with the simulated ones. • Dosimetric characterization of the acceleration system.
Heavy Ion SEU Cross Section Calculation Based on Proton Experimental Data, and Vice Versa
Wrobel, F; Pouget, V; Dilillo, L; Ecoffet, R; Lorfèvre, E; Bezerra, F; Brugger, M; Saigné, F
2014-01-01
The aim of this work is to provide a method to calculate single event upset (SEU) cross sections by using experimental data. Valuable tools such as PROFIT and SIMPA already focus on the calculation of the proton cross section by using heavy ions cross-section experiments. However, there is no available tool that calculates heavy ion cross sections based on measured proton cross sections with no knowledge of the technology. We based our approach on the diffusion-collection model with the aim of analyzing the characteristics of transient currents that trigger SEUs. We show that experimental cross sections could be used to characterize the pulses that trigger an SEU. Experimental results allow yet defining an empirical rule to identify the transient current that are responsible for an SEU. Then, the SEU cross section can be calculated for any kind of particle and any energy with no need to know the Spice model of the cell. We applied our method to some technologies (250 nm, 90 nm and 65 nm bulk SRAMs) and we sho...
Developing and understanding a hospital-based proton facility: bringing physics into medicine.
Slater, James M
2007-08-01
From October 18 to 20, 2006, a symposium, Developing and Understanding a Hospital-based Proton Facility: Bringing Physics Into Medicine, was held at the Renaissance Esmeralda Resort and Spa, Indian Wells, California. The event was offered by the Department of Radiation Medicine at Loma Linda University (LLU), supported by the Telemedicine and Advanced Technology Research Center (TATRC) and the United States Army Medical Research and Materiel Command (USAMRMC). The meeting was intended to discuss factors involved in planning, developing, and operating a hospital-based proton treatment center. It brought together some of the most distinguished physicists, radiation biologists, and radiation oncologists in the world, and more than 100 individuals participated in the three-day educational offering. This overview reports on the event and introduces several papers written by many of the speakers from their presentations, for publication in this issue of Technology in Cancer Research and Treatment. Both the symposium and the papers are appropriate for this journal: exploitation of technology was one of the underlying themes of the symposium.
DEFF Research Database (Denmark)
Aili, David
and electrode separator in both PEM fuel cells and water electrolyzers. The proton conductivity mechanism of Nafion® is strongly dependent on the presence of water within the membrane nanostructure, which limits the operating temperature to about 80 °C unless the system is pressurized in order to keep...... were thus conducted using membrane electrode assemblies (MEAs) based on pristine phosphoric acid doped Nafion® and PBI. The PBI based MEAs suffered from severe durability limitations due to membrane degradation, which was most likely connected to the acid catalyzed hydrolysis of the polymer...... the membrane well hydrated. However, some of the main issues of the conventional PFSA based PEM fuel cells and water electrolyzers are directly or indirectly related to their relatively low operating temperature. An elevated operating temperature results in better electrode kinetics in general and improved...
Energy Technology Data Exchange (ETDEWEB)
Wei, Mei-Lin, E-mail: weimeilinhd@163.com; Wang, Yu-Xia; Wang, Xin-Jun
2014-01-15
Two proton-conductive organic/inorganic complexes were constructed by Keggin-type heteropolyacids and 2-(3-pyridyl)benzimidazole molecules. Single-crystal X-ray diffraction analyses revealed that two complexes crystallized in the monoclinic space group P2{sub 1}/c, exhibited different unit cell parameters, and presented different hydrogen-bonded networks constructed by 2-(3-pyridyl)benzimidazole molecules, [PMo{sub 12}O{sub 40}]{sup 3−} anions and solvent molecules. The results of thermogravimetric analyses suggest that two supramolecular complexes have different thermal stability based on the different hydrogen-bonded networks. Two complexes at 100 °C under 35–98% relative humidity showed a good proton conductivity of about 10{sup −3} S cm{sup −1}. The proton conductivities of two complexes under 98% relative humidity both increase on a logarithmic scale with temperature range from 25 to 100 °C. At 100 °C, both complexes showed poor proton conductivities of 10{sup −8}–10{sup −9} S cm{sup −1} under acetonitrile or methanol vapor. - Graphical abstract: Two molecular hybrids constructed by Keggin-type heteropolyacids and 2-(3-pyridyl)benzimidazole molecules showed good proton conductivities of 10{sup −3} S cm{sup −1} at 100 °C under 35–98% relative humidity. Display Omitted - Highlights: • 2-(3-Pyridyl)benzimidazole could form hydrogen bonds via the N–H groups. • Heteropolyacids have suitable characteristics to be used excellent proton conductors. • Two proton-conductive hybrids based on Keggin HPAs and 3-PyBim were constructed. • The structures were determined by using single-crystal X-ray diffraction data. • They showed good proton conductivities of 10{sup −3} S cm{sup −1} at 100 °C under 35–98% RH.
Bragg peak prediction from quantitative proton computed tomography using different path estimates
Dongxu WANG; Mackie, T. Rockwell; Wolfgang A. Tomé
2011-01-01
This paper characterizes the performance of the straight-line path (SLP) and cubic spline path (CSP) as path estimates used in reconstruction of proton computed tomography (pCT). The GEANT4 Monte Carlo simulation toolkit is employed to simulate the imaging phantom and proton projections. SLP, CSP and the most-probable path (MPP) are constructed based on the entrance and exit information of each proton. The physical deviations of SLP, CSP and MPP from the real path are calculated. Using a cond...
An assessment of radiation damage in space-based germanium detectors due to solar proton events
International Nuclear Information System (INIS)
Radiation effects caused by solar proton events will be a common problem for many types of sensors on missions to the inner solar system because of the long cruise phases coupled with the inverse square scaling of solar particle events. As part of a study in support of the BepiColombo mission to Mercury we have undertaken a comprehensive series of tests to assess these effects on a wide range of sensors. In this paper, we report on the measurements on a large volume coaxial Ge detector which was exposed to simulated solar proton spectra of integrated fluences 8x108, 6x109 and 6x1010protonscm-2. After each irradiation the detectors performance was accessed in terms of energy resolution, efficiency and activation. The detector was then annealed and the measurements repeated before the next irradiation. The minimum operational performance criteria were based on the resolution and efficiency requirements necessary to detect and separate specific radioisotope emission lines from a planetary regolith. Specifically that the energy resolution be restored to 5 keV FWHM at 1332 keV and the detection efficiency be degraded to no more than 10% of its pre-irradiation value. The key conclusion of this study is that even after a modest solar proton event the detector requires extensive annealing. After exposure to an event of integral fluence ∼8x108protonscm-2 this amounts to ∼1 week duration at 1000C, whereas for a fluence of ∼6x1010protonscm-2, the detector requires 3.5 months of annealing to satisfy the minimum operational performance requirements and 4.5 months to return the energy resolution to <3keV FWHM at 1332 keV. As a consequence such an instrument will require constant, planned and active management throughout its operational lifetime. The impact on spacecraft operations including resource management therefore needs careful consideration
Schiff base protonation changes in Siberian hamster ultraviolet cone pigment photointermediates.
Mooney, Victoria L; Szundi, Istvan; Lewis, James W; Yan, Elsa C Y; Kliger, David S
2012-03-27
Molecular structure and function studies of vertebrate ultraviolet (UV) cone visual pigments are needed to understand the molecular evolution of these photoreceptors, which uniquely contain unprotonated Schiff base linkages between the 11-cis-retinal chromophore and the opsin proteins. In this study, the Siberian hamster ultraviolet cone pigment (SHUV) was expressed and purified in an n-dodecyl-β-D-maltoside suspension for optical characterization. Time-resolved absorbance measurements, over a spectral range from 300 to 700 nm, were taken for the purified pigment at time delays from 30 ns to 4.64 s after photoexcitation using 7 ns pulses of 355 nm light. The resulting data were fit globally to a sum of exponential functions after noise reduction using singular-value decomposition. Four exponentials best fit the data with lifetimes of 1.4 μs, 210 μs, 47 ms, and 1 s. The first photointermediate species characterized here is an equilibrated mixture similar to the one formed after rhodopsin's Batho intermediate decays into equilibrium with its successor, BSI. The extremely large red shift of the SHUV Batho component relative to the pigment suggests that SHUV Batho has a protonated Schiff base and that the SHUV cone pigment itself has an unprotonated Schiff base. In contrast to SHUV Batho, the portion of the equilibrated mixture's spectrum corresponding to SHUV BSI is well fit by a model spectrum with an unprotonated Schiff base. The spectra of the next two photointermediate species revealed that they both have unprotonated Schiff bases and suggest they are analogous to rhodopsin's Lumi I and Lumi II species. After decay of SHUV Lumi II, the correspondence with rhodopsin photointermediates breaks down and the next photointermediate, presumably including the G protein-activating species, is a mixture of protonated and unprotonated Schiff base photointermediate species.
Monte Carlo simulation of primary reactions on HPLUS based on pluto event generator
International Nuclear Information System (INIS)
Hadron Physics Lanzhou Spectrometer (HPLUS) is designed for the study of hadron production and decay from nucleon-nucleon interaction in the GeV region. The current formation of HPLUS and the particle identification methods for three polar angle regions are discussed. The Pluto event generator is applied to simulate the primary reactions on HPLUS, concerning four issues as followed: the agreement on pp elastic scattering angular distribution between Pluto samples and experimental data; the acceptance of charged K mesons in the strangeness production channels for the forward region of HPLUS; the dependence of the maximum energy of photons and the minimum vertex angle of two photons on the polar angle; the influence on the mass spectrum of excited states of nucleon with large resonant width from different reconstruction methods. It is proved that the Pluto event generator satisfies the requirements of Monte Carlo simulation for HPLUS. (authors)
Web-Based Parallel Monte Carlo Simulation Platform for Financial Computation
Institute of Scientific and Technical Information of China (English)
无
2006-01-01
Using Java, Java-enabled Web and object-oriented programming technologies, a framework is designed to organize multicomputer system on Intranet quickly to complete Monte Carlo simulation parallelizing. The high-performance computing environment is embedded in Web server so it can be accessed more easily. Adaptive parallelism and eager scheduling algorithm are used to realize load balancing, parallel processing and system fault-tolerance. Independent sequence pseudo-random number generator schemes to keep the parallel simulation availability. Three kinds of stock option pricing models as instances, ideal speedup and pricing results obtained on test bed. Now, as a Web service, a high-performance financial derivative security-pricing platform is set up for training and studying. The framework can also be used to develop other SPMD (single procedure multiple data) application. Robustness is still a major problem for further research.
GPU-based Monte Carlo dust radiative transfer scheme applied to AGN
Heymann, Frank
2012-01-01
A three dimensional parallel Monte Carlo (MC) dust radiative transfer code is presented. To overcome the huge computing time requirements of MC treatments, the computational power of vectorized hardware is used, utilizing either multi-core computer power or graphics processing units. The approach is a self-consistent way to solve the radiative transfer equation in arbitrary dust configurations. The code calculates the equilibrium temperatures of two populations of large grains and stochastic heated polycyclic aromatic hydrocarbons (PAH). Anisotropic scattering is treated applying the Heney-Greenstein phase function. The spectral energy distribution (SED) of the object is derived at low spatial resolution by a photon counting procedure and at high spatial resolution by a vectorized ray-tracer. The latter allows computation of high signal-to-noise images of the objects at any frequencies and arbitrary viewing angles. We test the robustness of our approach against other radiative transfer codes. The SED and dust...
PC-based process distribution to solve iterative Monte Carlo simulations in physical dosimetry
International Nuclear Information System (INIS)
A distribution model to simulate physical dosimetry measurements with Monte Carlo (MC) techniques has been developed. This approach is indicated to solve the simulations where there are continuous changes of measurement conditions (and hence of the input parameters) such as a TPR curve or the estimation of the resolution limit of an optimal densitometer in the case of small field profiles. As a comparison, a high resolution scan for narrow beams with no iterative process is presented. The model has been installed on a network PCs without any resident software. The only requirement for these PCs has been a small and temporal Linux partition in the hard disks and to be connecting by the net with our server PC. (orig.)
Evaluation of Monte Carlo-based calibrations of HPGe detectors for in situ gamma-ray spectrometry.
Boson, Jonas; Plamboeck, Agneta H; Ramebäck, Henrik; Agren, Göran; Johansson, Lennart
2009-11-01
The aim of this work was to evaluate the use of Monte Carlo-based calibrations for in situ gamma-ray spectrometry. We have performed in situ measurements at five different sites in Sweden using HPGe detectors to determine ground deposition activity levels of (137)Cs from the 1986 Chernobyl accident. Monte Carlo-calculated efficiency calibration factors were compared with corresponding values calculated using a more traditional semi-empirical method. In addition, results for the activity ground deposition were also compared with activity densities found in soil samples. In order to facilitate meaningful comparisons between the different types of results, the combined standard uncertainty of in situ measurements was assessed for both calibration methods. Good agreement, both between the two calibration methods, and between in situ measurements and soil samples, was found at all five sites. Uncertainties in in situ measurements for the given measurement conditions, about 20 years after the fallout occurred, were found to be in the range 15-20% (with a coverage factor k=1, i.e. with a confidence interval of about 68%). PMID:19604609
Proton recoil telescope based on diamond detectors for measurement of fusion neutrons
Caiffi, B; Ripani, M; Pillon, M; Taiuti, M
2015-01-01
Diamonds are very promising candidates for the neutron diagnostics in harsh environments such as fusion reactor. In the first place this is because of their radiation hardness, exceeding that of Silicon by an order of magnitude. Also, in comparison to the standard on-line neutron diagnostics (fission chambers, silicon based detectors, scintillators), diamonds are less sensitive to $\\gamma$ rays, which represent a huge background in fusion devices. Finally, their low leakage current at high temperature suppresses the detector intrinsic noise. In this talk a CVD diamond based detector has been proposed for the measurement of the 14 MeV neutrons from D-T fusion reaction. The detector was arranged in a proton recoil telescope configuration, featuring a plastic converter in front of the sensitive volume in order to induce the (n,p) reaction. The segmentation of the sensitive volume, achieved by using two crystals, allowed to perform measurements in coincidence, which suppressed the neutron elastic scattering backg...
Expression and functioning of retinal-based proton pumps in a saltern crystallizer brine.
Oren, Aharon; Abu-Ghosh, Said; Argov, Tal; Kara-Ivanov, Eliahu; Shitrit, Dror; Volpert, Adi; Horwitz, Rael
2016-01-01
We examined the presence of bacteriorhodopsin and other retinal protein pigments in the microbial community of the saltern crystallizer ponds in Eilat, Israel, and assessed the effect of the retinal-based proton pumps on the metabolic activity. The biota of the hypersaline (~309 g salts l(-1)) brine consisted of ~2200 β-carotene-rich Dunaliella cells and ~3.5 × 10(7) prokaryotes ml(-1), most of which were flat, square or rectangular Haloquadratum-like archaea. No indications were obtained for massive presence of Salinibacter. We estimated a concentration of bacteriorhodopsin and bacteriorhodopsin-like pigments of 3.6 nmol l(-1). When illuminated, the community respiration activity of the brine samples in which oxygenic photosynthesis was inhibited by 3-(3-4-dichlorophenyl)-1,1-dimethylurea, decreased by 40-43 %. This effect was interpreted to be the result of competition between two energy yielding systems: the bacteriorhodopsin proton pump and the respiratory chain. The results presented have important implications for the interpretation of many published data on photosynthetic and respiratory activities in hypersaline environments. PMID:26507954
A Monte Carlo model of auroral hydrogen emission line profiles
Directory of Open Access Journals (Sweden)
J.-C. Gérard
2005-06-01
Full Text Available Hydrogen line profiles measured from space-borne or ground-based instruments provide useful information to study the physical processes occurring in the proton aurora and to estimate the proton flux characteristics. The line shape of the hydrogen lines is determined by the velocity distribution of H atoms along the line-of-sight of the instrument. Calculations of line profiles of auroral hydrogen emissions were obtained using a Monte Carlo kinetic model of proton precipitation into the auroral atmosphere. In this model both processes of energy degradation and scattering angle redistribution in momentum and charge transfer collisions of the high-energy proton/hydrogen flux with the ambient atmospheric gas are considered at the microphysical level. The model is based on measured cross sections and scattering angle distributions and on a stochastic interpretation of such collisions. Calculations show that collisional angular redistribution of the precipitating proton/hydrogen beam is the dominant process leading to the formation of extended wings and peak shifts in the hydrogen line profiles. All simulations produce a peak shift from the rest line wavelength decreasing with increasing proton energy. These model predictions are confirmed by analysis of ground-based H-β line observations from Poker Flat, showing an anti-correlation between the magnitude of the peak shift and the extent of the blue wing of the line. Our results also strongly suggest that the relative extension of the blue and red wings provides a much better indicator of the auroral proton characteristic energy than the position of the peak wavelength.
Shi, Ming; Saint-Martin, Jérôme; Bournel, Arnaud; Maher, Hassan; Renvoise, Michel; Dollfus, Philippe
2010-11-01
High-mobility III-V heterostructures are emerging and very promising materials likely to fulfil high-speed and low-power specifications for ambient intelligent applications. The main objective of this work is to theoretically explore the potentialities of MOSFET based on III-V materials with low bandgap and high electron mobility. First, the charge control is studied in III-V MOS structures using a Schrödinger-Poisson solver. Electronic transport in III-V devices is then analyzed using a particle Monte Carlo device simulator. The external access resistances used in the calculations are carefully calibrated on experimental results. The performance of different structures of nanoscale MOS transistor based on III-V materials is evaluated and the quasi-ballistic character of electron transport is compared to that in Si transistors of same gate length. PMID:21137856
Proton and carbon ion radiotherapy for primary brain tumors and tumors of the skull base
Energy Technology Data Exchange (ETDEWEB)
Combs, Stephanie E.; Kessel, Kerstin; Habermehl, Daniel; Debus, Jurgen [Univ. Hospital of Heidelberg, Dept. of Radiation Oncology, Heidelberg (Germany)], e-mail: Stephanie.Combs@med.uni-heidelberg.de; Haberer, Thomas [Heidelberger Ionenstrahl Therapiezentrum (HIT), Heidelberg (Germany); Jaekel, Oliver [Univ. Hospital of Heidelberg, Dept. of Radiation Oncology, Heidelberg (Germany); Heidelberger Ionenstrahl Therapiezentrum (HIT), Heidelberg (Germany)
2013-10-15
To analyze clinical concepts, toxicity and treatment outcome in patients with brain and skull base tumors treated with photons and particle therapy. Material and methods: In total 260 patients with brain tumors and tumors of the skull base were treated at the Heidelberg Ion Therapy Center (HIT). Patients enrolled in and randomized within prospective clinical trials as well as bony or soft tissue tumors are not included in this analysis. Treatment was delivered as protons, carbon ions, or combinations of photons and a carbon ion boost. All patients are included in a tight follow-up program. The median follow-up time is 12 months (range 2-39 months). Results: Main histologies included meningioma (n = 107) for skull base lesions, pituitary adenomas (n = 14), low-grade gliomas (n = 51) as well as high-grade gliomas (n = 55) for brain tumors. In all patients treatment could be completed without any unexpected severe toxicities. No side effects > CTC Grade III were observed. To date, no severe late toxicities were observed, however, for endpoints such as secondary malignancies or neuro cognitive side effects follow-up time still remains too short. Local recurrences were mainly seen in the group of high-grade gliomas or atypical meningiomas; for benign skull base meningiomas, to date, no recurrences were observed during follow-up. Conclusion: The specific benefit of particle therapy will potentially reduce the risk of secondary malignancies as well as improve neuro cognitive outcome and quality of life (QOL); thus, longer follow-up will be necessary to confirm these endpoints. Indication-specific trials on meningiomas and gliomas are underway to elucidate the role of protons and carbon ions in these indications.
International Nuclear Information System (INIS)
Purpose. Intensity-modulated proton therapy is usually implemented with multi-field optimization of pencil-beam scanning (PBS) proton fields. However, at the view of the experience with photon-IMRT, proton facilities equipped with double-scattering (DS) delivery and multi-leaf collimation (MLC) could produce highly conformal dose distributions (and possibly eliminate the need for patient-specific compensators) with a clever use of their MLC field shaping, provided that an optimal inverse TPS is developed. Methods. A prototype TPS was developed in MATLAB. The dose calculation process was based on a fluence-dose algorithm on an adaptive divergent grid. A database of dose kernels was precalculated in order to allow for fast variations of the field range and modulation during optimization. The inverse planning process was based on the adaptive simulated annealing approach, with direct aperture optimization of the MLC leaves. A dosimetry study was performed on a phantom formed by three concentrical semicylinders separated by 5 mm, of which the inner-most and outer-most were regarded as organs at risk (OARs), and the middle one as the PTV. We chose a concave target (which is not treatable with conventional DS fields) to show the potential of our technique. The optimizer was configured to minimize the mean dose to the OARs while keeping a good coverage of the target. Results. The plan produced by the prototype TPS achieved a conformity index of 1.34, with the mean doses to the OARs below 78% of the prescribed dose. This Result is hardly achievable with traditional conformal DS technique with compensators, and it compares to what can be obtained with PBS. Conclusion. It is certainly feasible to produce IMPT fields with MLC passive scattering fields. With a fully developed treatment planning system, the produced plans can be superior to traditional DS plans in terms of plan conformity and dose to organs at risk
Energy Technology Data Exchange (ETDEWEB)
Sanchez-Parcerisa, D; Carabe-Fernandez, A [Department of Radiation Oncology, Hospital of the University of Pennsylvania, Philadelphia, PA (United States)
2014-06-01
Purpose. Intensity-modulated proton therapy is usually implemented with multi-field optimization of pencil-beam scanning (PBS) proton fields. However, at the view of the experience with photon-IMRT, proton facilities equipped with double-scattering (DS) delivery and multi-leaf collimation (MLC) could produce highly conformal dose distributions (and possibly eliminate the need for patient-specific compensators) with a clever use of their MLC field shaping, provided that an optimal inverse TPS is developed. Methods. A prototype TPS was developed in MATLAB. The dose calculation process was based on a fluence-dose algorithm on an adaptive divergent grid. A database of dose kernels was precalculated in order to allow for fast variations of the field range and modulation during optimization. The inverse planning process was based on the adaptive simulated annealing approach, with direct aperture optimization of the MLC leaves. A dosimetry study was performed on a phantom formed by three concentrical semicylinders separated by 5 mm, of which the inner-most and outer-most were regarded as organs at risk (OARs), and the middle one as the PTV. We chose a concave target (which is not treatable with conventional DS fields) to show the potential of our technique. The optimizer was configured to minimize the mean dose to the OARs while keeping a good coverage of the target. Results. The plan produced by the prototype TPS achieved a conformity index of 1.34, with the mean doses to the OARs below 78% of the prescribed dose. This Result is hardly achievable with traditional conformal DS technique with compensators, and it compares to what can be obtained with PBS. Conclusion. It is certainly feasible to produce IMPT fields with MLC passive scattering fields. With a fully developed treatment planning system, the produced plans can be superior to traditional DS plans in terms of plan conformity and dose to organs at risk.
Mart, T
2013-01-01
We have calculated the proton charge radius by assuming that the real proton radius is not unique and the radii are randomly distributed in a certain range. This is performed by averaging the elastic electron-proton differential cross section over the form factor cut-off. By using a dipole form factor and fitting the middle value of the cut-off to the low $Q^2$ Mainz data, we found the lowest $\\chi^2/N$ for a cut-off $\\Lambda=0.8203\\pm 0.0003$ GeV, which corresponds to a proton charge radius $r_E=0.8333\\pm 0.0004$ fm. The result is compatible with the recent precision measurement of the Lamb shift in muonic hydrogen as well as recent calculations using more sophisticated techniques. Our result indicates that the relative variation of the form factor cut-off should be around 21.5%. Based on this result we have investigated effects of the nucleon radius variation on the symmetric nuclear matter (SNM) and the neutron star matter (NSM) by considering the excluded volume effect in our calculation. The mass-radius ...
International Nuclear Information System (INIS)
Proton therapy treatments are currently planned and delivered using the assumption that the proton relative biological effectiveness (RBE) relative to photons is 1.1. This assumption ignores strong experimental evidence that suggests the RBE varies along the treatment field, i.e. with linear energy transfer (LET) and with tissue type. A recent review study collected over 70 experimental reports on proton RBE, providing a comprehensive dataset for predicting RBE for cell survival. Using this dataset we developed a model to predict proton RBE based on dose, dose average LET (LETd) and the ratio of the linear-quadratic model parameters for the reference radiation (α/β)x, as the tissue specific parameter.The proposed RBE model is based on the linear quadratic model and was derived from a nonlinear regression fit to 287 experimental data points. The proposed model predicts that the RBE increases with increasing LETd and decreases with increasing (α/β)x. This agrees with previous theoretical predictions on the relationship between RBE, LETd and (α/β)x. The model additionally predicts a decrease in RBE with increasing dose and shows a relationship between both α and β with LETd. Our proposed phenomenological RBE model is derived using the most comprehensive collection of proton RBE experimental data to date. Previously published phenomenological models, based on a limited data set, may have to be revised. (paper)
Neutrons in proton pencil beam scanning: parameterization of energy, quality factors and RBE
Schneider, Uwe; Hälg, Roger A.; Baiocco, Giorgio; Lomax, Tony
2016-08-01
The biological effectiveness of neutrons produced during proton therapy in inducing cancer is unknown, but potentially large. In particular, since neutron biological effectiveness is energy dependent, it is necessary to estimate, besides the dose, also the energy spectra, in order to obtain quantities which could be a measure of the biological effectiveness and test current models and new approaches against epidemiological studies on cancer induction after proton therapy. For patients treated with proton pencil beam scanning, this work aims to predict the spatially localized neutron energies, the effective quality factor, the weighting factor according to ICRP, and two RBE values, the first obtained from the saturation corrected dose mean lineal energy and the second from DSB cluster induction. A proton pencil beam was Monte Carlo simulated using GEANT. Based on the simulated neutron spectra for three different proton beam energies a parameterization of energy, quality factors and RBE was calculated. The pencil beam algorithm used for treatment planning at PSI has been extended using the developed parameterizations in order to calculate the spatially localized neutron energy, quality factors and RBE for each treated patient. The parameterization represents the simple quantification of neutron energy in two energy bins and the quality factors and RBE with a satisfying precision up to 85 cm away from the proton pencil beam when compared to the results based on 3D Monte Carlo simulations. The root mean square error of the energy estimate between Monte Carlo simulation based results and the parameterization is 3.9%. For the quality factors and RBE estimates it is smaller than 0.9%. The model was successfully integrated into the PSI treatment planning system. It was found that the parameterizations for neutron energy, quality factors and RBE were independent of proton energy in the investigated energy range of interest for proton therapy. The pencil beam algorithm has
Monte Carlo based water/medium stopping-power ratios for various ICRP and ICRU tissues
International Nuclear Information System (INIS)
Water/medium stopping-power ratios, sw,m, have been calculated for several ICRP and ICRU tissues, namely adipose tissue, brain, cortical bone, liver, lung (deflated and inflated) and spongiosa. The considered clinical beams were 6 and 18 MV x-rays and the field size was 10 x 10 cm2. Fluence distributions were scored at a depth of 10 cm using the Monte Carlo code PENELOPE. The collision stopping powers for the studied tissues were evaluated employing the formalism of ICRU Report 37 (1984 Stopping Powers for Electrons and Positrons (Bethesda, MD: ICRU)). The Bragg-Gray values of sw,m calculated with these ingredients range from about 0.98 (adipose tissue) to nearly 1.14 (cortical bone), displaying a rather small variation with beam quality. Excellent agreement, to within 0.1%, is found with stopping-power ratios reported by Siebers et al (2000a Phys. Med. Biol. 45 983-95) for cortical bone, inflated lung and spongiosa. In the case of cortical bone, sw,m changes approximately 2% when either ICRP or ICRU compositions are adopted, whereas the stopping-power ratios of lung, brain and adipose tissue are less sensitive to the selected composition. The mass density of lung also influences the calculated values of sw,m, reducing them by around 1% (6 MV) and 2% (18 MV) when going from deflated to inflated lung
Institute of Scientific and Technical Information of China (English)
ZHANG Jun; GUO Fan
2015-01-01
Tooth modification technique is widely used in gear industry to improve the meshing performance of gearings. However, few of the present studies on tooth modification considers the influence of inevitable random errors on gear modification effects. In order to investigate the uncertainties of tooth modification amount variations on system’s dynamic behaviors of a helical planetary gears, an analytical dynamic model including tooth modification parameters is proposed to carry out a deterministic analysis on the dynamics of a helical planetary gear. The dynamic meshing forces as well as the dynamic transmission errors of the sun-planet 1 gear pair with and without tooth modifications are computed and compared to show the effectiveness of tooth modifications on gear dynamics enhancement. By using response surface method, a fitted regression model for the dynamic transmission error(DTE) fluctuations is established to quantify the relationship between modification amounts and DTE fluctuations. By shifting the inevitable random errors arousing from manufacturing and installing process to tooth modification amount variations, a statistical tooth modification model is developed and a methodology combining Monte Carlo simulation and response surface method is presented for uncertainty analysis of tooth modifications. The uncertainly analysis reveals that the system’s dynamic behaviors do not obey the normal distribution rule even though the design variables are normally distributed. In addition, a deterministic modification amount will not definitely achieve an optimal result for both static and dynamic transmission error fluctuation reduction simultaneously.
Markov chain Monte Carlo based analysis of post-translationally modified VDAC gating kinetics.
Tewari, Shivendra G; Zhou, Yifan; Otto, Bradley J; Dash, Ranjan K; Kwok, Wai-Meng; Beard, Daniel A
2014-01-01
The voltage-dependent anion channel (VDAC) is the main conduit for permeation of solutes (including nucleotides and metabolites) of up to 5 kDa across the mitochondrial outer membrane (MOM). Recent studies suggest that VDAC activity is regulated via post-translational modifications (PTMs). Yet the nature and effect of these modifications is not understood. Herein, single channel currents of wild-type, nitrosated, and phosphorylated VDAC are analyzed using a generalized continuous-time Markov chain Monte Carlo (MCMC) method. This developed method describes three distinct conducting states (open, half-open, and closed) of VDAC activity. Lipid bilayer experiments are also performed to record single VDAC activity under un-phosphorylated and phosphorylated conditions, and are analyzed using the developed stochastic search method. Experimental data show significant alteration in VDAC gating kinetics and conductance as a result of PTMs. The effect of PTMs on VDAC kinetics is captured in the parameters associated with the identified Markov model. Stationary distributions of the Markov model suggest that nitrosation of VDAC not only decreased its conductance but also significantly locked VDAC in a closed state. On the other hand, stationary distributions of the model associated with un-phosphorylated and phosphorylated VDAC suggest a reversal in channel conformation from relatively closed state to an open state. Model analyses of the nitrosated data suggest that faster reaction of nitric oxide with Cys-127 thiol group might be responsible for the biphasic effect of nitric oxide on basal VDAC conductance. PMID:25628567
Monte Carlo based unit commitment procedures for the deregulated market environment
International Nuclear Information System (INIS)
The unit commitment problem, originally conceived in the framework of short term operation of vertically integrated utilities, needs a thorough re-examination in the light of the ongoing transition towards the open electricity market environment. In this work the problem is re-formulated to adapt unit commitment to the viewpoint of a generation company (GENCO) which is no longer bound to satisfy its load, but is willing to maximize its profits. Moreover, with reference to the present day situation in many countries, the presence of a GENCO (the former monopolist) which is in the position of exerting the market power, requires a careful analysis to be carried out considering the different perspectives of a price taker and of the price maker GENCO. Unit commitment is thus shown to lead to a couple of distinct, yet slightly different problems. The unavoidable uncertainties in load profile and price behaviour over the time period of interest are also taken into account by means of a Monte Carlo simulation. Both the forecasted loads and prices are handled as random variables with a normal multivariate distribution. The correlation between the random input variables corresponding to successive hours of the day was considered by carrying out a statistical analysis of actual load and price data. The whole procedure was tested making use of reasonable approximations of the actual data of the thermal generation units available to come actual GENCOs operating in Italy. (author)
Energy Technology Data Exchange (ETDEWEB)
Abdel-Khalik, Hany S. [North Carolina State Univ., Raleigh, NC (United States); Zhang, Qiong [North Carolina State Univ., Raleigh, NC (United States)
2014-05-20
The development of hybrid Monte-Carlo-Deterministic (MC-DT) approaches, taking place over the past few decades, have primarily focused on shielding and detection applications where the analysis requires a small number of responses, i.e. at the detector locations(s). This work further develops a recently introduced global variance reduction approach, denoted by the SUBSPACE approach is designed to allow the use of MC simulation, currently limited to benchmarking calculations, for routine engineering calculations. By way of demonstration, the SUBSPACE approach is applied to assembly level calculations used to generate the few-group homogenized cross-sections. These models are typically expensive and need to be executed in the order of 10^{3} - 10^{5} times to properly characterize the few-group cross-sections for downstream core-wide calculations. Applicability to k-eigenvalue core-wide models is also demonstrated in this work. Given the favorable results obtained in this work, we believe the applicability of the MC method for reactor analysis calculations could be realized in the near future.
Calculation of Credit Valuation Adjustment Based on Least Square Monte Carlo Methods
Directory of Open Access Journals (Sweden)
Qian Liu
2015-01-01
Full Text Available Counterparty credit risk has become one of the highest-profile risks facing participants in the financial markets. Despite this, relatively little is known about how counterparty credit risk is actually priced mathematically. We examine this issue using interest rate swaps. This largely traded financial product allows us to well identify the risk profiles of both institutions and their counterparties. Concretely, Hull-White model for rate and mean-reverting model for default intensity have proven to be in correspondence with the reality and to be well suited for financial institutions. Besides, we find that least square Monte Carlo method is quite efficient in the calculation of credit valuation adjustment (CVA, for short as it avoids the redundant step to generate inner scenarios. As a result, it accelerates the convergence speed of the CVA estimators. In the second part, we propose a new method to calculate bilateral CVA to avoid double counting in the existing bibliographies, where several copula functions are adopted to describe the dependence of two first to default times.
Markov chain Monte Carlo based analysis of post-translationally modified VDAC1 gating kinetics
Directory of Open Access Journals (Sweden)
Shivendra eTewari
2015-01-01
Full Text Available The voltage-dependent anion channel (VDAC is the main conduit for permeation of solutes (including nucleotides and metabolites of up to 5 kDa across the mitochondrial outer membrane (MOM. Recent studies suggest that VDAC activity is regulated via post-translational modifications (PTMs. Yet the nature and effect of these modifications is not understood. Herein, single channel currents of wild-type, nitrosated and phosphorylated VDAC are analyzed using a generalized continuous-time Markov chain Monte Carlo (MCMC method. This developed method describes three distinct conducting states (open, half-open, and closed of VDAC1 activity. Lipid bilayer experiments are also performed to record single VDAC activity under un-phosphorylated and phosphorylated conditions, and are analyzed using the developed stochastic search method. Experimental data show significant alteration in VDAC gating kinetics and conductance as a result of PTMs. The effect of PTMs on VDAC kinetics is captured in the parameters associated with the identified Markov model. Stationary distributions of the Markov model suggests that nitrosation of VDAC not only decreased its conductance but also significantly locked VDAC in a closed state. On the other hand, stationary distributions of the model associated with un-phosphorylated and phosphorylated VDAC suggest a reversal in channel conformation from relatively closed state to an open state. Model analyses of the nitrosated data suggest that faster reaction of nitric oxide with Cys-127 thiol group might be responsible for the biphasic effect of nitric oxide on basal VDAC conductance.
Hrivnacova, I; Berejnov, V V; Brun, R; Carminati, F; Fassò, A; Futo, E; Gheata, A; Caballero, I G; Morsch, Andreas
2003-01-01
The concept of Virtual Monte Carlo (VMC) has been developed by the ALICE Software Project to allow different Monte Carlo simulation programs to run without changing the user code, such as the geometry definition, the detector response simulation or input and output formats. Recently, the VMC classes have been integrated into the ROOT framework, and the other relevant packages have been separated from the AliRoot framework and can be used individually by any other HEP project. The general concept of the VMC and its set of base classes provided in ROOT will be presented. Existing implementations for Geant3, Geant4 and FLUKA and simple examples of usage will be described.
Manifestation of proton structure in ridge-like correlations in high-energy proton-proton collisions
Kubiczek, Patryk
2015-01-01
Recently, the CMS collaboration reported a long range in rapidity, near-side ('ridge-like') angular correlations in high-energy proton-proton collisions, so called ridge effect. This surprising observation suggests the presence of a collective flow that resembles the one believed to produce a similar correlation hydrodynamically in heavy-ion collisions. If the hydrodynamic description is valid then the effect is triggered by the initial spatial anisotropy of the colliding matter. Estimating this anisotropy within different models of the proton internal structure in comparison with measured angular correlations in high-energy proton-proton collision data could in principle discriminate between different proton models. Inspired by recent theoretical developments, we propose several phenomenological models of the proton structure. Subsequently, we calculate the anisotropy coefficients of the dense matter formed in proton-proton collisions within the formalism of the Monte Carlo Glauber model. We find that some p...
Energy Technology Data Exchange (ETDEWEB)
Vasileiou, Chrysoula; Wang, Wenjing; Jia, Xiaofei; Lee, Kin Sing Stephen; Watson, Camille T.; Geiger, James H.; Borhan, Babak; (MSU)
2010-03-04
Cellular Retinoic Acid Binding Protein II (CRABPII) has been reengineered to specifically bind and react with all-trans-retinal to form a protonated Schiff base. Each step of this process has been dissected and four residues (Lys132, Tyr134, Arg111, and Glu121) within the CRABPII binding site have been identified as crucial for imine formation and/or protonation. The precise role of each residue has been examined through site directed mutagenesis and crystallographic studies. The crystal structure of the R132K:L121E-CRABPII (PDB-3I17) double mutant suggests a direct interaction between engineered Glu121 and the native Arg111, which is critical for both Schiff base formation and protonation.
Proton exchange membrane fuel cell system diagnosis based on the signed directed graph method
Hua, Jianfeng; Lu, Languang; Ouyang, Minggao; Li, Jianqiu; Xu, Liangfei
The fuel-cell powered bus is becoming the favored choice for electric vehicles because of its extended driving range, zero emissions, and high energy conversion efficiency when compared with battery-operated electric vehicles. In China, a demonstration program for the fuel cell bus fleet operated at the Beijing Olympics in 2008 and the Shanghai Expo in 2010. It is necessary to develop comprehensive proton exchange membrane fuel cell (PEMFC) diagnostic tools to increase the reliability of these systems. It is especially critical for fuel-cell city buses serving large numbers of passengers using public transportation. This paper presents a diagnostic analysis and implementation study based on the signed directed graph (SDG) method for the fuel-cell system. This diagnostic system was successfully implemented in the fuel-cell bus fleet at the Shanghai Expo in 2010.
Institute of Scientific and Technical Information of China (English)
无
2001-01-01
The molecular first hyperpolarizabilities (b) and electronic properties of some azulenic retinal analogues and their derivatives have been investigated theoretically by employing semiempirical approaches. The results indicate that the protonated Schiff bases (PSB) of the 2-substituted azulenic retinal analogues possess extremely large negative b values and very good transparency. These can be attributed to the large difference between the ground state dipole moment and the first excited state dipole moment according to the electronic property analysis. The characteristic blue- shifted absorption in polar solvents of the 2-substituted PSB chromophores can be well explained by the negative solvato-chromic effects. The largest calculated |mb | value can reach the magnitude of 10-44 esu, which is close to the highest re-ported values of synthesized chromophores.
DSC and conductivity studies on PVA based proton conducting gel electrolytes
Indian Academy of Sciences (India)
S L Agrawal; Arvind Awadhia
2004-12-01
An attempt has been made in the present work to prepare polyvinyl alcohol (PVA) based proton conducting gel electrolytes in ammonium thiocyanate (NH4SCN) solution and characterize them. DSC studies affirm the formation of gels along with the presence of partial complexes. The cole–cole plots exhibit maximum ionic conductivity (2.58 × 10-3 S cm-1) for gel samples containing 6 wt% of PVA. The conductivity of gel electrolytes exhibit liquid like nature at low polymer concentrations while the behaviour is seen to be affected by the formation of PVA–NH4SCN complexes upon increase in polymer content beyond 5 wt%. Temperature dependence of ionic conductivity exhibits VTF behaviour.
Hennessy, Ricky; Lim, Sam L; Markey, Mia K; Tunnell, James W
2013-03-01
We present a Monte Carlo lookup table (MCLUT)-based inverse model for extracting optical properties from tissue-simulating phantoms. This model is valid for close source-detector separation and highly absorbing tissues. The MCLUT is based entirely on Monte Carlo simulation, which was implemented using a graphics processing unit. We used tissue-simulating phantoms to determine the accuracy of the MCLUT inverse model. Our results show strong agreement between extracted and expected optical properties, with errors rate of 1.74% for extracted reduced scattering values, 0.74% for extracted absorption values, and 2.42% for extracted hemoglobin concentration values. PMID:23455965
Albin, T.; Koschny, D.; Soja, R.; Srama, R.; Poppe, B.
2016-01-01
The Canary Islands Long-Baseline Observatory (CILBO) is a double station meteor camera system (Koschny et al., 2013; Koschny et al., 2014) that consists of 5 cameras. The two cameras considered in this report are ICC7 and ICC9, and are installed on Tenerife and La Palma. They point to the same atmospheric volume between both islands allowing stereoscopic observation of meteors. Since its installation in 2011 and the start of operation in 2012 CILBO has detected over 15000 simultaneously observed meteors. Koschny and Diaz (2002) developed the Meteor Orbit and Trajectory Software (MOTS) to compute the trajectory of such meteors. The software uses the astrometric data from the detection software MetRec (Molau, 1998) and determines the trajectory in geodetic coordinates. This work presents a Monte-Carlo based extension of the MOTS code to compute the orbital elements of simultaneously detected meteors by CILBO.
Determination of low-energy structures of a small RNA hairpin using Monte Carlo–based techniques
Indian Academy of Sciences (India)
Sudhanshu Shanker; Pradipta Bandyopadhyay
2012-07-01
The energy landscape of RNA is known to be extremely rugged, and hence finding low-energy structures starting from a random structure is a challenging task for any optimization algorithm. In the current work, we have investigated the ability of one Monte Carlo–based optimization algorithm, Temperature Basin Paving, to explore the energy landscape of a small RNA T-loop hairpin. In this method, the history of the simulation is used to increase the probability of states less visited in the simulation. It has been found that using both energy and end-to-end distance as the biasing parameters in the simulation, the partially folded structure of the hairpin starting from random structures could be obtained.
Ye, Hong-zhou; Jiang, Hong
2014-01-01
Materials with spin-crossover (SCO) properties hold great potentials in information storage and therefore have received a lot of concerns in the recent decades. The hysteresis phenomena accompanying SCO is attributed to the intermolecular cooperativity whose underlying mechanism may have a vibronic origin. In this work, a new vibronic Ising-like model in which the elastic coupling between SCO centers is included by considering harmonic stretching and bending (SAB) interactions is proposed and solved by Monte Carlo simulations. The key parameters in the new model, $k_1$ and $k_2$, corresponding to the elastic constant of the stretching and bending mode, respectively, can be directly related to the macroscopic bulk and shear modulus of the material in study, which can be readily estimated either based on experimental measurements or first-principles calculations. The convergence issue in the MC simulations of the thermal hysteresis has been carefully checked, and it was found that the stable hysteresis loop can...
Monte Carlo based approach to the LS–NaI 4πβ–γ anticoincidence extrapolation and uncertainty.
Fitzgerald, R
2016-03-01
The 4πβ–γ anticoincidence method is used for the primary standardization of β−, β+, electron capture (EC), α, and mixed-mode radionuclides. Efficiency extrapolation using one or more γ ray coincidence gates is typically carried out by a low-order polynomial fit. The approach presented here is to use a Geant4-based Monte Carlo simulation of the detector system to analyze the efficiency extrapolation. New code was developed to account for detector resolution, direct γ ray interaction with the PMT, and implementation of experimental β-decay shape factors. The simulation was tuned to 57Co and 60Co data, then tested with 99mTc data, and used in measurements of 18F, 129I, and 124I. The analysis method described here offers a more realistic activity value and uncertainty than those indicated from a least-squares fit alone.
Idiri, Z.; Mazrou, H.; Beddek, S.; Amokrane, A.; Azbouche, A.
2007-07-01
The present paper describes the optimization of sample dimensions of a 241Am-Be neutron source-based Prompt gamma neutron activation analysis (PGNAA) setup devoted for in situ environmental water rejects analysis. The optimal dimensions have been achieved following extensive Monte Carlo neutron flux calculations using MCNP5 computer code. A validation process has been performed for the proposed preliminary setup with measurements of thermal neutron flux by activation technique of indium foils, bare and with cadmium covered sheet. Sensitive calculations were subsequently performed to simulate real conditions of in situ analysis by determining thermal neutron flux perturbations in samples according to chlorine and organic matter concentrations changes. The desired optimal sample dimensions were finally achieved once established constraints regarding neutron damage to semi-conductor gamma detector, pulse pile-up, dead time and radiation hazards were fully met.
Monte Carlo based approach to the LS–NaI 4πβ–γ anticoincidence extrapolation and uncertainty.
Fitzgerald, R
2016-03-01
The 4πβ–γ anticoincidence method is used for the primary standardization of β−, β+, electron capture (EC), α, and mixed-mode radionuclides. Efficiency extrapolation using one or more γ ray coincidence gates is typically carried out by a low-order polynomial fit. The approach presented here is to use a Geant4-based Monte Carlo simulation of the detector system to analyze the efficiency extrapolation. New code was developed to account for detector resolution, direct γ ray interaction with the PMT, and implementation of experimental β-decay shape factors. The simulation was tuned to 57Co and 60Co data, then tested with 99mTc data, and used in measurements of 18F, 129I, and 124I. The analysis method described here offers a more realistic activity value and uncertainty than those indicated from a least-squares fit alone. PMID:27358944
Excitations in photoactive molecules from quantum Monte Carlo
International Nuclear Information System (INIS)
Despite significant advances in electronic structure methods for the treatment of excited states, attaining an accurate description of the photoinduced processes in photoactive biomolecules is proving very difficult. For the prototypical photosensitive molecules, formaldimine, formaldehyde, and a minimal protonated Schiff base model of the retinal chromophore, we investigate the performance of various approaches generally considered promising for the computation of excited potential energy surfaces. We show that quantum Monte Carlo can accurately estimate the excitation energies of the studied systems if one constructs carefully the trial wave function, including in most cases the reoptimization of its determinantal part within quantum Monte Carlo. While time-dependent density functional theory and quantum Monte Carlo are generally in reasonable agreement, they yield a qualitatively different description of the isomerization of the Schiff base model. Finally, we find that the restricted open shell Kohn-Sham method is at variance with quantum Monte Carlo in estimating the lowest-singlet excited state potential energy surface for low-symmetry molecular structures
Monte Carlo simulations of the radiation environment for the CMS Experiment
Mallows, Sophie
2015-01-01
Monte Carlo radiation transport codes are used by the CMS Beam Radiation Instrumentation and Luminosity (BRIL) project to estimate the radiation levels due to proton-proton collisions and machine induced background. Results are used by the CMS collaboration for various applications: comparison with detector hit rates, pile-up studies, predictions of radiation damage based on various models (Dose, NIEL, DPA), shielding design, estimations of residual dose environment. Simulation parameters, and the maintenance of the input files are summarised, and key results are presented. Furthermore, an overview of additional programs developed by the BRIL project to meet the specific needs of CMS community is given.
Monte Carlo simulations of the radiation environment for the CMS experiment
Mallows, S.; Azhgirey, I.; Bayshev, I.; Bergstrom, I.; Cooijmans, T.; Dabrowski, A.; Glöggler, L.; Guthoff, M.; Kurochkin, I.; Vincke, H.; Tajeda, S.
2016-07-01
Monte Carlo radiation transport codes are used by the CMS Beam Radiation Instrumentation and Luminosity (BRIL) project to estimate the radiation levels due to proton-proton collisions and machine induced background. Results are used by the CMS collaboration for various applications: comparison with detector hit rates, pile-up studies, predictions of radiation damage based on various models (Dose, NIEL, DPA), shielding design, estimations of residual dose environment. Simulation parameters, and the maintenance of the input files are summarized, and key results are presented. Furthermore, an overview of additional programs developed by the BRIL project to meet the specific needs of CMS community is given.
GPU-based Monte Carlo Dust Radiative Transfer Scheme Applied to Active Galactic Nuclei
Heymann, Frank; Siebenmorgen, Ralf
2012-05-01
A three-dimensional parallel Monte Carlo (MC) dust radiative transfer code is presented. To overcome the huge computing-time requirements of MC treatments, the computational power of vectorized hardware is used, utilizing either multi-core computer power or graphics processing units. The approach is a self-consistent way to solve the radiative transfer equation in arbitrary dust configurations. The code calculates the equilibrium temperatures of two populations of large grains and stochastic heated polycyclic aromatic hydrocarbons. Anisotropic scattering is treated applying the Heney-Greenstein phase function. The spectral energy distribution (SED) of the object is derived at low spatial resolution by a photon counting procedure and at high spatial resolution by a vectorized ray tracer. The latter allows computation of high signal-to-noise images of the objects at any frequencies and arbitrary viewing angles. We test the robustness of our approach against other radiative transfer codes. The SED and dust temperatures of one- and two-dimensional benchmarks are reproduced at high precision. The parallelization capability of various MC algorithms is analyzed and included in our treatment. We utilize the Lucy algorithm for the optical thin case where the Poisson noise is high, the iteration-free Bjorkman & Wood method to reduce the calculation time, and the Fleck & Canfield diffusion approximation for extreme optical thick cells. The code is applied to model the appearance of active galactic nuclei (AGNs) at optical and infrared wavelengths. The AGN torus is clumpy and includes fluffy composite grains of various sizes made up of silicates and carbon. The dependence of the SED on the number of clumps in the torus and the viewing angle is studied. The appearance of the 10 μm silicate features in absorption or emission is discussed. The SED of the radio-loud quasar 3C 249.1 is fit by the AGN model and a cirrus component to account for the far-infrared emission.
International Nuclear Information System (INIS)
Two proton-conductive molecular hybrid complexes, {[Zn(H2O)8][H(H2O)2](HINO)4(PMo12O40)}n (1) and {[Mn(H2O)8][H(H2O)2.5](HINO)4(PMo12O40)}n (2), were constructed by introducing protonated water clusters, transition metal ionized water clusters and [PMo12O40]3- anions in the gallery of H-bonding networks based on isonicotinic acid N-oxide (HINO). Single-crystal X-ray diffraction analyses at 293 K revealed that both complexes presented exactly the same three-dimensional (3D) hydrogen-bonded networks with large one-dimensional (1D) channels. Interestingly, [PMo12O40]3- anions just filled in the 1D channels and self-assembled into poly-Keggin-anion chains. Thermogravimetric analyses both show no weight loss in the temperature range of 20-100 deg. C, indicating that all water molecules in the unit structure are not easily lost below 100 deg. C. Surprisingly, the proton conductivities of 1 and 2 in the temperature range of 85-100 deg. C under 98% RH conditions reached high proton conductivities of 10-3 S cm-1. A possible mechanism of the proton conduction was proposed according to the experimental results. - Graphical abstract: Two molecular hybrids constructed by ionized water clusters and poly-Keggin-anion chains showed high proton conductivities of 10-3 S cm-1 in the temperature range of 85-100 deg. C under 98% relative humidity. Highlights: → Proton conductors have interested us from the point of its applications in fuel cells. → Heteropolyacids have suitable characteristics to be used as excellent proton conductors. → Two new supramolecular complexes based on [PMo12O40]3- and isonicotinic acid N-oxide was constructed. → The structure was determined by using single-crystal X-ray diffraction data. → Both complexes showed good proton conductivities of 10-3 S cm-1 in the temperature range of 85-100 deg. C.
Influence of Geant4 parameters on proton dose distribution
Directory of Open Access Journals (Sweden)
Asad Merouani
2015-09-01
Full Text Available Purpose: The proton therapy presents a great precision during the radiation dose delivery. It is useful when the tumor is located in a sensitive area like brain or eyes. The Monte Carlo (MC simulations are usually used in treatment planning system (TPS to estimate the radiation dose. In this paper we are interested in estimating the proton dose statistical uncertainty generated by the MC simulations. Methods: Geant4 was used in the simulation of the eye’s treatment room for 62 MeV protons therapy, installed in the Istituto Nazionale Fisica Nucleare Laboratori Nazionali del Sud (LNS-INFN facility in Catania. This code is a Monte Carlo based on software dedicated to simulate the passage of particles through the matter. In this work, we are interested in optimizing the Geant4 parameters on energy deposit distribution by proton to achieve the spatial resolution of dose distribution required for cancer therapy. We propose various simulations and compare the corresponding dose distribution inside water to evaluate the statistical uncertainties. Results: The simulated Bragg peak, based on facility model is in agreement with the experimental data, The calculations show that the mean statistical uncertainty is less than 1% for a simulation set with 5 × 104 events, 10-3 mm production threshold and a 10-2 mm step limit. Conclusion: The set of Geant4 cut and step limit values can be chosen in combination with the number of events to reach precision recommended from International Commission on Radiation Units and measurements (ICRU in Monte Carlo codes for proton therapy treatment.
Skrzyński, Witold
2014-11-01
The aim of this work was to create a model of a wide-bore Siemens Somatom Sensation Open CT scanner for use with GMCTdospp, which is an EGSnrc-based software tool dedicated for Monte Carlo calculations of dose in CT examinations. The method was based on matching spectrum and filtration to half value layer and dose profile, and thus was similar to the method of Turner et al. (Med. Phys. 36, pp. 2154-2164). Input data on unfiltered beam spectra were taken from two sources: the TASMIP model and IPEM Report 78. Two sources of HVL data were also used, namely measurements and documentation. Dose profile along the fan-beam was measured with Gafchromic RTQA-1010 (QA+) film. Two-component model of filtration was assumed: bow-tie filter made of aluminum with 0.5 mm thickness on central axis, and flat filter made of one of four materials: aluminum, graphite, lead, or titanium. Good agreement between calculations and measurements was obtained for models based on the measured values of HVL. Doses calculated with GMCTdospp differed from the doses measured with pencil ion chamber placed in PMMA phantom by less than 5%, and root mean square difference for four tube potentials and three positions in the phantom did not exceed 2.5%. The differences for models based on HVL values from documentation exceeded 10%. Models based on TASMIP spectra and IPEM78 spectra performed equally well. PMID:25028213
SU-E-T-528: Robustness Evaluation for Fiducial-Based Accelerated Partial Breast Proton Therapy
Energy Technology Data Exchange (ETDEWEB)
Zhao, L; Rana, S; Zheng, Y [Procure Proton Therapy Center, Oklahoma City, OK (United States)
2014-06-01
Purpose: To investigate the robustness of the proton treatment plans in the presence of rotational setup error when patient is aligned with implanted fiducials. Methods: Five Stage I invasive breast cancer patients treated with the APBP protocol (PCG BRE007-12) were studied. The rotational setup errors were simulated by rotating the original CT images around the body center clockwise and counterclockwise 5 degrees (5CW and 5CCW). Manual translational registration was then performed to match the implanted fiducials on the rotated images to the original dataset. Patient contours were copied to the newly created CT set. The original treatment plan was applied to the new CT dataset with the beam isocenter placed at the geometrical center of PTV. The dose distribution was recalculated for dosimetric parameters comparison. Results: CTV and PTV (D95 and V95) coverages were not significantly different between the two simulated plans (5CW and 5CCW) and the original plan. PTV D95 and CTV D95 absolute difference among the three plans were relatively small, with maximum changes of 0.28 CGE and 0.15 CGE, respectively. PTV V95 and CTV V95 absolute differences were 0.79% and 0.48%. The dosage to the thyroid, heart, contralateral breast and lung remained zero for all three plans. The Dmax and Dmean to the volume of ipsilateral breast excluding CTV were compared, with maximum difference values of 1.02 CGE for Dmax and 3.56 CGE for Dmean. Ipsilateral lung Dmean maintained no significant changes through the three plan comparison, with the largest value 0.32 CGE. Ipsilateral lung Dmax was the most sensitive parameter to this simulation study, with a maximum difference at 20.2 CGE. Conclusion: Our study suggests that fiducial-based Accelerated Partial Breast Proton Therapy is robust with respect to +/− 5 degree patient setup rotational errors, as long as the internal fiducial markers are used for patient alignment.
Proton-air and proton-proton cross sections
Directory of Open Access Journals (Sweden)
Ulrich Ralf
2013-06-01
Full Text Available Different attempts to measure hadronic cross sections with cosmic ray data are reviewed. The major results are compared to each other and the differences in the corresponding analyses are discussed. Besides some important differences, it is crucial to see that all analyses are based on the same fundamental relation of longitudinal air shower development to the observed fluctuation of experimental observables. Furthermore, the relation of the measured proton-air to the more fundamental proton-proton cross section is discussed. The current global picture combines hadronic proton-proton cross section data from accelerator and cosmic ray measurements and indicates a good consistency with predictions of models up to the highest energies.
A novel dose-based positioning method for CT image-guided proton therapy
Cheung, Joey P.; Park, Peter C.; Court, Laurence E.; Ronald Zhu, X.; Kudchadker, Rajat J.; Frank, Steven J.; Dong, Lei
2013-01-01
Purpose: Proton dose distributions can potentially be altered by anatomical changes in the beam path despite perfect target alignment using traditional image guidance methods. In this simulation study, the authors explored the use of dosimetric factors instead of only anatomy to set up patients for proton therapy using in-room volumetric computed tomographic (CT) images.
Chevelkov, Veniamin; Habenstein, Birgit; Loquet, Antoine; Giller, Karin; Becker, Stefan; Lange, Adam
2014-05-01
Proton-detected solid-state NMR was applied to a highly deuterated insoluble, non-crystalline biological assembly, the Salmonella typhimurium type iii secretion system (T3SS) needle. Spectra of very high resolution and sensitivity were obtained at a low protonation level of 10-20% at exchangeable amide positions. We developed efficient experimental protocols for resonance assignment tailored for this system and the employed experimental conditions. Using exclusively dipolar-based interspin magnetization transfers, we recorded two sets of 3D spectra allowing for an almost complete backbone resonance assignment of the needle subunit PrgI. The additional information provided by the well-resolved proton dimension revealed the presence of two sets of resonances in the N-terminal helix of PrgI, while in previous studies employing 13C detection only a single set of resonances was observed.
Monte Carlo-based diode design for correction-less small field dosimetry
International Nuclear Information System (INIS)
Due to their small collecting volume, diodes are commonly used in small field dosimetry. However, the relative sensitivity of a diode increases with decreasing small field size. Conversely, small air gaps have been shown to cause a significant decrease in the sensitivity of a detector as the field size is decreased. Therefore, this study uses Monte Carlo simulations to look at introducing air upstream to diodes such that they measure with a constant sensitivity across all field sizes in small field dosimetry. Varying thicknesses of air were introduced onto the upstream end of two commercial diodes (PTW 60016 photon diode and PTW 60017 electron diode), as well as a theoretical unenclosed silicon chip using field sizes as small as 5 mm × 5 mm. The metric used in this study represents the ratio of the dose to a point of water to the dose to the diode active volume, for a particular field size and location. The optimal thickness of air required to provide a constant sensitivity across all small field sizes was found by plotting as a function of introduced air gap size for various field sizes, and finding the intersection point of these plots. That is, the point at whichwas constant for all field sizes was found. The optimal thickness of air was calculated to be 3.3, 1.15 and 0.10 mm for the photon diode, electron diode and unenclosed silicon chip, respectively. The variation in these results was due to the different design of each detector. When calculated with the new diode design incorporating the upstream air gap, kQclin,Qmsrfclin,fmsr was equal to unity to within statistical uncertainty (0.5%) for all three diodes. Cross-axis profile measurements were also improved with the new detector design. The upstream air gap could be implanted on the commercial diodes via a cap consisting of the air cavity surrounded by water equivalent material. The results for the unclosed silicon chip show that an ideal small field dosimetry diode could be created by using a silicon chip
Monte Carlo-based diode design for correction-less small field dosimetry
Charles, P. H.; Crowe, S. B.; Kairn, T.; Knight, R. T.; Hill, B.; Kenny, J.; Langton, C. M.; Trapp, J. V.
2013-07-01
Due to their small collecting volume, diodes are commonly used in small field dosimetry. However, the relative sensitivity of a diode increases with decreasing small field size. Conversely, small air gaps have been shown to cause a significant decrease in the sensitivity of a detector as the field size is decreased. Therefore, this study uses Monte Carlo simulations to look at introducing air upstream to diodes such that they measure with a constant sensitivity across all field sizes in small field dosimetry. Varying thicknesses of air were introduced onto the upstream end of two commercial diodes (PTW 60016 photon diode and PTW 60017 electron diode), as well as a theoretical unenclosed silicon chip using field sizes as small as 5 mm × 5 mm. The metric \\frac{{D_{w,Q} }}{{D_{Det,Q} }} used in this study represents the ratio of the dose to a point of water to the dose to the diode active volume, for a particular field size and location. The optimal thickness of air required to provide a constant sensitivity across all small field sizes was found by plotting \\frac{{D_{w,Q} }}{{D_{Det,Q} }} as a function of introduced air gap size for various field sizes, and finding the intersection point of these plots. That is, the point at which \\frac{{D_{w,Q} }}{{D_{Det,Q} }} was constant for all field sizes was found. The optimal thickness of air was calculated to be 3.3, 1.15 and 0.10 mm for the photon diode, electron diode and unenclosed silicon chip, respectively. The variation in these results was due to the different design of each detector. When calculated with the new diode design incorporating the upstream air gap, k_{Q_{clin} ,Q_{msr} }^{f_{clin} ,f_{msr} } was equal to unity to within statistical uncertainty (0.5%) for all three diodes. Cross-axis profile measurements were also improved with the new detector design. The upstream air gap could be implanted on the commercial diodes via a cap consisting of the air cavity surrounded by water equivalent material. The
Monte Carlo based beam model using a photon MLC for modulated electron radiotherapy
International Nuclear Information System (INIS)
Purpose: Modulated electron radiotherapy (MERT) promises sparing of organs at risk for certain tumor sites. Any implementation of MERT treatment planning requires an accurate beam model. The aim of this work is the development of a beam model which reconstructs electron fields shaped using the Millennium photon multileaf collimator (MLC) (Varian Medical Systems, Inc., Palo Alto, CA) for a Varian linear accelerator (linac). Methods: This beam model is divided into an analytical part (two photon and two electron sources) and a Monte Carlo (MC) transport through the MLC. For dose calculation purposes the beam model has been coupled with a macro MC dose calculation algorithm. The commissioning process requires a set of measurements and precalculated MC input. The beam model has been commissioned at a source to surface distance of 70 cm for a Clinac 23EX (Varian Medical Systems, Inc., Palo Alto, CA) and a TrueBeam linac (Varian Medical Systems, Inc., Palo Alto, CA). For validation purposes, measured and calculated depth dose curves and dose profiles are compared for four different MLC shaped electron fields and all available energies. Furthermore, a measured two-dimensional dose distribution for patched segments consisting of three 18 MeV segments, three 12 MeV segments, and a 9 MeV segment is compared with corresponding dose calculations. Finally, measured and calculated two-dimensional dose distributions are compared for a circular segment encompassed with a C-shaped segment. Results: For 15 × 34, 5 × 5, and 2 × 2 cm2 fields differences between water phantom measurements and calculations using the beam model coupled with the macro MC dose calculation algorithm are generally within 2% of the maximal dose value or 2 mm distance to agreement (DTA) for all electron beam energies. For a more complex MLC pattern, differences between measurements and calculations are generally within 3% of the maximal dose value or 3 mm DTA for all electron beam energies. For the two
Institute of Scientific and Technical Information of China (English)
徐克; 何华刚; 朱毅川
2012-01-01
目前气体扩散模拟研究多采用流体力学的计算方法,分析气体扩散过程中的动力学特性.有限体积、有限元等方法都需要对事故区域整体进行网格划分,计算过程效率无法满足长输管道事故应急跨区域、多气象以及复杂地形的要求.Monte-Carlo方法利用RAMS预测的平均风场,模拟有限气体粒子在风场中的随机行走特性,有效地弥补了计算效率与网格精度冲突所导致的模拟性能下降的缺点.通过HAVEGE方法收集计算的硬件信息熵形成随机源,修正了以往伪随机数问题,增强了Monte-Carlo方法的计算精度.结果表明Monte-Carlo气体扩散模拟研究方法满足了长输管道事故灾害应急决策的需要.%The gas dispersion simulations almost used the methods of computational fluid dynamics at present, which analyzed the dynamic mechanics of the dispersion. However FVM and FEM have to mesh the total accidental area and the computation of both failed to meet the accidents emergency requirements, which included cross-regions , multiple meteorology and complicate terrains. Through researching the random walk performance of the particles in the average wind field predicted by the RAMS model, Monte-Carlo method resolved simulating performance degradation stemming from the conflict between the computing efficiency and the meshing accuracy. Besides, the HAVEGE method corrected paseudorandom problem by collecting the computer hardware comentropy as a random resource increased the computing accuracy. The results showed that the gas dispersion simulation based on Monte-Carlo could satisfy the requirement of the long-distant pipeline disasters emergency decision-making.
基于Monte Carlo仿真的PEM探测器优化设计%Optimization PEM designs based on Monte Carlo simulations
Institute of Scientific and Technical Information of China (English)
汪梦蝶; 刘懿龙; 胡广书; 张辉
2014-01-01
The influence of the detector thickness in dual-plate positron emission mammography (PEM) on the reconstructed image quality was evaluated to optimize the crystal thickness.The evaluations used a dual-plate PET system modd built on the GEANT4 Application for Tomographic Emission (GATE) platform.Monte Carlo simulations were conducted for various detector thicknesses to analyze the system performances.Comparisons of the system sensitivity,image spatial resolution and tumor detectability of different systems showed that thinner crystals tended to improve the spatial resolution but reduced the system sensitivity and tumor detectability.The spatial resolution along the two directions of the panel's width and length were both good for a 10 mm thick crystal and were not severely impacted by the depth of interaction (DOI) effect.The system sensitivity was about 11％.With 10 min data acquisition,a small low concentration tumor source could be easily distinguished from the background in the reconstructed image.Consequently,10 mm was chosen as a optimal crystal thickness for the PEM system to achieve both higher system sensitivity and less DOI effect.%为了评估双平板正电子发射乳腺断层成像(positron emission mammography,PEM)系统中平板厚度对系统成像质量的影响,并对探测器中晶体厚度进行优化,该文基于GATE (the GEANT4 application for tomographic emission)仿真平台,建立双平板正电子发射断层成像(positron emission tomography,PET)系统模型,在不同探测器厚度下进行Monte Carlo仿真,完成数据采集和图像重建,通过比较各系统灵敏度、图像空间分辨率以及病灶检测能力,得出随着晶体厚度的减小重建图像的空间分辨率提高,但系统灵敏度和病灶检测能力降低的结论.研究结果表明:使用厚度为10 mm的硅酸钇镥(lutetium-yttrium oxyorthosilicate,LYSO)闪烁晶体时,沿着平板的长、短轴方向的图像空间分辨率受反应深度效应(depth of
An OpenCL-based Monte Carlo dose calculation engine (oclMC) for coupled photon-electron transport
Tian, Zhen; Folkerts, Michael; Qin, Nan; Jiang, Steve B; Jia, Xun
2015-01-01
Monte Carlo (MC) method has been recognized the most accurate dose calculation method for radiotherapy. However, its extremely long computation time impedes clinical applications. Recently, a lot of efforts have been made to realize fast MC dose calculation on GPUs. Nonetheless, most of the GPU-based MC dose engines were developed in NVidia CUDA environment. This limits the code portability to other platforms, hindering the introduction of GPU-based MC simulations to clinical practice. The objective of this paper is to develop a fast cross-platform MC dose engine oclMC using OpenCL environment for external beam photon and electron radiotherapy in MeV energy range. Coupled photon-electron MC simulation was implemented with analogue simulations for photon transports and a Class II condensed history scheme for electron transports. To test the accuracy and efficiency of our dose engine oclMC, we compared dose calculation results of oclMC and gDPM, our previously developed GPU-based MC code, for a 15 MeV electron ...
Directory of Open Access Journals (Sweden)
Tuija Kangasmaa
2012-01-01
Full Text Available Simultaneous Tl-201/Tc-99m dual isotope myocardial perfusion SPECT is seriously hampered by down-scatter from Tc-99m into the Tl-201 energy window. This paper presents and optimises the ordered-subsets-expectation-maximisation-(OS-EM- based reconstruction algorithm, which corrects the down-scatter using an efficient Monte Carlo (MC simulator. The algorithm starts by first reconstructing the Tc-99m image with attenuation, collimator response, and MC-based scatter correction. The reconstructed Tc-99m image is then used as an input for an efficient MC-based down-scatter simulation of Tc-99m photons into the Tl-201 window. This down-scatter estimate is finally used in the Tl-201 reconstruction to correct the crosstalk between the two isotopes. The mathematical 4D NCAT phantom and physical cardiac phantoms were used to optimise the number of OS-EM iterations where the scatter estimate is updated and the number of MC simulated photons. The results showed that two scatter update iterations and 105 simulated photons are enough for the Tc-99m and Tl-201 reconstructions, whereas 106 simulated photons are needed to generate good quality down-scatter estimates. With these parameters, the entire Tl-201/Tc-99m dual isotope reconstruction can be accomplished in less than 3 minutes.
Kangasmaa, Tuija; Kuikka, Jyrki; Sohlberg, Antti
2012-01-01
Simultaneous Tl-201/Tc-99m dual isotope myocardial perfusion SPECT is seriously hampered by down-scatter from Tc-99m into the Tl-201 energy window. This paper presents and optimises the ordered-subsets-expectation-maximisation-(OS-EM-) based reconstruction algorithm, which corrects the down-scatter using an efficient Monte Carlo (MC) simulator. The algorithm starts by first reconstructing the Tc-99m image with attenuation, collimator response, and MC-based scatter correction. The reconstructed Tc-99m image is then used as an input for an efficient MC-based down-scatter simulation of Tc-99m photons into the Tl-201 window. This down-scatter estimate is finally used in the Tl-201 reconstruction to correct the crosstalk between the two isotopes. The mathematical 4D NCAT phantom and physical cardiac phantoms were used to optimise the number of OS-EM iterations where the scatter estimate is updated and the number of MC simulated photons. The results showed that two scatter update iterations and 10(5) simulated photons are enough for the Tc-99m and Tl-201 reconstructions, whereas 10(6) simulated photons are needed to generate good quality down-scatter estimates. With these parameters, the entire Tl-201/Tc-99m dual isotope reconstruction can be accomplished in less than 3 minutes.
Pan, J.; Durand, M. T.; Vanderjagt, B. J.
2015-12-01
Markov Chain Monte Carlo (MCMC) method is a retrieval algorithm based on Bayes' rule, which starts from an initial state of snow/soil parameters, and updates it to a series of new states by comparing the posterior probability of simulated snow microwave signals before and after each time of random walk. It is a realization of the Bayes' rule, which gives an approximation to the probability of the snow/soil parameters in condition of the measured microwave TB signals at different bands. Although this method could solve all snow parameters including depth, density, snow grain size and temperature at the same time, it still needs prior information of these parameters for posterior probability calculation. How the priors will influence the SWE retrieval is a big concern. Therefore, in this paper at first, a sensitivity test will be carried out to study how accurate the snow emission models and how explicit the snow priors need to be to maintain the SWE error within certain amount. The synthetic TB simulated from the measured snow properties plus a 2-K observation error will be used for this purpose. It aims to provide a guidance on the MCMC application under different circumstances. Later, the method will be used for the snowpits at different sites, including Sodankyla, Finland, Churchill, Canada and Colorado, USA, using the measured TB from ground-based radiometers at different bands. Based on the previous work, the error in these practical cases will be studied, and the error sources will be separated and quantified.
Reply to "Comment on 'A study on tetrahedron-based inhomogeneous Monte-Carlo optical simulation'".
Shen, Haiou; Wang, Ge
2011-04-19
We compare the accuracy of TIM-OS and MMCM in response to the recent analysis made by Fang [Biomed. Opt. Express 2, 1258 (2011)]. Our results show that the tetrahedron-based energy deposition algorithm used in TIM-OS is more accurate than the node-based energy deposition algorithm used in MMCM.
Nichiporov, D; Coutinho, L; Klyachko, A V
2016-04-21
Accurate, high-spatial resolution dosimetry in proton therapy is a time consuming task, and may be challenging in the case of small fields, due to the lack of adequate instrumentation. The purpose of this work is to develop a novel dose imaging detector with high spatial resolution and tissue equivalent response to dose in the Bragg peak, suitable for beam commissioning and quality assurance measurements. A scintillation gas electron multiplier (GEM) detector based on a double GEM amplification structure with optical readout was filled with a He/CF4 gas mixture and evaluated in pristine and modulated proton beams of several penetration ranges. The detector's performance was characterized in terms of linearity in dose rate, spatial resolution, short- and long-term stability and tissue-equivalence of response at different energies. Depth-dose profiles measured with the GEM detector in the 115-205 MeV energy range were compared with the profiles measured under similar conditions using the PinPoint 3D small-volume ion chamber. The GEM detector filled with a He-based mixture has a nearly tissue equivalent response in the proton beam and may become an attractive and efficient tool for high-resolution 2D and 3D dose imaging in proton dosimetry, and especially in small-field applications. PMID:26992243
Nichiporov, D.; Coutinho, L.; Klyachko, A. V.
2016-04-01
Accurate, high-spatial resolution dosimetry in proton therapy is a time consuming task, and may be challenging in the case of small fields, due to the lack of adequate instrumentation. The purpose of this work is to develop a novel dose imaging detector with high spatial resolution and tissue equivalent response to dose in the Bragg peak, suitable for beam commissioning and quality assurance measurements. A scintillation gas electron multiplier (GEM) detector based on a double GEM amplification structure with optical readout was filled with a He/CF4 gas mixture and evaluated in pristine and modulated proton beams of several penetration ranges. The detector’s performance was characterized in terms of linearity in dose rate, spatial resolution, short- and long-term stability and tissue-equivalence of response at different energies. Depth-dose profiles measured with the GEM detector in the 115-205 MeV energy range were compared with the profiles measured under similar conditions using the PinPoint 3D small-volume ion chamber. The GEM detector filled with a He-based mixture has a nearly tissue equivalent response in the proton beam and may become an attractive and efficient tool for high-resolution 2D and 3D dose imaging in proton dosimetry, and especially in small-field applications.
Design of a 10 MeV normal conducting CW proton linac based on equidistant multi-gap CH cavities
Li, Zhihui
2014-01-01
The continue wave (CW) high current proton linac has wide applications as the front end of the high power proton machines. The low energy part is the most difficult one and there is no widely accepted solution yet. Based on the analysis of the focusing properties of the CW low energy proton linac, a 10 MeV low energy normal conducting proton linac based on equidistant seven-gap Cross-bar H-type (CH) cavities is proposed. The linac is composed of ten 7-gap CH cavities and the transverse focusing is maintained by the quadrupole doublets located between cavities. The total length of the linac is less than 6 meters and the average acceleration gradient is about 1.2 MeV/m. The electromagnetic properties of the cavities are investigated by Microwave Studio. At the nominal acceleration gradient the maximum surface electric field in the cavities is less than 1.3 times Kilpatrick limit, and the Ohmic loss of each cavity is less than 35 kW. The multi-particle beam dynamics simulations are performed with the help of the...
Proton Linear Energy Transfer measurement using Emulsion Cloud Chamber
Energy Technology Data Exchange (ETDEWEB)
Shin, Jae-ik [Proton Therapy Center, National Cancer Center (Korea, Republic of); Division of Heavy Ion Clinical Research, Korea Institute of Radiological & Medical Sciences (KIRAMS), Seoul (Korea, Republic of); Park, Seyjoon [Department of Radiation Oncology, Samsung Medical Center, Sungkyunkwan University, School of Medicine, Seoul (Korea, Republic of); Kim, Haksoo; Kim, Meyoung [Proton Therapy Center, National Cancer Center (Korea, Republic of); Jeong, Chiyoung [Department of Radiation Oncology, Asan Medical Center, University of Ulsan College of Medicine, Seoul (Korea, Republic of); Cho, Sungkoo [Department of Radiation Oncology, Samsung Medical Center, Sungkyunkwan University, School of Medicine, Seoul (Korea, Republic of); Lim, Young Kyung; Shin, Dongho [Proton Therapy Center, National Cancer Center (Korea, Republic of); Lee, Se Byeong, E-mail: sblee@ncc.re.kr [Proton Therapy Center, National Cancer Center (Korea, Republic of); Morishima, Kunihiro; Naganawa, Naotaka; Sato, Osamu [Department of Physics, Nagoya University, Nagoya (Japan); Kwak, Jungwon [Department of Radiation Oncology, Asan Medical Center, University of Ulsan College of Medicine, Seoul (Korea, Republic of); Kim, Sung Hyun [Center for Underground Physics, Institute for Basic Science (IBS), Daejeon (Korea, Republic of); Cho, Jung Sook [Department of refinement education, Dongseo University, Busan (Korea, Republic of); Ahn, Jung Keun [Department of Physics, Korea University, Seoul (Korea, Republic of); Kim, Ji Hyun; Yoon, Chun Sil [Gyeongsang National University, Jinju (Korea, Republic of); Incerti, Sebastien [CNRS, IN2P3, CENBG, UMR 5797, F-33170 Gradignan (France); Université Bordeaux 1, CENBG, UMR 5797, F-33170 Gradignan (France)
2015-04-15
This study proposes to determine the correlation between the Volume Pulse Height (VPH) measured by nuclear emulsion and Linear Energy Transfer (LET) calculated by Monte Carlo simulation based on Geant4. The nuclear emulsion was irradiated at the National Cancer Center (NCC) with a therapeutic proton beam and was installed at 5.2 m distance from the beam nozzle structure with various thicknesses of water-equivalent material (PMMA) blocks to position with specific positions along the Bragg curve. After the beam exposure and development of the emulsion films, the films were scanned by S-UTS developed in Nagoya University. The proton tracks in the scanned films were reconstructed using the ‘NETSCAN’ method. Through this procedure, the VPH can be derived from each reconstructed proton track at each position along the Bragg curve. The VPH value indicates the magnitude of energy loss in proton track. By comparison with the simulation results obtained using Geant4, we found the correlation between the LET calculated by Monte Carlo simulation and the VPH measured by the nuclear emulsion.
Vila, Gabriela S.; Romero, Gustavo E.
2008-01-01
We present a model for high-energy emission in microquasars where the energy content of the jets is dominated by relativistic protons. We also include a primary leptonic component. Particles are accelerated up to relativistic energies in a compact region located near the base of the jet, where most of the emission is produced. We calculate the production spectrum due to proton and electron synchrotron radiation and photohadronic interactions. The target field for proton-photon collisions is p...
Simakov, S P; Moellendorff, U V; Schmuck, I; Konobeev, A Y; Korovin, Y A; Pereslavtsev, P
2002-01-01
A newly developed computational procedure is presented for the generation of d-Li source neutrons in Monte Carlo transport calculations based on the use of evaluated double-differential d+ sup 6 sup , sup 7 Li cross section data. A new code M sup c DeLicious was developed as an extension to MCNP4C to enable neutronics design calculations for the d-Li based IFMIF neutron source making use of the evaluated deuteron data files. The M sup c DeLicious code was checked against available experimental data and calculation results of M sup c DeLi and MCNPX, both of which use built-in analytical models for the Li(d, xn) reaction. It is shown that M sup c DeLicious along with newly evaluated d+ sup 6 sup , sup 7 Li data is superior in predicting the characteristics of the d-Li neutron source. As this approach makes use of tabulated Li(d, xn) cross sections, the accuracy of the IFMIF d-Li neutron source term can be steadily improved with more advanced and validated data.
International Nuclear Information System (INIS)
During construction of the whole body counter (WBC) at the Children's Nutrition Research Center (CNRC), efficiency calibration was needed to translate acquired counts of 40K to actual grams of potassium for measurement of total body potassium (TBK) in a diverse subject population. The MCNP Monte Carlo n-particle simulation program was used to describe the WBC (54 detectors plus shielding), test individual detector counting response, and create a series of virtual anthropomorphic phantoms based on national reference anthropometric data. Each phantom included an outer layer of adipose tissue and an inner core of lean tissue. Phantoms were designed for both genders representing ages 3.5 to 18.5 years with body sizes from the 5th to the 95th percentile based on body weight. In addition, a spherical surface source surrounding the WBC was modeled in order to measure the effects of subject mass on room background interference. Individual detector measurements showed good agreement with the MCNP model. The background source model came close to agreement with empirical measurements, but showed a trend deviating from unity with increasing subject size. Results from the MCNP simulation of the CNRC WBC agreed well with empirical measurements using BOMAB phantoms. Individual detector efficiency corrections were used to improve the accuracy of the model. Nonlinear multiple regression efficiency calibration equations were derived for each gender. Room background correction is critical in improving the accuracy of the WBC calibration.
Directory of Open Access Journals (Sweden)
Qinming Liu
2012-01-01
Full Text Available Health management for a complex nonlinear system is becoming more important for condition-based maintenance and minimizing the related risks and costs over its entire life. However, a complex nonlinear system often operates under dynamically operational and environmental conditions, and it subjects to high levels of uncertainty and unpredictability so that effective methods for online health management are still few now. This paper combines hidden semi-Markov model (HSMM with sequential Monte Carlo (SMC methods. HSMM is used to obtain the transition probabilities among health states and health state durations of a complex nonlinear system, while the SMC method is adopted to decrease the computational and space complexity, and describe the probability relationships between multiple health states and monitored observations of a complex nonlinear system. This paper proposes a novel method of multisteps ahead health recognition based on joint probability distribution for health management of a complex nonlinear system. Moreover, a new online health prognostic method is developed. A real case study is used to demonstrate the implementation and potential applications of the proposed methods for online health management of complex nonlinear systems.
Shypailo, R J; Ellis, K J
2011-05-21
During construction of the whole body counter (WBC) at the Children's Nutrition Research Center (CNRC), efficiency calibration was needed to translate acquired counts of (40)K to actual grams of potassium for measurement of total body potassium (TBK) in a diverse subject population. The MCNP Monte Carlo n-particle simulation program was used to describe the WBC (54 detectors plus shielding), test individual detector counting response, and create a series of virtual anthropomorphic phantoms based on national reference anthropometric data. Each phantom included an outer layer of adipose tissue and an inner core of lean tissue. Phantoms were designed for both genders representing ages 3.5 to 18.5 years with body sizes from the 5th to the 95th percentile based on body weight. In addition, a spherical surface source surrounding the WBC was modeled in order to measure the effects of subject mass on room background interference. Individual detector measurements showed good agreement with the MCNP model. The background source model came close to agreement with empirical measurements, but showed a trend deviating from unity with increasing subject size. Results from the MCNP simulation of the CNRC WBC agreed well with empirical measurements using BOMAB phantoms. Individual detector efficiency corrections were used to improve the accuracy of the model. Nonlinear multiple regression efficiency calibration equations were derived for each gender. Room background correction is critical in improving the accuracy of the WBC calibration.
Shypailo, R. J.; Ellis, K. J.
2011-05-01
During construction of the whole body counter (WBC) at the Children's Nutrition Research Center (CNRC), efficiency calibration was needed to translate acquired counts of 40K to actual grams of potassium for measurement of total body potassium (TBK) in a diverse subject population. The MCNP Monte Carlo n-particle simulation program was used to describe the WBC (54 detectors plus shielding), test individual detector counting response, and create a series of virtual anthropomorphic phantoms based on national reference anthropometric data. Each phantom included an outer layer of adipose tissue and an inner core of lean tissue. Phantoms were designed for both genders representing ages 3.5 to 18.5 years with body sizes from the 5th to the 95th percentile based on body weight. In addition, a spherical surface source surrounding the WBC was modeled in order to measure the effects of subject mass on room background interference. Individual detector measurements showed good agreement with the MCNP model. The background source model came close to agreement with empirical measurements, but showed a trend deviating from unity with increasing subject size. Results from the MCNP simulation of the CNRC WBC agreed well with empirical measurements using BOMAB phantoms. Individual detector efficiency corrections were used to improve the accuracy of the model. Nonlinear multiple regression efficiency calibration equations were derived for each gender. Room background correction is critical in improving the accuracy of the WBC calibration.
Recent improvements on Monte Carlo modelling at ATLAS
Soualah, Rachik; The ATLAS collaboration
2015-01-01
The most recent findings on the Monte Carlo simulation of proton-proton collisions at ATLAS are presented. In this, the most recent combined MPI and shower tunes performed using 7 TeV ATLAS data are reported, as well as improved modeling of electroweak processes, and processes containing top using recent MC generators and PDF sets.
International Nuclear Information System (INIS)
Purpose: The TestDose platform was developed to generate scintigraphic imaging protocols and associated dosimetry by Monte Carlo modeling. TestDose is part of a broader project (www.dositest.com) whose aim is to identify the biases induced by different clinical dosimetry protocols. Methods: The TestDose software allows handling the whole pipeline from virtual patient generation to resulting planar and SPECT images and dosimetry calculations. The originality of their approach relies on the implementation of functional segmentation for the anthropomorphic model representing a virtual patient. Two anthropomorphic models are currently available: 4D XCAT and ICRP 110. A pharmacokinetic model describes the biodistribution of a given radiopharmaceutical in each defined compartment at various time-points. The Monte Carlo simulation toolkit GATE offers the possibility to accurately simulate scintigraphic images and absorbed doses in volumes of interest. The TestDose platform relies on GATE to reproduce precisely any imaging protocol and to provide reference dosimetry. For image generation, TestDose stores user’s imaging requirements and generates automatically command files used as input for GATE. Each compartment is simulated only once and the resulting output is weighted using pharmacokinetic data. Resulting compartment projections are aggregated to obtain the final image. For dosimetry computation, emission data are stored in the platform database and relevant GATE input files are generated for the virtual patient model and associated pharmacokinetics. Results: Two samples of software runs are given to demonstrate the potential of TestDose. A clinical imaging protocol for the Octreoscan™ therapeutical treatment was implemented using the 4D XCAT model. Whole-body “step and shoot” acquisitions at different times postinjection and one SPECT acquisition were generated within reasonable computation times. Based on the same Octreoscan™ kinetics, a dosimetry
Energy Technology Data Exchange (ETDEWEB)
Garcia, Marie-Paule, E-mail: marie-paule.garcia@univ-brest.fr; Villoing, Daphnée [UMR 1037 INSERM/UPS, CRCT, 133 Route de Narbonne, 31062 Toulouse (France); McKay, Erin [St George Hospital, Gray Street, Kogarah, New South Wales 2217 (Australia); Ferrer, Ludovic [ICO René Gauducheau, Boulevard Jacques Monod, St Herblain 44805 (France); Cremonesi, Marta; Botta, Francesca; Ferrari, Mahila [European Institute of Oncology, Via Ripamonti 435, Milano 20141 (Italy); Bardiès, Manuel [UMR 1037 INSERM/UPS, CRCT, 133 Route de Narbonne, Toulouse 31062 (France)
2015-12-15
Purpose: The TestDose platform was developed to generate scintigraphic imaging protocols and associated dosimetry by Monte Carlo modeling. TestDose is part of a broader project (www.dositest.com) whose aim is to identify the biases induced by different clinical dosimetry protocols. Methods: The TestDose software allows handling the whole pipeline from virtual patient generation to resulting planar and SPECT images and dosimetry calculations. The originality of their approach relies on the implementation of functional segmentation for the anthropomorphic model representing a virtual patient. Two anthropomorphic models are currently available: 4D XCAT and ICRP 110. A pharmacokinetic model describes the biodistribution of a given radiopharmaceutical in each defined compartment at various time-points. The Monte Carlo simulation toolkit GATE offers the possibility to accurately simulate scintigraphic images and absorbed doses in volumes of interest. The TestDose platform relies on GATE to reproduce precisely any imaging protocol and to provide reference dosimetry. For image generation, TestDose stores user’s imaging requirements and generates automatically command files used as input for GATE. Each compartment is simulated only once and the resulting output is weighted using pharmacokinetic data. Resulting compartment projections are aggregated to obtain the final image. For dosimetry computation, emission data are stored in the platform database and relevant GATE input files are generated for the virtual patient model and associated pharmacokinetics. Results: Two samples of software runs are given to demonstrate the potential of TestDose. A clinical imaging protocol for the Octreoscan™ therapeutical treatment was implemented using the 4D XCAT model. Whole-body “step and shoot” acquisitions at different times postinjection and one SPECT acquisition were generated within reasonable computation times. Based on the same Octreoscan™ kinetics, a dosimetry
Proton and electron deep dose profiles for retinoblastoma based on GEANT 4 code
Energy Technology Data Exchange (ETDEWEB)
Braga, Flavia V., E-mail: flaviafisica@gmail.co [Universidade Federal de Minas Gerais (UFMG), Belo Horizonte, MG (Brazil). Programa de Pos-graduacao em Ciencias e Tecnicas Nucleares; Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil); Campos, Tarcisio P.R. de [Universidade Federal de Minas Gerais (UFMG), Belo Horizonte, MG (Brazil). Programa de Pos-graduacao em Ciencias e Tecnicas Nucleares; Ribeiro, Kilder L., E-mail: kilderlr@gmail.co [Universidade Estadual de Feira de Santana (UEFS), BA (Brazil). Dept. de Fisica
2009-07-01
Herein, the dosimetry responses to a retinoblastoma proton and electron radiation therapy were investigated. The computational tool applied to this simulation was the Geant4 code, version 4.9.1. The code allows simulating the charge particle interaction with eyeball tissue. In the present simulation, a box of 4 cm side water filled had represented the human eye. The simulation was performed considering mono energetic beams of protons and electrons with spectra of 57 to 70 MeV for protons and 2 to 8 MeV for electrons. The simulation was guide by the advanced hadron therapy example distributed with the Geant4 code. The phantom was divided in voxels with 0.2 mm side. The energy deposited in each voxel was evaluated taken the direct beam at one face. The simulation results show the delivery energy and therefore the dose deposited in each voxel. The deep dose profiles to proton and electron were plotted. The well known Bragg peak was reproduced for protons. The maximum delivered dose defined the position at the proton stopped. However, to electrons, the absorbed energies were delivered along its path producing a more continuous distribution following the water depth, but also being stopped in the end of its path. (author)
15 MeV proton irradiation effects on Bi-based high temperature superconductors
Energy Technology Data Exchange (ETDEWEB)
Alinejad, N.; Sohrabi, D. [Nuclear Science and Technology Research Institute, Tehran (Iran, Islamic Republic of). Plasma and Nuclear Fusion Research School; Bolori, F. [Karaj Agricultural, Medical, and Industrial Research School, Karaj (Iran, Islamic Republic of)
2015-11-15
Nowadays, superconducting magnetic coils are used in some tokamaks such as EAST, KSTAR, JT-60, and T-15 to generate strong magnetic fields and also in ITER magnetic fields of about 13 tesla will be produced with the help of superconductors. The tokamak superconductors are exposed to the variety of radiations (neutron, ions beam, and gamma) from plasma nuclear reactions which will affect some of the superconductor properties. Therefore, study of the irradiation effects on the superconductor structure and properties are very crucial from technological and scientific point of view. One of the superconductor irradiation effects to be investigated under different conditions of energy and dosage is the potential resistance of the material used in tokamak reactor magnetic coils against activation by radiation. In this work, pellets of high T{sub c} Bi-based superconductors have been prepared and after measurement of parameters, a sample of pellet has been irradiated with 15 MeV protons using Karaj cyclotron facility. The sample's parameters have been measured again after irradiation treatment. X-ray diffraction patterns and SEM images of the sample before and after irradiation treatment have been studied.
Grand unification and proton stability based on a chiral SU(8) theory
Energy Technology Data Exchange (ETDEWEB)
Deshpande, N.G.; Mannheim, P.D.
1980-06-01
A grand-unified model of the strong, electromagnetic, and weak interactions is presented based on a local SU(8)/sub L/ X SU(8)/sub R/ gauge theory that possesses a global U(8)/sub L/ X U(8)/sub R/ invariance. The model is spontaneously broken by the recently introduced neutrino pairing mechanism, in which a Higgs field that transforms like a pair of right-handed neutrinos acquires a vacuum expectation value. This neutrino pairing breaks the model down to the standard Weinberg-Salam phenomenology. Further, the neutrino pairing causes the two initial global currents of the model, fermion number and axial fermion number, to mix with the non-Abelian local currents to leave unbroken two new global currents, namely, baryon number and a particular lepton number that counts charged leptons and left-handed neutrinos only. The exact conservations of these two resulting currents ensure the absolute stability of the proton, the masslessness of the observed left-handed neutrinos, and the standard lepton number conservation of the usual weak interactions. A further feature of the model is the simultaneous absence of both strong CP violations and of observable axions. The model has a testable prediction, namely, the existence of an absolutely stable, relatively light, massive neutral lepton generated entirely from the right-handed neutrino sector of the theory. 1 table.
Malhado, João Pedro; Hynes, James T.
2012-12-01
The topographical character of conical intersections (CIs)—either sloped or peaked—has played a fundamental and important role in the discussion of the efficiency of CIs as photochemical "funnels." Here this perspective is employed in connection with a recent study of a model protonated Schiff base (PSB) cis to trans photoisomerization in solution [Malhado et al., J. Phys. Chem. A 115, 3720 (2011), 10.1021/jp106096m]. In that study, the calculated reduced photochemical quantum yield for the successful production of trans product versus cis reactant in acetonitrile solvent compared to water was interpreted in terms of a dynamical solvent effect related to the dominance, for the acetonitrile case, of S1 to S0 nonadiabatic transitions prior to the reaching the seam of CIs. The solvent influence on the quantum yield is here re-examined in the sloped/peaked CI topographical perspective via conversion of the model's two PSB internal coordinates and a nonequilibrium solvent coordinate into an effective branching space description, which is then used to re-analyze the generalized Langevin equation/surface hopping results. The present study supports the original interpretation and enriches it in terms of topographical detail.
Energy Technology Data Exchange (ETDEWEB)
Pardoe, H [School of Physics, University of Western Australia, Crawley, Perth, WA 6009 (Australia); Chua-anusorn, W [School of Physics, University of Western Australia, Crawley, Perth, WA 6009 (Australia); Pierre, T G St [School of Physics, University of Western Australia, Crawley, Perth, WA 6009 (Australia); Dobson, J [Department of Biomedical Engineering and Medical Physics, Centre for Science and Technology in Medicine, Keele University, Thornburrow Drive, Hartshill, Stoke-on-Trent, ST4 7QB (United Kingdom)
2003-03-21
A clinical magnetic resonance imaging (MRI) system was used to measure proton transverse relaxation rates (R{sub 2}) in agar gels with varying concentrations of ferrimagnetic iron oxide nanoparticles in a field strength of 1.5 T. The nanoparticles were prepared by coprecipitation of ferric and ferrous ions in the presence of either dextran or polyvinyl alcohol. The method of preparation resulted in loosely packed clusters (dextran) or branched chains (polyvinyl alcohol) of particles containing of the order of 600 and 400 particles, respectively. For both methods of particle preparation, concentrations of ferrimagnetic iron in agar gel less than 0.01 mg ml{sup -1} had no measurable effect on the value of R{sub 2} for the gel. The results indicate that MRI-based R{sub 2} measurements using 1.5 T clinical scanners are not quite sensitive enough to detect the very low concentrations of nanoparticulate biogenic magnetite reported in human brain tissue. (note)
Javed, Kamran; Gouriveau, Rafael; Zerhouni, Noureddine; Hissel, Daniel
2016-08-01
Proton Exchange Membrane Fuel Cell (PEMFC) is considered the most versatile among available fuel cell technologies, which qualify for diverse applications. However, the large-scale industrial deployment of PEMFCs is limited due to their short life span and high exploitation costs. Therefore, ensuring fuel cell service for a long duration is of vital importance, which has led to Prognostics and Health Management of fuel cells. More precisely, prognostics of PEMFC is major area of focus nowadays, which aims at identifying degradation of PEMFC stack at early stages and estimating its Remaining Useful Life (RUL) for life cycle management. This paper presents a data-driven approach for prognostics of PEMFC stack using an ensemble of constraint based Summation Wavelet- Extreme Learning Machine (SW-ELM) models. This development aim at improving the robustness and applicability of prognostics of PEMFC for an online application, with limited learning data. The proposed approach is applied to real data from two different PEMFC stacks and compared with ensembles of well known connectionist algorithms. The results comparison on long-term prognostics of both PEMFC stacks validates our proposition.
Energy Technology Data Exchange (ETDEWEB)
Yu, Xudong, E-mail: 081022009@fudan.edu.cn [College of Chemistry and Material Sciences, Hebei Normal University, Yuhua Road 113, Shijiazhuang 050024 (China); College of Science and Hebei Research Center of Pharmaceutical and Chemical Engineering, Hebei University of Science and Technology, Yuhua Road 70, Shijiazhuang 050080 (China); Zhang, Ping [College of Chemistry and Material Sciences, Hebei Normal University, Yuhua Road 113, Shijiazhuang 050024 (China); Li, Yajuan; Zhen, Xiaoli; Geng, Lijun; Wang, Yanqiu [College of Science and Hebei Research Center of Pharmaceutical and Chemical Engineering, Hebei University of Science and Technology, Yuhua Road 70, Shijiazhuang 050080 (China); Ma, Zichuan, E-mail: ma7405@hebtu.edu.cn [College of Chemistry and Material Sciences, Hebei Normal University, Yuhua Road 113, Shijiazhuang 050024 (China)
2014-07-01
In this paper, a new kind of phenol-based chemsensor L2 comprised of a Schiff base and azo groups was rationally designed and synthesized. It could selectively recognize fluoride anion among tested anions such as F{sup −}, AcO{sup −}, H{sub 2}PO{sub 4}{sup −}, Cl{sup −}, Br{sup −}, and I{sup −} with obvious color changes from yellow to fuchsia. The intramolecular PT (proton transfer) in L1 and L2 was responsible for the sensing ability, which was certified by the {sup 1}H NMR and Uv–vis experiments. - Highlights: • The phenol derivative L2 could selectively sense F{sup −} among test anions. • Intramolecular proton transfer happened when L2 was bonded with F{sup −}. • It is the first antipyrine-based anion receptor.
Shafiei, M.; Gharari, S.; Pande, S.; Bhulai, S.
2014-01-01
Posterior sampling methods are increasingly being used to describe parameter and model predictive uncertainty in hydrologic modelling. This paper proposes an alternative to random walk chains (such as DREAM-zs). We propose a sampler based on independence chains with an embedded feature of standardiz
Proton-air and proton-proton cross sections
Ulrich Ralf
2013-01-01
Different attempts to measure hadronic cross sections with cosmic ray data are reviewed. The major results are compared to each other and the differences in the corresponding analyses are discussed. Besides some important differences, it is crucial to see that all analyses are based on the same fundamental relation of longitudinal air shower development to the observed fluctuation of experimental observables. Furthermore, the relation of the measured proton-air to the more fundamental proton-...
Energy Technology Data Exchange (ETDEWEB)
Moore, Stephen C. [Department of Radiology, Brigham and Women' s Hospital and Harvard Medical School, 75 Francis Street, Boston, MA 02115 (United States)]. E-mail: scmoore@bwh.harvard.edu; Ouyang, Jinsong [Department of Radiology, Brigham and Women' s Hospital and Harvard Medical School, 75 Francis Street, Boston, MA 02115 (United States); Park, Mi-Ae [Department of Radiology, Brigham and Women' s Hospital and Harvard Medical School, 75 Francis Street, Boston, MA 02115 (United States); El Fakhri, Georges [Department of Radiology, Brigham and Women' s Hospital and Harvard Medical School, 75 Francis Street, Boston, MA 02115 (United States)
2006-12-20
We have incorporated Monte Carlo (MC)-based estimates of patient scatter, detector scatter, and crosstalk into an iterative reconstruction algorithm, and compared its performance to that of a general spectral (GS) approach. We extended the MC-based reconstruction algorithm of de Jong et al. by (1) using the 'Delta scattering' method to determine photon interaction points (2) simulating scatter maps for many energy bins simultaneously, and (3) decoupling the simulation of the object and detector by using pre-stored point spread functions (PSF) that included all collimator and detector effects. A numerical phantom was derived from a segmented CT scan of a torso phantom. The relative values of In-111 activity concentration simulated in soft tissue, liver, spine, left lung, right lung, and five spherical tumors (1.3-2.0 cm diam.) were 1.0, 1.5, 1.5, 0.3, 0.5, and 10.0, respectively. GS scatter projections were incorporated additively in an OSEM reconstruction (6 subsetsx10 projectionsx2 photopeak windows). After three iterations, GS scatter projections were replaced by MC-estimated scatter projections for two additional iterations. MC-based compensation was quantitatively compared to GS-based compensation after five iterations. The bias of organ activity estimates ranged from -13% to -6.5% (GS), and from -1.4% to +5.0% (MC); tumor bias ranged from -20.0% to +10.0% for GS (mean{+-}std.dev.=-4.3{+-}11.9%), and from -2.2 to +18.8% for MC (+4.1{+-}8.6%). Image noise in all organs was less with MC than with GS.
Ploykrachang, K.; Hasegawa, J.; Kondo, K.; Fukuda, H.; Oguri, Y.
2014-07-01
We have developed a micro-XRF system based on a proton-induced quasimonochromatic X-ray (QMXR) microbeam for in vivo measurement of biological samples. A 2.5-MeV proton beam impinged normally on a Cu foil target that was slightly thicker than the proton range. The emitted QMXR behind the Cu target was focused with a polycapillary X-ray half lens. For application to analysis of wet or aquatic samples, we prepared a QMXR beam with an incident angle of 45° with respect to the horizontal plane by using a dipole magnet in order to bend the primary proton beam downward by 45°. The focal spot size of the QMXR microbeam on a horizontal sample surface was evaluated to be 250 × 350 μm by a wire scanning method. A microscope camera with a long working distance was installed perpendicular to the sample surface to identify the analyzed position on the sample. The fluorescent radiation from the sample was collected by a Si-PIN photodiode X-ray detector. Using the setup above, we were able to successfully measure the accumulation and distribution of Co in the leaves of a free-floating aquatic plant on a dilute Co solution surface.
Energy Technology Data Exchange (ETDEWEB)
Ploykrachang, K., E-mail: ploykrachang.k.aa@m.titech.ac.jp [Research Laboratory for Nuclear Reactors, Tokyo Institute of Technology, 2-12-1 Ookayama, Meguro-ku, Tokyo 152-8550 (Japan); Hasegawa, J. [Department of Energy Sciences, Interdisciplinary Graduate School of Science and Engineering, Tokyo Institute of Technology, 4259 Nagatsuta-cho, Midori-ku, Yokohama 226-8502 (Japan); Kondo, K.; Fukuda, H.; Oguri, Y. [Research Laboratory for Nuclear Reactors, Tokyo Institute of Technology, 2-12-1 Ookayama, Meguro-ku, Tokyo 152-8550 (Japan)
2014-07-15
We have developed a micro-XRF system based on a proton-induced quasimonochromatic X-ray (QMXR) microbeam for in vivo measurement of biological samples. A 2.5-MeV proton beam impinged normally on a Cu foil target that was slightly thicker than the proton range. The emitted QMXR behind the Cu target was focused with a polycapillary X-ray half lens. For application to analysis of wet or aquatic samples, we prepared a QMXR beam with an incident angle of 45° with respect to the horizontal plane by using a dipole magnet in order to bend the primary proton beam downward by 45°. The focal spot size of the QMXR microbeam on a horizontal sample surface was evaluated to be 250 × 350 μm by a wire scanning method. A microscope camera with a long working distance was installed perpendicular to the sample surface to identify the analyzed position on the sample. The fluorescent radiation from the sample was collected by a Si-PIN photodiode X-ray detector. Using the setup above, we were able to successfully measure the accumulation and distribution of Co in the leaves of a free-floating aquatic plant on a dilute Co solution surface.
International Nuclear Information System (INIS)
We have developed a micro-XRF system based on a proton-induced quasimonochromatic X-ray (QMXR) microbeam for in vivo measurement of biological samples. A 2.5-MeV proton beam impinged normally on a Cu foil target that was slightly thicker than the proton range. The emitted QMXR behind the Cu target was focused with a polycapillary X-ray half lens. For application to analysis of wet or aquatic samples, we prepared a QMXR beam with an incident angle of 45° with respect to the horizontal plane by using a dipole magnet in order to bend the primary proton beam downward by 45°. The focal spot size of the QMXR microbeam on a horizontal sample surface was evaluated to be 250 × 350 μm by a wire scanning method. A microscope camera with a long working distance was installed perpendicular to the sample surface to identify the analyzed position on the sample. The fluorescent radiation from the sample was collected by a Si-PIN photodiode X-ray detector. Using the setup above, we were able to successfully measure the accumulation and distribution of Co in the leaves of a free-floating aquatic plant on a dilute Co solution surface
International Nuclear Information System (INIS)
In this thesis, measurements of the production cross sections for top-quark pairs and the determination of the top-quark mass are presented. Dileptonic decays of top-quark pairs (t anti t) with two opposite-charged lepton (electron and muon) candidates in the final state are considered. The studied data samples are collected in proton-proton collisions at the CERN Large Hadron Collider with the CMS detector and correspond to integrated luminosities of 5.0 fb-1 and 19.7 fb-1 at center-of-mass energies of √(s) = 7 TeV and √(s) = 8 TeV, respectively. The cross sections, σt anti t, are measured in the fiducial detector volume (visible phase space), defined by the kinematics of the top-quark decay products, and are extrapolated to the full phase space. The visible cross sections are extracted in a simultaneous binned-likelihood fit to multi-differential distributions of final-state observables, categorized according to the multiplicity of jets associated to b quarks (b jets) and other jets in each event. The fit is performed with emphasis on a consistent treatment of correlations between systematic uncertainties and taking into account features of the t anti t event topology. By comparison with predictions from the Standard Model at next-to-next-to leading order (NNLO) accuracy, the top-quark pole mass, mtpole, is extracted from the measured cross sections for different state-of-the-art PDF sets. Furthermore, the top-quark mass parameter used in Monte-Carlo simulations, mtMC, is determined using the distribution of the invariant mass of a lepton candidate and the leading b jet in the event, mlb. Being defined by the kinematics of the top-quark decay, this observable is unaffected by the description of the top-quark production mechanism. Events are selected from the data collected at √(s) = 8 TeV that contain at least two jets and one b jet in addition to the lepton candidate pair. A novel technique is presented, in which fixed-order calculations in quantum
Energy Technology Data Exchange (ETDEWEB)
Kieseler, Jan
2015-12-15
In this thesis, measurements of the production cross sections for top-quark pairs and the determination of the top-quark mass are presented. Dileptonic decays of top-quark pairs (t anti t) with two opposite-charged lepton (electron and muon) candidates in the final state are considered. The studied data samples are collected in proton-proton collisions at the CERN Large Hadron Collider with the CMS detector and correspond to integrated luminosities of 5.0 fb{sup -1} and 19.7 fb{sup -1} at center-of-mass energies of √(s) = 7 TeV and √(s) = 8 TeV, respectively. The cross sections, σ{sub t} {sub anti} {sub t}, are measured in the fiducial detector volume (visible phase space), defined by the kinematics of the top-quark decay products, and are extrapolated to the full phase space. The visible cross sections are extracted in a simultaneous binned-likelihood fit to multi-differential distributions of final-state observables, categorized according to the multiplicity of jets associated to b quarks (b jets) and other jets in each event. The fit is performed with emphasis on a consistent treatment of correlations between systematic uncertainties and taking into account features of the t anti t event topology. By comparison with predictions from the Standard Model at next-to-next-to leading order (NNLO) accuracy, the top-quark pole mass, m{sub t}{sup pole}, is extracted from the measured cross sections for different state-of-the-art PDF sets. Furthermore, the top-quark mass parameter used in Monte-Carlo simulations, m{sub t}{sup MC}, is determined using the distribution of the invariant mass of a lepton candidate and the leading b jet in the event, m{sub lb}. Being defined by the kinematics of the top-quark decay, this observable is unaffected by the description of the top-quark production mechanism. Events are selected from the data collected at √(s) = 8 TeV that contain at least two jets and one b jet in addition to the lepton candidate pair. A novel technique is
Meesters, Christian; Pairet, Bruno; Rabenhorst, Anja; Decker, Heinz; Jaenicke, Elmar
2010-06-01
We present a modular, collaborative, open-source architecture for rigid body modelling based upon small angle scattering data, named sas_rigid. It is designed to provide a fast and extensible scripting interface using the easy-to-learn Python programming language. Features include rigid body modelling to result in static structures and three-dimensional probability densities using two different algorithms. PMID:20598639
Investigation of the CRT performance of a PET scanner based in liquid xenon: A Monte Carlo study
Gomez-Cadenas, J J; Ferrario, P; Monrabal, F; Rodríguez, J; Toledo, J F
2016-01-01
The measurement of the time of flight of the two 511 keV gammas recorded in coincidence in a PET scanner provides an effective way of reducing the random background and therefore increases the scanner sensitivity, provided that the coincidence resolving time (CRT) of the gammas is sufficiently good. Existing commercial systems based in LYSO crystals, such as the GEMINIS of Philips, reach CRT values of ~ 600 ps (FWHM). In this paper we present a Monte Carlo investigation of the CRT performance of a PET scanner exploiting the scintillating properties of liquid xenon. We find that an excellent CRT of 60-70 ps (depending on the PDE of the sensor) can be obtained if the scanner is instrumented with silicon photomultipliers (SiPMs) sensitive to the ultraviolet light emitted by xenon. Alternatively, a CRT of 120 ps can be obtained instrumenting the scanner with (much cheaper) blue-sensitive SiPMs coated with a suitable wavelength shifter. These results show the excellent time of flight capabilities of a PET device b...
Energy Technology Data Exchange (ETDEWEB)
Brunetti, Antonio; Golosio, Bruno [Universita degli Studi di Sassari, Dipartimento di Scienze Politiche, Scienze della Comunicazione e Ingegneria dell' Informazione, Sassari (Italy); Melis, Maria Grazia [Universita degli Studi di Sassari, Dipartimento di Storia, Scienze dell' Uomo e della Formazione, Sassari (Italy); Mura, Stefania [Universita degli Studi di Sassari, Dipartimento di Agraria e Nucleo di Ricerca sulla Desertificazione, Sassari (Italy)
2014-11-08
X-ray fluorescence (XRF) is a well known nondestructive technique. It is also applied to multilayer characterization, due to its possibility of estimating both composition and thickness of the layers. Several kinds of cultural heritage samples can be considered as a complex multilayer, such as paintings or decorated objects or some types of metallic samples. Furthermore, they often have rough surfaces and this makes a precise determination of the structure and composition harder. The standard quantitative XRF approach does not take into account this aspect. In this paper, we propose a novel approach based on a combined use of X-ray measurements performed with a polychromatic beam and Monte Carlo simulations. All the information contained in an X-ray spectrum is used. This approach allows obtaining a very good estimation of the sample contents both in terms of chemical elements and material thickness, and in this sense, represents an improvement of the possibility of XRF measurements. Some examples will be examined and discussed. (orig.)
Joshi, Kaushik; Chaudhuri, Santanu
2016-10-01
Ability to accelerate the morphological evolution of nanoscale precipitates is a fundamental challenge for atomistic simulations. Kinetic Monte Carlo (KMC) methodology is an effective approach for accelerating the evolution of nanoscale systems that are dominated by so-called rare events. The quality and accuracy of energy landscape used in KMC calculations can be significantly improved using DFT-informed interatomic potentials. Using newly developed computational framework that uses molecular simulator LAMMPS as a library function inside KMC solver SPPARKS, we investigated formation and growth of Guiner–Preston (GP) zones in dilute Al–Cu alloys at different temperature and copper concentrations. The KMC simulations with angular dependent potential (ADP) predict formation of coherent disc-shaped monolayers of copper atoms (GPI zones) in early stage. Such monolayers are then gradually transformed into energetically favored GPII phase that has two aluminum layers sandwiched between copper layers. We analyzed the growth kinetics of KMC trajectory using Johnson–Mehl–Avrami (JMA) theory and obtained a phase transformation index close to 1.0. In the presence of grain boundaries, the KMC calculations predict the segregation of copper atoms near the grain boundaries instead of formation of GP zones. The computational framework presented in this work is based on open source potentials and MD simulator and can predict morphological changes during the evolution of the alloys in the bulk and around grain boundaries.
Xiong, Chuan; Shi, Jiancheng
2014-01-01
To date, the light scattering models of snow consider very little about the real snow microstructures. The ideal spherical or other single shaped particle assumptions in previous snow light scattering models can cause error in light scattering modeling of snow and further cause errors in remote sensing inversion algorithms. This paper tries to build up a snow polarized reflectance model based on bicontinuous medium, with which the real snow microstructure is considered. The accurate specific surface area of bicontinuous medium can be analytically derived. The polarized Monte Carlo ray tracing technique is applied to the computer generated bicontinuous medium. With proper algorithms, the snow surface albedo, bidirectional reflectance distribution function (BRDF) and polarized BRDF can be simulated. The validation of model predicted spectral albedo and bidirectional reflectance factor (BRF) using experiment data shows good results. The relationship between snow surface albedo and snow specific surface area (SSA) were predicted, and this relationship can be used for future improvement of snow specific surface area (SSA) inversion algorithms. The model predicted polarized reflectance is validated and proved accurate, which can be further applied in polarized remote sensing.
International Nuclear Information System (INIS)
A univel geometry, neutral particle Monte Carlo transport code, written entirely in the Java programming language, is under development for medical radiotherapy applications. The code uses ENDF-VI based continuous energy cross section data in a flexible XML format. Full neutron-photon coupling, including detailed photon production and photonuclear reactions, is included. Charged particle equilibrium is assumed within the patient model so that detailed transport of electrons produced by photon interactions may be neglected. External beam and internal distributed source descriptions for mixed neutron-photon sources are allowed. Flux and dose tallies are performed on a univel basis. A four-tap, shift-register-sequence random number generator is used. Initial verification and validation testing of the basic neutron transport routines is underway. The searchlight problem was chosen as a suitable first application because of the simplicity of the physical model. Results show excellent agreement with analytic solutions. Computation times for similar numbers of histories are comparable to other neutron MC codes written in C and FORTRAN
Scalability tests of R-GMA based Grid job monitoring system for CMS Monte Carlo data production
Bonacorsi, D; Field, L; Fisher, S; Grandi, C; Hobson, P R; Kyberd, P; MacEvoy, B; Nebrensky, J J; Tallini, H; Traylen, S
2004-01-01
High Energy Physics experiments such as CMS (Compact Muon Solenoid) at the Large Hadron Collider have unprecedented, large-scale data processing computing requirements, with data accumulating at around 1 Gbyte/s. The Grid distributed computing paradigm has been chosen as the solution to provide the requisite computing power. The demanding nature of CMS software and computing requirements, such as the production of large quantities of Monte Carlo simulated data, makes them an ideal test case for the Grid and a major driver for the development of Grid technologies. One important challenge when using the Grid for large-scale data analysis is the ability to monitor the large numbers of jobs that are being executed simultaneously at multiple remote sites. R-GMA is a monitoring and information management service for distributed resources based on the Grid Monitoring Architecture of the Global Grid Forum. In this paper we report on the first measurements of R-GMA as part of a monitoring architecture to be used for b...
Wierling, Christoph; Kühn, Alexander; Hache, Hendrik; Daskalaki, Andriani; Maschke-Dutz, Elisabeth; Peycheva, Svetlana; Li, Jian; Herwig, Ralf; Lehrach, Hans
2012-08-15
Cancer is known to be a complex disease and its therapy is difficult. Much information is available on molecules and pathways involved in cancer onset and progression and this data provides a valuable resource for the development of predictive computer models that can help to identify new potential drug targets or to improve therapies. Modeling cancer treatment has to take into account many cellular pathways usually leading to the construction of large mathematical models. The development of such models is complicated by the fact that relevant parameters are either completely unknown, or can at best be measured under highly artificial conditions. Here we propose an approach for constructing predictive models of such complex biological networks in the absence of accurate knowledge on parameter values, and apply this strategy to predict the effects of perturbations induced by anti-cancer drug target inhibitions on an epidermal growth factor (EGF) signaling network. The strategy is based on a Monte Carlo approach, in which the kinetic parameters are repeatedly sampled from specific probability distributions and used for multiple parallel simulations. Simulation results from different forms of the model (e.g., a model that expresses a certain mutation or mutation pattern or the treatment by a certain drug or drug combination) can be compared with the unperturbed control model and used for the prediction of the perturbation effects. This framework opens the way to experiment with complex biological networks in the computer, likely to save costs in drug development and to improve patient therapy.
Study on the Uncertainty of the Available Time Under Ship Fire Based on Monte Carlo Sampling Method
Institute of Scientific and Technical Information of China (English)
WANG Jin-hui; CHU Guan-quan; LI Kai-yuan
2013-01-01
Available safety egress time under ship fire (SFAT) is critical to ship fire safety assessment,design and emergency rescue.Although it is available to determine SFAT by using fire models such as the two-zone fire model CFAST and the field model FDS,none of these models can address the uncertainties involved in the input parameters.To solve this problem,current study presents a framework of uncertainty analysis for SFAT.Firstly,a deterministic model estimating SFAT is built.The uncertainties of the input parameters are regarded as random variables with the given probability distribution functions.Subsequently,the deterministic SFAT model is employed to couple with a Monte Carlo sampling method to investigate the uncertainties of the SFAT.The Spearman's rank-order correlation coefficient (SRCC) is used to examine the sensitivity of each input uncertainty parameter on SFAT.To illustrate the proposed approach in detail,a case study is performed.Based on the proposed approach,probability density function and cumulative density function of SFAT are obtained.Furthermore,sensitivity analysis with regard to SFAT is also conducted.The results give a high-negative correlation of SFAT and the fire growth coefficient whereas the effect of other parameters is so weak that they can be neglected.
Energy Technology Data Exchange (ETDEWEB)
Piccinini, M., E-mail: massimo.piccinini@enea.it; Ambrosini, F.; Ampollini, A.; Carpanese, M.; Picardi, L.; Ronsivalle, C.; Bonfigli, F.; Libera, S.; Vincenti, M.A.; Montereali, R.M.
2014-12-15
Proton beams of 3 and 7 MeV energies, produced by a linear accelerator, were used to irradiate lithium fluoride crystals and thermally evaporated LiF thin films in the fluence range of 10{sup 11}–10{sup 15} protons/cm{sup 2}. The irradiation induces the formation of stable colour centres, mainly the primary F centre and the aggregate F{sub 2} and F{sub 3}{sup +} defects. By optical pumping in the blue spectral region, the F{sub 2} and F{sub 3}{sup +} centres emit broad photoluminescence bands in the visible spectral range. By conventional fluorescence microscopy, the integrated photoluminescence intensity was carefully measured in LiF crystals and thin films as a function of the irradiation fluence: a linear optical response was obtained in a large range of fluence, which is dependent on the used LiF samples and the selected beam energy. It was possible to record the transversal proton beam intensity profile by acquiring the photoluminescence image of the irradiated spots on LiF films by a standard optical microscope. Using LiF films grown on silicon substrates irradiated in a particular geometry, the same optical reading microscopy technique allowed one to measure the distribution of colour centres photoluminescence along the depth and direct imaging the Bragg peak position, which gives a rough estimation of the initial proton beam energy. - Highlights: • Photoluminescence of color centres in LiF can be used for proton beam imaging. • Photoluminescence is linear over several orders of magnitude of H{sup +} fluence range. • Photoluminescence behaviour in crystals and thin films at two energies is discussed. • LiF thin films can directly image the Bragg peak to estimate proton beam energy. • LiF crystals and thin films are promising for proton dosimetry by photoluminescence.
Pattern-oriented Agent-based Monte Carlo simulation of Cellular Redox Environment
DEFF Research Database (Denmark)
Tang, Jiaowei; Holcombe, Mike; Boonen, Harrie C.M.
/CYSS) and mitochondrial redox couples. Evidence suggests that both intracellular and extracellular redox can affect overall cell redox state. How redox is communicated between extracellular and intracellular environments is still a matter of debate. Some researchers conclude based on experimental data...... will be the agents [7]. Additionally, the spatial distribution of enzymes and reactants, and diffusion of reactants will be considered as a contributing factor. To initially simplify the modeling, the redox change of intracellular compartments will be ignored or only the export and import of redox will be modeled...... for Autonomous Agents and Multiagent Systems: Toronto, Canada. p. 1633-1636....
Energy Technology Data Exchange (ETDEWEB)
Chung, Kwang Zoo; Han, Young Yih; Kim, Jin Sung [Dept. of Radiation Oncology, Samsung Medical Center, Sungkyunkwan University School of Medicine, Seoul (Korea, Republic of); and others
2015-12-15
The purpose of this report is to describe the proton therapy system at Samsung Medical Center (SMC-PTS) including the proton beam generator, irradiation system, patient positioning system, patient position verification system, respiratory gating system, and operating and safety control system, and review the current status of the SMC-PTS. The SMC-PTS has a cyclotron (230 MeV) and two treatment rooms: one treatment room is equipped with a multi-purpose nozzle and the other treatment room is equipped with a dedicated pencil beam scanning nozzle. The proton beam generator including the cyclotron and the energy selection system can lower the energy of protons down to 70 MeV from the maximum 230 MeV. The multi-purpose nozzle can deliver both wobbling proton beam and active scanning proton beam, and a multi-leaf collimator has been installed in the downstream of the nozzle. The dedicated scanning nozzle can deliver active scanning proton beam with a helium gas filled pipe minimizing unnecessary interactions with the air in the beam path. The equipment was provided by Sumitomo Heavy Industries Ltd., RayStation from RaySearch Laboratories AB is the selected treatment planning system, and data management will be handled by the MOSAIQ system from Elekta AB. The SMC-PTS located in Seoul, Korea, is scheduled to begin treating cancer patients in 2015.
Directory of Open Access Journals (Sweden)
S. Maiti
2011-03-01
Full Text Available Koyna region is well-known for its triggered seismic activities since the hazardous earthquake of M=6.3 occurred around the Koyna reservoir on 10 December 1967. Understanding the shallow distribution of resistivity pattern in such a seismically critical area is vital for mapping faults, fractures and lineaments. However, deducing true resistivity distribution from the apparent resistivity data lacks precise information due to intrinsic non-linearity in the data structures. Here we present a new technique based on the Bayesian neural network (BNN theory using the concept of Hybrid Monte Carlo (HMC/Markov Chain Monte Carlo (MCMC simulation scheme. The new method is applied to invert one and two-dimensional Direct Current (DC vertical electrical sounding (VES data acquired around the Koyna region in India. Prior to apply the method on actual resistivity data, the new method was tested for simulating synthetic signal. In this approach the objective/cost function is optimized following the Hybrid Monte Carlo (HMC/Markov Chain Monte Carlo (MCMC sampling based algorithm and each trajectory was updated by approximating the Hamiltonian differential equations through a leapfrog discretization scheme. The stability of the new inversion technique was tested in presence of correlated red noise and uncertainty of the result was estimated using the BNN code. The estimated true resistivity distribution was compared with the results of singular value decomposition (SVD-based conventional resistivity inversion results. Comparative results based on the HMC-based Bayesian Neural Network are in good agreement with the existing model results, however in some cases, it also provides more detail and precise results, which appears to be justified with local geological and structural details. The new BNN approach based on HMC is faster and proved to be a promising inversion scheme to interpret complex and non-linear resistivity problems. The HMC-based BNN results
International Nuclear Information System (INIS)
The purpose of this study was to examine dose distribution of a skull base tumor and surrounding critical structures in response to high dose intensity-modulated radiosurgery (IMRS) with Monte Carlo (MC) simulation using a dual resolution sandwich phantom. The measurement-based Monte Carlo (MBMC) method (Lin et al., 2009) was adopted for the study. The major components of the MBMC technique involve (1) the BEAMnrc code for beam transport through the treatment head of a Varian 21EX linear accelerator, (2) the DOSXYZnrc code for patient dose simulation and (3) an EPID-measured efficiency map which describes non-uniform fluence distribution of the IMRS treatment beam. For the simulated case, five isocentric 6 MV photon beams were designed to deliver a total dose of 1200 cGy in two fractions to the skull base tumor. A sandwich phantom for the MBMC simulation was created based on the patient's CT scan of a skull base tumor [gross tumor volume (GTV)=8.4 cm3] near the right 8th cranial nerve. The phantom, consisted of a 1.2-cm thick skull base region, had a voxel resolution of 0.05×0.05×0.1 cm3 and was sandwiched in between 0.05×0.05×0.3 cm3 slices of a head phantom. A coarser 0.2×0.2×0.3 cm3 single resolution (SR) phantom was also created for comparison with the sandwich phantom. A particle history of 3×108 for each beam was used for simulations of both the SR and the sandwich phantoms to achieve a statistical uncertainty of <2%. Our study showed that the planning target volume (PTV) receiving at least 95% of the prescribed dose (VPTV95) was 96.9%, 96.7% and 99.9% for the TPS, SR, and sandwich phantom, respectively. The maximum and mean doses to large organs such as the PTV, brain stem, and parotid gland for the TPS, SR and sandwich MC simulations did not show any significant difference; however, significant dose differences were observed for very small structures like the right 8th cranial nerve, right cochlea, right malleus and right semicircular canal
An automated Monte-Carlo based method for the calculation of cascade summing factors
Jackson, M. J.; Britton, R.; Davies, A. V.; McLarty, J. L.; Goodwin, M.
2016-10-01
A versatile method has been developed to calculate cascade summing factors for use in quantitative gamma-spectrometry analysis procedures. The proposed method is based solely on Evaluated Nuclear Structure Data File (ENSDF) nuclear data, an X-ray energy library, and accurate efficiency characterisations for single detector counting geometries. The algorithm, which accounts for γ-γ, γ-X, γ-511 and γ-e- coincidences, can be applied to any design of gamma spectrometer and can be expanded to incorporate any number of nuclides. Efficiency characterisations can be derived from measured or mathematically modelled functions, and can accommodate both point and volumetric source types. The calculated results are shown to be consistent with an industry standard gamma-spectrometry software package. Additional benefits including calculation of cascade summing factors for all gamma and X-ray emissions, not just the major emission lines, are also highlighted.
Free standing diamond-like carbon thin films by PLD for laser based electrons/protons acceleration
Energy Technology Data Exchange (ETDEWEB)
Thema, F.T.; Beukes, P.; Ngom, B.D. [UNESCO Africa Chair in Nanosciences-Nanotechnology, College of Graduate Studies, University of South Africa, Muckleneuk Ridge, PO Box 392, Pretoria (South Africa); Nanosciences African Network (NANOAFNET), iThemba LABS-National Research Foundation, 1 Old Faure Road, Somerset West, 7129, PO Box722, Western Cape Province (South Africa); Manikandan, E., E-mail: mani@tlabs.ac.za [UNESCO Africa Chair in Nanosciences-Nanotechnology, College of Graduate Studies, University of South Africa, Muckleneuk Ridge, PO Box 392, Pretoria (South Africa); Nanosciences African Network (NANOAFNET), iThemba LABS-National Research Foundation, 1 Old Faure Road, Somerset West, 7129, PO Box722, Western Cape Province (South Africa); Central Research Laboratory, Sree Balaji Medical College & Hospital (SBMCH), Chrompet, Bharath University, Chennai, 600044 (India); Maaza, M., E-mail: maaza@tlabs.ac.za [UNESCO Africa Chair in Nanosciences-Nanotechnology, College of Graduate Studies, University of South Africa, Muckleneuk Ridge, PO Box 392, Pretoria (South Africa); Nanosciences African Network (NANOAFNET), iThemba LABS-National Research Foundation, 1 Old Faure Road, Somerset West, 7129, PO Box722, Western Cape Province (South Africa)
2015-11-05
This study we reports for the first time on the synthesis and optical characteristics of free standing diamond-like carbon (DLC) deposited by pulsed laser deposition (PLD) onto graphene buffer layers for ultrahigh intensity laser based electron/proton acceleration applications. The fingerprint techniques of micro-Raman, UV–VIS–NIR and the IR spectroscopic investigations indicate that the suitability of such free standing DLC thin-films within the laser window and long wave infrared (LWIR) spectral range and hence their appropriateness for the targeted applications. - Highlights: • We report for the first time synthesis of free standing diamond-like carbon. • Pulsed laser deposition onto graphene buffer layers. • Fingerprint techniques of micro-Raman, UV–VIS–NIR and the IR spectroscopic investigations. • Ultrahigh intensity laser based electron/proton acceleration applications. • This material's suitable for the laser window and long wave infrared (LWIR) spectral range.
Free standing diamond-like carbon thin films by PLD for laser based electrons/protons acceleration
International Nuclear Information System (INIS)
This study we reports for the first time on the synthesis and optical characteristics of free standing diamond-like carbon (DLC) deposited by pulsed laser deposition (PLD) onto graphene buffer layers for ultrahigh intensity laser based electron/proton acceleration applications. The fingerprint techniques of micro-Raman, UV–VIS–NIR and the IR spectroscopic investigations indicate that the suitability of such free standing DLC thin-films within the laser window and long wave infrared (LWIR) spectral range and hence their appropriateness for the targeted applications. - Highlights: • We report for the first time synthesis of free standing diamond-like carbon. • Pulsed laser deposition onto graphene buffer layers. • Fingerprint techniques of micro-Raman, UV–VIS–NIR and the IR spectroscopic investigations. • Ultrahigh intensity laser based electron/proton acceleration applications. • This material's suitable for the laser window and long wave infrared (LWIR) spectral range
Raby, Stuart
2002-01-01
We discuss the status of supersymmetric grand unified theories [SUSY GUTs] with regards to the observation of proton decay. In this talk we focus on SUSY GUTs in 4 dimensions. We outline the major theoretical uncertainties present in the calculation of the proton lifetime and then present our best estimate of an absolute upper bound on the predicted proton lifetime. Towards the end, we consider some new results in higher dimensional GUTs and the ramifications for proton decay.
SU-E-J-148: Tools for Development of 4D Proton CT
International Nuclear Information System (INIS)
Purpose: To develop tools for performing 4D proton computed tomography (CT). Methods: A suitable patient with a tumor in the right lower lobe was selected from a set of 4D CT scans. The volumetric CT images formed the basis for calculating the parameters of a breathing model that allows reconstruction of a static reference CT and CT images in each breathing phase. The images were imported into the TOPAS Monte Carlo simulation platform for simulating an experimental proton CT scan with 45 projections spaced by 4 degree intervals. Each projection acquired data for 2 seconds followed by a gantry rotation for 2 seconds without acquisition. The scan covered 180 degrees with individual protons passing through a 9-cm slab of the patient’s lung covering the moving tumor. An initial proton energy sufficient for penetrating the patient from all directions was determined. Performing the proton CT simulation, TOPAS provided output of the proton energy and coordinates registered in two planes before and after the patient, respectively. The set of projection data was then used with an iterative reconstruction algorithm to generate a volumetric proton CT image set of the static reference image and the image obtained under breathing motion, respectively. Results: An initial proton energy of 230 MeV was found to be sufficient, while for an initial energy of 200 MeV a substantial number of protons did not penetrate the patient. The reconstruction of the static reference image set provided sufficient detail for treatment planning. Conclusion: We have developed tools to perform studies of proton CT in the presence of lung motion based on the TOPAS simulation toolkit. This will allow to optimize 4D reconstruction algorithms by synchronizing the acquired proton CT data with a breathing signal and utilizing a breathing model obtained prior to the proton CT scan. This research has been supported by the National Institute Of Biomedical Imaging And Bioengineering of the National
SU-E-J-148: Tools for Development of 4D Proton CT
Energy Technology Data Exchange (ETDEWEB)
Dou, T [University of California, Los Angeles, Los Angeles, CA (United States); Ramos-Mendez, J [University of California San Francisco, San Francisco, CA (United States); Piersimoni, P [Loma Linda University, Loma Linda, CA (United States); Giacometti, V [Center for Medical Radiation Physics, University of Wollongong, Sydney, NSW (Australia); Penfold, S [University of Adelaide, Adelaide, SA (Australia); Censor, Y [University of Haifa, Haifa (Israel); Faddegon, B [UC San Francisco, San Francisco, CA (United States); Low, D [Deparment of Radiation Oncology, University of California Los Angeles, Los Angeles, CA (United States); Schulte, R [Loma Linda Univ. Medical Ctr., Loma Linda, CA (United States)
2015-06-15
Purpose: To develop tools for performing 4D proton computed tomography (CT). Methods: A suitable patient with a tumor in the right lower lobe was selected from a set of 4D CT scans. The volumetric CT images formed the basis for calculating the parameters of a breathing model that allows reconstruction of a static reference CT and CT images in each breathing phase. The images were imported into the TOPAS Monte Carlo simulation platform for simulating an experimental proton CT scan with 45 projections spaced by 4 degree intervals. Each projection acquired data for 2 seconds followed by a gantry rotation for 2 seconds without acquisition. The scan covered 180 degrees with individual protons passing through a 9-cm slab of the patient’s lung covering the moving tumor. An initial proton energy sufficient for penetrating the patient from all directions was determined. Performing the proton CT simulation, TOPAS provided output of the proton energy and coordinates registered in two planes before and after the patient, respectively. The set of projection data was then used with an iterative reconstruction algorithm to generate a volumetric proton CT image set of the static reference image and the image obtained under breathing motion, respectively. Results: An initial proton energy of 230 MeV was found to be sufficient, while for an initial energy of 200 MeV a substantial number of protons did not penetrate the patient. The reconstruction of the static reference image set provided sufficient detail for treatment planning. Conclusion: We have developed tools to perform studies of proton CT in the presence of lung motion based on the TOPAS simulation toolkit. This will allow to optimize 4D reconstruction algorithms by synchronizing the acquired proton CT data with a breathing signal and utilizing a breathing model obtained prior to the proton CT scan. This research has been supported by the National Institute Of Biomedical Imaging And Bioengineering of the National
A magneto-optical scheme of a proton microscope based on the U-70 synchrotron radiograph complex
Maksimov, A. V.; Fedotov, Yu. S.
2016-07-01
A magneto-optical scheme of a proton microscope based on an active U-70 accelerator radiography complex is presented. The microscope is created in the magnetic structure of the radiography complex only by changing strengths of quadrupole lenses without changing their position. The achieved magnification ratio of the image of the studied object is no smaller than 10 at E = 60 GeV. This is sufficient for a spatial resolution of images of 10-20 μm.
Skliarova, H.; Azzolini, O.; Dousset, O.; Johnson, R.R.; V. Palmieri
2013-01-01
Chemically inert Coatings on Havar entrance foils of the targets for [18F] production via proton irradiation of enriched water at pressurized conditions are needed to decrease the amount of ionic contaminants released from Havar. In order to find the most effective protective coatings, the Nb-based coating microstructure and barrier properties have been correlated with deposition parameters as: substrate temperature, applied bias, deposition rate and sputtering gas pressure. Aluminated quartz...
Atriana Palma, Bianey; Ureba Sánchez, Ana; Salguero, Francisco Javier; Arráns, Rafael; Míguez Sánchez, Carlos; Walls Zurita, Amadeo; Romero Hermida, María Isabel; Leal, Antonio
2012-03-01
The purpose of this study was to present a Monte-Carlo (MC)-based optimization procedure to improve conventional treatment plans for accelerated partial breast irradiation (APBI) using modulated electron beams alone or combined with modulated photon beams, to be delivered by a single collimation device, i.e. a photon multi-leaf collimator (xMLC) already installed in a standard hospital. Five left-sided breast cases were retrospectively planned using modulated photon and/or electron beams with an in-house treatment planning system (TPS), called CARMEN, and based on MC simulations. For comparison, the same cases were also planned by a PINNACLE TPS using conventional inverse intensity modulated radiation therapy (IMRT). Normal tissue complication probability for pericarditis, pneumonitis and breast fibrosis was calculated. CARMEN plans showed similar acceptable planning target volume (PTV) coverage as conventional IMRT plans with 90% of PTV volume covered by the prescribed dose (Dp). Heart and ipsilateral lung receiving 5% Dp and 15% Dp, respectively, was 3.2-3.6 times lower for CARMEN plans. Ipsilateral breast receiving 50% Dp and 100% Dp was an average of 1.4-1.7 times lower for CARMEN plans. Skin and whole body low-dose volume was also reduced. Modulated photon and/or electron beams planned by the CARMEN TPS improve APBI treatments by increasing normal tissue sparing maintaining the same PTV coverage achieved by other techniques. The use of the xMLC, already installed in the linac, to collimate photon and electron beams favors the clinical implementation of APBI with the highest efficiency.
Monte Carlo simulation of moderator and reflector in coal analyzer based on a D-T neutron generator.
Shan, Qing; Chu, Shengnan; Jia, Wenbao
2015-11-01
Coal is one of the most popular fuels in the world. The use of coal not only produces carbon dioxide, but also contributes to the environmental pollution by heavy metals. In prompt gamma-ray neutron activation analysis (PGNAA)-based coal analyzer, the characteristic gamma rays of C and O are mainly induced by fast neutrons, whereas thermal neutrons can be used to induce the characteristic gamma rays of H, Si, and heavy metals. Therefore, appropriate thermal and fast neutrons are beneficial in improving the measurement accuracy of heavy metals, and ensure that the measurement accuracy of main elements meets the requirements of the industry. Once the required yield of the deuterium-tritium (d-T) neutron generator is determined, appropriate thermal and fast neutrons can be obtained by optimizing the neutron source term. In this article, the Monte Carlo N-Particle (MCNP) Transport Code and Evaluated Nuclear Data File (ENDF) database are used to optimize the neutron source term in PGNAA-based coal analyzer, including the material and shape of the moderator and neutron reflector. The optimized targets include two points: (1) the ratio of the thermal to fast neutron is 1:1 and (2) the total neutron flux from the optimized neutron source in the sample increases at least 100% when compared with the initial one. The simulation results show that, the total neutron flux in the sample increases 102%, 102%, 85%, 72%, and 62% with Pb, Bi, Nb, W, and Be reflectors, respectively. Maximum optimization of the targets is achieved when the moderator is a 3-cm-thick lead layer coupled with a 3-cm-thick high-density polyethylene (HDPE) layer, and the neutron reflector is a 27-cm-thick hemispherical lead layer.
Li, Dong; Chen, Bin; Ran, Wei Yu; Wang, Guo Xiang; Wu, Wen Juan
2015-01-01
The voxel-based Monte Carlo method (VMC) is now a gold standard in the simulation of light propagation in turbid media. For complex tissue structures, however, the computational cost will be higher when small voxels are used to improve smoothness of tissue interface and a large number of photons are used to obtain accurate results. To reduce computational cost, criteria were proposed to determine the voxel size and photon number in 3-dimensional VMC simulations with acceptable accuracy and computation time. The selection of the voxel size can be expressed as a function of tissue geometry and optical properties. The photon number should be at least 5 times the total voxel number. These criteria are further applied in developing a photon ray splitting scheme of local grid refinement technique to reduce computational cost of a nonuniform tissue structure with significantly varying optical properties. In the proposed technique, a nonuniform refined grid system is used, where fine grids are used for the tissue with high absorption and complex geometry, and coarse grids are used for the other part. In this technique, the total photon number is selected based on the voxel size of the coarse grid. Furthermore, the photon-splitting scheme is developed to satisfy the statistical accuracy requirement for the dense grid area. Result shows that local grid refinement technique photon ray splitting scheme can accelerate the computation by 7.6 times (reduce time consumption from 17.5 to 2.3 h) in the simulation of laser light energy deposition in skin tissue that contains port wine stain lesions.
Zhang, Junlong; Li, Yongping; Huang, Guohe; Chen, Xi; Bao, Anming
2016-07-01
Without a realistic assessment of parameter uncertainty, decision makers may encounter difficulties in accurately describing hydrologic processes and assessing relationships between model parameters and watershed characteristics. In this study, a Markov-Chain-Monte-Carlo-based multilevel-factorial-analysis (MCMC-MFA) method is developed, which can not only generate samples of parameters from a well constructed Markov chain and assess parameter uncertainties with straightforward Bayesian inference, but also investigate the individual and interactive effects of multiple parameters on model output through measuring the specific variations of hydrological responses. A case study is conducted for addressing parameter uncertainties in the Kaidu watershed of northwest China. Effects of multiple parameters and their interactions are quantitatively investigated using the MCMC-MFA with a three-level factorial experiment (totally 81 runs). A variance-based sensitivity analysis method is used to validate the results of parameters' effects. Results disclose that (i) soil conservation service runoff curve number for moisture condition II (CN2) and fraction of snow volume corresponding to 50% snow cover (SNO50COV) are the most significant factors to hydrological responses, implying that infiltration-excess overland flow and snow water equivalent represent important water input to the hydrological system of the Kaidu watershed; (ii) saturate hydraulic conductivity (SOL_K) and soil evaporation compensation factor (ESCO) have obvious effects on hydrological responses; this implies that the processes of percolation and evaporation would impact hydrological process in this watershed; (iii) the interactions of ESCO and SNO50COV as well as CN2 and SNO50COV have an obvious effect, implying that snow cover can impact the generation of runoff on land surface and the extraction of soil evaporative demand in lower soil layers. These findings can help enhance the hydrological model
Beam Dynamics Based Design of Solenoid Channel for TAC Proton Linac
Kisoglu, H F
2014-01-01
Today a linear particle accelerator (linac), in which electric and magnetic fields are of vital importance, is one of the popular energy generation sources like Accelerator Driven System (ADS). A multipurpose, including primarily ADS, proton linac with energy of ~2 GeV is planned to constitute within the Turkish Accelerator Center (TAC) project collaborated by more than 10 Turkish universities. A Low Energy Beam Transport (LEBT) channel with two solenoids is a subcomponent of this linac. It transports the proton beam ejected by an ion source, and matches it with the Radio Frequency Quadrupole (RFQ) that is an important part of the linac. The LEBT channel would be consisted of two focusing solenoids and some diagnostic elements such as faraday cup, BC transformers, etc. This paper includes a beam dynamical design and optimization study of LEBT channel for TAC proton linac done by using a beam dynamics simulation code PATH MANAGER and comparing of the simulation results with the theoretical expectations.
Institute of Scientific and Technical Information of China (English)
张佳琦; 靳家玉; 张隽佶; 邹雷
2012-01-01
A simple method for the synthesis of new bithienylethenes bearing a functional group on the cyclopentene moi- ety is developed. Three new photochromic compounds （4a, 4b, 4c） have been successfully synthesized through this simple method, and exhibit good photochromic properties with alternate irradiation of ultraviolet and visible light. Furthermore, the fluorescence of compound 4a, which bears a quinoline unit on the cyclopentene, can be modulated via optic and proton dual inputs. Upon excitation by 320 nm light, 4a emits a strong fluorescence at 404 nm. After irradiation with 254 nm light, the emission intensity is reduced due to the fluorescence resonance energy transfers （FRET） from quinoline unit to bithienylethene unit. Moreover, on addition of H~, the fluorescence is quenched completely due to the protonation of the quinoline. In addition, both the FRET and protonation process are reversi- ble, which indicates a potential application in molecular switches and logic gates.
Tian, Zhen; Li, Yongbao; Shi, Feng; Jiang, Steve B; Jia, Xun
2015-01-01
We recently built an analytical source model for GPU-based MC dose engine. In this paper, we present a sampling strategy to efficiently utilize this source model in GPU-based dose calculation. Our source model was based on a concept of phase-space-ring (PSR). This ring structure makes it effective to account for beam rotational symmetry, but not suitable for dose calculations due to rectangular jaw settings. Hence, we first convert PSR source model to its phase-space let (PSL) representation. Then in dose calculation, different types of sub-sources were separately sampled. Source sampling and particle transport were iterated. So that the particles being sampled and transported simultaneously are of same type and close in energy to alleviate GPU thread divergence. We also present an automatic commissioning approach to adjust the model for a good representation of a clinical linear accelerator . Weighting factors were introduced to adjust relative weights of PSRs, determined by solving a quadratic minimization ...
Energy Technology Data Exchange (ETDEWEB)
Aaron, F.D.; Alexa, C.; Rotaru, M.; Stoicea, G. [National Inst. for Physics and Nuclear Engineering, Bucharest (Romania); Andreev, V.; Belousov, A.; Eliseev, A.; Fomenko, A.; Gogitidze, N.; Lebedev, A.; Malinovski, E.; Rusakov, S.; Shtarkov, L.N.; Soloviev, Y.; Vazdik, Y. [Lebedev Physical Inst., Moscow (Russian Federation); Backovic, S.; Dubak, A.; Lastovicka-Medin, G.; Picuric, I.; Raicevic, N. [Univ. of Montenegro, Faculty of Science, Podgorica (ME); Baghdasaryan, A.; Baghdasaryan, S.; Zohrabyan, H. [Yerevan Physics Inst., Yerevan (Armenia); Barrelet, E. [CNRS/IN2P3, LPNHE, Univ. Pierre et Marie Curie Paris 6, Univ. Denis Diderot Paris 7, Paris (France); Bartel, W.; Belov, P.; Brandt, G.; Brinkmann, M.; Britzger, D.; Campbell, A.J.; Eckerlin, G.; Elsen, E.; Felst, R.; Fischer, D.J.; Fleischer, M.; Gayler, J.; Ghazaryan, S.; Glazov, A.; Gouzevitch, M.; Grebenyuk, A.; Grell, B.R.; Habib, S.; Haidt, D.; Helebrant, C.; Kleinwort, C.; Kogler, R.; Kraemer, M.; Levonian, S.; Lipka, K.; List, B.; List, J.; Meyer, A.B.; Meyer, J.; Niebuhr, C.; Nowak, K.; Olsson, J.E.; Pahl, P.; Panagoulias, I.; Papadopoulou, T.; Petrukhin, A.; Piec, S.; Pitzl, D.; Schmitt, S.; Sefkow, F.; Shushkevich, S.; South, D.; Steder, M.; Wuensch, E. [DESY, Hamburg (Germany); Begzsuren, K.; Ravdandorj, T.; Tseepeldorj, B. [Inst. of Physics and Technology of the Mongolian Academy of Sciences, Ulaanbaatar (Mongolia); Bizot, J.C.; Brisson, V.; Delcourt, B.; Jacquet, M.; Pascaud, C.; Tran, T.H.; Zhang, Z.; Zomer, F. [CNRS/IN2P3, LAL, Univ. Paris-Sud, Orsay (France); Boudry, V.; Moreau, F.; Specka, A. [CNRS/IN2P3, LLR, Ecole Polytechnique, Palaiseau (France); Bozovic-Jelisavcic, I.; Mudrinic, M.; Pandurovic, M.; Smiljanic, I. [Univ. of Belgrade, Vinca Inst. of Nuclear Sciences, Belgrade (RS); Bracinik, J.; Kenyon, I.R.; Newman, P.R.; Thompson, P.D. [Univ. of Birmingham (United Kingdom); Bruncko, D.; Cerny, V.; Ferencei, J. [Slovak Academy of Sciences, Kosice (Slovakia)] [and others
2012-04-15
The cross section of diffractive deep-inelastic scattering ep{yields}eXp is measured, where the system X contains at least two jets and the leading final state proton is detected in the H1 Forward Proton Spectrometer. The measurement is performed for fractional proton longitudinal momentum loss x{sub P}<0.1 and covers the range 0.1< vertical stroke t vertical stroke <0.7 GeV{sup 2} in squared four-momentum transfer at the proton vertex and 4based on diffractive parton distribution functions extracted from measurements of inclusive and dijet cross sections in diffractive deep-inelastic scattering. The data are also compared with leading order Monte Carlo models. (orig.)
Determination of Magnet Specification of 13 MeV Proton Cyclotron Based on Opera 3D
Directory of Open Access Journals (Sweden)
Taufik
2014-08-01
Full Text Available The magnet is one of the main components of a cyclotron, used to form a circular particle beam trajectories and to provide focusing of the beam. To support the mastery of 13-MeV proton cyclotron technologies, cyclotron magnet design must be done to satisfy cyclotron magnet requirements. This research was conducted by studying important parameters in designing the cyclotron magnet which is then used to determine the design requirements. The magnet design was based on the results of a 3D simulation using Opera 3D software. Opera 3D is a software developed by Cobham plc to solve physical problems in 3D such as magnetostatic using finite element methods. The simulation started by drawing a 3D model of the magnet using a modeler, followed by magnetic field calculations by Tosca module in the Opera 3D software. Simulation results were analyzed with the Genspeo software to determine whether the parameters of the cyclotron magnet have met design requirements. The results indicate that the magnet design satisfied the cyclotron magnet design requirement, that B in the median plane of the magnetic pole approached the isochronous curve, providing axial and radial focusing beam, crossing the resonance line at vr = 1 when the particle energy is low and the particle energy is more than 13 MeV, and lead to small enough phase shift of about 13°. The dimension of the cyclotron magnet is 1.96 m × 1.30 m × 1.21 m; its weight is 17.3 ton; its coil current is 88,024 ampere-turn; its center magnetic field is 1.27479 T; its maximum magnetic field is 1.942116 T; its minimum magnetic field is 0.7689 T; its valley gap is 120 mm; its hill gaps are 40 to 50.78 mm; and its hill angles are 35° to 44°.to 44°
Study of proton-nucleus collisions at high energies based on the hydrodynamical model
International Nuclear Information System (INIS)
We study proton-nucleus collisions at high energies using the one-dimensional hydrodynamical model of Landau with special emphasis on the effect of the size of the target nucleus and of the magnitude of velocity of sound of excited hadronic matter. We convert a collision problem of a proton and a nucleus with a spherical shape into that of a proton and a one-dimensional nuclear tunnel whose length is determined from the average impact parameter. By extending the methods developed by Milekhin and Emelyanov, we obtain the solutions of the hydrodynamical equations of proton-nucleus collisions for arbitrary target tunnel length and arbitrary velocity of sound. The connection between these solutions and observable physical quantities is established as in the work of Cooper, Frye, and Schonberg. Extensive numerical analyses are made at E/sub lab/ = 200 GeV and for the velocity of sound u = 1/√3 of a relativistic ideal Bose gas and u = 1/(7.5)/sup 1/2/ of an interacting Bose gas. In order to compare proton-nucleus collisions with proton-proton collisions, all the analyses are made in the equal-velocity frame. We find the following results. (1) In comparing the number of secondary particles produced in p-A collisions N/sub p/A with those in p-p collisions N/sub p/p, while most of the excess of N/sub p/A over N/sub p/p is concentrated in the backward rapidity region, there exists also an increase of N/sub p/A with A in the forward rapidity region. This result is at variance with the predictions of the energy-flux-cascade model and of the coherent-production model. (2) The excess energies are contained exclusively in the backward region. We also find evidence for new phenomena in proton-nucleus collisions. (3) The existence of an asymmetry of average energies of secondary particles between forward and backward regions, in particular, >> for larger nuclear targets. Thus, energetic particles are predominantly produced in the backward region
Detection of Explosives by Using a Neutron Source Based on a Proton Linac
Dolya, S N
2016-01-01
The paper considers an opportunity of detecting explosives by using radiation capture of a neutron with nitrogen nucleus. Proton LINAC is offered as the neutron source with the following parameters: proton energy five Mega electron Volts , beam pulse current one and seven-tenths milliampere, duration of the current pulse two hundreds microseconds, repetition rate fifty Hertz. The reaction in which neutrons are formed is lithium (p,n) beryllium. It is shown that this neutron source will have the intensity of ten to the twelfth degree neutron per second that will allow one to detect explosives of the size of a tennis ball.
Kalaiselvimary, J.; Pradeepa, P.; Sowmya, G.; Edwinraj, S.; Prabhu, M. Ramesh
2016-05-01
This study describes the biodegradable acid doped films composed of chitosan and Perchloric acid with different ratios (2.5 wt %, 5 wt %, 7.5 wt %, 10 wt %) was prepared by the solution casting technique. The temperature dependence of the proton conductivity of complex electrolytes obeys the Arrhenius relationship. Proton conductivity of the prepared polymer electrolyte of the bio polymer with acid doped was measured to be approximately 5.90 × 10-4 Scm-1. The dielectric data were analyzed using Complex impedance Z*, Dielectric loss ɛ', Tangent loss for prepared polymer electrolyte membrane with the highest conductivity samples at various temperature.
Proton transfer in organic scaffolds
Basak, Dipankar
This dissertation focuses on the fundamental understanding of the proton transfer process and translating the knowledge into design/development of new organic materials for efficient non-aqueous proton transport. For example, what controls the shuttling of a proton between two basic sites? a) Distance between two groups? or b) the basicity? c) What is the impact of protonation on molecular conformation when the basic sites are attached to rigid scaffolds? For this purpose, we developed several tunable proton sponges and studied proton transfer in these scaffolds theoretically as well as experimentally. Next we moved our attention to understand long-range proton conduction or proton transport. We introduced liquid crystalline (LC) proton conductor based on triphenylene molecule and established that activation energy barrier for proton transport is lower in the LC phase compared to the crystalline phase. Furthermore, we investigated the impact of several critical factors: the choice of the proton transferring groups, mobility of the charge carriers, intrinsic vs. extrinsic charge carrier concentrations and the molecular architectures on long-range proton transport. The outcome of this research will lead to a deeper understanding of non-aqueous proton transfer process and aid the design of next generation proton exchange membrane (PEM) for fuel cell.
The underlying event in proton-proton collisions
Energy Technology Data Exchange (ETDEWEB)
Bechtel, F.
2009-05-15
In this thesis, studies of the underlying event in proton-proton collisions at a center-of-mass energy of {radical}(s) = 10 TeV are presented. Crucial ingredient to underlying event models are multiple parton-parton scatters in single proton-proton collisions. The feasibility of measuring the underlying event was investigated with the Compact Muon Solenoid (CMS) detector at the Large Hadron Collider (LHC) using charged particles and charged-particle jets. Systematic uncertainties of the underlying event measurement due to detector misalignment and imperfect track reconstruction are found to be negligible after {integral}Ldt=1 pb{sup -1} of data are available. Different model predictions are compared with each other using fully simulated Monte Carlo samples. It is found, that distinct models differ strongly enough to tell them apart with early data. (orig.)
Directory of Open Access Journals (Sweden)
Sandeep Chakraborty
Full Text Available The pathways of proton abstraction (PA, a key aspect of most catalytic reactions, is often controversial and highly debated. Ultrahigh-resolution diffraction studies, molecular dynamics, quantum mechanics and molecular mechanic simulations are often adopted to gain insights in the PA mechanisms in enzymes. These methods require expertise and effort to setup and can be computationally intensive. We present a push button methodology--Proton abstraction Simulation (PRISM--to enumerate the possible pathways of PA in a protein with known 3D structure based on the spatial and electrostatic properties of residues in the proximity of a given nucleophilic residue. Proton movements are evaluated in the vicinity of this nucleophilic residue based on distances, potential differences, spatial channels and characteristics of the individual residues (polarity, acidic, basic, etc. Modulating these parameters eliminates their empirical nature and also might reveal pathways that originate from conformational changes. We have validated our method using serine proteases and concurred with the dichotomy in PA in Class A β-lactamases, both of which are hydrolases. The PA mechanism in a transferase has also been corroborated. The source code is made available at www.sanchak.com/prism.
Directory of Open Access Journals (Sweden)
Ah Reum Choi
Full Text Available A homologue of type I rhodopsin was found in the unicellular Gloeobacter violaceus PCC7421, which is believed to be primitive because of the lack of thylakoids and peculiar morphology of phycobilisomes. The Gloeobacter rhodopsin (GR gene encodes a polypeptide of 298 amino acids. This gene is localized alone in the genome unlike cyanobacterium Anabaena opsin, which is clustered together with 14 kDa transducer gene. Amino acid sequence comparison of GR with other type I rhodopsin shows several conserved residues important for retinal binding and H+ pumping. In this study, the gene was expressed in Escherichia coli and bound all-trans retinal to form a pigment (λmax = 544 nm at pH 7. The pKa of proton acceptor (Asp121 for the Schiff base, is approximately 5.9, so GR can translocate H+ under physiological conditions (pH 7.4. In order to prove the functional activity in the cell, pumping activity was measured in the sphaeroplast membranes of E. coli and one of Gloeobacter whole cell. The efficient proton pumping and rapid photocycle of GR strongly suggests that Gloeobacter rhodopsin functions as a proton pumping in its natural environment, probably compensating the shortage of energy generated by chlorophyll-based photosynthesis without thylakoids.
Nuclear interaction cross sections for proton radiotherapy
Chadwick, M B; Arendse, G J; Cowley, A A; Richter, W A; Lawrie, J J; Newman, R T; Pilcher, J V; Smit, F D; Steyn, G F; Koen, J W; Stander, J A
1999-01-01
Model calculations of proton-induced nuclear reaction cross sections are described for biologically-important targets. Measurements made at the National Accelerator Centre are presented for double-differential proton, deuteron, triton, helium-3 and alpha particle spectra, for 150 and 200 MeV protons incident on C, N, and O. These data are needed for Monte Carlo simulations of radiation transport and absorbed dose in proton therapy. Data relevant to the use of positron emission tomography to locate the Bragg peak are also described.
International Nuclear Information System (INIS)
Sheet metal stamping is one of the most commonly used manufacturing processes, and hence, much research has been carried for economic gain. Searching through the literatures, however, it is found that there are still a lots of problems unsolved. For example, it is well known that for a same press, same workpiece material, and same set of die, the product quality may vary owing to a number of factors, such as the inhomogeneous of the workpice material, the loading error, the lubrication, and etc. Presently, few seem able to predict the quality variation, not to mention what contribute to the quality variation. As a result, trial-and-error is still needed in the shop floor, causing additional cost and time delay. This paper introduces a new approach to predict the product quality variation and identify the sensitive design / process parameters. The new approach is based on a combination of inverse Finite Element Modeling (FEM) and Monte Carlo Simulation (more specifically, the Latin Hypercube Sampling (LHS) approach). With an acceptable accuracy, the inverse FEM (also called one-step FEM) requires much less computation load than that of the usual incremental FEM and hence, can be used to predict the quality variations under various conditions. LHS is a statistical method, through which the sensitivity analysis can be carried out. The result of the sensitivity analysis has clear physical meaning and can be used to optimize the die design and / or the process design. Two simulation examples are presented including drawing a rectangular box and drawing a two-step rectangular box
International Nuclear Information System (INIS)
This paper presents a novel decision-support tool for assessing future generation portfolios in an increasingly uncertain electricity industry. The tool combines optimal generation mix concepts with Monte Carlo simulation and portfolio analysis techniques to determine expected overall industry costs, associated cost uncertainty, and expected CO2 emissions for different generation portfolio mixes. The tool can incorporate complex and correlated probability distributions for estimated future fossil-fuel costs, carbon prices, plant investment costs, and demand, including price elasticity impacts. The intent of this tool is to facilitate risk-weighted generation investment and associated policy decision-making given uncertainties facing the electricity industry. Applications of this tool are demonstrated through a case study of an electricity industry with coal, CCGT, and OCGT facing future uncertainties. Results highlight some significant generation investment challenges, including the impacts of uncertain and correlated carbon and fossil-fuel prices, the role of future demand changes in response to electricity prices, and the impact of construction cost uncertainties on capital intensive generation. The tool can incorporate virtually any type of input probability distribution, and support sophisticated risk assessments of different portfolios, including downside economic risks. It can also assess portfolios against multi-criterion objectives such as greenhouse emissions as well as overall industry costs. - Highlights: ► Present a decision support tool to assist generation investment and policy making under uncertainty. ► Generation portfolios are assessed based on their expected costs, risks, and CO2 emissions. ► There is tradeoff among expected cost, risks, and CO2 emissions of generation portfolios. ► Investment challenges include economic impact of uncertainties and the effect of price elasticity. ► CO2 emissions reduction depends on the mix of
A GPU OpenCL based cross-platform Monte Carlo dose calculation engine (goMC).
Tian, Zhen; Shi, Feng; Folkerts, Michael; Qin, Nan; Jiang, Steve B; Jia, Xun
2015-10-01
Monte Carlo (MC) simulation has been recognized as the most accurate dose calculation method for radiotherapy. However, the extremely long computation time impedes its clinical application. Recently, a lot of effort has been made to realize fast MC dose calculation on graphic processing units (GPUs). However, most of the GPU-based MC dose engines have been developed under NVidia's CUDA environment. This limits the code portability to other platforms, hindering the introduction of GPU-based MC simulations to clinical practice. The objective of this paper is to develop a GPU OpenCL based cross-platform MC dose engine named goMC with coupled photon-electron simulation for external photon and electron radiotherapy in the MeV energy range. Compared to our previously developed GPU-based MC code named gDPM (Jia et al 2012 Phys. Med. Biol. 57 7783-97), goMC has two major differences. First, it was developed under the OpenCL environment for high code portability and hence could be run not only on different GPU cards but also on CPU platforms. Second, we adopted the electron transport model used in EGSnrc MC package and PENELOPE's random hinge method in our new dose engine, instead of the dose planning method employed in gDPM. Dose distributions were calculated for a 15 MeV electron beam and a 6 MV photon beam in a homogenous water phantom, a water-bone-lung-water slab phantom and a half-slab phantom. Satisfactory agreement between the two MC dose engines goMC and gDPM was observed in all cases. The average dose differences in the regions that received a dose higher than 10% of the maximum dose were 0.48-0.53% for the electron beam cases and 0.15-0.17% for the photon beam cases. In terms of efficiency, goMC was ~4-16% slower than gDPM when running on the same NVidia TITAN card for all the cases we tested, due to both the different electron transport models and the different development environments. The code portability of our new dose engine goMC was validated by
A GPU OpenCL based cross-platform Monte Carlo dose calculation engine (goMC)
Tian, Zhen; Shi, Feng; Folkerts, Michael; Qin, Nan; Jiang, Steve B.; Jia, Xun
2015-09-01
Monte Carlo (MC) simulation has been recognized as the most accurate dose calculation method for radiotherapy. However, the extremely long computation time impedes its clinical application. Recently, a lot of effort has been made to realize fast MC dose calculation on graphic processing units (GPUs). However, most of the GPU-based MC dose engines have been developed under NVidia’s CUDA environment. This limits the code portability to other platforms, hindering the introduction of GPU-based MC simulations to clinical practice. The objective of this paper is to develop a GPU OpenCL based cross-platform MC dose engine named goMC with coupled photon-electron simulation for external photon and electron radiotherapy in the MeV energy range. Compared to our previously developed GPU-based MC code named gDPM (Jia et al 2012 Phys. Med. Biol. 57 7783-97), goMC has two major differences. First, it was developed under the OpenCL environment for high code portability and hence could be run not only on different GPU cards but also on CPU platforms. Second, we adopted the electron transport model used in EGSnrc MC package and PENELOPE’s random hinge method in our new dose engine, instead of the dose planning method employed in gDPM. Dose distributions were calculated for a 15 MeV electron beam and a 6 MV photon beam in a homogenous water phantom, a water-bone-lung-water slab phantom and a half-slab phantom. Satisfactory agreement between the two MC dose engines goMC and gDPM was observed in all cases. The average dose differences in the regions that received a dose higher than 10% of the maximum dose were 0.48-0.53% for the electron beam cases and 0.15-0.17% for the photon beam cases. In terms of efficiency, goMC was ~4-16% slower than gDPM when running on the same NVidia TITAN card for all the cases we tested, due to both the different electron transport models and the different development environments. The code portability of our new dose engine goMC was validated by
The Quest for Evidence for Proton Therapy : Model-Based Approach and Precision Medicine
Widder, Joachim; van der Schaaf, Arjen; Lambin, Philippe; Marijnen, Corrie A. M.; Pignol, Jean-Philippe; Rasch, Coen R.; Slotman, Ben J.; Verheij, Marcel; Langendijk, Johannes A.
2016-01-01
Purpose: Reducing dose to normal tissues is the advantage of protons versus photons. We aimed to describe a method for translating this reduction into a clinically relevant benefit. Methods and Materials: Dutch scientific and health care governance bodies have recently issued landmark reports regard
pK(a) based protonation states and microspecies for protein-ligand docking.
ten Brink, Tim; Exner, Thomas E
2010-11-01
In this paper we present our reworked approach to generate ligand protonation states with our structure preparation tool SPORES (Structure PrOtonation and REcognition System). SPORES can be used for the preprocessing of proteins and protein-ligand complexes as e.g. taken from the Protein Data Bank as well as for the setup of 3D ligand databases. It automatically assigns atom and bond types, generates different protonation, tautomeric states as well as different stereoisomers. In the revised version, pKa calculations with the ChemAxon software MARVIN are used either to determine the likeliness of a combinatorial generated protonation state or to determine the titrable atoms used in the combinatorial approach. Additionally, the MARVIN software is used to predict microspecies distributions of ligand molecules. Docking studies were performed with our recently introduced program PLANTS (Protein-Ligand ANT System) on all protomers resulting from the three different selection methods for the well established CCDC/ASTEX clean data set demonstrating the usefulness of especially the latter approach.
An assessment of radiation damage in space-based germanium detectors due to solar proton events
Owens, Alan; Brandenburg, S.; Buis, E. -J.; Kliewiet, H.; Kraft, S.; Ostendorf, R. W.; Peacock, A.; Quarati, F.; Quirin, P.
2007-01-01
Radiation effects caused by solar proton events will be a common problem for many types of sensors on missions to the inner solar system because of the long cruise phases coupled with the inverse square scaling of solar particle events. As part of a study in support of the BepiColombo mission to Mer
Damage induced by proton irradiation in carbonate based natural painting pigments
International Nuclear Information System (INIS)
The so called 'dark spot' phenomenon produced during proton irradiation of pigments is an important factor to determine experimental conditions of ion beam analysis of pigments in paintings, miniatures, pottery and other art objects. Recently it has been suggested that this phenomenon could be due to the formation of colour centres during irradiation, but there is scarce knowledge about the characteristics and the reversibility of the damage. In this work a representative set of natural carbonate minerals, traditionally used as pigments, were exposed to proton irradiation in an external beam set-up, in order to simulate routine external proton induced X-ray emission (PIXE) analysis conditions of an art object. During irradiation ionoluminescence (IL) combined with PIXE were employed to identify the microscopic processes involved in the proton damage. After irradiation, two well-established techniques for the study of colour centres, thermoluminescence (TL) and optical absorption were used. Particularly, TL is a very sensitive technique to detect very low concentrations of radiation induced defects
pK(a) based protonation states and microspecies for protein-ligand docking.
ten Brink, Tim; Exner, Thomas E
2010-11-01
In this paper we present our reworked approach to generate ligand protonation states with our structure preparation tool SPORES (Structure PrOtonation and REcognition System). SPORES can be used for the preprocessing of proteins and protein-ligand complexes as e.g. taken from the Protein Data Bank as well as for the setup of 3D ligand databases. It automatically assigns atom and bond types, generates different protonation, tautomeric states as well as different stereoisomers. In the revised version, pKa calculations with the ChemAxon software MARVIN are used either to determine the likeliness of a combinatorial generated protonation state or to determine the titrable atoms used in the combinatorial approach. Additionally, the MARVIN software is used to predict microspecies distributions of ligand molecules. Docking studies were performed with our recently introduced program PLANTS (Protein-Ligand ANT System) on all protomers resulting from the three different selection methods for the well established CCDC/ASTEX clean data set demonstrating the usefulness of especially the latter approach. PMID:20882397
Energy deposition in selected-mammalian cell for several-MeV single-proton beam
Ding, K.; Yu, Z.
2007-05-01
The phenomena resulting from interaction between ion beam and mammalian cell pose important problems for biological applications. Classic Bethe-Bloch theory utilizing attached V79 mammalian cell has been conducted in order to establish the stopping powers of the mammalian cell for several-MeV single-proton microbeam. Based on the biological structure of the mammalian cell, a physical model is proposed which presumes that the attached cell is simple MWM model. According to this model and Monte Carlo simulation, we studied the energy deposition and its ratio on the selected attached mammalian cell for MeV proton implantation.
International Nuclear Information System (INIS)
Purpose: To use proton radiography for i) in-vivo range verification of the brain fields of medulloblastoma patients in order to reduce the exit dose to the cranial skin and thus the risk of permanent alopecia; ii) for performing patient specific optimization of the calibration from CT-Hounsfield units to proton relative stopping power in order to minimize uncertainties of proton rang Methods: We developed and tested a prototype proton radiography system based on a single-plane scintillation screen coupled with a fast CCD camera (1ms sampling rate, 0.29x0.29 mm2 pixel size, 30×30 cm2 field of view). The method is based on the principle that, for passively scattered beams, the radiological depth of any point in the plateau of a spread-out Bragg-Peak (SOBP) can be inferred from the time-pattern of the dose rate measurements. We performed detector characterization measurements using complex-shape homogeneous phantoms and an Alderson phanto Results: Detector characterization tests confirmed the robustness of the technique. The results of the phantom measurements are encouraging in terms of achievable accuracy of the water equivalent thickness. A technique to minimize the degradation of spatial resolution due to multiple Coulomb scattering is discussed. Our novel radiographic technique is rapid (100 ms) and simultaneous over the whole field. The dose required to produce one radiograph, with the current settings, is ∼3 cG Conclusion: The results obtained with this simple and innovative radiography method are promising and motivate further development of technique. The system requires only a single-plane 2D dosimeter and it uses the clinical beam for a fraction of second with low dose to the patient
Energy Technology Data Exchange (ETDEWEB)
Testa, M; Paganetti, H; Lu, H-M [Massachusetts General Hospital ' Harvard Medical School, Boston, MA (United States); Doolan, P [University College London (United Kingdom); H, Bentefour E [IBA, Warrenville, IL (United States)
2014-06-01
Purpose: To use proton radiography for i) in-vivo range verification of the brain fields of medulloblastoma patients in order to reduce the exit dose to the cranial skin and thus the risk of permanent alopecia; ii) for performing patient specific optimization of the calibration from CT-Hounsfield units to proton relative stopping power in order to minimize uncertainties of proton rang Methods: We developed and tested a prototype proton radiography system based on a single-plane scintillation screen coupled with a fast CCD camera (1ms sampling rate, 0.29x0.29 mm{sup 2} pixel size, 30×30 cm{sup 2} field of view). The method is based on the principle that, for passively scattered beams, the radiological depth of any point in the plateau of a spread-out Bragg-Peak (SOBP) can be inferred from the time-pattern of the dose rate measurements. We performed detector characterization measurements using complex-shape homogeneous phantoms and an Alderson phanto Results: Detector characterization tests confirmed the robustness of the technique. The results of the phantom measurements are encouraging in terms of achievable accuracy of the water equivalent thickness. A technique to minimize the degradation of spatial resolution due to multiple Coulomb scattering is discussed. Our novel radiographic technique is rapid (100 ms) and simultaneous over the whole field. The dose required to produce one radiograph, with the current settings, is ∼3 cG Conclusion: The results obtained with this simple and innovative radiography method are promising and motivate further development of technique. The system requires only a single-plane 2D dosimeter and it uses the clinical beam for a fraction of second with low dose to the patient.
Phantom based evaluation of CT to CBCT image registration for proton therapy dose recalculation
International Nuclear Information System (INIS)
The ability to perform dose recalculation on the anatomy of the day is important in the context of adaptive proton therapy. The objective of this study was to investigate the use of deformable image registration (DIR) and cone beam CT (CBCT) imaging to generate the daily stopping power distribution of the patient. We investigated the deformation of the planning CT scan (pCT) onto daily CBCT images to generate a virtual CT (vCT) using a deformable phantom designed for the head and neck (H and N) region. The phantom was imaged at a planning CT scanner in planning configuration, yielding a pCT and in deformed, treatment day configuration, yielding a reference CT (refCT). The treatment day configuration was additionally scanned at a CBCT scanner. A Morphons DIR algorithm was used to generate a vCT. The accuracy of the vCT was evaluated by comparison to the refCT in terms of corresponding features as identified by an adaptive scale invariant feature transform (aSIFT) algorithm. Additionally, the vCT CT numbers were compared to those of the refCT using both profiles and regions of interest and the volumes and overlap (DICE coefficients) of various phantom structures were compared. The water equivalent thickness (WET) of the vCT, refCT and pCT were also compared to evaluate proton range differences. Proton dose distributions from the same initial fluence were calculated on the refCT, vCT and pCT and compared in terms of proton range. The method was tested on a clinical dataset using a replanning CT scan acquired close in time to a CBCT scan as reference using the WET evaluation. Results from the aSIFT investigation suggest a deformation accuracy of 2–3 mm. The use of the Morphon algorithm did not distort CT number intensity in uniform regions and WET differences between vCT and refCT were of the order of 2% of the proton range. This result was confirmed by proton dose calculations. The patient results were consistent with phantom observations. In conclusion, our
Phantom based evaluation of CT to CBCT image registration for proton therapy dose recalculation
Landry, Guillaume; Dedes, George; Zöllner, Christoph; Handrack, Josefine; Janssens, Guillaume; Orban de Xivry, Jonathan; Reiner, Michael; Paganelli, Chiara; Riboldi, Marco; Kamp, Florian; Söhn, Matthias; Wilkens, Jan J.; Baroni, Guido; Belka, Claus; Parodi, Katia
2015-01-01
The ability to perform dose recalculation on the anatomy of the day is important in the context of adaptive proton therapy. The objective of this study was to investigate the use of deformable image registration (DIR) and cone beam CT (CBCT) imaging to generate the daily stopping power distribution of the patient. We investigated the deformation of the planning CT scan (pCT) onto daily CBCT images to generate a virtual CT (vCT) using a deformable phantom designed for the head and neck (H & N) region. The phantom was imaged at a planning CT scanner in planning configuration, yielding a pCT and in deformed, treatment day configuration, yielding a reference CT (refCT). The treatment day configuration was additionally scanned at a CBCT scanner. A Morphons DIR algorithm was used to generate a vCT. The accuracy of the vCT was evaluated by comparison to the refCT in terms of corresponding features as identified by an adaptive scale invariant feature transform (aSIFT) algorithm. Additionally, the vCT CT numbers were compared to those of the refCT using both profiles and regions of interest and the volumes and overlap (DICE coefficients) of various phantom structures were compared. The water equivalent thickness (WET) of the vCT, refCT and pCT were also compared to evaluate proton range differences. Proton dose distributions from the same initial fluence were calculated on the refCT, vCT and pCT and compared in terms of proton range. The method was tested on a clinical dataset using a replanning CT scan acquired close in time to a CBCT scan as reference using the WET evaluation. Results from the aSIFT investigation suggest a deformation accuracy of 2-3 mm. The use of the Morphon algorithm did not distort CT number intensity in uniform regions and WET differences between vCT and refCT were of the order of 2% of the proton range. This result was confirmed by proton dose calculations. The patient results were consistent with phantom observations. In conclusion, our phantom
Dosimetric impact of intrafraction motion for compensator-based proton therapy of lung cancer
Zhao, Li; Sandison, George A.; Farr, Jonathan B.; Chien Hsi, Wen; Li, X. Allen
2008-06-01
Compensator-based proton therapy of lung cancer using an un-gated treatment while allowing the patient to breathe freely requires a compensator design that ensures tumor coverage throughout respiration. Our investigation had two purposes: one is to investigate the dosimetric impact when a composite compensator correction is applied, or is not, and the other one is to evaluate the significance of using different respiratory phases as the reference computed tomography (CT) for treatment planning dose calculations. A 4D-CT-based phantom study and a real patient treatment planning study were performed. A 3D MIP dataset generated over all phases of the acquired 4D-CT scans was adopted to design the field-specific composite aperture and compensator. In the phantom study, the MIP-based compensator design plan named plan D was compared to the other three plans, in which average intensity projection (AIP) images in conjunction with the composite target volume contour copied from the MIP images were used. Relative electron densities within the target envelope were assigned either to original values from the AIP image dataset (plan A) or to predetermined values, 0.8 (plan B) and 0.9 (plan C). In the patient study, the dosimetric impact of a compensator design based on the MIP images (plan ITVMIP) was compared to designs based on end-of-inhale (EOI) (plan ITVEOI) and middle-of-exhale (MOE) CT images (plan ITVMOE). The dose distributions were recalculated for each phase. Throughout the ten phases, it shows that DGTVmin changed slightly from 86% to 89% (SD = 0.9%) of prescribed dose (PD) in the MIP plan, while varying greatly from 10% to 79% (SD = 26.7%) in plan A, 17% to 73% (SD = 22.5%) in plan B and 53% to 73% (SD = 6.8%) in plan C. The same trend was observed for DGTVmean and V95 with less amplitude. In the MIP-based plan ITVMIP, DGTVmean was almost identically equal to 95% in each phase (SD = 0.5%). The patient study verified that the MIP approach increased the minimum
Directory of Open Access Journals (Sweden)
Jenine K Sanzari
Full Text Available Immune system adaptation during spaceflight is a concern in space medicine. Decreased circulating leukocytes observed during and after space flight infer suppressed immune responses and susceptibility to infection. The microgravity aspect of the space environment has been simulated on Earth to study adverse biological effects in astronauts. In this report, the hindlimb unloading (HU model was employed to investigate the combined effects of solar particle event-like proton radiation and simulated microgravity on immune cell parameters including lymphocyte subtype populations and activity. Lymphocytes are a type of white blood cell critical for adaptive immune responses and T lymphocytes are regulators of cell-mediated immunity, controlling the entire immune response. Mice were suspended prior to and after proton radiation exposure (2 Gy dose and total leukocyte numbers and splenic lymphocyte functionality were evaluated on days 4 or 21 after combined HU and radiation exposure. Total white blood cell (WBC, lymphocyte, neutrophil, and monocyte counts are reduced by approximately 65%, 70%, 55%, and 70%, respectively, compared to the non-treated control group at 4 days after combined exposure. Splenic lymphocyte subpopulations are altered at both time points investigated. At 21 days post-exposure to combined HU and proton radiation, T cell activation and proliferation were assessed in isolated lymphocytes. Cell surface expression of the Early Activation Marker, CD69, is decreased by 30% in the combined treatment group, compared to the non-treated control group and cell proliferation was suppressed by approximately 50%, compared to the non-treated control group. These findings reveal that the combined stressors (HU and proton radiation exposure result in decreased leukocyte numbers and function, which could contribute to immune system dysfunction in crew members. This investigation is one of the first to report on combined proton radiation and
International Nuclear Information System (INIS)
In tokamak-type DT nuclear fusion reactor, there are various type slits and ducts in the blanket and the vacuum vessel. The helium production in the rewelding location of the blanket and the vacuum vessel, the nuclear properties in the super-conductive TF coil, e.g. the nuclear heating rate in the coil winding pack, are enhanced by the radiation streaming through the slits and ducts, and they are critical concern in the shielding design. The decay gamma ray dose rate around the duct penetrating the blanket and the vacuum vessel is also enhanced by the radiation streaming through the duct, and they are also critical concern from the view point of the human access to the cryostat during maintenance. In order to evaluate these nuclear properties with good accuracy, three dimensional Monte Carlo calculation is required but requires long calculation time. Therefore, the development of the effective simple design evaluation method for radiation streaming is substantially important. This study aims to establish the systematic evaluation method for the nuclear properties of the blanket, the vacuum vessel and the Toroidal Field (TF) coil taking into account the radiation streaming through various types of slits and ducts, based on three dimensional Monte Carlo calculation using the MNCP code, and for the decay gamma ray dose rates penetrated around the ducts. The present thesis describes three topics in five chapters as follows; 1) In Chapter 2, the results calculated by the Monte Carlo code, MCNP, are compared with those by the Sn code, DOT3.5, for the radiation streaming in the tokamak-type nuclear fusion reactor, for validating the results of the Sn calculation. From this comparison, the uncertainties of the Sn calculation results coming from the ray-effect and the effect due to approximation of the geometry are investigated whether the two dimensional Sn calculation can be applied instead of the Monte Carlo calculation. Through the study, it can be concluded that the
Yeh, Chi-Yuan; Tung, Chuan-Jung; Chao, Tsi-Chain; Lin, Mu-Han; Lee, Chung-Chi
2014-11-01
The purpose of this study was to examine dose distribution of a skull base tumor and surrounding critical structures in response to high dose intensity-modulated radiosurgery (IMRS) with Monte Carlo (MC) simulation using a dual resolution sandwich phantom. The measurement-based Monte Carlo (MBMC) method (Lin et al., 2009) was adopted for the study. The major components of the MBMC technique involve (1) the BEAMnrc code for beam transport through the treatment head of a Varian 21EX linear accelerator, (2) the DOSXYZnrc code for patient dose simulation and (3) an EPID-measured efficiency map which describes non-uniform fluence distribution of the IMRS treatment beam. For the simulated case, five isocentric 6 MV photon beams were designed to deliver a total dose of 1200 cGy in two fractions to the skull base tumor. A sandwich phantom for the MBMC simulation was created based on the patient's CT scan of a skull base