Proton Upset Monte Carlo Simulation
O'Neill, Patrick M.; Kouba, Coy K.; Foster, Charles C.
2009-01-01
The Proton Upset Monte Carlo Simulation (PROPSET) program calculates the frequency of on-orbit upsets in computer chips (for given orbits such as Low Earth Orbit, Lunar Orbit, and the like) from proton bombardment based on the results of heavy ion testing alone. The software simulates the bombardment of modern microelectronic components (computer chips) with high-energy (.200 MeV) protons. The nuclear interaction of the proton with the silicon of the chip is modeled and nuclear fragments from this interaction are tracked using Monte Carlo techniques to produce statistically accurate predictions.
GPU-based fast Monte Carlo dose calculation for proton therapy.
Jia, Xun; Schümann, Jan; Paganetti, Harald; Jiang, Steve B
2012-12-07
Accurate radiation dose calculation is essential for successful proton radiotherapy. Monte Carlo (MC) simulation is considered to be the most accurate method. However, the long computation time limits it from routine clinical applications. Recently, graphics processing units (GPUs) have been widely used to accelerate computationally intensive tasks in radiotherapy. We have developed a fast MC dose calculation package, gPMC, for proton dose calculation on a GPU. In gPMC, proton transport is modeled by the class II condensed history simulation scheme with a continuous slowing down approximation. Ionization, elastic and inelastic proton nucleus interactions are considered. Energy straggling and multiple scattering are modeled. Secondary electrons are not transported and their energies are locally deposited. After an inelastic nuclear interaction event, a variety of products are generated using an empirical model. Among them, charged nuclear fragments are terminated with energy locally deposited. Secondary protons are stored in a stack and transported after finishing transport of the primary protons, while secondary neutral particles are neglected. gPMC is implemented on the GPU under the CUDA platform. We have validated gPMC using the TOPAS/Geant4 MC code as the gold standard. For various cases including homogeneous and inhomogeneous phantoms as well as a patient case, good agreements between gPMC and TOPAS/Geant4 are observed. The gamma passing rate for the 2%/2 mm criterion is over 98.7% in the region with dose greater than 10% maximum dose in all cases, excluding low-density air regions. With gPMC it takes only 6-22 s to simulate 10 million source protons to achieve ∼1% relative statistical uncertainty, depending on the phantoms and energy. This is an extremely high efficiency compared to the computational time of tens of CPU hours for TOPAS/Geant4. Our fast GPU-based code can thus facilitate the routine use of MC dose calculation in proton therapy.
A fast Monte Carlo code for proton transport in radiation therapy based on MCNPX.
Jabbari, Keyvan; Seuntjens, Jan
2014-07-01
An important requirement for proton therapy is a software for dose calculation. Monte Carlo is the most accurate method for dose calculation, but it is very slow. In this work, a method is developed to improve the speed of dose calculation. The method is based on pre-generated tracks for particle transport. The MCNPX code has been used for generation of tracks. A set of data including the track of the particle was produced in each particular material (water, air, lung tissue, bone, and soft tissue). This code can transport protons in wide range of energies (up to 200 MeV for proton). The validity of the fast Monte Carlo (MC) code is evaluated with data MCNPX as a reference code. While analytical pencil beam algorithm transport shows great errors (up to 10%) near small high density heterogeneities, there was less than 2% deviation of MCNPX results in our dose calculation and isodose distribution. In terms of speed, the code runs 200 times faster than MCNPX. In the Fast MC code which is developed in this work, it takes the system less than 2 minutes to calculate dose for 10(6) particles in an Intel Core 2 Duo 2.66 GHZ desktop computer.
A fast Monte Carlo code for proton transport in radiation therapy based on MCNPX
Directory of Open Access Journals (Sweden)
Keyvan Jabbari
2014-01-01
Full Text Available An important requirement for proton therapy is a software for dose calculation. Monte Carlo is the most accurate method for dose calculation, but it is very slow. In this work, a method is developed to improve the speed of dose calculation. The method is based on pre-generated tracks for particle transport. The MCNPX code has been used for generation of tracks. A set of data including the track of the particle was produced in each particular material (water, air, lung tissue, bone, and soft tissue. This code can transport protons in wide range of energies (up to 200 MeV for proton. The validity of the fast Monte Carlo (MC code is evaluated with data MCNPX as a reference code. While analytical pencil beam algorithm transport shows great errors (up to 10% near small high density heterogeneities, there was less than 2% deviation of MCNPX results in our dose calculation and isodose distribution. In terms of speed, the code runs 200 times faster than MCNPX. In the Fast MC code which is developed in this work, it takes the system less than 2 minutes to calculate dose for 10 6 particles in an Intel Core 2 Duo 2.66 GHZ desktop computer.
A fast Monte Carlo code for proton transport in radiation therapy based on MCNPX
Keyvan Jabbari; Jan Seuntjens
2014-01-01
An important requirement for proton therapy is a software for dose calculation. Monte Carlo is the most accurate method for dose calculation, but it is very slow. In this work, a method is developed to improve the speed of dose calculation. The method is based on pre-generated tracks for particle transport. The MCNPX code has been used for generation of tracks. A set of data including the track of the particle was produced in each particular material (water, air, lung tissue, bone, and soft t...
Qin, Nan; Botas, Pablo; Giantsoudi, Drosoula; Schuemann, Jan; Tian, Zhen; Jiang, Steve B.; Paganetti, Harald; Jia, Xun
2016-10-01
Monte Carlo (MC) simulation is commonly considered as the most accurate dose calculation method for proton therapy. Aiming at achieving fast MC dose calculations for clinical applications, we have previously developed a graphics-processing unit (GPU)-based MC tool, gPMC. In this paper, we report our recent updates on gPMC in terms of its accuracy, portability, and functionality, as well as comprehensive tests on this tool. The new version, gPMC v2.0, was developed under the OpenCL environment to enable portability across different computational platforms. Physics models of nuclear interactions were refined to improve calculation accuracy. Scoring functions of gPMC were expanded to enable tallying particle fluence, dose deposited by different particle types, and dose-averaged linear energy transfer (LETd). A multiple counter approach was employed to improve efficiency by reducing the frequency of memory writing conflict at scoring. For dose calculation, accuracy improvements over gPMC v1.0 were observed in both water phantom cases and a patient case. For a prostate cancer case planned using high-energy proton beams, dose discrepancies in beam entrance and target region seen in gPMC v1.0 with respect to the gold standard tool for proton Monte Carlo simulations (TOPAS) results were substantially reduced and gamma test passing rate (1%/1 mm) was improved from 82.7%-93.1%. The average relative difference in LETd between gPMC and TOPAS was 1.7%. The average relative differences in the dose deposited by primary, secondary, and other heavier particles were within 2.3%, 0.4%, and 0.2%. Depending on source proton energy and phantom complexity, it took 8-17 s on an AMD Radeon R9 290x GPU to simulate {{10}7} source protons, achieving less than 1% average statistical uncertainty. As the beam size was reduced from 10 × 10 cm2 to 1 × 1 cm2, the time on scoring was only increased by 4.8% with eight counters, in contrast to a 40% increase using only
A Monte Carlo-based treatment planning tool for proton therapy
Mairani, A.; Böhlen, T. T.; Schiavi, A.; Tessonnier, T.; Molinelli, S.; Brons, S.; Battistoni, G.; Parodi, K.; Patera, V.
2013-04-01
In the field of radiotherapy, Monte Carlo (MC) particle transport calculations are recognized for their superior accuracy in predicting dose and fluence distributions in patient geometries compared to analytical algorithms which are generally used for treatment planning due to their shorter execution times. In this work, a newly developed MC-based treatment planning (MCTP) tool for proton therapy is proposed to support treatment planning studies and research applications. It allows for single-field and simultaneous multiple-field optimization in realistic treatment scenarios and is based on the MC code FLUKA. Relative biological effectiveness (RBE)-weighted dose is optimized either with the common approach using a constant RBE of 1.1 or using a variable RBE according to radiobiological input tables. A validated reimplementation of the local effect model was used in this work to generate radiobiological input tables. Examples of treatment plans in water phantoms and in patient-CT geometries together with an experimental dosimetric validation of the plans are presented for clinical treatment parameters as used at the Italian National Center for Oncological Hadron Therapy. To conclude, a versatile MCTP tool for proton therapy was developed and validated for realistic patient treatment scenarios against dosimetric measurements and commercial analytical TP calculations. It is aimed to be used in future for research and to support treatment planning at state-of-the-art ion beam therapy facilities.
Dosimetric investigation of proton therapy on CT-based patient data using Monte Carlo simulation
Chongsan, T.; Liamsuwan, T.; Tangboonduangjit, P.
2016-03-01
The aim of radiotherapy is to deliver high radiation dose to the tumor with low radiation dose to healthy tissues. Protons have Bragg peaks that give high radiation dose to the tumor but low exit dose or dose tail. Therefore, proton therapy is promising for treating deep- seated tumors and tumors locating close to organs at risk. Moreover, the physical characteristic of protons is suitable for treating cancer in pediatric patients. This work developed a computational platform for calculating proton dose distribution using the Monte Carlo (MC) technique and patient's anatomical data. The studied case is a pediatric patient with a primary brain tumor. PHITS will be used for MC simulation. Therefore, patient-specific CT-DICOM files were converted to the PHITS input. A MATLAB optimization program was developed to create a beam delivery control file for this study. The optimization program requires the proton beam data. All these data were calculated in this work using analytical formulas and the calculation accuracy was tested, before the beam delivery control file is used for MC simulation. This study will be useful for researchers aiming to investigate proton dose distribution in patients but do not have access to proton therapy machines.
Tseung, H Wan Chan; Beltran, C
2014-01-01
Purpose: Very fast Monte Carlo (MC) simulations of proton transport have been implemented recently on GPUs. However, these usually use simplified models for non-elastic (NE) proton-nucleus interactions. Our primary goal is to build a GPU-based proton transport MC with detailed modeling of elastic and NE collisions. Methods: Using CUDA, we implemented GPU kernels for these tasks: (1) Simulation of spots from our scanning nozzle configurations, (2) Proton propagation through CT geometry, considering nuclear elastic scattering, multiple scattering, and energy loss straggling, (3) Modeling of the intranuclear cascade stage of NE interactions, (4) Nuclear evaporation simulation, and (5) Statistical error estimates on the dose. To validate our MC, we performed: (1) Secondary particle yield calculations in NE collisions, (2) Dose calculations in homogeneous phantoms, (3) Re-calculations of head and neck plans from a commercial treatment planning system (TPS), and compared with Geant4.9.6p2/TOPAS. Results: Yields, en...
Li, Yongbao; Tian, Zhen; Song, Ting; Wu, Zhaoxia; Liu, Yaqiang; Jiang, Steve; Jia, Xun
2017-01-01
Monte Carlo (MC)-based spot dose calculation is highly desired for inverse treatment planning in proton therapy because of its accuracy. Recent studies on biological optimization have also indicated the use of MC methods to compute relevant quantities of interest, e.g. linear energy transfer. Although GPU-based MC engines have been developed to address inverse optimization problems, their efficiency still needs to be improved. Also, the use of a large number of GPUs in MC calculation is not favorable for clinical applications. The previously proposed adaptive particle sampling (APS) method can improve the efficiency of MC-based inverse optimization by using the computationally expensive MC simulation more effectively. This method is more efficient than the conventional approach that performs spot dose calculation and optimization in two sequential steps. In this paper, we propose a computational library to perform MC-based spot dose calculation on GPU with the APS scheme. The implemented APS method performs a non-uniform sampling of the particles from pencil beam spots during the optimization process, favoring those from the high intensity spots. The library also conducts two computationally intensive matrix-vector operations frequently used when solving an optimization problem. This library design allows a streamlined integration of the MC-based spot dose calculation into an existing proton therapy inverse planning process. We tested the developed library in a typical inverse optimization system with four patient cases. The library achieved the targeted functions by supporting inverse planning in various proton therapy schemes, e.g. single field uniform dose, 3D intensity modulated proton therapy, and distal edge tracking. The efficiency was 41.6 ± 15.3% higher than the use of a GPU-based MC package in a conventional calculation scheme. The total computation time ranged between 2 and 50 min on a single GPU card depending on the problem size.
Energy Technology Data Exchange (ETDEWEB)
Wuerl, Matthias
2016-08-01
Matthias Wuerl presents two essential steps to implement offline PET monitoring of proton dose delivery at a clinical facility, namely the setting up of an accurate Monte Carlo model of the clinical beamline and the experimental validation of positron emitter production cross-sections. In the first part, the field size dependence of the dose output is described for scanned proton beams. Both the Monte Carlo and an analytical computational beam model were able to accurately predict target dose, while the latter tends to overestimate dose in normal tissue. In the second part, the author presents PET measurements of different phantom materials, which were activated by the proton beam. The results indicate that for an irradiation with a high number of protons for the sake of good statistics, dead time losses of the PET scanner may become important and lead to an underestimation of positron-emitter production yields.
Kohno, R; Hotta, K; Nishioka, S; Matsubara, K; Tansho, R; Suzuki, T
2011-11-21
We implemented the simplified Monte Carlo (SMC) method on graphics processing unit (GPU) architecture under the computer-unified device architecture platform developed by NVIDIA. The GPU-based SMC was clinically applied for four patients with head and neck, lung, or prostate cancer. The results were compared to those obtained by a traditional CPU-based SMC with respect to the computation time and discrepancy. In the CPU- and GPU-based SMC calculations, the estimated mean statistical errors of the calculated doses in the planning target volume region were within 0.5% rms. The dose distributions calculated by the GPU- and CPU-based SMCs were similar, within statistical errors. The GPU-based SMC showed 12.30-16.00 times faster performance than the CPU-based SMC. The computation time per beam arrangement using the GPU-based SMC for the clinical cases ranged 9-67 s. The results demonstrate the successful application of the GPU-based SMC to a clinical proton treatment planning.
Validation of a GPU-based Monte Carlo code (gPMC) for proton radiation therapy: clinical cases study
Giantsoudi, Drosoula; Schuemann, Jan; Jia, Xun; Dowdell, Stephen; Jiang, Steve; Paganetti, Harald
2015-03-01
Monte Carlo (MC) methods are recognized as the gold-standard for dose calculation, however they have not replaced analytical methods up to now due to their lengthy calculation times. GPU-based applications allow MC dose calculations to be performed on time scales comparable to conventional analytical algorithms. This study focuses on validating our GPU-based MC code for proton dose calculation (gPMC) using an experimentally validated multi-purpose MC code (TOPAS) and compare their performance for clinical patient cases. Clinical cases from five treatment sites were selected covering the full range from very homogeneous patient geometries (liver) to patients with high geometrical complexity (air cavities and density heterogeneities in head-and-neck and lung patients) and from short beam range (breast) to large beam range (prostate). Both gPMC and TOPAS were used to calculate 3D dose distributions for all patients. Comparisons were performed based on target coverage indices (mean dose, V95, D98, D50, D02) and gamma index distributions. Dosimetric indices differed less than 2% between TOPAS and gPMC dose distributions for most cases. Gamma index analysis with 1%/1 mm criterion resulted in a passing rate of more than 94% of all patient voxels receiving more than 10% of the mean target dose, for all patients except for prostate cases. Although clinically insignificant, gPMC resulted in systematic underestimation of target dose for prostate cases by 1-2% compared to TOPAS. Correspondingly the gamma index analysis with 1%/1 mm criterion failed for most beams for this site, while for 2%/1 mm criterion passing rates of more than 94.6% of all patient voxels were observed. For the same initial number of simulated particles, calculation time for a single beam for a typical head and neck patient plan decreased from 4 CPU hours per million particles (2.8-2.9 GHz Intel X5600) for TOPAS to 2.4 s per million particles (NVIDIA TESLA C2075) for gPMC. Excellent agreement was
Energy Technology Data Exchange (ETDEWEB)
Kurosu, Keita [Department of Medical Physics and Engineering, Osaka University Graduate School of Medicine, Suita, Osaka 565-0871 (Japan); Department of Radiation Oncology, Osaka University Graduate School of Medicine, Suita, Osaka 565-0871 (Japan); Takashina, Masaaki; Koizumi, Masahiko [Department of Medical Physics and Engineering, Osaka University Graduate School of Medicine, Suita, Osaka 565-0871 (Japan); Das, Indra J. [Department of Radiation Oncology, Indiana University School of Medicine, Indianapolis, IN 46202 (United States); Moskvin, Vadim P., E-mail: vadim.p.moskvin@gmail.com [Department of Radiation Oncology, Indiana University School of Medicine, Indianapolis, IN 46202 (United States)
2014-10-01
Although three general-purpose Monte Carlo (MC) simulation tools: Geant4, FLUKA and PHITS have been used extensively, differences in calculation results have been reported. The major causes are the implementation of the physical model, preset value of the ionization potential or definition of the maximum step size. In order to achieve artifact free MC simulation, an optimized parameters list for each simulation system is required. Several authors have already proposed the optimized lists, but those studies were performed with a simple system such as only a water phantom. Since particle beams have a transport, interaction and electromagnetic processes during beam delivery, establishment of an optimized parameters-list for whole beam delivery system is therefore of major importance. The purpose of this study was to determine the optimized parameters list for GATE and PHITS using proton treatment nozzle computational model. The simulation was performed with the broad scanning proton beam. The influences of the customizing parameters on the percentage depth dose (PDD) profile and the proton range were investigated by comparison with the result of FLUKA, and then the optimal parameters were determined. The PDD profile and the proton range obtained from our optimized parameters list showed different characteristics from the results obtained with simple system. This led to the conclusion that the physical model, particle transport mechanics and different geometry-based descriptions need accurate customization in planning computational experiments for artifact-free MC simulation.
Benchmarking of Proton Transport in Super Monte Carlo Simulation Program
Wang, Yongfeng; Li, Gui; Song, Jing; Zheng, Huaqing; Sun, Guangyao; Hao, Lijuan; Wu, Yican
2014-06-01
The Monte Carlo (MC) method has been traditionally applied in nuclear design and analysis due to its capability of dealing with complicated geometries and multi-dimensional physics problems as well as obtaining accurate results. The Super Monte Carlo Simulation Program (SuperMC) is developed by FDS Team in China for fusion, fission, and other nuclear applications. The simulations of radiation transport, isotope burn-up, material activation, radiation dose, and biology damage could be performed using SuperMC. Complicated geometries and the whole physical process of various types of particles in broad energy scale can be well handled. Bi-directional automatic conversion between general CAD models and full-formed input files of SuperMC is supported by MCAM, which is a CAD/image-based automatic modeling program for neutronics and radiation transport simulation. Mixed visualization of dynamical 3D dataset and geometry model is supported by RVIS, which is a nuclear radiation virtual simulation and assessment system. Continuous-energy cross section data from hybrid evaluated nuclear data library HENDL are utilized to support simulation. Neutronic fixed source and critical design parameters calculates for reactors of complex geometry and material distribution based on the transport of neutron and photon have been achieved in our former version of SuperMC. Recently, the proton transport has also been intergrated in SuperMC in the energy region up to 10 GeV. The physical processes considered for proton transport include electromagnetic processes and hadronic processes. The electromagnetic processes include ionization, multiple scattering, bremsstrahlung, and pair production processes. Public evaluated data from HENDL are used in some electromagnetic processes. In hadronic physics, the Bertini intra-nuclear cascade model with exitons, preequilibrium model, nucleus explosion model, fission model, and evaporation model are incorporated to treat the intermediate energy nuclear
Directory of Open Access Journals (Sweden)
Biniam Yohannes Tesfamicael
2014-03-01
Full Text Available Purpose: To construct a dose monitoring system based on an endorectal balloon coupled to thin scintillating fibers to study the dose to the rectum in proton therapy of prostate cancer.Method: A Geant4 Monte Carlo toolkit was used to simulate the proton therapy of prostate cancer, with an endorectal balloon and a set of scintillating fibers for immobilization and dosimetry measurements, respectively.Results: A linear response of the fibers to the dose delivered was observed to within less than 2%. Results obtained show that fibers close to the prostate recorded higher dose, with the closest fiber recording about one-third of the dose to the target. A 1/r2 (r is defined as center-to-center distance between the prostate and the fibers decrease was observed as one goes toward the frontal and distal regions. A very low dose was recorded by the fibers beneath the balloon which is a clear indication that the overall volume of the rectal wall that is exposed to a higher dose is relatively minimized. Further analysis showed a relatively linear relationship between the dose to the target and the dose to the top fibers (total 17, with a slope of (-0.07 ± 0.07 at large number of events per degree of rotation of the modulator wheel (i.e., dose.Conclusion: Thin (1 mm × 1 mm, long (1 m scintillating fibers were found to be ideal for real time in-vivo dose measurement to the rectum during proton therapy of prostate cancer. The linear response of the fibers to the dose delivered makes them good candidates as dosimeters. With thorough calibration and the ability to define a good correlation between the dose to the target and the dose to the fibers, such dosimeters can be used for real time dose verification to the target.-----------------------------------Cite this article as: Tesfamicael BY, Avery S, Gueye P, Lyons D, Mahesh M. Scintillating fiber based in-vivo dose monitoring system to the rectum in proton therapy of prostate cancer: A Geant4 Monte Carlo
Tseung, H Wan Chan; Kreofsky, C R; Ma, D; Beltran, C
2016-01-01
Purpose: To demonstrate the feasibility of fast Monte Carlo (MC) based inverse biological planning for the treatment of head and neck tumors in spot-scanning proton therapy. Methods: Recently, a fast and accurate Graphics Processor Unit (GPU)-based MC simulation of proton transport was developed and used as the dose calculation engine in a GPU-accelerated IMPT optimizer. Besides dose, the dose-averaged linear energy transfer (LETd) can be simultaneously scored, which makes biological dose (BD) optimization possible. To convert from LETd to BD, a linear relation was assumed. Using this novel optimizer, inverse biological planning was applied to 4 patients: 2 small and 1 large thyroid tumor targets, and 1 glioma case. To create these plans, constraints were placed to maintain the physical dose (PD) within 1.25 times the prescription while maximizing target BD. For comparison, conventional IMRT and IMPT plans were created for each case in Eclipse (Varian, Inc). The same critical structure PD constraints were use...
Monte Carlo calculations of positron emitter yields in proton radiotherapy.
Seravalli, E; Robert, C; Bauer, J; Stichelbaut, F; Kurz, C; Smeets, J; Van Ngoc Ty, C; Schaart, D R; Buvat, I; Parodi, K; Verhaegen, F
2012-03-21
Positron emission tomography (PET) is a promising tool for monitoring the three-dimensional dose distribution in charged particle radiotherapy. PET imaging during or shortly after proton treatment is based on the detection of annihilation photons following the ß(+)-decay of radionuclides resulting from nuclear reactions in the irradiated tissue. Therapy monitoring is achieved by comparing the measured spatial distribution of irradiation-induced ß(+)-activity with the predicted distribution based on the treatment plan. The accuracy of the calculated distribution depends on the correctness of the computational models, implemented in the employed Monte Carlo (MC) codes that describe the interactions of the charged particle beam with matter and the production of radionuclides and secondary particles. However, no well-established theoretical models exist for predicting the nuclear interactions and so phenomenological models are typically used based on parameters derived from experimental data. Unfortunately, the experimental data presently available are insufficient to validate such phenomenological hadronic interaction models. Hence, a comparison among the models used by the different MC packages is desirable. In this work, starting from a common geometry, we compare the performances of MCNPX, GATE and PHITS MC codes in predicting the amount and spatial distribution of proton-induced activity, at therapeutic energies, to the already experimentally validated PET modelling based on the FLUKA MC code. In particular, we show how the amount of ß(+)-emitters produced in tissue-like media depends on the physics model and cross-sectional data used to describe the proton nuclear interactions, thus calling for future experimental campaigns aiming at supporting improvements of MC modelling for clinical application of PET monitoring.
A generic algorithm for Monte Carlo simulation of proton transport
Energy Technology Data Exchange (ETDEWEB)
Salvat, Francesc, E-mail: francesc.salvat@ub.edu
2013-12-01
A mixed (class II) algorithm for Monte Carlo simulation of the transport of protons, and other heavy charged particles, in matter is presented. The emphasis is on the electromagnetic interactions (elastic and inelastic collisions) which are simulated using strategies similar to those employed in the electron–photon code PENELOPE. Elastic collisions are described in terms of numerical differential cross sections (DCSs) in the center-of-mass frame, calculated from the eikonal approximation with the Dirac–Hartree–Fock–Slater atomic potential. The polar scattering angle is sampled by employing an adaptive numerical algorithm which allows control of interpolation errors. The energy transferred to the recoiling target atoms (nuclear stopping) is consistently described by transformation to the laboratory frame. Inelastic collisions are simulated from DCSs based on the plane–wave Born approximation (PWBA), making use of the Sternheimer–Liljequist model of the generalized oscillator strength, with parameters adjusted to reproduce (1) the electronic stopping power read from the input file, and (2) the total cross sections for impact ionization of inner subshells. The latter were calculated from the PWBA including screening and Coulomb corrections. This approach provides quite a realistic description of the energy-loss distribution in single collisions, and of the emission of X-rays induced by proton impact. The simulation algorithm can be readily modified to include nuclear reactions, when the corresponding cross sections and emission probabilities are available, and bremsstrahlung emission.
A generic algorithm for Monte Carlo simulation of proton transport
Salvat, Francesc
2013-12-01
A mixed (class II) algorithm for Monte Carlo simulation of the transport of protons, and other heavy charged particles, in matter is presented. The emphasis is on the electromagnetic interactions (elastic and inelastic collisions) which are simulated using strategies similar to those employed in the electron-photon code PENELOPE. Elastic collisions are described in terms of numerical differential cross sections (DCSs) in the center-of-mass frame, calculated from the eikonal approximation with the Dirac-Hartree-Fock-Slater atomic potential. The polar scattering angle is sampled by employing an adaptive numerical algorithm which allows control of interpolation errors. The energy transferred to the recoiling target atoms (nuclear stopping) is consistently described by transformation to the laboratory frame. Inelastic collisions are simulated from DCSs based on the plane-wave Born approximation (PWBA), making use of the Sternheimer-Liljequist model of the generalized oscillator strength, with parameters adjusted to reproduce (1) the electronic stopping power read from the input file, and (2) the total cross sections for impact ionization of inner subshells. The latter were calculated from the PWBA including screening and Coulomb corrections. This approach provides quite a realistic description of the energy-loss distribution in single collisions, and of the emission of X-rays induced by proton impact. The simulation algorithm can be readily modified to include nuclear reactions, when the corresponding cross sections and emission probabilities are available, and bremsstrahlung emission.
Proton therapy Monte Carlo SRNA-VOX code
Directory of Open Access Journals (Sweden)
Ilić Radovan D.
2012-01-01
Full Text Available The most powerful feature of the Monte Carlo method is the possibility of simulating all individual particle interactions in three dimensions and performing numerical experiments with a preset error. These facts were the motivation behind the development of a general-purpose Monte Carlo SRNA program for proton transport simulation in technical systems described by standard geometrical forms (plane, sphere, cone, cylinder, cube. Some of the possible applications of the SRNA program are: (a a general code for proton transport modeling, (b design of accelerator-driven systems, (c simulation of proton scattering and degrading shapes and composition, (d research on proton detectors; and (e radiation protection at accelerator installations. This wide range of possible applications of the program demands the development of various versions of SRNA-VOX codes for proton transport modeling in voxelized geometries and has, finally, resulted in the ISTAR package for the calculation of deposited energy distribution in patients on the basis of CT data in radiotherapy. All of the said codes are capable of using 3-D proton sources with an arbitrary energy spectrum in an interval of 100 keV to 250 MeV.
Energy Technology Data Exchange (ETDEWEB)
Kurosu, Keita [Department of Radiation Oncology, Indiana University School of Medicine, Indianapolis, IN 46202 (United States); Department of Radiation Oncology, Osaka University Graduate School of Medicine, Suita, Osaka 565-0871 (Japan); Department of Radiology, Osaka University Hospital, Suita, Osaka 565-0871 (Japan); Das, Indra J. [Department of Radiation Oncology, Indiana University School of Medicine, Indianapolis, IN 46202 (United States); Moskvin, Vadim P. [Department of Radiation Oncology, Indiana University School of Medicine, Indianapolis, IN 46202 (United States); Department of Radiation Oncology, St. Jude Children’s Research Hospital, Memphis, TN 38105 (United States)
2016-01-15
Spot scanning, owing to its superior dose-shaping capability, provides unsurpassed dose conformity, in particular for complex targets. However, the robustness of the delivered dose distribution and prescription has to be verified. Monte Carlo (MC) simulation has the potential to generate significant advantages for high-precise particle therapy, especially for medium containing inhomogeneities. However, the inherent choice of computational parameters in MC simulation codes of GATE, PHITS and FLUKA that is observed for uniform scanning proton beam needs to be evaluated. This means that the relationship between the effect of input parameters and the calculation results should be carefully scrutinized. The objective of this study was, therefore, to determine the optimal parameters for the spot scanning proton beam for both GATE and PHITS codes by using data from FLUKA simulation as a reference. The proton beam scanning system of the Indiana University Health Proton Therapy Center was modeled in FLUKA, and the geometry was subsequently and identically transferred to GATE and PHITS. Although the beam transport is managed by spot scanning system, the spot location is always set at the center of a water phantom of 600 × 600 × 300 mm{sup 3}, which is placed after the treatment nozzle. The percentage depth dose (PDD) is computed along the central axis using 0.5 × 0.5 × 0.5 mm{sup 3} voxels in the water phantom. The PDDs and the proton ranges obtained with several computational parameters are then compared to those of FLUKA, and optimal parameters are determined from the accuracy of the proton range, suppressed dose deviation, and computational time minimization. Our results indicate that the optimized parameters are different from those for uniform scanning, suggesting that the gold standard for setting computational parameters for any proton therapy application cannot be determined consistently since the impact of setting parameters depends on the proton irradiation
Energy Technology Data Exchange (ETDEWEB)
Kurosu, K [Department of Radiation Oncology, Osaka University Graduate School of Medicine, Osaka (Japan); Department of Medical Physics ' Engineering, Osaka University Graduate School of Medicine, Osaka (Japan); Takashina, M; Koizumi, M [Department of Medical Physics ' Engineering, Osaka University Graduate School of Medicine, Osaka (Japan); Das, I; Moskvin, V [Department of Radiation Oncology, Indiana University School of Medicine, Indianapolis, IN (United States)
2014-06-01
Purpose: Monte Carlo codes are becoming important tools for proton beam dosimetry. However, the relationships between the customizing parameters and percentage depth dose (PDD) of GATE and PHITS codes have not been reported which are studied for PDD and proton range compared to the FLUKA code and the experimental data. Methods: The beam delivery system of the Indiana University Health Proton Therapy Center was modeled for the uniform scanning beam in FLUKA and transferred identically into GATE and PHITS. This computational model was built from the blue print and validated with the commissioning data. Three parameters evaluated are the maximum step size, cut off energy and physical and transport model. The dependence of the PDDs on the customizing parameters was compared with the published results of previous studies. Results: The optimal parameters for the simulation of the whole beam delivery system were defined by referring to the calculation results obtained with each parameter. Although the PDDs from FLUKA and the experimental data show a good agreement, those of GATE and PHITS obtained with our optimal parameters show a minor discrepancy. The measured proton range R90 was 269.37 mm, compared to the calculated range of 269.63 mm, 268.96 mm, and 270.85 mm with FLUKA, GATE and PHITS, respectively. Conclusion: We evaluated the dependence of the results for PDDs obtained with GATE and PHITS Monte Carlo generalpurpose codes on the customizing parameters by using the whole computational model of the treatment nozzle. The optimal parameters for the simulation were then defined by referring to the calculation results. The physical model, particle transport mechanics and the different geometrybased descriptions need accurate customization in three simulation codes to agree with experimental data for artifact-free Monte Carlo simulation. This study was supported by Grants-in Aid for Cancer Research (H22-3rd Term Cancer Control-General-043) from the Ministry of Health
Srna - Monte Carlo codes for proton transport simulation in combined and voxelized geometries
Directory of Open Access Journals (Sweden)
Ilić Radovan D.
2002-01-01
Full Text Available This paper describes new Monte Carlo codes for proton transport simulations in complex geometrical forms and in materials of different composition. The SRNA codes were developed for three dimensional (3D dose distribution calculation in proton therapy and dosimetry. The model of these codes is based on the theory of proton multiple scattering and a simple model of compound nucleus decay. The developed package consists of two codes: SRNA-2KG and SRNA-VOX. The first code simulates proton transport in combined geometry that can be described by planes and second order surfaces. The second one uses the voxelized geometry of material zones and is specifically adopted for the application of patient computer tomography data. Transition probabilities for both codes are given by the SRNADAT program. In this paper, we will present the models and algorithms of our programs, as well as the results of the numerical experiments we have carried out applying them, along with the results of proton transport simulation obtained through the PETRA and GEANT programs. The simulation of the proton beam characterization by means of the Multi-Layer Faraday Cup and spatial distribution of positron emitters obtained by our program indicate the imminent application of Monte Carlo techniques in clinical practice.
Energy Technology Data Exchange (ETDEWEB)
Tesfamicael, B; Gueye, P; Lyons, D [Hampton University, Hampton, VA (United States); Mahesh, M [Johns Hopkins Univ, Baltimore, MD (United States); Avery, S [University of Pennsylvania, Sicklerville, NJ (United States)
2014-06-01
Purpose: To construct a dose monitoring system based on an endorectal balloon coupled to thin scintillating fibers to study the dose delivered to the rectum during prostate cancer proton therapy Methods: The Geant4 Monte Carlo toolkit version 9.6p02 was used to simulate prostate cancer proton therapy treatments of an endorectal balloon (for immobilization of a 2.9 cm diameter prostate gland) and a set of 34 scintillating fibers symmetrically placed around the balloon and perpendicular to the proton beam direction (for dosimetry measurements) Results: A linear response of the fibers to the dose delivered was observed within <2%, a property that makes them good candidates for real time dosimetry. Results obtained show that the closest fiber recorded about 1/3 of the dose to the target with a 1/r{sup 2} decrease in the dose distribution as one goes toward the frontal and distal top fibers. Very low dose was recorded by the bottom fibers (about 45 times comparatively), which is a clear indication that the overall volume of the rectal wall that is exposed to a higher dose is relatively minimized. Further analysis indicated a simple scaling relationship between the dose to the prostate and the dose to the top fibers (a linear fit gave a slope of −0.07±0.07 MeV per treatment Gy) Conclusion: Thin (1 mm × 1 mm × 100 cm) long scintillating fibers were found to be ideal for real time in-vivo dose measurement to the rectum for prostate cancer proton therapy. The linear response of the fibers to the dose delivered makes them good candidates of dosimeters. With thorough calibration and the ability to define a good correlation between the dose to the target and the dose to the fibers, such dosimeters can be used for real time dose verification to the target.
Monte Carlo calculations supporting patient plan verification in proton therapy
Directory of Open Access Journals (Sweden)
Thiago Viana Miranda Lima
2016-03-01
Full Text Available Patient’s treatment plan verification covers substantial amount of the quality assurance (QA resources, this is especially true for Intensity Modulated Proton Therapy (IMPT. The use of Monte Carlo (MC simulations in supporting QA has been widely discussed and several methods have been proposed. In this paper we studied an alternative approach from the one being currently applied clinically at Centro Nazionale di Adroterapia Oncologica (CNAO. We reanalysed the previously published data (Molinelli et al. 2013, where 9 patient plans were investigated in which the warning QA threshold of 3% mean dose deviation was crossed. The possibility that these differences between measurement and calculated dose were related to dose modelling (Treatment Planning Systems (TPS vs MC, limitations on dose delivery system or detectors mispositioning was originally explored but other factors such as the geometric description of the detectors were not ruled out. For the purpose of this work we compared ionisation-chambers measurements with different MC simulations results. It was also studied some physical effects introduced by this new approach for example inter detector interference and the delta ray thresholds. The simulations accounting for a detailed geometry typically are superior (statistical difference - p-value around 0.01 to most of the MC simulations used at CNAO (only inferior to the shift approach used. No real improvement were observed in reducing the current delta-ray threshold used (100 keV and no significant interference between ion chambers in the phantom were detected (p-value 0.81. In conclusion, it was observed that the detailed geometrical description improves the agreement between measurement and MC calculations in some cases. But in other cases position uncertainty represents the dominant uncertainty. The inter chamber disturbance was not detected for the therapeutic protons energies and the results from the current delta threshold are
Monte Carlo simulations of soft proton flares: testing the physics with XMM-Newton
Fioretti, Valentina; Bulgarelli, Andrea; Malaguti, Giuseppe; Spiga, Daniele; Tiengo, Andrea
2016-07-01
, we conclude that an average spectrum, based on the analysis of a full season of soft proton events is required to compare Monte Carlo simulations with real events.
Srna-Monte Carlo codes for proton transport simulation in combined and voxelized geometries
Ilic, R D; Stankovic, S J
2002-01-01
This paper describes new Monte Carlo codes for proton transport simulations in complex geometrical forms and in materials of different composition. The SRNA codes were developed for three dimensional (3D) dose distribution calculation in proton therapy and dosimetry. The model of these codes is based on the theory of proton multiple scattering and a simple model of compound nucleus decay. The developed package consists of two codes: SRNA-2KG and SRNA-VOX. The first code simulates proton transport in combined geometry that can be described by planes and second order surfaces. The second one uses the voxelized geometry of material zones and is specifically adopted for the application of patient computer tomography data. Transition probabilities for both codes are given by the SRNADAT program. In this paper, we will present the models and algorithms of our programs, as well as the results of the numerical experiments we have carried out applying them, along with the results of proton transport simulation obtaine...
Monte Carlo simulation in proton computed tomography: a study of image reconstruction technique
Energy Technology Data Exchange (ETDEWEB)
Inocente, Guilherme Franco; Stenico, Gabriela V.; Hormaza, Joel Mesa [Universidade Estadual Paulista Julio de Mesquita Filho (UNESP), Botucatu, SP (Brazil). Inst. de Biociencias. Dept. de Fisica e Biofisica
2012-07-01
Full text: The radiation method is one of the most used for cancer treatment. In this context arises therapy with proton beams in front of conventional radiotherapy. It is known that with proton therapy there are more advantages to the patient treated when compared with more conventional methods. The dose distributed along the path, especially in healthy tissues - neighbor the tumor, is smaller and the accuracy of treatment is much better. To carry out the treatment, the patient undergoes a plan through images for visualization and location of the target volume. The main method for obtaining these images is computed tomography X-ray (XCT). For treatment with proton beam this imaging technique can to generate some uncertainties. The purpose of this project is to study the feasibility of reconstructing images generated from the irradiation with proton beams, thereby reducing some inaccuracies, as it will be the same type of radiation as treatment planning, and also to drastically reduce some errors location, since the planning can be done at the same place and just before where the patient is treated. This study aims to obtain a relationship between the intrinsic property of the interaction of photons and protons with matter. For this we use computational simulation based on Monte Carlo method with the code SRIM 2008 and MCNPX v.2.5.0, to reconstruct images using the technique used in conventional computed tomography. (author)
Yepes, Pablo; Randeniya, Sharmalee; Taddei, Phillip J; Newhauser, Wayne D
2009-01-07
The Monte Carlo method is used to provide accurate dose estimates in proton radiation therapy research. While it is more accurate than commonly used analytical dose calculations, it is computationally intense. The aim of this work was to characterize for a clinical setup the fast dose calculator (FDC), a Monte Carlo track-repeating algorithm based on GEANT4. FDC was developed to increase computation speed without diminishing dosimetric accuracy. The algorithm used a database of proton trajectories in water to calculate the dose of protons in heterogeneous media. The extrapolation from water to 41 materials was achieved by scaling the proton range and the scattering angles. The scaling parameters were obtained by comparing GEANT4 dose distributions with those calculated with FDC for homogeneous phantoms. The FDC algorithm was tested by comparing dose distributions in a voxelized prostate cancer patient as calculated with well-known Monte Carlo codes (GEANT4 and MCNPX). The track-repeating approach reduced the CPU time required for a complete dose calculation in a voxelized patient anatomy by more than two orders of magnitude, while on average reproducing the results from the Monte Carlo predictions within 2% in terms of dose and within 1 mm in terms of distance.
Biegun, Aleksandra; Takatsu, Jun; Nakaji, Taku; van Goethem, Marc-Jan; van der Graaf, Emiel; Koffeman, E.; Visser, Jan; Brandenburg, Sijtze
2016-01-01
The novel proton radiography imaging technique has a large potential to be used in direct measurement of the proton energy loss (proton stopping power, PSP) in various tissues in the patient. The uncertainty of PSPs, currently obtained from translation of X-ray Computed Tomography (xCT) images, shou
Biegun, Aleksandra; Takatsu, Jun; Nakaji, Taku; van Goethem, Marc-Jan; van der Graaf, Emiel; Koffeman, E.; Visser, Jan; Brandenburg, Sijtze
2016-01-01
The novel proton radiography imaging technique has a large potential to be used in direct measurement of the proton energy loss (proton stopping power, PSP) in various tissues in the patient. The uncertainty of PSPs, currently obtained from translation of X-ray Computed Tomography (xCT) images, shou
ELRADGEN: Monte Carlo generator for radiative events in elastic electron-proton scattering
Afanasiev, A V; Ilyichev, A N; Niczyporuk, B B
2003-01-01
We discuss the theoretical approach and practical algorithms for simulation of radiative events in elastic ep-scattering. A new Monte Carlo generator for real photon emission events in the process of elastic electron-proton scattering is presented. We perform a few consistency checks and present numerical results.
Lai, Bo-Lun; Sheu, Rong-Jiun; Lin, Uei-Tyng
2015-05-01
Monte Carlo simulations are generally considered the most accurate method for complex accelerator shielding analysis. Simplified models based on point-source line-of-sight approximation are often preferable in practice because they are intuitive and easy to use. A set of shielding data, including source terms and attenuation lengths for several common targets (iron, graphite, tissue, and copper) and shielding materials (concrete, iron, and lead) were generated by performing Monte Carlo simulations for 100-300 MeV protons. Possible applications and a proper use of the data set were demonstrated through a practical case study, in which shielding analysis on a typical proton treatment room was conducted. A thorough and consistent comparison between the predictions of our point-source line-of-sight model and those obtained by Monte Carlo simulations for a 360° dose distribution around the room perimeter showed that the data set can yield fairly accurate or conservative estimates for the transmitted doses, except for those near the maze exit. In addition, this study demonstrated that appropriate coupling between the generated source term and empirical formulae for radiation streaming can be used to predict a reasonable dose distribution along the maze. This case study proved the effectiveness and advantage of applying the data set to a quick shielding design and dose evaluation for proton therapy accelerators.
Biegun, A. K.; Takatsu, J.; Nakaji, T.; van Goethem, M. J.; van der Graaf, E. R.; Koffeman, E. N.; Visser, J.; Brandenburg, S.
2016-12-01
The novel proton radiography imaging technique has a large potential to be used in direct measurement of the proton energy loss (proton stopping power, PSP) in various tissues in the patient. The uncertainty of PSPs, currently obtained from translation of X-ray Computed Tomography (xCT) images, should be minimized from 3-5% or higher to less than 1%, to make the treatment plan with proton beams more accurate, and thereby better treatment for the patient. With Geant4 we simulated a proton radiography detection system with two position-sensitive and residual energy detectors. A complex phantom filled with various materials (including tissue surrogates), was placed between the position sensitive detectors. The phantom was irradiated with 150 MeV protons and the energy loss radiograph and scattering angles were studied. Protons passing through different materials in the phantom lose energy, which was used to create a radiography image of the phantom. The multiple Coulomb scattering of a proton traversing different materials causes blurring of the image. To improve image quality and material identification in the phantom, we selected protons with small scattering angles. A good quality proton radiography image, in which various materials can be recognized accurately, and in combination with xCT can lead to more accurate relative stopping powers predictions.
Application of proton boron fusion reaction to radiation therapy: A Monte Carlo simulation study
Yoon, Do-Kun; Jung, Joo-Young; Suh, Tae Suk
2014-12-01
Three alpha particles are emitted from the point of reaction between a proton and boron. The alpha particles are effective in inducing the death of a tumor cell. After boron is accumulated in the tumor region, the emitted from outside the body proton can react with the boron in the tumor region. An increase of the proton's maximum dose level is caused by the boron and only the tumor cell is damaged more critically. In addition, a prompt gamma ray is emitted from the proton boron reaction point. Here, we show that the effectiveness of the proton boron fusion therapy was verified using Monte Carlo simulations. We found that a dramatic increase by more than half of the proton's maximum dose level was induced by the boron in the tumor region. This increase occurred only when the proton's maximum dose point was located within the boron uptake region. In addition, the 719 keV prompt gamma ray peak produced by the proton boron fusion reaction was positively detected. This therapy method features the advantages such as the application of Bragg-peak to the therapy, the accurate targeting of tumor, improved therapy effects, and the monitoring of the therapy region during treatment.
The proton therapy nozzles at Samsung Medical Center: A Monte Carlo simulation study using TOPAS
Chung, Kwangzoo; Kim, Jinsung; Kim, Dae-Hyun; Ahn, Sunghwan; Han, Youngyih
2015-07-01
To expedite the commissioning process of the proton therapy system at Samsung Medical Center (SMC), we have developed a Monte Carlo simulation model of the proton therapy nozzles by using TOol for PArticle Simulation (TOPAS). At SMC proton therapy center, we have two gantry rooms with different types of nozzles: a multi-purpose nozzle and a dedicated scanning nozzle. Each nozzle has been modeled in detail following the geometry information provided by the manufacturer, Sumitomo Heavy Industries, Ltd. For this purpose, the novel features of TOPAS, such as the time feature or the ridge filter class, have been used, and the appropriate physics models for proton nozzle simulation have been defined. Dosimetric properties, like percent depth dose curve, spreadout Bragg peak (SOBP), and beam spot size, have been simulated and verified against measured beam data. Beyond the Monte Carlo nozzle modeling, we have developed an interface between TOPAS and the treatment planning system (TPS), RayStation. An exported radiotherapy (RT) plan from the TPS is interpreted by using an interface and is then translated into the TOPAS input text. The developed Monte Carlo nozzle model can be used to estimate the non-beam performance, such as the neutron background, of the nozzles. Furthermore, the nozzle model can be used to study the mechanical optimization of the design of the nozzle.
Comparison of some popular Monte Carlo solution for proton transportation within pCT problem
Energy Technology Data Exchange (ETDEWEB)
Evseev, Ivan; Assis, Joaquim T. de; Yevseyeva, Olga [Universidade do Estado do Rio de Janeiro (UERJ), Nova Friburgo, RJ (Brazil). Inst. Politecnico], E-mail: evseev@iprj.uerj.br, E-mail: joaquim@iprj.uerj.br, E-mail: yevseyeva@iprj.uerj.br; Lopes, Ricardo T.; Cardoso, Jose J.B.; Silva, Ademir X. da [Universidade Federal do Rio de Janeiro (UFRJ), RJ (Brazil). Coordenacao dos Programas de Pos-graduacao de Engenharia (COPPE). Lab. de Instrumentacao Nuclear], E-mail: ricardo@lin.ufrj.br, E-mail: jjbrum@oi.com.br, E-mail: ademir@con.ufrj.br; Vinagre Filho, Ubirajara M. [Instituto de Engenharia Nuclear IEN/CNEN-RJ, Rio de Janeiro, RJ (Brazil)], E-mail: bira@ien.gov.br; Hormaza, Joel M. [UNESP, Botucatu, SP (Brazil). Inst. de Biociencias], E-mail: jmesa@ibb.unesp.br; Schelin, Hugo R.; Paschuk, Sergei A.; Setti, Joao A.P.; Milhoretto, Edney [Universidade Tecnologica Federal do Parana, Curitiba, PR (Brazil)], E-mail: schelin@cpgei.cefetpr.br, E-mail: sergei@utfpr.edu.br, E-mail: jsetti@gmail.com, E-mail: edneymilhoretto@yahoo.com
2007-07-01
The proton transport in matter is described by the Boltzmann kinetic equation for the proton flux density. This equation, however, does not have a general analytical solution. Some approximate analytical solutions have been developed within a number of significant simplifications. Alternatively, the Monte Carlo simulations are widely used. Current work is devoted to the discussion of the proton energy spectra obtained by simulation with SRIM2006, GEANT4 and MCNPX packages. The simulations have been performed considering some further applications of the obtained results in computed tomography with proton beam (pCT). Thus the initial and outgoing proton energies (3 / 300 MeV) as well as the thickness of irradiated target (water and aluminum phantoms within 90% of the full range for a given proton beam energy) were considered in the interval of values typical for pCT applications. One from the most interesting results of this comparison is that while the MCNPX spectra are in a good agreement with analytical description within Fokker-Plank approximation and the GEANT4 simulated spectra are slightly shifted from them the SRIM2006 simulations predict a notably higher mean energy loss for protons. (author)
Energy Technology Data Exchange (ETDEWEB)
Cirrone, G.A.P., E-mail: cirrone@lns.infn.it [Laboratori Nazionali del Sud - National Instiute for Nuclear Physics INFN (INFN-LNS), Via S.Sofia 64, 95100 Catania (Italy); Bucciolini, M. [Department of ' Fisiopatologia Clinica' , University of Florence, V.le Morgagni 85, I-50134 Florence (Italy); Bruzzi, M. [Energetic Department, University of Florence, Via S. Marta 3, I-50139 Florence (Italy); Candiano, G. [Laboratorio di Tecnologie Oncologiche HSR, Giglio Contrada, Pietrapollastra-Pisciotto, 90015 Cefalu, Palermo (Italy); Civinini, C. [National Institute for Nuclear Physics INFN, Section of Florence, Via G. Sansone 1, Sesto Fiorentino, I-50019 Florence (Italy); Cuttone, G. [Laboratori Nazionali del Sud - National Instiute for Nuclear Physics INFN (INFN-LNS), Via S.Sofia 64, 95100 Catania (Italy); Guarino, P. [Nuclear Engineering Department, University of Palermo, Via... Palermo (Italy); Laboratori Nazionali del Sud - National Instiute for Nuclear Physics INFN (INFN-LNS), Via S.Sofia 64, 95100 Catania (Italy); Lo Presti, D. [Physics Department, University of Catania, Via S. Sofia 64, I-95123, Catania (Italy); Mazzaglia, S.E. [Laboratori Nazionali del Sud - National Instiute for Nuclear Physics INFN (INFN-LNS), Via S.Sofia 64, 95100 Catania (Italy); Pallotta, S. [Department of ' Fisiopatologia Clinica' , University of Florence, V.le Morgagni 85, I-50134 Florence (Italy); Randazzo, N. [National Institute for Nuclear Physics INFN, Section of Catania, Via S.Sofia 64, 95123 Catania (Italy); Sipala, V. [National Institute for Nuclear Physics INFN, Section of Catania, Via S.Sofia 64, 95123 Catania (Italy); Physics Department, University of Catania, Via S. Sofia 64, I-95123, Catania (Italy); Stancampiano, C. [National Institute for Nuclear Physics INFN, Section of Catania, Via S.Sofia 64, 95123 Catania (Italy); and others
2011-12-01
In this paper the use of the Filtered Back Projection (FBP) Algorithm, in order to reconstruct tomographic images using the high energy (200-250 MeV) proton beams, is investigated. The algorithm has been studied in detail with a Monte Carlo approach and image quality has been analysed and compared with the total absorbed dose. A proton Computed Tomography (pCT) apparatus, developed by our group, has been fully simulated to exploit the power of the Geant4 Monte Carlo toolkit. From the simulation of the apparatus, a set of tomographic images of a test phantom has been reconstructed using the FBP at different absorbed dose values. The images have been evaluated in terms of homogeneity, noise, contrast, spatial and density resolution.
Pia, Maria Grazia; Lechner, Anton; Quintieri, Lina; Saracco, Paolo
2010-01-01
The issue of how epistemic uncertainties affect the outcome of Monte Carlo simulation is discussed by means of a concrete use case: the simulation of the longitudinal energy deposition profile of low energy protons. A variety of electromagnetic and hadronic physics models is investigated, and their effects are analyzed. Possible systematic effects are highlighted. The results identify requirements for experimental measurements capable of reducing epistemic uncertainties in the physics models.
A new PET prototype for proton therapy: comparison of data and Monte Carlo simulations
Rosso, V.; Battistoni, G.; Belcari, N.; Camarlinghi, N.; Ferrari, A.; Ferretti, S.; Kraan, A.; Mairani, A.; Marino, N.; Ortuño, J. E.; Pullia, M.; Sala, P.; Santos, A.; Sportelli, G.; Straub, K.; Del Guerra, A.
2013-03-01
Ion beam therapy is a valuable method for the treatment of deep-seated and radio-resistant tumors thanks to the favorable depth-dose distribution characterized by the Bragg peak. Hadrontherapy facilities take advantage of the specific ion range, resulting in a highly conformal dose in the target volume, while the dose in critical organs is reduced as compared to photon therapy. The necessity to monitor the delivery precision, i.e. the ion range, is unquestionable, thus different approaches have been investigated, such as the detection of prompt photons or annihilation photons of positron emitter nuclei created during the therapeutic treatment. Based on the measurement of the induced β+ activity, our group has developed various in-beam PET prototypes: the one under test is composed by two planar detector heads, each one consisting of four modules with a total active area of 10 × 10 cm2. A single detector module is made of a LYSO crystal matrix coupled to a position sensitive photomultiplier and is read-out by dedicated frontend electronics. A preliminary data taking was performed at the Italian National Centre for Oncological Hadron Therapy (CNAO, Pavia), using proton beams in the energy range of 93-112 MeV impinging on a plastic phantom. The measured activity profiles are presented and compared with the simulated ones based on the Monte Carlo FLUKA package.
The Proton Therapy Nozzles at Samsung Medical Center: A Monte Carlo Simulation Study using TOPAS
Chung, Kwangzoo; Kim, Dae-Hyun; Ahn, Sunghwan; Han, Youngyih
2015-01-01
To expedite the commissioning process of the proton therapy system at Samsung Medical Center (SMC), we have developed a Monte Carlo simulation model of the proton therapy nozzles using TOPAS. At SMC proton therapy center, we have two gantry rooms with different types of nozzles; a multi-purpose nozzle and a dedicated scanning nozzle. Each nozzle has been modeled in detail following the geometry information provided by the manufacturer, Sumitomo Heavy Industries, Ltd. For this purpose, novel features of TOPAS, such as the time feature or the ridge filter class, have been used. And the appropriate physics models for proton nozzle simulation were defined. Dosimetric properties, like percent depth dose curve, spread-out Bragg peak (SOBP), beam spot size, have been simulated and verified against measured beam data. Beyond the Monte Carlo nozzle modeling, we have developed an interface between TOPAS and the treatment planning system (TPS), RayStation. An exported RT plan data from the TPS has been interpreted by th...
Zheng, Yuanshui; Newhauser, Wayne; Klein, Eric; Low, Daniel
2009-11-21
Neutron production is of principal concern when designing proton therapy vault shielding. Conventionally, neutron calculations are based on analytical methods, which do not accurately consider beam shaping components and nozzle shielding. The goal of this study was to calculate, using Monte Carlo modeling, the neutron spectral fluence and neutron dose equivalent generated by a realistic proton therapy nozzle and evaluate how these data could be used in shielding calculations. We modeled a contemporary passive scattering proton therapy nozzle in detail with the MCNPX simulation code. The neutron spectral fluence and dose equivalent at various locations in the treatment room were calculated and compared to those obtained from a thick iron target bombarded by parallel proton beams, the simplified geometry on which analytical methods are based. The neutron spectral fluence distributions were similar for both methods, with deeply penetrating high-energy neutrons (E > 10 MeV) being most prevalent along the beam central axis, and low-energy neutrons predominating the neutron spectral fluence in the lateral region. However, unlike the inverse square falloff used in conventional analytical methods, this study shows that the neutron dose equivalent per therapeutic dose in the treatment room decreased with distance approximately following a power law, with an exponent of about -1.63 in the lateral region and -1.73 in the downstream region. Based on the simulated data according to the detailed nozzle modeling, we developed an empirical equation to estimate the neutron dose equivalent at any location and distance in the treatment vault, e.g. for cases in which detailed Monte Carlo modeling is not feasible. We applied the simulated neutron spectral fluence and dose equivalent to a shielding calculation as an example.
Farah, J.; Martinetti, F.; Sayah, R.; Lacoste, V.; Donadille, L.; Trompier, F.; Nauraye, C.; De Marzi, L.; Vabre, I.; Delacroix, S.; Hérault, J.; Clairand, I.
2014-06-01
Monte Carlo calculations are increasingly used to assess stray radiation dose to healthy organs of proton therapy patients and estimate the risk of secondary cancer. Among the secondary particles, neutrons are of primary concern due to their high relative biological effectiveness. The validation of Monte Carlo simulations for out-of-field neutron doses remains however a major challenge to the community. Therefore this work focused on developing a global experimental approach to test the reliability of the MCNPX models of two proton therapy installations operating at 75 and 178 MeV for ocular and intracranial tumor treatments, respectively. The method consists of comparing Monte Carlo calculations against experimental measurements of: (a) neutron spectrometry inside the treatment room, (b) neutron ambient dose equivalent at several points within the treatment room, (c) secondary organ-specific neutron doses inside the Rando-Alderson anthropomorphic phantom. Results have proven that Monte Carlo models correctly reproduce secondary neutrons within the two proton therapy treatment rooms. Sensitive differences between experimental measurements and simulations were nonetheless observed especially with the highest beam energy. The study demonstrated the need for improved measurement tools, especially at the high neutron energy range, and more accurate physical models and cross sections within the Monte Carlo code to correctly assess secondary neutron doses in proton therapy applications.
Farah, J; Martinetti, F; Sayah, R; Lacoste, V; Donadille, L; Trompier, F; Nauraye, C; De Marzi, L; Vabre, I; Delacroix, S; Hérault, J; Clairand, I
2014-06-07
Monte Carlo calculations are increasingly used to assess stray radiation dose to healthy organs of proton therapy patients and estimate the risk of secondary cancer. Among the secondary particles, neutrons are of primary concern due to their high relative biological effectiveness. The validation of Monte Carlo simulations for out-of-field neutron doses remains however a major challenge to the community. Therefore this work focused on developing a global experimental approach to test the reliability of the MCNPX models of two proton therapy installations operating at 75 and 178 MeV for ocular and intracranial tumor treatments, respectively. The method consists of comparing Monte Carlo calculations against experimental measurements of: (a) neutron spectrometry inside the treatment room, (b) neutron ambient dose equivalent at several points within the treatment room, (c) secondary organ-specific neutron doses inside the Rando-Alderson anthropomorphic phantom. Results have proven that Monte Carlo models correctly reproduce secondary neutrons within the two proton therapy treatment rooms. Sensitive differences between experimental measurements and simulations were nonetheless observed especially with the highest beam energy. The study demonstrated the need for improved measurement tools, especially at the high neutron energy range, and more accurate physical models and cross sections within the Monte Carlo code to correctly assess secondary neutron doses in proton therapy applications.
A Monte Carlo pencil beam scanning model for proton treatment plan simulation using GATE/GEANT4
Energy Technology Data Exchange (ETDEWEB)
Grevillot, L; Freud, N; Sarrut, D [Universite de Lyon, CREATIS, CNRS UMR5220, Inserm U1044, INSA-Lyon, Universite Lyon 1, Centre Leon Berard, Lyon (France); Bertrand, D; Dessy, F, E-mail: loic.grevillot@creatis.insa-lyon.fr [IBA, B-1348, Louvain-la Neuve (Belgium)
2011-08-21
This work proposes a generic method for modeling scanned ion beam delivery systems, without simulation of the treatment nozzle and based exclusively on beam data library (BDL) measurements required for treatment planning systems (TPS). To this aim, new tools dedicated to treatment plan simulation were implemented in the Gate Monte Carlo platform. The method was applied to a dedicated nozzle from IBA for proton pencil beam scanning delivery. Optical and energy parameters of the system were modeled using a set of proton depth-dose profiles and spot sizes measured at 27 therapeutic energies. For further validation of the beam model, specific 2D and 3D plans were produced and then measured with appropriate dosimetric tools. Dose contributions from secondary particles produced by nuclear interactions were also investigated using field size factor experiments. Pristine Bragg peaks were reproduced with 0.7 mm range and 0.2 mm spot size accuracy. A 32 cm range spread-out Bragg peak with 10 cm modulation was reproduced with 0.8 mm range accuracy and a maximum point-to-point dose difference of less than 2%. A 2D test pattern consisting of a combination of homogeneous and high-gradient dose regions passed a 2%/2 mm gamma index comparison for 97% of the points. In conclusion, the generic modeling method proposed for scanned ion beam delivery systems was applicable to an IBA proton therapy system. The key advantage of the method is that it only requires BDL measurements of the system. The validation tests performed so far demonstrated that the beam model achieves clinical performance, paving the way for further studies toward TPS benchmarking. The method involves new sources that are available in the new Gate release V6.1 and could be further applied to other particle therapy systems delivering protons or other types of ions like carbon.
Proton microbeam radiotherapy with scanned pencil-beams--Monte Carlo simulations.
Kłodowska, M; Olko, P; Waligórski, M P R
2015-09-01
Irradiation, delivered by a synchrotron facility, using a set of highly collimated, narrow and parallel photon beams spaced by 1 mm or less, has been termed Microbeam Radiation Therapy (MRT). The tolerance of healthy tissue after MRT was found to be better than after standard broad X-ray beams, together with a more pronounced response of malignant tissue. The microbeam spacing and transverse peak-to-valley dose ratio (PVDR) are considered to be relevant biological MRT parameters. We investigated the MRT concept for proton microbeams, where we expected different depth-dose profiles and PVDR dependences, resulting in skin sparing and homogeneous dose distributions at larger beam depths, due to differences between interactions of proton and photon beams in tissue. Using the FLUKA Monte Carlo code we simulated PVDR distributions for differently spaced 0.1 mm (sigma) pencil-beams of entrance energies 60, 80, 100 and 120 MeV irradiating a cylindrical water phantom with and without a bone layer, representing human head. We calculated PVDR distributions and evaluated uniformity of target irradiation at distal beam ranges of 60-120 MeV microbeams. We also calculated PVDR distributions for a 60 MeV spread-out Bragg peak microbeam configuration. Application of optimised proton MRT in terms of spot size, pencil-beam distribution, entrance beam energy, multiport irradiation, combined with relevant radiobiological investigations, could pave the way for hypofractionation scenarios where tissue sparing at the entrance, better malignant tissue response and better dose conformity of target volume irradiation could be achieved, compared with present proton beam radiotherapy configurations.
Monte Carlo calculations of relativistic solar proton propagation in interplanetary space
Lumme, M.; Torsti, J. J.; Vainikka, E.; Peltonen, J.; Nieminen, M.; Valtonen, E.; Arvelta, H.
1985-01-01
Particle fluxes and pitch angle distributions of relativistic solar protons at 1 AU were determined by Monte Carlo calculations. The analysis covers two hours after the release of the particles from the Sun and total of eight 100000 particle trajectories were simulated. The pitch angle scattering was assumed to be isotropic ad the scattering mean free path was varied from 0.1 to 4 AU. As an application, the solar injection time and interplanetary scattering mean free path of particles that gave rise to the GLE on May, 1978 were determined. Assuming exponential form, the injection decay time was found to be about 11 minutes. The m.f.p. of pitch angle scattering during the event was about 1 AU.
Energy Technology Data Exchange (ETDEWEB)
Wang, Z [Reading Hospital, West Reading, PA (United States); Gao, M [ProCure Treatment Centers, Warrenville, IL (United States)
2014-06-01
Purpose: Monte Carlo simulation plays an important role for proton Pencil Beam Scanning (PBS) technique. However, MC simulation demands high computing power and is limited to few large proton centers that can afford a computer cluster. We study the feasibility of utilizing cloud computing in the MC simulation of PBS beams. Methods: A GATE/GEANT4 based MC simulation software was installed on a commercial cloud computing virtual machine (Linux 64-bits, Amazon EC2). Single spot Integral Depth Dose (IDD) curves and in-air transverse profiles were used to tune the source parameters to simulate an IBA machine. With the use of StarCluster software developed at MIT, a Linux cluster with 2–100 nodes can be conveniently launched in the cloud. A proton PBS plan was then exported to the cloud where the MC simulation was run. Results: The simulated PBS plan has a field size of 10×10cm{sup 2}, 20cm range, 10cm modulation, and contains over 10,000 beam spots. EC2 instance type m1.medium was selected considering the CPU/memory requirement and 40 instances were used to form a Linux cluster. To minimize cost, master node was created with on-demand instance and worker nodes were created with spot-instance. The hourly cost for the 40-node cluster was $0.63 and the projected cost for a 100-node cluster was $1.41. Ten million events were simulated to plot PDD and profile, with each job containing 500k events. The simulation completed within 1 hour and an overall statistical uncertainty of < 2% was achieved. Good agreement between MC simulation and measurement was observed. Conclusion: Cloud computing is a cost-effective and easy to maintain platform to run proton PBS MC simulation. When proton MC packages such as GATE and TOPAS are combined with cloud computing, it will greatly facilitate the pursuing of PBS MC studies, especially for newly established proton centers or individual researchers.
Energy Technology Data Exchange (ETDEWEB)
Krause, Claudius
2012-04-15
High energy proton-proton collisions lead to a large amount of secondary particles to be measured in a detector. A final state containing top quarks is of particular interest. But top quarks are only produced in a small fraction of the collisions. Hence, criteria must be defined to separate events containing top quarks from the background. From detectors, we record signals, for example hits in the tracker system or deposits in the calorimeters. In order to obtain the momentum of the particles, we apply algorithms to reconstruct tracks in space. More sophisticated algorithms are needed to identify the flavour of quarks, such as b-tagging. Several steps are needed to test these algorithms. Collision products of proton-proton events are generated using Monte Carlo techniques and their passage through the detector is simulated. After that, the algorithms are applied and the signal efficiency and the mistagging rate can be obtained. There are, however, many different approaches and algorithms realized in programs, so the question arises if the choice of the Monte Carlo generator influences the measured quantities. In this thesis, two commonly used Monte Carlo generators, SHERPA and MadGraph/MadEvent, are compared and the differences in the selection efficiency of semimuonic tt events are estimated. In addition, the distributions of kinematic variables are shown. A special chapter about the matching of matrix elements with parton showers is included. The main algorithms, CKKW for SHERPA and MLM for MadGraph/MadEvent, are introduced.
Energy Technology Data Exchange (ETDEWEB)
Tran, H.N., E-mail: tranngochoang@tdt.edu.vn [Division of Nuclear Physics, Ton Duc Thang University, Tan Phong Ward, District 7, Ho Chi Minh City (Viet Nam); Faculty of Applied Sciences, Ton Duc Thang University, Tan Phong Ward, District 7, Ho Chi Minh City (Viet Nam); Karamitros, M. [Notre Dame Radiation Laboratory, University of Notre-Dame, IN 46556 (United States); Ivanchenko, V.N. [Geant4 Associates International Ltd, Hebden Bridge (United Kingdom); Guatelli, S.; McKinnon, S. [Centre For Medical Radiation Physics, University of Wollongong (Australia); Illawarra Health and Medical Research, University of Wollongong, NSW (Australia); Murakami, K.; Sasaki, T.; Okada, S. [Computing Research Center, High Energy Accelerator Organization, KEK, Tsukuba City (Japan); Bordage, M.C. [INSERM, UMR 1037, CRCT, F-31000 Toulouse (France); Univ. Toulouse III-Paul Sabatier, UMR 1037, CRCT, F-31000 Toulouse (France); Francis, Z. [Saint Joseph University, Faculty of Sciences, Department of Physics, Beirut (Lebanon); El Bitar, Z. [Institut Pluridisciplinaire Hubert Curien/IN2P3/CNRS, Strasbourg (France); Bernal, M.A. [Instituto de Física Gleb Wataghin, Universidade Estadual de Campinas, SP (Brazil); Shin, J.I. [Division of Heavy Ion Clinical Research, Korea Institute of Radiological and Medical Science, 75, Nowon-ro, Nowon-gu, Seoul (Korea, Republic of); Lee, S.B. [Proton Therapy Center, National Cancer Center, 323, Ilsan-ro, Ilsandong-gu, Goyang-si, Gyeonggi-do (Korea, Republic of); Barberet, Ph. [Univ. Bordeaux, CENBG, UMR 5797, F-33170 Gradignan (France); CNRS, IN2P3, CENBG, UMR 5797, F-33170 Gradignan (France); Tran, T.T. [VNUHCM-University of Science (Viet Nam); Brown, J.M.C. [School of Mathematics and Physics, Queen’s University Belfast, Belfast, Northern Ireland (United Kingdom); and others
2016-04-15
Gold nanoparticles have been reported as a possible radio-sensitizer agent in radiation therapy due to their ability to increase energy deposition and subsequent direct damage to cells and DNA within their local vicinity. Moreover, this increase in energy deposition also results in an increase of the radiochemical yields. In this work we present, for the first time, an in silico investigation, based on the general purpose Monte Carlo simulation toolkit Geant4, into energy deposition and radical species production around a spherical gold nanoparticle 50 nm in diameter via proton irradiation. Simulations were preformed for incident proton energies ranging from 2 to 170 MeV, which are of interest for clinical proton therapy.
Wu, D; Yu, W; Fritzsche, S
2016-01-01
A Monte-Carlo approach to proton stopping in warm dense matter is implemented into an existing particle-in-cell code. The model is based on multiple binary-collisions among electron-electron, electron-ion and ion-ion, taking into account contributions from both free and bound electrons, and allows to calculate particle stopping in much more natural manner. At low temperature limit, when ``all'' electron are bounded at the nucleus, the stopping power converges to the predictions of Bethe-Bloch theory, which shows good consistency with data provided by the NIST. With the rising of temperatures, more and more bound electron are ionized, thus giving rise to an increased stopping power to cold matter, which is consistent with the report of a recently experimental measurement [Phys. Rev. Lett. 114, 215002 (2015)]. When temperature is further increased, with ionizations reaching the maximum, lowered stopping power is observed, which is due to the suppression of collision frequency between projected proton beam and h...
Wu, D.; He, X. T.; Yu, W.; Fritzsche, S.
2017-02-01
A Monte Carlo approach to proton stopping in warm dense matter is implemented into an existing particle-in-cell code. This approach is based on multiple electron-electron, electron-ion, and ion-ion binary collision and accounts for both the free and the bound electrons in the plasmas. This approach enables one to calculate the stopping of particles in a more natural manner than existing theoretical treatment. In the low-temperature limit, when "all" electrons are bound to the nucleus, the stopping power coincides with the predictions from the Bethe-Bloch formula and is consistent with the data from the National Institute of Standard and Technology database. At higher temperatures, some of the bound electrons are ionized, and this increases the stopping power in the plasmas, as demonstrated by A. B. Zylstra et al. [Phys. Rev. Lett. 114, 215002 (2015)], 10.1103/PhysRevLett.114.215002. At even higher temperatures, the degree of ionization reaches a maximum and thus decreases the stopping power due to the suppression of collision frequency between projected proton beam and hot plasmas in the target.
Energy Technology Data Exchange (ETDEWEB)
Shin, J; Park, S; Jeong, J; Jeong, C [National Cancer Center, Goyang, Gyeonggi-do (Korea, Republic of); Lim, Y; Lee, S [National Cancer Center in Korea, Goyang, Gyeonggi-do (Korea, Republic of); SHIN, D [National Cancer Center, Goyangsi, Gyeonggi-do (Korea, Republic of); Incerti, S [Universite Bordeaux 1, CNRS.IN2P3, Centres d’Etudes Nucleaires de Bordeau, Gradignan, Gradignan (France)
2014-06-01
Purpose: In particle therapy and radiobiology, the investigation of mechanisms leading to the death of target cancer cells induced by ionising radiation is an active field of research. Recently, several studies based on Monte Carlo simulation codes have been initiated in order to simulate physical interactions of ionising particles at cellular scale and in DNA. Geant4-DNA is the one of them; it is an extension of the general purpose Geant4 Monte Carlo simulation toolkit for the simulation of physical interactions at sub-micrometre scale. In this study, we present Geant4-DNA Monte Carlo simulations for the prediction of DNA strand breakage using a geometrical modelling of DNA structure. Methods: For the simulation of DNA strand breakage, we developed a specific DNA geometrical structure. This structure consists of DNA components, such as the deoxynucleotide pairs, the DNA double helix, the nucleosomes and the chromatin fibre. Each component is made of water because the cross sections models currently available in Geant4-DNA for protons apply to liquid water only. Also, at the macroscopic-scale, protons were generated with various energies available for proton therapy at the National Cancer Center, obtained using validated proton beam simulations developed in previous studies. These multi-scale simulations were combined for the validation of Geant4-DNA in radiobiology. Results: In the double helix structure, the deposited energy in a strand allowed to determine direct DNA damage from physical interaction. In other words, the amount of dose and frequency of damage in microscopic geometries was related to direct radiobiological effect. Conclusion: In this report, we calculated the frequency of DNA strand breakage using Geant4- DNA physics processes for liquid water. This study is now on-going in order to develop geometries which use realistic DNA material, instead of liquid water. This will be tested as soon as cross sections for DNA material become available in Geant4
Accelerated GPU based SPECT Monte Carlo simulations.
Garcia, Marie-Paule; Bert, Julien; Benoit, Didier; Bardiès, Manuel; Visvikis, Dimitris
2016-06-07
Monte Carlo (MC) modelling is widely used in the field of single photon emission computed tomography (SPECT) as it is a reliable technique to simulate very high quality scans. This technique provides very accurate modelling of the radiation transport and particle interactions in a heterogeneous medium. Various MC codes exist for nuclear medicine imaging simulations. Recently, new strategies exploiting the computing capabilities of graphical processing units (GPU) have been proposed. This work aims at evaluating the accuracy of such GPU implementation strategies in comparison to standard MC codes in the context of SPECT imaging. GATE was considered the reference MC toolkit and used to evaluate the performance of newly developed GPU Geant4-based Monte Carlo simulation (GGEMS) modules for SPECT imaging. Radioisotopes with different photon energies were used with these various CPU and GPU Geant4-based MC codes in order to assess the best strategy for each configuration. Three different isotopes were considered: (99m) Tc, (111)In and (131)I, using a low energy high resolution (LEHR) collimator, a medium energy general purpose (MEGP) collimator and a high energy general purpose (HEGP) collimator respectively. Point source, uniform source, cylindrical phantom and anthropomorphic phantom acquisitions were simulated using a model of the GE infinia II 3/8" gamma camera. Both simulation platforms yielded a similar system sensitivity and image statistical quality for the various combinations. The overall acceleration factor between GATE and GGEMS platform derived from the same cylindrical phantom acquisition was between 18 and 27 for the different radioisotopes. Besides, a full MC simulation using an anthropomorphic phantom showed the full potential of the GGEMS platform, with a resulting acceleration factor up to 71. The good agreement with reference codes and the acceleration factors obtained support the use of GPU implementation strategies for improving computational
Accelerated GPU based SPECT Monte Carlo simulations
Garcia, Marie-Paule; Bert, Julien; Benoit, Didier; Bardiès, Manuel; Visvikis, Dimitris
2016-06-01
Monte Carlo (MC) modelling is widely used in the field of single photon emission computed tomography (SPECT) as it is a reliable technique to simulate very high quality scans. This technique provides very accurate modelling of the radiation transport and particle interactions in a heterogeneous medium. Various MC codes exist for nuclear medicine imaging simulations. Recently, new strategies exploiting the computing capabilities of graphical processing units (GPU) have been proposed. This work aims at evaluating the accuracy of such GPU implementation strategies in comparison to standard MC codes in the context of SPECT imaging. GATE was considered the reference MC toolkit and used to evaluate the performance of newly developed GPU Geant4-based Monte Carlo simulation (GGEMS) modules for SPECT imaging. Radioisotopes with different photon energies were used with these various CPU and GPU Geant4-based MC codes in order to assess the best strategy for each configuration. Three different isotopes were considered: 99m Tc, 111In and 131I, using a low energy high resolution (LEHR) collimator, a medium energy general purpose (MEGP) collimator and a high energy general purpose (HEGP) collimator respectively. Point source, uniform source, cylindrical phantom and anthropomorphic phantom acquisitions were simulated using a model of the GE infinia II 3/8" gamma camera. Both simulation platforms yielded a similar system sensitivity and image statistical quality for the various combinations. The overall acceleration factor between GATE and GGEMS platform derived from the same cylindrical phantom acquisition was between 18 and 27 for the different radioisotopes. Besides, a full MC simulation using an anthropomorphic phantom showed the full potential of the GGEMS platform, with a resulting acceleration factor up to 71. The good agreement with reference codes and the acceleration factors obtained support the use of GPU implementation strategies for improving computational efficiency
Lin, Yuting; McMahon, Stephen J; Scarpelli, Matthew; Paganetti, Harald; Schuemann, Jan
2014-12-21
Gold nanoparticles (GNPs) have shown potential to be used as a radiosensitizer for radiation therapy. Despite extensive research activity to study GNP radiosensitization using photon beams, only a few studies have been carried out using proton beams. In this work Monte Carlo simulations were used to assess the dose enhancement of GNPs for proton therapy. The enhancement effect was compared between a clinical proton spectrum, a clinical 6 MV photon spectrum, and a kilovoltage photon source similar to those used in many radiobiology lab settings. We showed that the mechanism by which GNPs can lead to dose enhancements in radiation therapy differs when comparing photon and proton radiation. The GNP dose enhancement using protons can be up to 14 and is independent of proton energy, while the dose enhancement is highly dependent on the photon energy used. For the same amount of energy absorbed in the GNP, interactions with protons, kVp photons and MV photons produce similar doses within several nanometers of the GNP surface, and differences are below 15% for the first 10 nm. However, secondary electrons produced by kilovoltage photons have the longest range in water as compared to protons and MV photons, e.g. they cause a dose enhancement 20 times higher than the one caused by protons 10 μm away from the GNP surface. We conclude that GNPs have the potential to enhance radiation therapy depending on the type of radiation source. Proton therapy can be enhanced significantly only if the GNPs are in close proximity to the biological target.
Feasibility Study of Neutron Dose for Real Time Image Guided Proton Therapy: A Monte Carlo Study
Kim, Jin Sung; Kim, Daehyun; Shin, EunHyuk; Chung, Kwangzoo; Cho, Sungkoo; Ahn, Sung Hwan; Ju, Sanggyu; Chung, Yoonsun; Jung, Sang Hoon; Han, Youngyih
2015-01-01
Two full rotating gantry with different nozzles (Multipurpose nozzle with MLC, Scanning Dedicated nozzle) with conventional cyclotron system is installed and under commissioning for various proton treatment options at Samsung Medical Center in Korea. The purpose of this study is to investigate neutron dose equivalent per therapeutic dose, H/D, to x-ray imaging equipment under various treatment conditions with monte carlo simulation. At first, we investigated H/D with the various modifications of the beam line devices (Scattering, Scanning, Multi-leaf collimator, Aperture, Compensator) at isocenter, 20, 40, 60 cm distance from isocenter and compared with other research groups. Next, we investigated the neutron dose at x-ray equipments used for real time imaging with various treatment conditions. Our investigation showed the 0.07 ~ 0.19 mSv/Gy at x-ray imaging equipments according to various treatment options and intestingly 50% neutron dose reduction effect of flat panel detector was observed due to multi- lea...
Energy Technology Data Exchange (ETDEWEB)
Cho, S; Shin, E H; Kim, J; Ahn, S H; Chung, K; Kim, D-H; Han, Y; Choi, D H [Samsung Medical Center, Seoul (Korea, Republic of)
2015-06-15
Purpose: To evaluate the shielding wall design to protect patients, staff and member of the general public for secondary neutron using a simply analytic solution, multi-Monte Carlo code MCNPX, ANISN and FLUKA. Methods: An analytical and multi-Monte Carlo method were calculated for proton facility (Sumitomo Heavy Industry Ltd.) at Samsung Medical Center in Korea. The NCRP-144 analytical evaluation methods, which produced conservative estimates on the dose equivalent values for the shielding, were used for analytical evaluations. Then, the radiation transport was simulated with the multi-Monte Carlo code. The neutron dose at evaluation point is got by the value using the production of the simulation value and the neutron dose coefficient introduced in ICRP-74. Results: The evaluation points of accelerator control room and control room entrance are mainly influenced by the point of the proton beam loss. So the neutron dose equivalent of accelerator control room for evaluation point is 0.651, 1.530, 0.912, 0.943 mSv/yr and the entrance of cyclotron room is 0.465, 0.790, 0.522, 0.453 mSv/yr with calculation by the method of NCRP-144 formalism, ANISN, FLUKA and MCNP, respectively. The most of Result of MCNPX and FLUKA using the complicated geometry showed smaller values than Result of ANISN. Conclusion: The neutron shielding for a proton therapy facility has been evaluated by the analytic model and multi-Monte Carlo methods. We confirmed that the setting of shielding was located in well accessible area to people when the proton facility is operated.
Eslami, M.; Kakavand, T.; Mirzaii, M.; Rajabifar, S.
2015-01-01
The 22Ne(p,n)22Na is an optimal reaction for the cyclotron production of 22Na. This work tends to monitor the proton induced production of 22Na in a gas-cell target, containing natural and enriched neon gas, using Monte Carlo method. The excitation functions of reactions are calculated by both TALYS-1.6 and ALICE/ASH codes and then the optimum energy range of projectile for the high yield production is selected. A free gaseous environment of neon at a particular pressure and temperature is prearranged and the proton beam is transported within it using Monte Carlo codes MCNPX and SRIM. The beam monitoring performed by each of these codes indicates that the gas-cell has to be designed as conical frustum to reach desired interactions. The MCNPX is also employed to calculate the energy distribution of proton in the designed target and estimation of the residual nuclei during irradiation. The production yield of 22Na in 22Ne(p,n)22Na and natNe(p,x)22Na reactions are estimated and it shows a good agreement with the experimental results. The results demonstrate that Monte Carlo makes available a beneficial manner to design and optimize the gas targets as well as calibration of detectors, which can be used for the radionuclide production purposes.
Mizutani, Shohei; Takada, Yoshihisa; Kohno, Ryosuke; Hotta, Kenji; Tansho, Ryohei; Akimoto, Tetsuo
2016-03-01
Full Monte Carlo (FMC) calculation of dose distribution has been recognized to have superior accuracy, compared with the pencil beam algorithm (PBA). However, since the FMC methods require long calculation time, it is difficult to apply them to routine treatment planning at present. In order to improve the situation, a simplified Monte Carlo (SMC) method has been introduced to the dose kernel calculation applicable to dose optimization procedure for the proton pencil beam scanning. We have evaluated accuracy of the SMC calculation by comparing a result of the dose kernel calculation using the SMC method with that using the FMC method in an inhomogeneous phantom. The dose distribution obtained by the SMC method was in good agreement with that obtained by the FMC method. To assess the usefulness of SMC calculation in clinical situations, we have compared results of the dose calculation using the SMC with those using the PBA method for three clinical cases of tumor treatment. The dose distributions calculated with the PBA dose kernels appear to be homogeneous in the planning target volumes (PTVs). In practice, the dose distributions calculated with the SMC dose kernels with the spot weights optimized with the PBA method show largely inhomogeneous dose distributions in the PTVs, while those with the spot weights optimized with the SMC method have moderately homogeneous distributions in the PTVs. Calculation using the SMC method is faster than that using the GEANT4 by three orders of magnitude. In addition, the graphic processing unit (GPU) boosts the calculation speed by 13 times for the treatment planning using the SMC method. Thence, the SMC method will be applicable to routine clinical treatment planning for reproduction of the complex dose distribution more accurately than the PBA method in a reasonably short time by use of the GPU-based calculation engine. PACS number(s): 87.55.Gh.
Energy Technology Data Exchange (ETDEWEB)
Eslami, M., E-mail: mohammad.eslami25@yahoo.com [Department of Physics, Faculty of Science, University of Zanjan, Zengan (Zanjan) (Iran, Islamic Republic of); Kakavand, T. [Department of Physics, Faculty of Science, University of Zanjan, Zengan (Zanjan) (Iran, Islamic Republic of); Department of Physics, Faculty of Science, Imam Khomeini International University, Qazvin (Iran, Islamic Republic of); Mirzaii, M.; Rajabifar, S. [Agricultural, Medical and Industrial Research School, Nuclear Science and Technology Research Institute, AEOI, Karaj (Iran, Islamic Republic of)
2015-01-01
Highlights: • Angular distribution of the proton beam in a gaseous environment. • Particle energy distribution profile and proton flux within gas-cell target with MCNPX. • Detection of the residual nuclei during the nuclear reactions. • Estimation of production yield for {sup 22,nat}Ne(p,x){sup 22}Na reactions. - Abstract: The {sup 22}Ne(p,n){sup 22}Na is an optimal reaction for the cyclotron production of {sup 22}Na. This work tends to monitor the proton induced production of {sup 22}Na in a gas-cell target, containing natural and enriched neon gas, using Monte Carlo method. The excitation functions of reactions are calculated by both TALYS-1.6 and ALICE/ASH codes and then the optimum energy range of projectile for the high yield production is selected. A free gaseous environment of neon at a particular pressure and temperature is prearranged and the proton beam is transported within it using Monte Carlo codes MCNPX and SRIM. The beam monitoring performed by each of these codes indicates that the gas-cell has to be designed as conical frustum to reach desired interactions. The MCNPX is also employed to calculate the energy distribution of proton in the designed target and estimation of the residual nuclei during irradiation. The production yield of {sup 22}Na in {sup 22}Ne(p,n){sup 22}Na and {sup nat}Ne(p,x){sup 22}Na reactions are estimated and it shows a good agreement with the experimental results. The results demonstrate that Monte Carlo makes available a beneficial manner to design and optimize the gas targets as well as calibration of detectors, which can be used for the radionuclide production purposes.
CERN Summer Student Report 2016 Monte Carlo Data Base Improvement
Caciulescu, Alexandru Razvan
2016-01-01
During my Summer Student project I worked on improving the Monte Carlo Data Base and MonALISA services for the ALICE Collaboration. The project included learning the infrastructure for tracking and monitoring of the Monte Carlo productions as well as developing a new RESTful API for seamless integration with the JIRA issue tracking framework.
Hydrogen-bonded proton transfer in the protonated guanine-cytosine (GC+H)+ base pair.
Lin, Yuexia; Wang, Hongyan; Gao, Simin; Schaefer, Henry F
2011-10-13
The single proton transfer at the different sites of the Watson-Crick (WC) guanine-cytosine (GC) DNA base pair are studied here using density functional methods. The conventional protonated structures, transition state (TS) and proton-transferred product (PT) structures of every relevant species are optimized. Each transition state and proton-transferred product structure has been compared with the corresponding conventional protonated structure to demonstrate the process of proton transfer and the change of geometrical structures. The relative energies of the protonated tautomers and the proton-transfer energy profiles in gas and solvent are analyzed. The proton-transferred product structure G(+H(+))-H(+)C(N3)(-H(+))(PT) has the lowest relative energy for which only two hydrogen bonds exist. Almost all 14 isomers of the protonated GC base pair involve hydrogen-bonded proton transfer following the three pathways, with the exception of structure G-H(+)C(O2). When the positive charge is primarily "located" on the guanine moiety (H(+)G-C, G-H(+)C(C4), and G-H(+)C(C6)), the H(1) proton transfers from the N(1) site of guanine to the N(3) site of cytosine. The structures G-H(+)C(C5) and G-H(+)C(C4) involve H(4a) proton transfer from the N(4) of cytosine to the O(6) site of guanine. H(2a) proton transfer from the N(2) site of guanine to the O(2) site of cytosine is found only for the structure G-H(+)C(C4). The structures to which a proton is added on the six-centered sites adjoining the hydrogen bonds are more prone to proton transfer in the gas phase, whereas a proton added on the minor groove and the sites adjoining the hydrogen bonds is favorable to the proton transfer in energy in the aqueous phase.
Monte carlo computation of the energy deposited by protons in water, bone and adipose
Küçer, Rahmi; Küçer, Nermin; Türemen, Görkem
2013-02-01
Protons are most suitable for treating deeply-seated tumors due to their unique depth dose distribution. The maximum dose of protons is a pronounced peak, called the Bragg peak, with zero dose behind the peak. The objective of radiation therapy with protons is to deliver the dose to the target volume by using this type of distribution. This is achieved with a finite number of Bragg peaks at the depth of the target volume. The location of the peak in terms of depth depends on the energy of the protons. Simulations are used to determine the depth dose distribution of proton beams passing through tissue, so it is important that experimental data agree with the simulation data. In this study, we used the FLUKA computer code to determine the correct position of the Bragg peak for proton beams passing through water, bone and adipose, and the results were compared with experimental data.
Proton Dose Assessment to the Human Eye Using Monte Carlo N-Particle Transport Code (MCNPX)
2006-08-01
objective of this project was to develop a simple MCNPX model of the human eye to approximate dose delivered from proton therapy. The calculated dose...computer code MCNPX that approximates dose delivered during proton therapy. The calculations considered proton interactions and secondary interactions...Volume Calculation The MCNPX code has limited ability to compute the volumes of defined cells. The dosimetric volumes in the outer wall of the eye are
Energy Technology Data Exchange (ETDEWEB)
Palmans, H. [Ghent Univ. (Belgium). Dept. of Biomedical Physics; Verhaegen, F.
1995-12-01
In the last decade, several clinical proton beam therapy facilities have been developed. To satisfy the demand for uniformity in clinical (routine) proton beam dosimetry two dosimetry protocols (ECHED and AAPM) have been published. Both protocols neglect the influence of ion chamber dependent parameters on dose determination in proton beams because of the scatter properties of these beams, although the problem has not been studied thoroughly yet. A comparison between water calorimetry and ionisation chamber dosimetry showed a discrepancy of 2.6% between the former method and ionometry following the ECHED protocol. Possibly, a small part of this difference can be attributed to chamber dependent correction factors. Indications for this possibility are found in ionometry measurements. To allow the simulation of complex geometries with different media necessary for the study of those corrections, an existing proton Monte Carlo code (PTRAN, Berger) has been modified. The original code, that applies Mollire`s multiple scattering theory and Vavilov`s energy straggling theory, calculates depth dose profiles, energy distributions and radial distributions for pencil beams in water. Comparisons with measurements and calculations reported in the literature are done to test the program`s accuracy. Preliminary results of the influence of chamber design and chamber materials on dose to water determination are presented.
Energy Technology Data Exchange (ETDEWEB)
Bourhaleb, F; Givehchi, N; Iliescu, S; Rosa, A La; Pecka, A; Peroni, C [Dipartimento di Fisica Sperimentale, Universita' di Torino, Via P. Giuria 1, Torino 10125 (Italy); Attili, A; Cirio, R; Marchetto, F; Donetti, M; Garella, M A; Giordanengo, S; Pardo, J [INFN, Sezione di Torino, Via P. Giuria 1, Torino 10125 (Italy); Cirrone, P [INFN, Laboratori Nazionali del Sud, Via S.Sofia 62, Catania 95125 (Italy)], E-mail: bourhaleb@to.infn.it
2008-02-01
Proton and carbon ion beams have a very sharp Bragg peak. For proton beams of energies smaller than 100 MeV, fitting with a gaussian the region of the maximum of the Bragg peak, the sigma along the beam direction is smaller than 1 mm, while for carbon ion beams, the sigma derived with the same technique is smaller than 1 mm for energies up to 360 MeV. In order to use low energy proton and carbon ion beams in hadrontherapy and to achieve an acceptable homogeneity of the spread out Bragg peak (SOBP) either the peak positions along the beam have to be quite close to each other or the longitudinal peak shape needs to be broaden at least few millimeters by means of a properly designed ripple filter. With a synchrotron accelerator in conjunction with active scanning techniques the use of a ripple filter is necessary to reduce the numbers of energy switches necessary to obtain a smooth SOBP, leading also to shorter overall irradiation times. We studied the impact of the design of the ripple filter on the dose uniformity in the SOBP region by means of Monte Carlo simulations, implemented using the package Geant4. We simulated the beam delivery line supporting both proton and carbon ion beams using different energies of the beams. We compared the effect of different kind of ripple filters and their advantages.
Gomà, Carles; Andreo, Pedro; Sempau, Josep
2016-03-01
This work calculates beam quality correction factors (k Q ) in monoenergetic proton beams using detailed Monte Carlo simulation of ionization chambers. It uses the Monte Carlo code penh and the electronic stopping powers resulting from the adoption of two different sets of mean excitation energy values for water and graphite: (i) the currently ICRU 37 and ICRU 49 recommended {{I}\\text{w}}=75~\\text{eV} and {{I}\\text{g}}=78~\\text{eV} and (ii) the recently proposed {{I}\\text{w}}=78~\\text{eV} and {{I}\\text{g}}=81.1~\\text{eV} . Twelve different ionization chambers were studied. The k Q factors calculated using the two different sets of I-values were found to agree with each other within 1.6% or better. k Q factors calculated using current ICRU I-values were found to agree within 2.3% or better with the k Q factors tabulated in IAEA TRS-398, and within 1% or better with experimental values published in the literature. k Q factors calculated using the new I-values were also found to agree within 1.1% or better with the experimental values. This work concludes that perturbation correction factors in proton beams—currently assumed to be equal to unity—are in fact significantly different from unity for some of the ionization chambers studied.
Kalantzis, Georgios; Tachibana, Hidenobu
2014-01-01
For microdosimetric calculations event-by-event Monte Carlo (MC) methods are considered the most accurate. The main shortcoming of those methods is the extensive requirement for computational time. In this work we present an event-by-event MC code of low projectile energy electron and proton tracks for accelerated microdosimetric MC simulations on a graphic processing unit (GPU). Additionally, a hybrid implementation scheme was realized by employing OpenMP and CUDA in such a way that both GPU and multi-core CPU were utilized simultaneously. The two implementation schemes have been tested and compared with the sequential single threaded MC code on the CPU. Performance comparison was established on the speed-up for a set of benchmarking cases of electron and proton tracks. A maximum speedup of 67.2 was achieved for the GPU-based MC code, while a further improvement of the speedup up to 20% was achieved for the hybrid approach. The results indicate the capability of our CPU-GPU implementation for accelerated MC microdosimetric calculations of both electron and proton tracks without loss of accuracy.
Comparison of Monte Carlo simulations with proton experiment for a thick Au absorber
Energy Technology Data Exchange (ETDEWEB)
Yevseyeva, Olga; Assis, Joaquim T. de, E-mail: yevseveva@iprj.uerj.b, E-mail: joaquim@iprj.uerj.b [Universidade do Estado do Rio de Janeiro (UERJ), Nova Friburgo, RJ (Brazil). Inst. Politecnico; Evseev, Ivan; Schelin, Hugo R.; Paschuk, Sergei A.; Milhoretto, Edney; Setti, Joao A.P., E-mail: evseev@utfpr.edu.b, E-mail: schelin@utfpr.edu.b, E-mail: sergei@utfpr.edu.b, E-mail: edneymilhoretto@yahoo.co, E-mail: jsetti@gmail.co [Universidade Tecnologica Federal do Parana, Curitiba, PR (Brazil); Diaz, Katherin S., E-mail: kshtejer@infomed.sld.c [Centro de Aplicaciones Tecnologicas y Desarrollo Nuclear, Havana (Cuba); Hormaza, Joel M., E-mail: jmesa@ibb.unesp.b [UNESP, Botucatu, SP (Brazil). Inst. de Biociencias; Lopes, Ricardo T., E-mail: ricardo@lin.ufrj.b [Coordenacao dos Programas de Pos-graduacao de Engenharia (COPPE/UFRJ), Rio de Janeiro, RJ (Brazil). Lab. de Instrumentacao Nuclear
2009-07-01
Proton therapy applications deal with relatively thick targets like the human head or the trunk. Therefore, relatively small differences in the total proton stopping power given, for example, by the different models provided by GEANT4, could lead to significant disagreement in the final proton energy spectra when integrated along lengthy proton trajectories. This work presents a comparison of proton energy spectra for 49.1 MeV protons passing through a couple of Au absorbers with different thicknesses obtained by GEANT4.8.2 simulations using ICRU49, Ziegler1985 and Ziegler2000 models. The comparison was made with the experimental data of Tschalaer, with TRIM/SRIM2008 and MCNPX2.4.0 simulations, and the Payne analytical solution for the transport equation in the Fokker-Plank approximation. It is shown that the simulations reproduce the experimental spectra with some detectable contradictions. It should be noted that all the spectra lay at the proton energies significantly above 2 MeV, i.e. in the so-called 'Bethe-Bloch region'. Therefore the observed disagreements in GEANT4 results, simulated with different models, are somewhat unexpected. Further studies for a better understanding and to obtain definitive conclusions are necessary. (author)
Monte-Carlo simulation-based statistical modeling
Chen, John
2017-01-01
This book brings together expert researchers engaged in Monte-Carlo simulation-based statistical modeling, offering them a forum to present and discuss recent issues in methodological development as well as public health applications. It is divided into three parts, with the first providing an overview of Monte-Carlo techniques, the second focusing on missing data Monte-Carlo methods, and the third addressing Bayesian and general statistical modeling using Monte-Carlo simulations. The data and computer programs used here will also be made publicly available, allowing readers to replicate the model development and data analysis presented in each chapter, and to readily apply them in their own research. Featuring highly topical content, the book has the potential to impact model development and data analyses across a wide spectrum of fields, and to spark further research in this direction.
A Monte Carlo tool for combined photon and proton treatment planning verification
Energy Technology Data Exchange (ETDEWEB)
Seco, J [Massachusetts General Hospital and Harvard Medical School, Boston, MA (United States); Jiang, H [University of Arkansas for Medical Sciences, 4301 W. Markham Street, Little Rock, Arkansas 72202 USA (United States); Herrup, D [Massachusetts General Hospital and Harvard Medical School, Boston, MA (United States); Kooy, H [Massachusetts General Hospital and Harvard Medical School, Boston, MA (United States); Paganetti, H [Massachusetts General Hospital and Harvard Medical School, Boston, MA (United States)
2007-06-15
Photons and protons are usually used independently to treat cancer. However, at MGH patients can be treated with both photons and protons since both modalities are available on site. A combined therapy can be advantageous in cancer therapy due to the skin sparing ability of photons and the sharp Bragg peak fall-off for protons beyond the tumor. In the present work, we demonstrate how to implement a combined 3D MC toolkit for photon and proton (ph-pr) therapy, which can be used for verification of the treatment plan. The commissioning of a MC system for combined ph-pr involves initially the development of a MC model of both the photon and proton treatment heads. The MC dose tool was evaluated on a head and neck patient treated with both combined photon and proton beams. The combined ph-pr dose agreed with measurements in solid water phantom to within 3%mm. Comparison with commercial planning system pencil beam prediction agrees within 3% (except for air cavities and bone regions)
Energy Technology Data Exchange (ETDEWEB)
Farah, J; Bonfrate, A; Donadille, L; Dubourg, N; Lacoste, V; Martinetti, F; Sayah, R; Trompier, F; Clairand, I [IRSN - Institute for Radiological Protection and Nuclear Safety, Fontenay-aux-roses (France); Caresana, M [Politecnico di Milano, Milano (Italy); Delacroix, S; Nauraye, C [Institut Curie - Centre de Protontherapie d Orsay, Orsay (France); Herault, J [Centre Antoine Lacassagne, Nice (France); Piau, S; Vabre, I [Institut de Physique Nucleaire d Orsay, Orsay (France)
2014-06-01
Purpose: Measure stray radiation inside a passive scattering proton therapy facility, compare values to Monte Carlo (MC) simulations and identify the actual needs and challenges. Methods: Measurements and MC simulations were considered to acknowledge neutron exposure associated with 75 MeV ocular or 180 MeV intracranial passively scattered proton treatments. First, using a specifically-designed high sensitivity Bonner Sphere system, neutron spectra were measured at different positions inside the treatment rooms. Next, measurement-based mapping of neutron ambient dose equivalent was fulfilled using several TEPCs and rem-meters. Finally, photon and neutron organ doses were measured using TLDs, RPLs and PADCs set inside anthropomorphic phantoms (Rando, 1 and 5-years-old CIRS). All measurements were also simulated with MCNPX to investigate the efficiency of MC models in predicting stray neutrons considering different nuclear cross sections and models. Results: Knowledge of the neutron fluence and energy distribution inside a proton therapy room is critical for stray radiation dosimetry. However, as spectrometry unfolding is initiated using a MC guess spectrum and suffers from algorithmic limits a 20% spectrometry uncertainty is expected. H*(10) mapping with TEPCs and rem-meters showed a good agreement between the detectors. Differences within measurement uncertainty (10–15%) were observed and are inherent to the energy, fluence and directional response of each detector. For a typical ocular and intracranial treatment respectively, neutron doses outside the clinical target volume of 0.4 and 11 mGy were measured inside the Rando phantom. Photon doses were 2–10 times lower depending on organs position. High uncertainties (40%) are inherent to TLDs and PADCs measurements due to the need for neutron spectra at detector position. Finally, stray neutrons prediction with MC simulations proved to be extremely dependent on proton beam energy and the used nuclear models and
DEFF Research Database (Denmark)
Palmans, Hugo; Al-Sulaiti, L; Andreo, P
2013-01-01
-to-graphite in a graphite phantom to dose-to-water in a water phantom for 60 MeV mono-energetic protons were calculated using an analytical model and five different Monte Carlo codes (Geant4, FLUKA, MCNPX, SHIELD-HIT and McPTRAN.MEDIA). In general the fluence correction factors are found to be close to unity...... in energy at equivalent depths in water and graphite can be described by kfl = 0.9964 + 0.0024 ⋅ zw-eq with a relative standard uncertainty of 0.2%. Fluence correction factors derived from a ratio of calculated doses at equivalent depths in water and graphite can be described by kfl = 0.9947 + 0.0024 ⋅ zw...
Depauw, Nicolas; Seco, Joao
2011-04-21
The imaging sensitivity of proton radiography has been studied and compared with kV and MV x-ray imaging using Monte Carlo simulations. A phantom was specifically modeled using 21 different material inserts with densities ranging from 0.001 to 1.92 g cm(-3). These simulations were run using the MGH double scattered proton beam, scanned pencil proton beams from 200 to 490 MeV, as well as pure 50 keV, 100 keV, 1 MeV and 2 MeV gamma x-ray beams. In order to compare the physics implied in both proton and photon radiography without being biased by the current state of the art in detector technology, the detectors were considered perfect. Along with spatial resolution, the contrast-to-noise ratio was evaluated and compared for each material. These analyses were performed using radiographic images that took into account the following: only primary protons, both primary and secondary protons, and both contributions while performing angular and energetic cuts. Additionally, tissue-to-tissue contrasts in an actual lung cancer patient case were studied for simulated proton radiographs and compared against the original kV x-ray image which corresponds to the current patient set-up image in the proton clinic. This study highlights the poorer spatial resolution of protons versus x-rays for radiographic imaging purposes, and the excellent density resolution of proton radiography. Contrasts around the tumor are higher using protons in a lung cancer patient case. The high-density resolution of proton radiography is of great importance for specific tumor diagnostics, such as in lung cancer, where x-ray radiography operates poorly. Furthermore, the use of daily proton radiography prior to proton therapy would ameliorate patient set-up while reducing the absorbed dose delivered through imaging.
Monte Carlo simulations of soft proton flares: testing the physics with XMM-Newton
Fioretti, Valentina; Malaguti, Giuseppe; Spiga, Daniele; Tiengo, Andrea
2016-01-01
Low energy protons (<100-300 keV) in the Van Allen belt and the outer regions can enter the field of view of X-ray focusing telescopes, interact with the Wolter-I optics, and reach the focal plane. The use of special filters protects the XMM-Newton focal plane below an altitude of 70000 km, but above this limit the effect of soft protons is still present in the form of sudden flares in the count rate of the EPIC instruments, causing the loss of large amounts of observing time. We try to characterize the input proton population and the physics interaction by simulating, using the BoGEMMS framework, the proton interaction with a simplified model of the X-ray mirror module and the focal plane, and comparing the result with a real observation. The analysis of ten orbits of observations of the EPIC/pn instrument show that the detection of flares in regions far outside the radiation belt is largely influenced by the different orientation of the Earth's magnetosphere respect with XMM-Newton's orbit, confirming th...
Jenkins, C. M.; Godang, R.; Cavaglia, M.; Cremaldi, L.; Summers, D.
2008-10-01
The 14 TeV center of mass proton-proton collisions at the LHC opens the possibility for new Physics, including the possible formation of microscopic black holes. A Fortran-based Monte Carlo event generator program called CATFISH (Collider grAviTational FIeld Simulator for black Holes) has been developed at the University of Mississippi to study signatures of microscopic black hole production (http://www.phy.olemiss.edu/GR/catfish). This black hole event generator includes many of the currently accepted theoretical results for microscopic black hole formation. High energy physics data analysis is shifting from Fortran to C++ as the CERN data analysis packages HBOOK and PAW are no longer supported. The C++ based root is replacing these packages. Work done at the University of South Alabama has resulted in a successful inclusion of CATFISH into root. The methods used to interface the Fortran-based CATFISH into the C++ based root will be presented. Benchmark histograms will be presented demonstrating the conversion. Preliminary results will be presented for selecting black hole candidate events in 14 TeV/ center of mass proton-proton collisions.
Afanasiev, Alexandr; Vainio, Rami
2016-01-01
Context. Solar energetic particles observed in association with coronal mass ejections (CMEs) are produced by the CME-driven shock waves. The acceleration of particles is considered to be due to diffusive shock acceleration (DSA). Aims. We aim at a better understanding of DSA in the case of quasi-parallel shocks, in which self-generated turbulence in the shock vicinity plays a key role. Methods. We have developed and applied a new Monte Carlo simulation code for acceleration of protons in parallel coronal shocks. The code performs a self-consistent calculation of resonant interactions of particles with Alfv\\'en waves based on the quasi-linear theory. In contrast to the existing Monte Carlo codes of DSA, the new code features the full quasi-linear resonance condition of particle pitch-angle scattering. This allows us to take anisotropy of particle pitch-angle scattering into account, while the older codes implement an approximate resonance condition leading to isotropic scattering.We performed simulations with...
Monte Carlo Predictions of Proton SEE Cross-Sections from Heavy Ion Test Data
Xi, Kai; Zhang, Zhan-Gang; Hou, Ming-Dong; Sun, You-Mei; Luo, Jie; Liu, Tian-Qi; Wang, Bin; Ye, Bing; Yin, Ya-Nan; Liu, Jie
2015-01-01
The limits of previous methods promote us to design a new approach (named PRESTAGE) to predict proton single event effect (SEE) cross-sections using heavy-ion test data. To more realistically simulate the SEE mechanisms, we adopt Geant4 and the location-dependent strategy to describe the physics processes and the sensitivity of the device. Cross-sections predicted by PRESTAGE for over twenty devices are compared with the measured data. Evidences show that PRESTAGE can calculate not only single event upsets induced by proton indirect ionization, but also direct ionization effects and single event latch-ups. Most of the PRESTAGE calculated results agree with the experimental data within a factor of 2-3.
Elmekawy, Ahmed Farouk
The distal edge of therapeutic proton radiation beams was investigated by different methods. Proton beams produced at the Hampton University Proton Therapy Institute (HUPTI) were used to irradiate a Polymethylmethacrylate (PMMA) phantom for three different ranges (13.5, 17.0 and 21.0 cm) to investigate the distal slope dependence of the Bragg peak. The activation of 11 C was studied by scanning the phantom less than 10 minutes post-irradiation with a Philips Big Bore Gemini(c) PET/CT. The DICOM images were imported into the Varian Eclipse(c) Treatment Planning System (TPS) for analysis and then analyzed by ImageJ(c) . The distal slope ranged from ?0.1671 +/- 0.0036 to -0.1986 +/- 0.0052 (pixel intensity/slice number) for ranges 13.5 to 21.0 cm respectively. A realistic description of the setup was modeled using the GATE 7.0 Monte Carlo simulation tool and compared to the experiment data. The results show the distal slope ranged from -0.1158+/-0.0133 to -0.0787+/-0.002 (Gy/mm). Additionally, low activity, 11C were simulated to study the 11C reconstructed half-life dependence versus the initial activity for six ranges chosen around the previous activation study. The results of the expected/nominal half-life vs. activity ranged from -5 x 10-4 +/- 2.8104 x 10-4 to 1.6 x 10-3 +/- 9.44 x 10-4 (%diff./Bq). The comparison between two experiments with proton beams on a PMMA phantom and multi-layer ion chamber, and two GATE simulations of a proton beam incident on a water phantom and 11C PET study show that: (i) the distal fall-off variation of the steepness of the slopes are found to be similar thus validating the sensitivity of the PET technique to the range degradation and (ii) the average of the super-ratios difference between all studies observed is primarily due to the difference in the dose deposited in the media.
Energy Technology Data Exchange (ETDEWEB)
Guan, Fada; Peeler, Christopher; Taleei, Reza; Randeniya, Sharmalee; Ge, Shuaiping; Mirkovic, Dragan; Mohan, Radhe; Titt, Uwe, E-mail: UTitt@mdanderson.org [Department of Radiation Physics, The University of Texas MD Anderson Cancer Center, 1515 Holcombe Boulevard, Houston, Texas 77030 (United States); Bronk, Lawrence [Department of Experimental Radiation Oncology, The University of Texas MD Anderson Cancer Center, 1515 Holcombe Boulevard, Houston, Texas 77030 (United States); Geng, Changran [Department of Nuclear Science and Engineering, Nanjing University of Aeronautics and Astronautics, Nanjing 210016, China and Department of Radiation Oncology, Massachusetts General Hospital and Harvard Medical School, Boston, Massachusetts 02114 (United States); Grosshans, David [Department of Experimental Radiation Oncology, The University of Texas MD Anderson Cancer Center, 1515 Holcombe Boulevard, Houston, Texas 77030 and Department of Radiation Oncology, The University of Texas MD Anderson Cancer Center, 1515 Holcombe Boulevard, Houston, Texas 77030 (United States)
2015-11-15
Purpose: The motivation of this study was to find and eliminate the cause of errors in dose-averaged linear energy transfer (LET) calculations from therapeutic protons in small targets, such as biological cell layers, calculated using the GEANT 4 Monte Carlo code. Furthermore, the purpose was also to provide a recommendation to select an appropriate LET quantity from GEANT 4 simulations to correlate with biological effectiveness of therapeutic protons. Methods: The authors developed a particle tracking step based strategy to calculate the average LET quantities (track-averaged LET, LET{sub t} and dose-averaged LET, LET{sub d}) using GEANT 4 for different tracking step size limits. A step size limit refers to the maximally allowable tracking step length. The authors investigated how the tracking step size limit influenced the calculated LET{sub t} and LET{sub d} of protons with six different step limits ranging from 1 to 500 μm in a water phantom irradiated by a 79.7-MeV clinical proton beam. In addition, the authors analyzed the detailed stochastic energy deposition information including fluence spectra and dose spectra of the energy-deposition-per-step of protons. As a reference, the authors also calculated the averaged LET and analyzed the LET spectra combining the Monte Carlo method and the deterministic method. Relative biological effectiveness (RBE) calculations were performed to illustrate the impact of different LET calculation methods on the RBE-weighted dose. Results: Simulation results showed that the step limit effect was small for LET{sub t} but significant for LET{sub d}. This resulted from differences in the energy-deposition-per-step between the fluence spectra and dose spectra at different depths in the phantom. Using the Monte Carlo particle tracking method in GEANT 4 can result in incorrect LET{sub d} calculation results in the dose plateau region for small step limits. The erroneous LET{sub d} results can be attributed to the algorithm to
Range verification of passively scattered proton beams based on prompt gamma time patterns
Testa, Mauro; Min, Chul Hee; Verburg, Joost M.; Schümann, Jan; Lu, Hsiao-Ming; Paganetti, Harald
2014-07-01
We propose a proton range verification technique for passive scattering proton therapy systems where spread out Bragg peak (SOBP) fields are produced with rotating range modulator wheels. The technique is based on the correlation of time patterns of the prompt gamma ray emission with the range of protons delivering the SOBP. The main feature of the technique is the ability to verify the proton range with a single point of measurement and a simple detector configuration. We performed four-dimensional (time-dependent) Monte Carlo simulations using TOPAS to show the validity and accuracy of the technique. First, we validated the hadronic models used in TOPAS by comparing simulations and prompt gamma spectrometry measurements published in the literature. Second, prompt gamma simulations for proton range verification were performed for the case of a water phantom and a prostate cancer patient. In the water phantom, the proton range was determined with 2 mm accuracy with a full ring detector configuration for a dose of ~2.5 cGy. For the prostate cancer patient, 4 mm accuracy on range determination was achieved for a dose of ~15 cGy. The results presented in this paper are encouraging in view of a potential clinical application of the technique.
Bauer, J.; Unholtz, D.; Kurz, C.; Parodi, K.
2013-08-01
We report on the experimental campaign carried out at the Heidelberg Ion-Beam Therapy Center (HIT) to optimize the Monte Carlo (MC) modelling of proton-induced positron-emitter production. The presented experimental strategy constitutes a pragmatic inverse approach to overcome the known uncertainties in the modelling of positron-emitter production due to the lack of reliable cross-section data for the relevant therapeutic energy range. This work is motivated by the clinical implementation of offline PET/CT-based treatment verification at our facility. Here, the irradiation induced tissue activation in the patient is monitored shortly after the treatment delivery by means of a commercial PET/CT scanner and compared to a MC simulated activity expectation, derived under the assumption of a correct treatment delivery. At HIT, the MC particle transport and interaction code FLUKA is used for the simulation of the expected positron-emitter yield. For this particular application, the code is coupled to externally provided cross-section data of several proton-induced reactions. Studying experimentally the positron-emitting radionuclide yield in homogeneous phantoms provides access to the fundamental production channels. Therefore, five different materials have been irradiated by monoenergetic proton pencil beams at various energies and the induced β+ activity subsequently acquired with a commercial full-ring PET/CT scanner. With the analysis of dynamically reconstructed PET images, we are able to determine separately the spatial distribution of different radionuclide concentrations at the starting time of the PET scan. The laterally integrated radionuclide yields in depth are used to tune the input cross-section data such that the impact of both the physical production and the imaging process on the various positron-emitter yields is reproduced. The resulting cross-section data sets allow to model the absolute level of measured β+ activity induced in the investigated
Parallel proton transfer pathways in aqueous acid-base reactions
Cox, M. J.; Bakker, H.J.
2008-01-01
We study the mechanism of proton transfer (PT) between the photoacid 8-hydroxy-1,3, 6-pyrenetrisulfonic acid (HPTS) and the base chloroacetate in aqueous solution. We investigate both proton and deuteron transfer reactions in solutions with base concentrations ranging from 0.25M to 4M. Using femtosecond midinfrared spectroscopy, we probe the vibrational responses of HPTS, its conjugate photobase, the hydrated proton/deuteron, and chloroacetate. The measurement of these four resonances allows ...
Wet-based glaciation in Phlegra Montes, Mars.
Gallagher, Colman; Balme, Matt
2016-04-01
Eskers are sinuous landforms composed of sediments deposited from meltwaters in ice-contact glacial conduits. This presentation describes the first definitive identification of eskers on Mars still physically linked with their parent system (1), a Late Amazonian-age glacier (~150 Ma) in Phlegra Montes. Previously described Amazonian-age glaciers on Mars are generally considered to have been dry based, having moved by creep in the absence of subglacial water required for sliding, but our observations indicate significant sub-glacial meltwater routing. The confinement of the Phlegra Montes glacial system to a regionally extensive graben is evidence that the esker formed due to sub-glacial melting in response to an elevated, but spatially restricted, geothermal heat flux rather than climate-induced warming. Now, however, new observations reveal the presence of many assemblages of glacial abrasion forms and associated channels that could be evidence of more widespread wet-based glaciation in Phlegra Montes, including the collapse of several distinct ice domes. This landform assemblage has not been described in other glaciated, mid-latitude regions of the martian northern hemisphere. Moreover, Phlegra Montes are flanked by lowlands displaying evidence of extensive volcanism, including contact between plains lava and piedmont glacial ice. These observations provide a rationale for investigating non-climatic forcing of glacial melting and associated landscape development on Mars, and can build on insights from Earth into the importance of geothermally-induced destabilisation of glaciers as a key amplifier of climate change. (1) Gallagher, C. and Balme, M. (2015). Eskers in a complete, wet-based glacial system in the Phlegra Montes region, Mars, Earth and Planetary Science Letters, 431, 96-109.
Parallel proton transfer pathways in aqueous acid-base reactions
Cox, M.J.; Bakker, H.J.
2008-01-01
We study the mechanism of proton transfer (PT) between the photoacid 8-hydroxy-1,3, 6-pyrenetrisulfonic acid (HPTS) and the base chloroacetate in aqueous solution. We investigate both proton and deuteron transfer reactions in solutions with base concentrations ranging from 0.25M to 4M. Using femtose
Fuel-Cell Electrolytes Based on Organosilica Hybrid Proton Conductors
Narayan, Sri R.; Yen, Shiao-Pin S.
2008-01-01
A new membrane composite material that combines an organosilica proton conductor with perfluorinated Nafion material to achieve good proton conductivity and high-temperature performance for membranes used for fuel cells in stationary, transportation, and portable applications has been developed. To achieve high proton conductivities of the order of 10(exp -1)S/cm over a wide range of temperatures, a composite membrane based on a new class of mesoporous, proton-conducting, hydrogen-bonded organosilica, used with Nafion, will allow for water retention and high proton conductivity over a wider range of temperatures than currently offered by Nafion alone. At the time of this reporting, this innovation is at the concept level. Some of the materials and processes investigated have shown good proton conductivity, but membranes have not yet been prepared and demonstrated.
Shin, Wook-Geun; Min, Chul Hee; Shin, Jae-Ik; Jeong, Jong Hwi; Lee, Se Byeong
2015-07-01
For in-vivo range verification in proton therapy, attempts have been made to measure the spatial distribution of the prompt gammas generated by the proton-induced interactions and to determine the proton dose distribution. However, the high energies of prompt gammas and background gammas are still problematic in measuring the distribution. In this study, we suggested a new method for determining the in-vivo range by utilizing the time structure of the prompt gammas formed during the rotation of a range modulation wheel (RMW) in passive scattering proton therapy. To validate the Monte Carlo code simulating the proton beam nozzle, we compared the axial percent depth doses (PDDs) with the measured PDDs for varying beam range from 4.73 to 24.01 cm. Also, we assessed the relationship between the proton dose rate and the time structure of the prompt gammas in a water phantom. The results of the PDD showed agreement within relative errors of 1.1% in the distal range and 2.9% in the modulation width. The average dose difference in the modulation was assessed as less than 1.3% by comparison with the measurements. The time structure of prompt gammas was well-matched, within 0.39 ms, with the proton dose rate, and this enabled an accurate prediction of the in-vivo range.
Energy Technology Data Exchange (ETDEWEB)
Sahoo, G.S. [Health Physics Division, Bhabha Atomic Research Centre, Mumbai 400085 (India); Tripathy, S.P., E-mail: sam.tripathy@gmail.com [Health Physics Division, Bhabha Atomic Research Centre, Mumbai 400085 (India); Homi Bhabha National Institute, Mumbai 400094 (India); Molokanov, A.G.; Aleynikov, V.E. [Joint Institute for Nuclear Research, Dubna 141980 (Russian Federation); Sharma, S.D. [Homi Bhabha National Institute, Mumbai 400094 (India); Radiological Physics & Advisory Division, Bhabha Atomic Research Centre, Mumbai 400085 (India); Bandyopadhyay, T. [Health Physics Division, Bhabha Atomic Research Centre, Mumbai 400085 (India); Homi Bhabha National Institute, Mumbai 400094 (India)
2016-05-11
In this work, we have used CR-39 detectors to estimate the LET (linear energy transfer) spectrum of secondary particles due to 171 MeV proton beam at different depths of water including the Bragg peak region. The measured LET spectra were compared with those obtained from FLUKA Monte Carlo simulation. The absorbed dose (D{sub LET}), dose equivalent (H{sub LET}) were estimated using the LET spectra. The values of D{sub LET} and H{sub LET} per incident proton fluence were found to increase with the increase in depth of water and were maximum at Bragg peak. - Highlights: • Measurement of LET spectrometry using CR-39 detectors at different depths of water. • Comparison of measured spectra with FLUKA Monte carlo simulation. • Absorbed dose and dose equivalent was found to increase with depth of water.
Energy Technology Data Exchange (ETDEWEB)
Lekadir, H.; Abbas, I.; Champion, C. [Universite Paul Verlaine Metz, Laboratoire de Physique Moleculaire et des Collisions, Institut J. Barriol FR CNRS 2843, 1 Bd Arago, 57078 Metz Cedex3 (France); Hanssen, J. [Universite Paul Verlaine Metz, Laboratoire de Physique Moleculaire et des Collisions, Institut J. Barriol FR CNRS 2843, 1 Bd Arago, 57078 Metz Cedex3 (France)], E-mail: jocelyn@univ-metz.fr
2009-03-15
In the current work, we present a study of ionizing interactions between protons and molecular targets of biological interest like water vapour and DNA bases. Total cross sections for single and multiple ionizing processes are calculated in the independent electron model and compared to existing theoretical and experimental results for impact energies ranging from 10 keV/amu to 10 MeV/amu. The theoretical approach combines some characteristics of the classical trajectory Monte Carlo method with the classical over-barrier framework. In this 'mixed' approach, all the particles are described in a classical way by assuming that the target electrons are involved in the collision only when their binding energy is greater than the maximum of the potential energy of the system projectile-target. We test our theoretical approach on the water molecule and the obtained results are compared to a large set of data and a reasonable agreement is generally observed specially for impact energies greater than 100 keV, except for the double ionization process for which large discrepancies are reported. Considering the DNA bases, the obtained results are given without any comparison since the literature is till now very poor in terms of cross section measurements.
Energy Technology Data Exchange (ETDEWEB)
Tesfamicael, B; Gueye, P; Lyons, D [Hampton University, Hampton, VA (United States); Avery, S [University of Pennsylvania, Sicklerville, NJ (United States); Mahesh, M [Johns Hopkins Univ, Baltimore, MD (United States)
2014-06-01
Purpose: To monitor the secondary dose distribution originating from a water phantom during proton therapy of prostate cancer using scintillating fibers. Methods: The Geant4 Monte Carlo toolkit version 9.6.p02 was used to simulate prostate cancer proton therapy based treatments. Two cases were studied. In the first case, 8 × 8 = 64 equally spaced fibers inside three 4 × 4 × 2.54 cmm{sup 3} DuPont™ Delrin blocks were used to monitor the emission of secondary particles in the transverse (left and right) and distal regions relative to the beam direction. In the second case, a scintillating block with a thickness of 2.54 cm and equal vertical and longitudinal dimensions as the water phantom was used. Geometrical cuts were used to extract the energy deposited in each fiber and the scintillating block. Results: The transverse dose distributions from secondary particles in both cases agree within <5% and with a very good symmetry. The energy deposited not only gradually increases as one moves from the peripheral row fibers towards the center of the block (aligned with the center of the prostate) but also decreases as one goes from the frontal to distal region of the block. The ratio of the doses from the prostate to the ones in the middle two rows of fibers showed a linear relationship with a slope (−3.55±2.26) × 10−5 MeV per treatment Gy. The distal detectors recorded a very small energy deposited due to water attenuation. Conclusion: With a good calibration and the ability to define a good correlation between the dose to the external fibers and the prostate, such fibers can be used for real time dose verification to the target.
Shin, Wook-Geun; Shin, Jae-Ik; Jeong, Jong Hwi; Lee, Se Byeong
2015-01-01
For the in vivo range verification in proton therapy, it has been tried to measure the spatial distribution of the prompt gammas generated by the proton-induced interactions with the close relationship with the proton dose distribution. However, the high energy of the prompt gammas and background gammas are still problematic in measuring the distribution. In this study, we suggested a new method determining the in vivo range by utilizing the time structure of the prompt gammas formed with the rotation of a range modulation wheel (RMW) in the passive scattering proton therapy. To validate the Monte Carlo code simulating the proton beam nozzle, axial percent depth doses (PDDs) were compared with the measured PDDs with the varying beam range of 4.73-24.01 cm. And the relationship between the proton dose rate and the time structure of the prompt gammas was assessed and compared in the water phantom. The results of the PDD showed accurate agreement within the relative errors of 1.1% in the distal range and 2.9% in...
Monte Carlo based radial shield design of typical PWR reactor
Energy Technology Data Exchange (ETDEWEB)
Gul, Anas; Khan, Rustam; Qureshi, M. Ayub; Azeem, Muhammad Waqar; Raza, S.A. [Pakistan Institute of Engineering and Applied Sciences, Islamabad (Pakistan). Dept. of Nuclear Engineering; Stummer, Thomas [Technische Univ. Wien (Austria). Atominst.
2016-11-15
Neutron and gamma flux and dose equivalent rate distribution are analysed in radial and shields of a typical PWR type reactor based on the Monte Carlo radiation transport computer code MCNP5. The ENDF/B-VI continuous energy cross-section library has been employed for the criticality and shielding analysis. The computed results are in good agreement with the reference results (maximum difference is less than 56 %). It implies that MCNP5 a good tool for accurate prediction of neutron and gamma flux and dose rates in radial shield around the core of PWR type reactors.
Bznuni, S A; Zhamkochyan, V M; Polanski, A; Sosnin, A N; Khudaverdyan, A H
2001-01-01
Parameters of a subcritical cascade reactor driven by a proton accelerator and based on a primary lead-bismuth target, main reactor constructed analogously to the molten salt breeder (MSBR) reactor core and a booster-reactor analogous to the core of the BN-350 liquid metal cooled fast breeder reactor (LMFBR). It is shown by means of Monte-Carlo modeling that the reactor under study provides safe operation modes (k_{eff}=0.94-0.98), is apable to transmute effectively radioactive nuclear waste and reduces by an order of magnitude the requirements on the accelerator beam current. Calculations show that the maximal neutron flux in the thermal zone is 10^{14} cm^{12}\\cdot s^_{-1}, in the fast booster zone is 5.12\\cdot10^{15} cm^{12}\\cdot s{-1} at k_{eff}=0.98 and proton beam current I=2.1 mA.
The general base in the thymidylate synthase catalyzed proton abstraction.
Ghosh, Ananda K; Islam, Zahidul; Krueger, Jonathan; Abeysinghe, Thelma; Kohen, Amnon
2015-12-14
The enzyme thymidylate synthase (TSase), an important chemotherapeutic drug target, catalyzes the formation of 2'-deoxythymidine-5'-monophosphate (dTMP), a precursor of one of the DNA building blocks. TSase catalyzes a multi-step mechanism that includes the abstraction of a proton from the C5 of the substrate 2'-deoxyuridine-5'-monophosphate (dUMP). Previous studies on ecTSase proposed that an active-site residue, Y94 serves the role of the general base abstracting this proton. However, since Y94 is neither very basic, nor connected to basic residues, nor located close enough to the pyrimidine proton to be abstracted, the actual identity of this base remains enigmatic. Based on crystal structures, an alternative hypothesis is that the nearest potential proton-acceptor of C5 of dUMP is a water molecule that is part of a hydrogen bond (H-bond) network comprised of several water molecules and several protein residues including H147, E58, N177, and Y94. Here, we examine the role of the residue Y94 in the proton abstraction step by removing its hydroxyl group (Y94F mutant). We investigated the effect of the mutation on the temperature dependence of intrinsic kinetic isotope effects (KIEs) and found that these KIEs are more temperature dependent than those of the wild-type enzyme (WT). These results suggest that the phenolic -OH of Y94 is a component of the transition state for the proton abstraction step. The findings further support the hypothesis that no single functional group is the general base, but a network of bases and hydroxyls (from water molecules and tyrosine) sharing H-bonds across the active site can serve the role of the general base to remove the pyrimidine proton.
Sunil, C
2016-04-01
The neutron ambient dose equivalent outside the radiation shield of a proton therapy cyclotron vault is estimated using the unshielded dose equivalent rates and the attenuation lengths obtained from the literature and by simulations carried out with the FLUKA Monte Carlo radiation transport code. The source terms derived from the literature and that obtained from the FLUKA calculations differ by a factor of 2-3, while the attenuation lengths obtained from the literature differ by 20-40%. The instantaneous dose equivalent rates outside the shield differ by a few orders of magnitude, not only in comparison with the Monte Carlo simulation results, but also with the results obtained by line of sight attenuation calculations with the different parameters obtained from the literature. The attenuation of neutrons caused by the presence of bulk iron, such as magnet yokes is expected to reduce the dose equivalent by as much as a couple of orders of magnitude outside the shield walls.
Institute of Scientific and Technical Information of China (English)
殷雯; 张国锋; 杜建红; 梁九卿
2003-01-01
The Monte Carlo simulation and the finite element methods have been used to calculate the heat deposition and temperature distribution in tungsten plate target when the target is bombarded by high-energy protons from the accelerator with nuclear power of 100 kW. The results show that the heat deposition in the target, reflector and shield will be 48 kW, 15 kW and 11 kW, respectively, and the highest temperature in the target plates will be lower than 100 ℃when the surfaces of plates are cooled by water.
Quantitative Monte Carlo-based holmium-166 SPECT reconstruction
Energy Technology Data Exchange (ETDEWEB)
Elschot, Mattijs; Smits, Maarten L. J.; Nijsen, Johannes F. W.; Lam, Marnix G. E. H.; Zonnenberg, Bernard A.; Bosch, Maurice A. A. J. van den; Jong, Hugo W. A. M. de [Department of Radiology and Nuclear Medicine, University Medical Center Utrecht, Heidelberglaan 100, 3584 CX Utrecht (Netherlands); Viergever, Max A. [Image Sciences Institute, University Medical Center Utrecht, Heidelberglaan 100, 3584 CX Utrecht (Netherlands)
2013-11-15
Purpose: Quantitative imaging of the radionuclide distribution is of increasing interest for microsphere radioembolization (RE) of liver malignancies, to aid treatment planning and dosimetry. For this purpose, holmium-166 ({sup 166}Ho) microspheres have been developed, which can be visualized with a gamma camera. The objective of this work is to develop and evaluate a new reconstruction method for quantitative {sup 166}Ho SPECT, including Monte Carlo-based modeling of photon contributions from the full energy spectrum.Methods: A fast Monte Carlo (MC) simulator was developed for simulation of {sup 166}Ho projection images and incorporated in a statistical reconstruction algorithm (SPECT-fMC). Photon scatter and attenuation for all photons sampled from the full {sup 166}Ho energy spectrum were modeled during reconstruction by Monte Carlo simulations. The energy- and distance-dependent collimator-detector response was modeled using precalculated convolution kernels. Phantom experiments were performed to quantitatively evaluate image contrast, image noise, count errors, and activity recovery coefficients (ARCs) of SPECT-fMC in comparison with those of an energy window-based method for correction of down-scattered high-energy photons (SPECT-DSW) and a previously presented hybrid method that combines MC simulation of photopeak scatter with energy window-based estimation of down-scattered high-energy contributions (SPECT-ppMC+DSW). Additionally, the impact of SPECT-fMC on whole-body recovered activities (A{sup est}) and estimated radiation absorbed doses was evaluated using clinical SPECT data of six {sup 166}Ho RE patients.Results: At the same noise level, SPECT-fMC images showed substantially higher contrast than SPECT-DSW and SPECT-ppMC+DSW in spheres ≥17 mm in diameter. The count error was reduced from 29% (SPECT-DSW) and 25% (SPECT-ppMC+DSW) to 12% (SPECT-fMC). ARCs in five spherical volumes of 1.96–106.21 ml were improved from 32%–63% (SPECT-DSW) and 50%–80
Energy Technology Data Exchange (ETDEWEB)
Pignol, J.-P. [Toronto-Sunnybrook Regional Cancer Centre, Radiotherapy Dept., Toronto, Ontario (Canada); Slabbert, J. [National Accelerator Centre, Faure (South Africa)
2001-02-01
Fast neutrons (FN) have a higher radio-biological effectiveness (RBE) compared with photons, however the mechanism of this increase remains a controversial issue. RBE variations are seen among various FN facilities and at the same facility when different tissue depths or thicknesses of hardening filters are used. These variations lead to uncertainties in dose reporting as well as in the comparisons of clinical results. Besides radiobiology and microdosimetry, another powerful method for the characterization of FN beams is the calculation of total proton and heavy ion kerma spectra. FLUKA and MCNP Monte Carlo code were used to simulate these kerma spectra following a set of microdosimetry measurements performed at the National Accelerator Centre. The calculated spectra confirmed major classical statements: RBE increase is linked to both slow energy protons and alpha particles yielded by (n,{alpha}) reactions on carbon and oxygen nuclei. The slow energy protons are produced by neutrons having an energy between 10 keV and 10 MeV, while the alpha particles are produced by neutrons having an energy between 10 keV and 15 MeV. Looking at the heavy ion kerma from <15 MeV and the proton kerma from neutrons <10 MeV, it is possible to anticipate y* and RBE trends. (author)
GPU-Monte Carlo based fast IMRT plan optimization
Directory of Open Access Journals (Sweden)
Yongbao Li
2014-03-01
Full Text Available Purpose: Intensity-modulated radiation treatment (IMRT plan optimization needs pre-calculated beamlet dose distribution. Pencil-beam or superposition/convolution type algorithms are typically used because of high computation speed. However, inaccurate beamlet dose distributions, particularly in cases with high levels of inhomogeneity, may mislead optimization, hindering the resulting plan quality. It is desire to use Monte Carlo (MC methods for beamlet dose calculations. Yet, the long computational time from repeated dose calculations for a number of beamlets prevents this application. It is our objective to integrate a GPU-based MC dose engine in lung IMRT optimization using a novel two-steps workflow.Methods: A GPU-based MC code gDPM is used. Each particle is tagged with an index of a beamlet where the source particle is from. Deposit dose are stored separately for beamlets based on the index. Due to limited GPU memory size, a pyramid space is allocated for each beamlet, and dose outside the space is neglected. A two-steps optimization workflow is proposed for fast MC-based optimization. At first step, a rough dose calculation is conducted with only a few number of particle per beamlet. Plan optimization is followed to get an approximated fluence map. In the second step, more accurate beamlet doses are calculated, where sampled number of particles for a beamlet is proportional to the intensity determined previously. A second-round optimization is conducted, yielding the final result.Results: For a lung case with 5317 beamlets, 105 particles per beamlet in the first round, and 108 particles per beam in the second round are enough to get a good plan quality. The total simulation time is 96.4 sec.Conclusion: A fast GPU-based MC dose calculation method along with a novel two-step optimization workflow are developed. The high efficiency allows the use of MC for IMRT optimizations.--------------------------------Cite this article as: Li Y, Tian Z
Proton radiography and proton computed tomography based on time-resolved dose measurements
Testa, Mauro; Verburg, Joost M.; Rose, Mark; Min, Chul Hee; Tang, Shikui; Hassane Bentefour, El; Paganetti, Harald; Lu, Hsiao-Ming
2013-11-01
We present a proof of principle study of proton radiography and proton computed tomography (pCT) based on time-resolved dose measurements. We used a prototype, two-dimensional, diode-array detector capable of fast dose rate measurements, to acquire proton radiographic images expressed directly in water equivalent path length (WEPL). The technique is based on the time dependence of the dose distribution delivered by a proton beam traversing a range modulator wheel in passive scattering proton therapy systems. The dose rate produced in the medium by such a system is periodic and has a unique pattern in time at each point along the beam path and thus encodes the WEPL. By measuring the time dose pattern at the point of interest, the WEPL to this point can be decoded. If one measures the time-dose patterns at points on a plane behind the patient for a beam with sufficient energy to penetrate the patient, the obtained 2D distribution of the WEPL forms an image. The technique requires only a 2D dosimeter array and it uses only the clinical beam for a fraction of second with negligible dose to patient. We first evaluated the accuracy of the technique in determining the WEPL for static phantoms aiming at beam range verification of the brain fields of medulloblastoma patients. Accurate beam ranges for these fields can significantly reduce the dose to the cranial skin of the patient and thus the risk of permanent alopecia. Second, we investigated the potential features of the technique for real-time imaging of a moving phantom. Real-time tumor tracking by proton radiography could provide more accurate validations of tumor motion models due to the more sensitive dependence of proton beam on tissue density compared to x-rays. Our radiographic technique is rapid (˜100 ms) and simultaneous over the whole field, it can image mobile tumors without the problem of interplay effect inherently challenging for methods based on pencil beams. Third, we present the reconstructed p
Jie, Binbin; Sah, Chihtang
Pure water has been characterized empirically for nearly a century, as dissociation into hydronium (H3O)1+ and hydroxide (HO)1- ions. Last March, we reported that the ~40 year experimental industrial standard of chemical equilibrium reaction constant, the ion product, can be accounted for by a statistical-physics-based concentration product of two electrical charge carriers, the positively charged protons, p+, and the negatively charged proton holes or prohols, p-, with a thermal activation energy or proton trapping well depth of Ep + / p - = 576 meV, in the 0-100OC pure liquid water. We now report that the empirically fitted industrial standard experimental data (1985, 1987, 2005) of the two dc ion mobilities in liquid water, can also be accounted for by trapping-limited drift of protons and prohols through proton channels of lower proton electrical potential valleys, Ep+/0 Pauling statistical model using the 1933 Bernal-Fowler water rule.
Proton-Exchange Membranes Based on Sulfonated Polymers
Directory of Open Access Journals (Sweden)
Yulia Sergeevna Sedesheva
2016-10-01
Full Text Available Review is dedicated to discussion of different types of proton-exchange membranes used in fuel cells (FC. One of the most promising electrolytes is polymer electrolyte membrane (PEM. In recent years, researchers pay great attention to various non-fluorinated or partially fluorinated hydrocarbon polymers, which may become a real alternative to Nafion. Typical examples are sulfonatedpolyetheretherketones, polyarylene ethers, polysulphones, polyimides. A class of polyimides-based hydrocarbon proton-exchange membranes is separately considered as promising for widespread use in fuel cell, such membranes are of interest for our further experimental development.
Proton-beam writing channel based on an electrostatic accelerator
Lapin, A. S.; Rebrov, V. A.; Kolin'ko, S. V.; Salivon, V. F.; Ponomarev, A. G.
2016-09-01
We have described the structure of the proton-beam writing channel as a continuation of a nuclear scanning microprobe channel. The problem of the accuracy of positioning a probe by constructing a new high-frequency electrostatic scanning system has been solved. Special attention has been paid to designing the probe-forming system and its various configurations have been considered. The probe-forming system that best corresponds to the conditions of the lithographic process has been found based on solving the problem of optimizing proton beam formation. A system for controlling beam scanning using multifunctional module of integrated programmable logic systems has been developed.
Proton Conductivity and Operational Features Of PBI-Based Membranes
DEFF Research Database (Denmark)
Qingfeng, Li; Jensen, Jens Oluf; Precht Noyé, Pernille;
2005-01-01
As an approach to high temperature operation of PEMFCs, acid-doped PBI membranes are under active development. The membrane exhibits high proton conductivity under low water contents at temperatures up to 200°C. Mechanisms of proton conduction for the membranes have been proposed. Based...... on the membranes fuel cell tests have been demonstrated. Operating features of the PBI cell include no humidification, high CO tolerance, better heat utilization and possible integration with fuel processing units. Issues for further development are also discussed....
Peeler, Christopher R; Titt, Uwe
2012-06-21
In spot-scanning intensity-modulated proton therapy, numerous unmodulated proton beam spots are delivered over a target volume to produce a prescribed dose distribution. To accurately model field size-dependent output factors for beam spots, the energy deposition at positions radial to the central axis of the beam must be characterized. In this study, we determined the difference in the central axis dose for spot-scanned fields that results from secondary particle doses by investigating energy deposition radial to the proton beam central axis resulting from primary protons and secondary particles for mathematical point source and distributed source models. The largest difference in the central axis dose from secondary particles resulting from the use of a mathematical point source and a distributed source model was approximately 0.43%. Thus, we conclude that the central axis dose for a spot-scanned field is effectively independent of the source model used to calculate the secondary particle dose.
Presiado, Itay; Erez, Yuval; Huppert, Dan
2010-12-30
Steady-state and time-resolved techniques were employed to study the excited-state proton transfer (ESPT) from d-luciferin, the natural substrate of the firefly luciferase, to the mild acetate base in aqueous solutions. We found that in 1 M aqueous solutions of acetate or higher, a proton transfer (PT) process to the acetate takes place within 30 ps in both H(2)O and D(2)O solutions. The time-resolved emission signal is composed of three components. We found that the short-time component decay time is 300 and 600 fs in H(2)O and D(2)O, respectively. This component is attributed either to a PT process via the shortest water bridged complex available, ROH··H(2)O··Ac(-), or to PT taking place within a contact ion pair. The second time component of 2000 and 3000 fs for H(2)O and D(2)O, respectively, is attributed to ROH* acetate complex, whose proton wire is longer by one water molecule. The decay rate of the third, long-time component is proportional to the acetate concentration. We attribute it to the diffusion-assisted reaction as well as to PT process to the solvent.
Cortés-Giraldo, M A; Carabe, A
2015-04-07
We compare unrestricted dose average linear energy transfer (LET) maps calculated with three different Monte Carlo scoring methods in voxelized geometries irradiated with proton therapy beams with three different Monte Carlo scoring methods. Simulations were done with the Geant4 (Geometry ANd Tracking) toolkit. The first method corresponds to a step-by-step computation of LET which has been reported previously in the literature. We found that this scoring strategy is influenced by spurious high LET components, which relative contribution in the dose average LET calculations significantly increases as the voxel size becomes smaller. Dose average LET values calculated for primary protons in water with voxel size of 0.2 mm were a factor ~1.8 higher than those obtained with a size of 2.0 mm at the plateau region for a 160 MeV beam. Such high LET components are a consequence of proton steps in which the condensed-history algorithm determines an energy transfer to an electron of the material close to the maximum value, while the step length remains limited due to voxel boundary crossing. Two alternative methods were derived to overcome this problem. The second scores LET along the entire path described by each proton within the voxel. The third followed the same approach of the first method, but the LET was evaluated at each step from stopping power tables according to the proton kinetic energy value. We carried out microdosimetry calculations with the aim of deriving reference dose average LET values from microdosimetric quantities. Significant differences between the methods were reported either with pristine or spread-out Bragg peaks (SOBPs). The first method reported values systematically higher than the other two at depths proximal to SOBP by about 15% for a 5.9 cm wide SOBP and about 30% for a 11.0 cm one. At distal SOBP, the second method gave values about 15% lower than the others. Overall, we found that the third method gave the most consistent
Pignol, J P; Slabbert, J
2001-02-01
Fast neutrons (FN) have a higher radio-biological effectiveness (RBE) compared with photons, however the mechanism of this increase remains a controversial issue. RBE variations are seen among various FN facilities and at the same facility when different tissue depths or thicknesses of hardening filters are used. These variations lead to uncertainties in dose reporting as well as in the comparisons of clinical results. Besides radiobiology and microdosimetry, another powerful method for the characterization of FN beams is the calculation of total proton and heavy ion kerma spectra. FLUKA and MCNP Monte Carlo code were used to simulate these kerma spectra following a set of microdosimetry measurements performed at the National Accelerator Centre. The calculated spectra confirmed major classical statements: RBE increase is linked to both slow energy protons and alpha particles yielded by (n,alpha) reactions on carbon and oxygen nuclei. The slow energy protons are produced by neutrons having an energy between 10 keV and 10 MeV, while the alpha particles are produced by neutrons having an energy between 10 keV and 15 MeV. Looking at the heavy ion kerma from neutrons <10 MeV, it is possible to anticipate y* and RBE trends.
A Monte Carlo-based model of gold nanoparticle radiosensitization
Lechtman, Eli Solomon
The goal of radiotherapy is to operate within the therapeutic window - delivering doses of ionizing radiation to achieve locoregional tumour control, while minimizing normal tissue toxicity. A greater therapeutic ratio can be achieved by utilizing radiosensitizing agents designed to enhance the effects of radiation at the tumour. Gold nanoparticles (AuNP) represent a novel radiosensitizer with unique and attractive properties. AuNPs enhance local photon interactions, thereby converting photons into localized damaging electrons. Experimental reports of AuNP radiosensitization reveal this enhancement effect to be highly sensitive to irradiation source energy, cell line, and AuNP size, concentration and intracellular localization. This thesis explored the physics and some of the underlying mechanisms behind AuNP radiosensitization. A Monte Carlo simulation approach was developed to investigate the enhanced photoelectric absorption within AuNPs, and to characterize the escaping energy and range of the photoelectric products. Simulations revealed a 10 3 fold increase in the rate of photoelectric absorption using low-energy brachytherapy sources compared to megavolt sources. For low-energy sources, AuNPs released electrons with ranges of only a few microns in the surrounding tissue. For higher energy sources, longer ranged photoelectric products travelled orders of magnitude farther. A novel radiobiological model called the AuNP radiosensitization predictive (ARP) model was developed based on the unique nanoscale energy deposition pattern around AuNPs. The ARP model incorporated detailed Monte Carlo simulations with experimentally determined parameters to predict AuNP radiosensitization. This model compared well to in vitro experiments involving two cancer cell lines (PC-3 and SK-BR-3), two AuNP sizes (5 and 30 nm) and two source energies (100 and 300 kVp). The ARP model was then used to explore the effects of AuNP intracellular localization using 1.9 and 100 nm Au
Mairani, A.; Dokic, I.; Magro, G.; Tessonnier, T.; Bauer, J.; Böhlen, T. T.; Ciocca, M.; Ferrari, A.; Sala, P. R.; Jäkel, O.; Debus, J.; Haberer, T.; Abdollahi, A.; Parodi, K.
2017-02-01
Proton therapy treatment planning systems (TPSs) are based on the assumption of a constant relative biological effectiveness (RBE) of 1.1 without taking into account the found in vitro experimental variations of the RBE as a function of tissue type, linear energy transfer (LET) and dose. The phenomenological RBE models available in literature are based on the dose-averaged LET (LET D ) as an indicator of the physical properties of the proton radiation field. The LET D values are typically calculated taking into account primary and secondary protons, neglecting the biological effect of heavier secondaries. In this work, we have introduced a phenomenological RBE approach which considers the biological effect of primary protons, and of secondary protons, deuterons, tritons (Z = 1) and He fragments (3He and 4He, Z = 2). The calculation framework, coupled with a Monte Carlo (MC) code, has been successfully benchmarked against clonogenic in vitro data measured in this work for two cell lines and then applied to determine biological quantities for spread-out Bragg peaks and a prostate and a head case. The introduced RBE formalism, which depends on the mixed radiation field, the dose and the ratio of the linear–quadratic model parameters for the reference radiation {{≤ft(α /β \\right)}\\text{ph}} , predicts, when integrated in an MC code, higher RBE values in comparison to LET D -based parameterizations. This effect is particular enhanced in the entrance channel of the proton field and for low {{≤ft(α /β \\right)}\\text{ph}} tissues. For the prostate and the head case, we found higher RBE-weighted dose values up to about 5% in the entrance channel when including or neglecting the Z = 2 secondaries in the RBE calculation. TPSs able to proper account for the mixed radiation field in proton therapy are thus recommended for an accurate determination of the RBE in the whole treatment field.
Smol'janinova, T I; Zhidkov, V A; Sokolov, G V
1982-01-01
The titration curves of nitrogen bases and fractions of disordered nucleotide pairs are obtained during DNA protonation. It is shown that purine bases are the first sites of the DNA double helix protonation. The cytosine protonation is due to proton-induced conformational transition within GC pairs with the sequence proton transfer from (N-7) of guanine to (N-3) of cytosine. Within DNA with unwound regions the bases are protonated in the following order: cytosine, adenine, guanine. It is shown that GC pairs are the primary centres in which the unwinding of protonated DNAs occurs. PMID:7079177
Presiado, Itay; Gepshtein, Rinat; Erez, Yuval; Huppert, Dan
2011-07-07
We studied the direct proton transfer (PT) from electronically excited D-luciferin to several mild bases. The fluorescence up-conversion technique is used to measure the rise and decay of the fluorescence signals of the protonated and deprotonated species of D-luciferin. From a base concentration of 0.25 M or higher the proton transfer rates to the fluoride, dihdyrogen phosphate or acetate bases are fast and comparable. The fluorescence signals are nonexponential and complex. We suggest that the fastest decay component arises from a direct proton transfer process from the hydroxyl group of D-luciferin to the mild base. The proton donor and acceptor molecules form an ion pair prior to photoexcitation. Upon photoexcitation solvent rearrangement occurs on a 1 ps time-scale. The PT reaction time constant is ∼2 ps for all three bases. A second decay component of about 10 ps is attributed to the proton transfer in a contact pair bridged by one water molecule. The longest decay component is due to both the excited-state proton transfer (ESPT) to the solvent and the diffusion-assisted PT process between a photoacid and a base pair positioned remotely from each other prior to photoexcitation.
A global reaction route mapping-based kinetic Monte Carlo algorithm
Mitchell, Izaac; Irle, Stephan; Page, Alister J.
2016-07-01
We propose a new on-the-fly kinetic Monte Carlo (KMC) method that is based on exhaustive potential energy surface searching carried out with the global reaction route mapping (GRRM) algorithm. Starting from any given equilibrium state, this GRRM-KMC algorithm performs a one-step GRRM search to identify all surrounding transition states. Intrinsic reaction coordinate pathways are then calculated to identify potential subsequent equilibrium states. Harmonic transition state theory is used to calculate rate constants for all potential pathways, before a standard KMC accept/reject selection is performed. The selected pathway is then used to propagate the system forward in time, which is calculated on the basis of 1st order kinetics. The GRRM-KMC algorithm is validated here in two challenging contexts: intramolecular proton transfer in malonaldehyde and surface carbon diffusion on an iron nanoparticle. We demonstrate that in both cases the GRRM-KMC method is capable of reproducing the 1st order kinetics observed during independent quantum chemical molecular dynamics simulations using the density-functional tight-binding potential.
Romano, F.; Cirrone, G. A. P.; Cuttone, G.; Di Rosa, F.; Mazzaglia, S. E.; Petrovic, I.; Ristic Fira, A.; Varisano, A.
2014-06-01
Fluence, depth absorbed dose and linear energy transfer (LET) distributions of proton and carbon ion beams have been investigated using the Monte Carlo code Geant4 (GEometry ANd Tracking). An open source application was developed with the aim to simulate two typical transport beam lines, one used for ocular therapy and cell irradiations with protons and the other for cell irradiations with carbon ions. This tool allows evaluation of the primary and total dose averaged LET and predict their spatial distribution in voxelized or sliced geometries. In order to reproduce the LET distributions in a realistic way, and also the secondary particles’ contributions due to nuclear interactions were considered in the computations. Pristine and spread-out Bragg peaks were taken into account both for proton and carbon ion beams, with the maximum energy of 62 MeV/n. Depth dose distributions were compared with experimental data, showing good agreement. Primary and total LET distributions were analysed in order to study the influence of contributions of secondary particles in regions at different depths. A non-negligible influence of high-LET components was found in the entrance channel for proton beams, determining the total dose averaged LET by the factor 3 higher than the primary one. A completely different situation was obtained for carbon ions. In this case, secondary particles mainly contributed in the tail that is after the peak. The results showed how the weight of light and heavy secondary ions can considerably influence the computation of LET depth distributions. This has an important role in the interpretation of results coming from radiobiological experiments and, therefore, in hadron treatment planning procedures.
A GEM-based dose imaging detector with optical readout for proton radiotherapy
Energy Technology Data Exchange (ETDEWEB)
Klyachko, A.V., E-mail: aklyachk@indiana.edu [Indiana University Cyclotron Operations, Indiana University Integrated Science and Accelerator Technology Hall, 2401 Milo. B. Sampson Ln., Bloomington, IN 47408 (United States); Moskvin, V. [Department of Radiation Oncology, School of Medicine, Indiana University, Indianapolis, IN 46202 (United States); Nichiporov, D.F.; Solberg, K.A. [Indiana University Cyclotron Operations, Indiana University Integrated Science and Accelerator Technology Hall, 2401 Milo. B. Sampson Ln., Bloomington, IN 47408 (United States)
2012-12-01
New techniques in proton radiation therapy and advances in beam delivery systems design such as beam scanning require accurate 2D dosimetry systems to verify the delivered dose distribution. Dose imaging detectors based on gas electron multipliers (GEMs) are capable of providing high sensitivity, improved dose measurement linearity, position resolution, fast response and accurate characterization of depth-dose distributions. In this work, we report on the development of a GEM-based dose imaging detector with optical readout using a CCD camera. A 10 Multiplication-Sign 10 cm{sup 2} detector has been tested in a 205 MeV proton beam in single- and double-GEM configurations. The detector demonstrates linearity in dose rate up to 100 Gy/min and position resolution ({sigma}) of 0.42 mm. Transverse non-uniformity of the detector response is {<=}10% before correction and the stability of the detector output throughout the day is within {+-}1%, with day-to-day reproducibility of about 10%. The depth-dose response of the detector is close to that of a wide-aperture air-filled ionization chamber and is in good agreement with Monte Carlo simulations.
Zhou, Qi-Dong; Menjo, Hiroaki; Sako, Takashi
2016-01-01
Very forward (VF) detectors in hadron colliders, having unique sensitivity to diffractive processes, can be a powerful tool for studying diffractive dissociation by combining them with central detectors. Several Monte Carlo simulation samples in $p$-$p$ collisions at $\\sqrt s = 13$ TeV were analyzed, and different nondiffractive and diffractive contributions were clarified through differential cross sections of forward neutral particles. Diffraction selection criteria in the VF-triggered-event samples were determined by using the central track information. The corresponding selection applicable in real experiments has $\\approx$100% purity and 30%-70% efficiency. Consequently, the central information enables classification of the forward productions into diffraction and nondiffraction categories; in particular, most of the surviving events from the selection belong to low-mass diffraction events at $\\log_{10}(\\xi_{x}) < -5.5$. Therefore, the combined method can uniquely access the low-mass diffraction regim...
Burris-Mog, Trevor J.
The interaction of intense laser light (I > 10 18 W/cm2) with a thin target foil leads to the Target Normal Sheath Acceleration mechanism (TNSA). TNSA is responsible for the generation of high current, ultra-low emittance proton beams, which may allow for the development of a compact and cost effective proton therapy system for the treatment of cancer. Before this application can be realized, control is needed over the large divergence and the 100% kinetic energy spread that are characteristic of TNSA proton beams. The work presented here demonstrates control over the divergence and energy spread using strong magnetic fields generated by a pulse power solenoid. The solenoidal field results in a parallel proton beam with a kinetic energy spread DeltaE/E = 10%. Assuming that next generation lasers will be able to operate at 10 Hz, the 10% spread in the kinetic energy along with the 23% capture efficiency of the solenoid yield enough protons per laser pulse to, for the first time, consider applications in Radiation Oncology. Current lasers can generate proton beams with kinetic energies up to 67.5 MeV, but for therapy applications, the proton kinetic energy must reach 250 MeV. Since the maximum kinetic energy Emax of the proton scales with laser light intensity as Emax ∝ I0.5, next generation lasers may very well accelerate 250 MeV protons. As the kinetic energy of the protons is increased, the magnetic field strength of the solenoid will need to increase. The scaling of the magnetic field B with the kinetic energy of the protons follows B ∝ E1/2. Therefor, the field strength of the solenoid presented in this work will need to be increased by a factor of 2.4 in order to accommodate 250 MeV protons. This scaling factor seems reasonable, even with present technology. This work not only demonstrates control over beam divergence and energy spread, it also allows for us to now perform feasibility studies to further research what a laser-based proton therapy system
MCHITS: Monte Carlo based Method for Hyperlink Induced Topic Search on Networks
Directory of Open Access Journals (Sweden)
Zhaoyan Jin
2013-10-01
Full Text Available Hyperlink Induced Topic Search (HITS is the most authoritative and most widely used personalized ranking algorithm on networks. The HITS algorithm ranks nodes on networks according to power iteration, and has high complexity of computation. This paper models the HITS algorithm with the Monte Carlo method, and proposes Monte Carlo based algorithms for the HITS computation. Theoretical analysis and experiments show that the Monte Carlo based approximate computing of the HITS ranking reduces computing resources a lot while keeping higher accuracy, and is significantly better than related works
US Fish and Wildlife Service, Department of the Interior — This is the protocol for conducting waterfowl production surveys based on duck nest transects for the Monte Vista National Wildlife Refuge. The basic approach, field...
Taal, A; van der Kooij, A; Okx, W J C
2016-11-01
Monte Carlo simulations were performed with MCNPX to determine the neutron dose equivalent in thick concrete after a metal shield, a double-layered shielding configuration. In the simulations, a 230-MeV proton beam impinging on a copper target was used to produce the neutrons. For forward angles up to 30° with respect to the proton beam, it is found that the neutron dose equivalent in thick concrete after a metal layer can be expressed in a single formula. This single formula being the neutron dose equivalent formula for a single thick concrete shield enhanced with an additional exponential term. The exponent of this additional exponential term is related to the relative macroscopic neutron removal cross section of the metal with respect to the concrete. The single formula found fits MCNPX data for the neutron dose equivalent in thick concrete after layers of metal ranging from beryllium to lead. First attempts were made to make this shortcut formula applicable to alloys and compounds of metals.
Energy Technology Data Exchange (ETDEWEB)
Lindsay, C; Jirasek, A [University of Victoria (Australia); Blackmore, E; Hoehr, C; Schaffer, P; Trinczek, M [TRIUMF (Canada); Sossi, V [University of British Columbia (Canada)
2014-08-15
Uveal melanoma is a rare and deadly tumour of the eye with primary metastases in the liver resulting in an 8% 2-year survival rate upon detection. Large growths, or those in close proximity to the optic nerve, pose a particular challenge to the commonly employed eye-sparing technique of eye-plaque brachytherapy. In these cases external beam charged particle therapy offers improved odds in avoiding catastrophic side effects such as neuropathy or blindness. Since 1995, the British Columbia Cancer Agency in partnership with the TRIUMF national laboratory have offered proton therapy in the treatment of difficult ocular tumors. Having seen 175 patients, yielding 80% globe preservation and 82% metastasis free survival as of 2010, this modality has proven to be highly effective. Despite this success, there have been few studies into the use of the world's largest cyclotron in patient care. Here we describe first efforts of modeling the TRIUMF dose delivery system using the FLUKA Monte Carlo package. Details on geometry, estimating beam parameters, measurement of primary dose and simulation of PET isotope production are discussed. Proton depth dose in both modulated and pristine beams is successfully simulated to sub-millimeter precision in range (within limits of measurement) and 2% agreement to measurement within in a treatment volume. With the goal of using PET signals for in vivo dosimetry (alignment), a first look at PET isotope depth distribution is presented — comparing favourably to a naive method of approximating simulated PET slice activity in a Lucite phantom.
Sun, Wenjuan; Jia, Xianghong; Xie, Tianwu; Xu, Feng; Liu, Qian
2013-03-01
With the rapid development of China's space industry, the importance of radiation protection is increasingly prominent. To provide relevant dose data, we first developed the Visible Chinese Human adult Female (VCH-F) phantom, and performed further modifications to generate the VCH-F Astronaut (VCH-FA) phantom, incorporating statistical body characteristics data from the first batch of Chinese female astronauts as well as reference organ mass data from the International Commission on Radiological Protection (ICRP; both within 1% relative error). Based on cryosection images, the original phantom was constructed via Non-Uniform Rational B-Spline (NURBS) boundary surfaces to strengthen the deformability for fitting the body parameters of Chinese female astronauts. The VCH-FA phantom was voxelized at a resolution of 2 × 2 × 4 mm(3)for radioactive particle transport simulations from isotropic protons with energies of 5000-10 000 MeV in Monte Carlo N-Particle eXtended (MCNPX) code. To investigate discrepancies caused by anatomical variations and other factors, the obtained doses were compared with corresponding values from other phantoms and sex-averaged doses. Dose differences were observed among phantom calculation results, especially for effective dose with low-energy protons. Local skin thickness shifts the breast dose curve toward high energy, but has little impact on inner organs. Under a shielding layer, organ dose reduction is greater for skin than for other organs. The calculated skin dose per day closely approximates measurement data obtained in low-Earth orbit (LEO).
Ulmer, W
2010-01-01
We have developed a model for proton depth dose and lateral distributions based on Monte Carlo calculations (GEANT4) and an integration procedure of the Bethe-Bloch equation (BBE). The model accounts for the transport of primary and secondary protons, the creation of recoil protons and heavy recoil nuclei as well as lateral scattering of these contributions. The buildup, which is experimentally observed in higher energy depth dose curves, is modeled by inclusion of two different origins: 1. Secondary reaction protons with a contribution of ca. 65 % of the buildup (for monoenergetic protons). 2. Landau tails as well as Gaussian type of fluctuations for range straggling effects. All parameters of the model for initially monoenergetic proton beams have been obtained from Monte Carlo calculations or checked by them. Furthermore, there are a few parameters, which can be obtained by fitting the model to measured depth dose curves in order to describe individual characteristics of the beamline - the most important b...
Jiang, Xu; Deng, Yong; Luo, Zhaoyang; Wang, Kan; Lian, Lichao; Yang, Xiaoquan; Meglinski, Igor; Luo, Qingming
2014-12-29
The path-history-based fluorescence Monte Carlo method used for fluorescence tomography imaging reconstruction has attracted increasing attention. In this paper, we first validate the standard fluorescence Monte Carlo (sfMC) method by experimenting with a cylindrical phantom. Then, we describe a path-history-based decoupled fluorescence Monte Carlo (dfMC) method, analyze different perturbation fluorescence Monte Carlo (pfMC) methods, and compare the calculation accuracy and computational efficiency of the dfMC and pfMC methods using the sfMC method as a reference. The results show that the dfMC method is more accurate and efficient than the pfMC method in heterogeneous medium.
Haibo, Xu
2014-01-01
A version of Geant4 has been developed to treat high-energy proton radiography. This article presents the results of calculations simulating the effects of nuclear elastic scattering for various test step wedges. Comparisons with experimental data are also presented. The traditional expressions of the transmission should be correct if the angle distribution of the scattering is Gaussian multiple Coulomb scattering. The mean free path which depends on the collimator angle and the radiation length are treated as empirical parameters, according to transmission as a function of thickness obtained by simulations. The results benefit for reconstructing density that depends on the transmission expressions.
Energy Technology Data Exchange (ETDEWEB)
Pietrzak, Robert [Department of Nuclear Physics and Its Applications, Institute of Physics, University of Silesia, Katowice (Poland); Konefał, Adam, E-mail: adam.konefal@us.edu.pl [Department of Nuclear Physics and Its Applications, Institute of Physics, University of Silesia, Katowice (Poland); Sokół, Maria; Orlef, Andrzej [Department of Medical Physics, Maria Sklodowska-Curie Memorial Cancer Center, Institute of Oncology, Gliwice (Poland)
2016-08-01
The success of proton therapy depends strongly on the precision of treatment planning. Dose distribution in biological tissue may be obtained from Monte Carlo simulations using various scientific codes making it possible to perform very accurate calculations. However, there are many factors affecting the accuracy of modeling. One of them is a structure of objects called bins registering a dose. In this work the influence of bin structure on the dose distributions was examined. The MCNPX code calculations of Bragg curve for the 60 MeV proton beam were done in two ways: using simple logical detectors being the volumes determined in water, and using a precise model of ionization chamber used in clinical dosimetry. The results of the simulations were verified experimentally in the water phantom with Marcus ionization chamber. The average local dose difference between the measured relative doses in the water phantom and those calculated by means of the logical detectors was 1.4% at first 25 mm, whereas in the full depth range this difference was 1.6% for the maximum uncertainty in the calculations less than 2.4% and for the maximum measuring error of 1%. In case of the relative doses calculated with the use of the ionization chamber model this average difference was somewhat greater, being 2.3% at depths up to 25 mm and 2.4% in the full range of depths for the maximum uncertainty in the calculations of 3%. In the dose calculations the ionization chamber model does not offer any additional advantages over the logical detectors. The results provided by both models are similar and in good agreement with the measurements, however, the logical detector approach is a more time-effective method. - Highlights: • Influence of the bin structure on the proton dose distributions was examined for the MC simulations. • The considered relative proton dose distributions in water correspond to the clinical application. • MC simulations performed with the logical detectors and the
Geng, Changran; Tang, Xiaobin; Gong, Chunhui; Guan, Fada; Johns, Jesse; Shu, Diyun; Chen, Da
2015-12-01
The active shielding technique has great potential for radiation protection in space exploration because it has the advantage of a significant mass saving compared with the passive shielding technique. This paper demonstrates a Monte Carlo-based approach to evaluating the shielding effectiveness of the active shielding technique using confined magnetic fields (CMFs). The International Commission on Radiological Protection reference anthropomorphic phantom, as well as the toroidal CMF, was modeled using the Monte Carlo toolkit Geant4. The penetrating primary particle fluence, organ-specific dose equivalent, and male effective dose were calculated for particles in galactic cosmic radiation (GCR) and solar particle events (SPEs). Results show that the SPE protons can be easily shielded against, even almost completely deflected, by the toroidal magnetic field. GCR particles can also be more effectively shielded against by increasing the magnetic field strength. Our results also show that the introduction of a structural Al wall in the CMF did not provide additional shielding for GCR; in fact it can weaken the total shielding effect of the CMF. This study demonstrated the feasibility of accurately determining the radiation field inside the environment and evaluating the organ dose equivalents for astronauts under active shielding using the CMF.
Composite proton exchange membrane based on sulfonated organic nanoparticles
Pitia, Emmanuel Sokiri
exchange was characterized with solid state 13C NMR spectroscopy, FTIR spectroscopy, TGA, elemental analysis, and titration. The results indicate the extent of ion exchange was ~ 70-80%. Due to the mass of QAA, the remaining QAA reduced the IEC of the nanoparticles to polystyrene were solution cast in a continuous process with and without electric field. The electric field had no effect on the water uptake. Based on the morphology and the proton conductivity, it appears orientation of the nanoparticles did not occur. We hypothesize the lack of orientation was caused by swelling of the particles with the solvent. The solvent inside the particle minimized polarizability, and thus prevented orientation. The composite membranes were limited to low proton conductivity of ~ 10-5 S/cm due to low IEC of the nanoparticles, but good dispersion of the nanoparticles was achieved. Future work should look into eliminating the QAA during synthesis and developing a rigid core for the nanoparticles.
Energy Technology Data Exchange (ETDEWEB)
Moskvin, V; Tsiamas, P; Axente, M; Farr, J [St. Jude Children’s Research Hospital, Memphis, TN (United States); Stewart, R [University of Washington, Seattle, WA. (United States)
2015-06-15
Purpose: One of the more critical initiating events for reproductive cell death is the creation of a DNA double strand break (DSB). In this study, we present a computationally efficient way to determine spatial variations in the relative biological effectiveness (RBE) of proton therapy beams within the FLUKA Monte Carlo (MC) code. Methods: We used the independently tested Monte Carlo Damage Simulation (MCDS) developed by Stewart and colleagues (Radiat. Res. 176, 587–602 2011) to estimate the RBE for DSB induction of monoenergetic protons, tritium, deuterium, hellium-3, hellium-4 ions and delta-electrons. The dose-weighted (RBE) coefficients were incorporated into FLUKA to determine the equivalent {sup 6}°60Co γ-ray dose for representative proton beams incident on cells in an aerobic and anoxic environment. Results: We found that the proton beam RBE for DSB induction at the tip of the Bragg peak, including primary and secondary particles, is close to 1.2. Furthermore, the RBE increases laterally to the beam axis at the area of Bragg peak. At the distal edge, the RBE is in the range from 1.3–1.4 for cells irradiated under aerobic conditions and may be as large as 1.5–1.8 for cells irradiated under anoxic conditions. Across the plateau region, the recorded RBE for DSB induction is 1.02 for aerobic cells and 1.05 for cells irradiated under anoxic conditions. The contribution to total effective dose from secondary heavy ions decreases with depth and is higher at shallow depths (e.g., at the surface of the skin). Conclusion: Multiscale simulation of the RBE for DSB induction provides useful insights into spatial variations in proton RBE within pristine Bragg peaks. This methodology is potentially useful for the biological optimization of proton therapy for the treatment of cancer. The study highlights the need to incorporate spatial variations in proton RBE into proton therapy treatment plans.
Pietrzak, Robert; Konefał, Adam; Sokół, Maria; Orlef, Andrzej
2016-08-01
The success of proton therapy depends strongly on the precision of treatment planning. Dose distribution in biological tissue may be obtained from Monte Carlo simulations using various scientific codes making it possible to perform very accurate calculations. However, there are many factors affecting the accuracy of modeling. One of them is a structure of objects called bins registering a dose. In this work the influence of bin structure on the dose distributions was examined. The MCNPX code calculations of Bragg curve for the 60 MeV proton beam were done in two ways: using simple logical detectors being the volumes determined in water, and using a precise model of ionization chamber used in clinical dosimetry. The results of the simulations were verified experimentally in the water phantom with Marcus ionization chamber. The average local dose difference between the measured relative doses in the water phantom and those calculated by means of the logical detectors was 1.4% at first 25 mm, whereas in the full depth range this difference was 1.6% for the maximum uncertainty in the calculations less than 2.4% and for the maximum measuring error of 1%. In case of the relative doses calculated with the use of the ionization chamber model this average difference was somewhat greater, being 2.3% at depths up to 25 mm and 2.4% in the full range of depths for the maximum uncertainty in the calculations of 3%. In the dose calculations the ionization chamber model does not offer any additional advantages over the logical detectors. The results provided by both models are similar and in good agreement with the measurements, however, the logical detector approach is a more time-effective method.
Obot, I. B.; Kaya, Savaş; Kaya, Cemal; Tüzün, Burak
2016-06-01
DFT and Monte Carlo simulation were performed on three Schiff bases namely, 4-(4-bromophenyl)-N‧-(4-methoxybenzylidene)thiazole-2-carbohydrazide (BMTC), 4-(4-bromophenyl)-N‧-(2,4-dimethoxybenzylidene)thiazole-2-carbohydrazide (BDTC), 4-(4-bromophenyl)-N‧-(4-hydroxybenzylidene)thiazole-2-carbohydrazide (BHTC) recently studied as corrosion inhibitor for steel in acid medium. Electronic parameters relevant to their inhibition activity such as EHOMO, ELUMO, Energy gap (ΔE), hardness (η), softness (σ), the absolute electronegativity (χ), proton affinity (PA) and nucleophilicity (ω) etc., were computed and discussed. Monte Carlo simulations were applied to search for the most stable configuration and adsorption energies for the interaction of the inhibitors with Fe (110) surface. The theoretical data obtained are in most cases in agreement with experimental results.
Magro, G; Mairani, A; Mirandola, A; Panizza, D; Russo, S; Ferrari, A; Valvo, F; Fossati, P; Ciocca, M
2015-01-01
This study was performed to evaluate the accuracy of a commercial treatment planning system (TPS), in optimising proton pencil beam dose distributions for small targets of different sizes (5–30 mm side) located at increasing depths in water. The TPS analytical algorithm was benchmarked against experimental data and the FLUKA Monte Carlo (MC) code, previously validated for the selected beam-line. We tested the Siemens syngo® TPS plan optimisation module for water cubes fixing the configurable parameters at clinical standards, with homogeneous target coverage to a 2 Gy (RBE) dose prescription as unique goal. Plans were delivered and the dose at each volume centre was measured in water with a calibrated PTW Advanced Markus® chamber. An EBT3® film was also positioned at the phantom entrance window for the acquisition of 2D dose maps. Discrepancies between TPS calculated and MC simulated values were mainly due to the different lateral spread modeling and resulted in being related to the field-to-spot size r...
Magro, G.; Molinelli, S.; Mairani, A.; Mirandola, A.; Panizza, D.; Russo, S.; Ferrari, A.; Valvo, F.; Fossati, P.; Ciocca, M.
2015-09-01
This study was performed to evaluate the accuracy of a commercial treatment planning system (TPS), in optimising proton pencil beam dose distributions for small targets of different sizes (5-30 mm side) located at increasing depths in water. The TPS analytical algorithm was benchmarked against experimental data and the FLUKA Monte Carlo (MC) code, previously validated for the selected beam-line. We tested the Siemens syngo® TPS plan optimisation module for water cubes fixing the configurable parameters at clinical standards, with homogeneous target coverage to a 2 Gy (RBE) dose prescription as unique goal. Plans were delivered and the dose at each volume centre was measured in water with a calibrated PTW Advanced Markus® chamber. An EBT3® film was also positioned at the phantom entrance window for the acquisition of 2D dose maps. Discrepancies between TPS calculated and MC simulated values were mainly due to the different lateral spread modeling and resulted in being related to the field-to-spot size ratio. The accuracy of the TPS was proved to be clinically acceptable in all cases but very small and shallow volumes. In this contest, the use of MC to validate TPS results proved to be a reliable procedure for pre-treatment plan verification.
Molinelli, S.; Mairani, A.; Mirandola, A.; Vilches Freixas, G.; Tessonnier, T.; Giordanengo, S.; Parodi, K.; Ciocca, M.; Orecchia, R.
2013-06-01
During one year of clinical activity at the Italian National Center for Oncological Hadron Therapy 31 patients were treated with actively scanned proton beams. Results of patient-specific quality assurance procedures are presented here which assess the accuracy of a three-dimensional dose verification technique with the simultaneous use of multiple small-volume ionization chambers. To investigate critical cases of major deviations between treatment planning system (TPS) calculated and measured data points, a Monte Carlo (MC) simulation tool was implemented for plan verification in water. Starting from MC results, the impact of dose calculation, dose delivery and measurement set-up uncertainties on plan verification results was analyzed. All resulting patient-specific quality checks were within the acceptance threshold, which was set at 5% for both mean deviation between measured and calculated doses and standard deviation. The mean deviation between TPS dose calculation and measurement was less than ±3% in 86% of the cases. When all three sources of uncertainty were accounted for, simulated data sets showed a high level of agreement, with mean and maximum absolute deviation lower than 2.5% and 5%, respectively.
Application of Photon Transport Monte Carlo Module with GPU-based Parallel System
Energy Technology Data Exchange (ETDEWEB)
Park, Chang Je [Sejong University, Seoul (Korea, Republic of); Shon, Heejeong [Golden Eng. Co. LTD, Seoul (Korea, Republic of); Lee, Donghak [CoCo Link Inc., Seoul (Korea, Republic of)
2015-05-15
In general, it takes lots of computing time to get reliable results in Monte Carlo simulations especially in deep penetration problems with a thick shielding medium. To mitigate such a weakness of Monte Carlo methods, lots of variance reduction algorithms are proposed including geometry splitting and Russian roulette, weight windows, exponential transform, and forced collision, etc. Simultaneously, advanced computing hardware systems such as GPU(Graphics Processing Units)-based parallel machines are used to get a better performance of the Monte Carlo simulation. The GPU is much easier to access and to manage when comparing a CPU cluster system. It also becomes less expensive these days due to enhanced computer technology. There, lots of engineering areas adapt GPU-bases massive parallel computation technique. based photon transport Monte Carlo method. It provides almost 30 times speedup without any optimization and it is expected almost 200 times with fully supported GPU system. It is expected that GPU system with advanced parallelization algorithm will contribute successfully for development of the Monte Carlo module which requires quick and accurate simulations.
Proton therapy for tumors of the skull base
Energy Technology Data Exchange (ETDEWEB)
Munzenrider, J.E.; Liebsch, N.J. [Dept. of Radiation Oncology, Harvard Univ. Medical School, Boston, MA (United States)
1999-06-01
Charged particle beams are ideal for treating skull base and cervical spine tumors: dose can be focused in the target, while achieving significant sparing of the brain, brain stem, cervical cord, and optic nerves and chiasm. For skull base tumors, 10-year local control rates with combined proton-photon therapy are highest for chondrosarcomas, intermediate for male chordomas, and lowest for female chordomas (94%, 65%, and 42%, respectively). For cervical spine tumors, 10-year local control rates are not significantly different for chordomas and chondrosarcomas (54% and 48%, respectively), nor is there any difference in local control between males and females. Observed treatment-related morbidity has been judged acceptable, in view of the major morbidity and mortality which accompany uncontrolled tumor growth. (orig.)
Xue, Pengchong; Chen, Peng; Jia, Junhui; Xu, Qiuxia; Sun, Jiabao; Yao, Boqi; Zhang, Zhenqi; Lu, Ran
2014-03-11
A triphenylamine-based benzoxazole derivative exhibits a low contrast piezofluorochromic behavior under external pressure, and a high-contrast fluorescence change induced by protonation can be observed.
Accuracy Analysis for 6-DOF PKM with Sobol Sequence Based Quasi Monte Carlo Method
Institute of Scientific and Technical Information of China (English)
Jianguang Li; Jian Ding; Lijie Guo; Yingxue Yao; Zhaohong Yi; Huaijing Jing; Honggen Fang
2015-01-01
To improve the precisions of pose error analysis for 6⁃dof parallel kinematic mechanism ( PKM) during assembly quality control, a Sobol sequence based on Quasi Monte Carlo ( QMC) method is introduced and implemented in pose accuracy analysis for the PKM in this paper. The Sobol sequence based on Quasi Monte Carlo with the regularity and uniformity of samples in high dimensions, can prevail traditional Monte Carlo method with up to 98�59% and 98�25% enhancement for computational precision of pose error statistics. Then a PKM tolerance design system integrating this method is developed and with it pose error distributions of the PKM within a prescribed workspace are finally obtained and analyzed.
Energy Technology Data Exchange (ETDEWEB)
Lopez-Tarjuelo, J.; Garcia-Molla, R.; Suan-Senabre, X. J.; Quiros-Higueras, J. Q.; Santos-Serra, A.; Marco-Blancas, N.; Calzada-Feliu, S.
2013-07-01
It has been done the acceptance for use clinical Monaco computerized planning system, based on an on a virtual model of the energy yield of the head of the linear electron Accelerator and that performs the calculation of the dose with an algorithm of x-rays (XVMC) based on Monte Carlo algorithm. (Author)
A Muon Source Proton Driver at JPARC-based Parameters
Energy Technology Data Exchange (ETDEWEB)
Neuffer, David [Fermilab
2016-06-01
An "ultimate" high intensity proton source for neutrino factories and/or muon colliders was projected to be a ~4 MW multi-GeV proton source providing short, intense proton pulses at ~15 Hz. The JPARC ~1 MW accelerators provide beam at parameters that in many respects overlap these goals. Proton pulses from the JPARC Main Ring can readily meet the pulsed intensity goals. We explore these parameters, describing the overlap and consider extensions that may take a JPARC-like facility toward this "ultimate" source. JPARC itself could serve as a stage 1 source for such a facility.
Sampling uncertainty evaluation for data acquisition board based on Monte Carlo method
Ge, Leyi; Wang, Zhongyu
2008-10-01
Evaluating the data acquisition board sampling uncertainty is a difficult problem in the field of signal sampling. This paper analyzes the sources of dada acquisition board sampling uncertainty in the first, then introduces a simulation theory of dada acquisition board sampling uncertainty evaluation based on Monte Carlo method and puts forward a relation model of sampling uncertainty results, sampling numbers and simulation times. In the case of different sample numbers and different signal scopes, the author establishes a random sampling uncertainty evaluation program of a PCI-6024E data acquisition board to execute the simulation. The results of the proposed Monte Carlo simulation method are in a good agreement with the GUM ones, and the validities of Monte Carlo method are represented.
A new Monte Carlo simulation model for laser transmission in smokescreen based on MATLAB
Lee, Heming; Wang, Qianqian; Shan, Bin; Li, Xiaoyang; Gong, Yong; Zhao, Jing; Peng, Zhong
2016-11-01
A new Monte Carlo simulation model of laser transmission in smokescreen is promoted in this paper. In the traditional Monte Carlo simulation model, the radius of particles is set at the same value and the initial cosine value of photons direction is fixed also, which can only get the approximate result. The new model is achieved based on MATLAB and can simulate laser transmittance in smokescreen with different sizes of particles, and the output result of the model is close to the real scenarios. In order to alleviate the influence of the laser divergence while traveling in the air, we changed the initial direction cosine of photons on the basis of the traditional Monte Carlo model. The mixed radius particle smoke simulation results agree with the measured transmittance under the same experimental conditions with 5.42% error rate.
Directory of Open Access Journals (Sweden)
Biniam Tesfamicael
2016-03-01
Full Text Available Purpose: The main purpose of this study was to monitor the secondary dose distribution originating from a water phantom during proton therapy of prostate cancer using scintillating fibers.Methods: The Geant4 Monte Carlo toolkit version 9.6.p02 was used to simulate a proton therapy of prostate cancer. Two cases were studied. In the first case, 8 × 8 = 64 equally spaced fibers inside three 4 × 4 × 2.54 cm3 Delrin® blocks were used to monitor the emission of secondary particles in the transverse (left and right and distal regions relative to the beam direction. In the second case, a scintillating block with a thickness of 2.54 cm and equal vertical and longitudinal dimensions as the water phantom was used. Geometrical cuts were implemented to extract the energy deposited in each fiber and inside the scintillating block.Results: The transverse dose distributions from the detected secondary particles in both cases are symmetric and agree to within <3.6%. The energy deposited gradually increases as one moves from the peripheral row of fibers towards the center of the block (aligned with the center of the prostate by a factor of approximately 5. The energy deposited was also observed to decrease as one goes from the frontal to distal region of the block. The ratio of the energy deposited in the prostate to the energy deposited in the middle two rows of fibers showed a linear relationship with a slope of (-3.55±2.26 × 10-5 MeV per treatment Gy delivered. The distal detectors recorded a negligible amount of energy deposited due to higher attenuation of the secondary particles by the water in that direction.Conclusion: With a good calibration and with the ability to define a good correlation between the radiation flux recorded by the external fibers and the dose delivered to the prostate, such fibers can be used for real time dose verification to the target. The system was also observed to respond to the series of Bragg Peaks used to generate the
Monte-Carlo based prediction of radiochromic film response for hadrontherapy dosimetry
Energy Technology Data Exchange (ETDEWEB)
Frisson, T. [Universite de Lyon, F-69622 Lyon (France); CREATIS-LRMN, INSA, Batiment Blaise Pascal, 7 avenue Jean Capelle, 69621 Villeurbanne Cedex (France); Centre Leon Berrard - 28 rue Laennec, F-69373 Lyon Cedex 08 (France)], E-mail: frisson@creatis.insa-lyon.fr; Zahra, N. [Universite de Lyon, F-69622 Lyon (France); IPNL - CNRS/IN2P3 UMR 5822, Universite Lyon 1, Batiment Paul Dirac, 4 rue Enrico Fermi, F-69622 Villeurbanne Cedex (France); Centre Leon Berrard - 28 rue Laennec, F-69373 Lyon Cedex 08 (France); Lautesse, P. [Universite de Lyon, F-69622 Lyon (France); IPNL - CNRS/IN2P3 UMR 5822, Universite Lyon 1, Batiment Paul Dirac, 4 rue Enrico Fermi, F-69622 Villeurbanne Cedex (France); Sarrut, D. [Universite de Lyon, F-69622 Lyon (France); CREATIS-LRMN, INSA, Batiment Blaise Pascal, 7 avenue Jean Capelle, 69621 Villeurbanne Cedex (France); Centre Leon Berrard - 28 rue Laennec, F-69373 Lyon Cedex 08 (France)
2009-07-21
A model has been developed to calculate MD-55-V2 radiochromic film response to ion irradiation. This model is based on photon film response and film saturation by high local energy deposition computed by Monte-Carlo simulation. We have studied the response of the film to photon irradiation and we proposed a calculation method for hadron beams.
Monte-Carlo based prediction of radiochromic film response for hadrontherapy dosimetry
Frisson, T.; Zahra, N.; Lautesse, P.; Sarrut, D.
2009-07-01
A model has been developed to calculate MD-55-V2 radiochromic film response to ion irradiation. This model is based on photon film response and film saturation by high local energy deposition computed by Monte-Carlo simulation. We have studied the response of the film to photon irradiation and we proposed a calculation method for hadron beams.
Confronting uncertainty in model-based geostatistics using Markov Chain Monte Carlo simulation
Minasny, B.; Vrugt, J.A.; McBratney, A.B.
2011-01-01
This paper demonstrates for the first time the use of Markov Chain Monte Carlo (MCMC) simulation for parameter inference in model-based soil geostatistics. We implemented the recently developed DiffeRential Evolution Adaptive Metropolis (DREAM) algorithm to jointly summarize the posterior distributi
Panda, Tamas; Kundu, Tanay; Banerjee, Rahul
2013-07-14
Proton conductivity has been studied thoroughly in two isomeric In(III)-isophthalate based MOFs. In-IA-2D-1 is capable of showing proton conductivity (3.4 × 10(-3) S cm(-1)) under humidified conditions (98% RH), whereas In-IA-2D-2 can conduct protons (2.6 × 10(-5) S cm(-1)) under humidified as well as anhydrous conditions.
Eskers and other evidence of wet-based glaciation in Phlegra Montes, Mars.
Gallagher, Colman; Balme, Matt
2016-04-01
Although glacial landsystems produced under warm/wet based conditions are very common on Earth, glaciological and landform evidence indicates that glaciation on Mars during the Amazonian period (3 Ga to present) has been characterised by cold/dry based glaciers, consistent with the prevailing cold, hyperarid conditions. However, this presentation describes a system of sinuous ridges, interpreted as eskers (1), emerging from the degraded piedmont terminus of a Late Amazonian (˜150 Ma) glacier in the southern Phlegra Montes region of Mars. This is probably the first identification of martian eskers that can be directly linked to their parent glacier. Together with their contextual landform assemblage, the eskers are indicative of glacial melting and subglacial meltwater routing but the confinement of the system to a well-defined, regionally significant graben, and the absence of eskers elsewhere in the region, suggests that melting was a response to locally enhanced geothermal heat flux, rather than regional, climate-induced warming. Now, however, new observations reveal the presence of many assemblages of glacial abrasion forms and associated channels that could be evidence of more widespread wet-based glaciation in Phlegra Montes, including the collapse of several distinct ice domes. This landform assemblage has not been described in other glaciated, mid-latitude regions of the martian northern hemisphere. Moreover, Phlegra Montes are flanked by lowlands displaying evidence of extensive volcanism, including contact between plains lava and piedmont glacial ice. These observations suggest that the glaciation of Phlegra Montes might have been strongly conditioned by both volcanism and more restricted forms of ground-heating. These are important new insights both to the forcing of glacial dynamic and melting behaviour on Mars by factors other than climate and to the production of liquid water on Mars during the Late Amazonian. (1) Gallagher, C. and Balme, M. (2015
Collider design issues based on proton-driven plasma wakefield acceleration
Xia, G; Aimidula, A; Welsch, C; Chattopadhyay, S; Mandry, S; Wing, M
2014-01-01
Recent simulations have shown that a high-energy proton bunch can excite strong plasma wakefields and accelerate a bunch of electrons to the energy frontier in a single stage of acceleration. It therefore paves the way towards a compact future collider design using the proton beams from existing high-energy proton machines, e.g. Tevatron or the LHC. This paper addresses some key issues in designing a compact electron-positron linear collider and an electron-proton collider based on existing CERN accelerator infrastructure.
Fung, Wing K; Yu, Kexin; Yang, Yingrui; Zhou, Ji-Yuan
2016-08-08
Monte Carlo evaluation of resampling-based tests is often conducted in statistical analysis. However, this procedure is generally computationally intensive. The pooling resampling-based method has been developed to reduce the computational burden but the validity of the method has not been studied before. In this article, we first investigate the asymptotic properties of the pooling resampling-based method and then propose a novel Monte Carlo evaluation procedure namely the n-times pooling resampling-based method. Theorems as well as simulations show that the proposed method can give smaller or comparable root mean squared errors and bias with much less computing time, thus can be strongly recommended especially for evaluating highly computationally intensive hypothesis testing procedures in genetic epidemiology.
Molecular modeling of protonic acid doping of emeraldine base polyaniline for chemical sensors
Chen, X.; Yuan, C.A.; Wong, C.K.Y.; Ye, H.; Leung, S.Y.Y.; Zhang, G.
2012-01-01
We proposed a molecular modeling methodology to study the protonic acid doping of emeraldine base polyaniline which can used in gas detection. The commercial forcefield COMPASS was used for the polymer and protonic acid molecules. The molecular model, which is capable of representing the polyaniline
Long-range proton transfer in aqueous acid-base reactions
Siwick, B.J.; Cox, M.J.; Bakker, H.J.
2008-01-01
We study the mechanism of proton transfer (PT) in the aqueous acid−base reaction between the photoacid 8-hydroxy-1,3,6-pyrenetrisulfonic acid (HPTS) and acetate by probing the vibrational resonances of HPTS, acetate, and the hydrated proton with femtosecond mid-infrared laser pulses. We find that PT
A measurement-based generalized source model for Monte Carlo dose simulations of CT scans
Ming, Xin; Feng, Yuanming; Liu, Ransheng; Yang, Chengwen; Zhou, Li; Zhai, Hezheng; Deng, Jun
2017-03-01
The goal of this study is to develop a generalized source model for accurate Monte Carlo dose simulations of CT scans based solely on the measurement data without a priori knowledge of scanner specifications. The proposed generalized source model consists of an extended circular source located at x-ray target level with its energy spectrum, source distribution and fluence distribution derived from a set of measurement data conveniently available in the clinic. Specifically, the central axis percent depth dose (PDD) curves measured in water and the cone output factors measured in air were used to derive the energy spectrum and the source distribution respectively with a Levenberg–Marquardt algorithm. The in-air film measurement of fan-beam dose profiles at fixed gantry was back-projected to generate the fluence distribution of the source model. A benchmarked Monte Carlo user code was used to simulate the dose distributions in water with the developed source model as beam input. The feasibility and accuracy of the proposed source model was tested on a GE LightSpeed and a Philips Brilliance Big Bore multi-detector CT (MDCT) scanners available in our clinic. In general, the Monte Carlo simulations of the PDDs in water and dose profiles along lateral and longitudinal directions agreed with the measurements within 4%/1 mm for both CT scanners. The absolute dose comparison using two CTDI phantoms (16 cm and 32 cm in diameters) indicated a better than 5% agreement between the Monte Carlo-simulated and the ion chamber-measured doses at a variety of locations for the two scanners. Overall, this study demonstrated that a generalized source model can be constructed based only on a set of measurement data and used for accurate Monte Carlo dose simulations of patients’ CT scans, which would facilitate patient-specific CT organ dose estimation and cancer risk management in the diagnostic and therapeutic radiology.
Proton pump inhibitors in cirrhosis: Tradition or evidence based practice?
Institute of Scientific and Technical Information of China (English)
Francesca Lodato; Francesco Azzaroli; Maria Di Girolamo; Valentina Feletti; Paolo Cecinato; Andrea Lisotti; Davide Festi; Enrico Roda; Giuseppe Mazzella
2008-01-01
Proton Pump Inhibitors (PPI) are very effective in inhibiting acid secretion and are extensively used in many acid related diseases. They are also often used in patients with cirrhosis sometimes in the absence of a specific acid related disease, with the aim of preventing peptic complications in patients with variceal or hypertensive gastropathic bleeding receiving multidrug treatment. Contradicting reports support their use in cirrhosis and evidence of their efficacy in this condition is poor. Moreover there are convincing papers suggesting that acid secretion is reduced in patients with liver cirrhosis. With regard to H pylori infection, its prevalence in patients with cirrhosis is largely variable among different studies, and it seems that H pylori eradication does not prevent gastro-duodenal ulcer formation and bleeding. With regard to the prevention and treatment of oesophageal complications after banding or sclerotherapy of oesophageal varices, there is little evidence for a protective role of PPI. Moreover, due to liver metabolism of PPI, the dose of most available PPIs should be reduced in cirrhotics. In conclusion, the use of this class of drugs seems more habit related than evidence-based eventually leading to an increase in health costs.
Effects of CT based Voxel Phantoms on Dose Distribution Calculated with Monte Carlo Method
Institute of Scientific and Technical Information of China (English)
Chen Chaobin; Huang Qunying; Wu Yican
2005-01-01
A few CT-based voxel phantoms were produced to investigate the sensitivity of Monte Carlo simulations of X-ray beam and electron beam to the proportions of elements and the mass densities of the materials used to express the patient's anatomical structure. The human body can be well outlined by air, lung, adipose, muscle, soft bone and hard bone to calculate the dose distribution with Monte Carlo method. The effects of the calibration curves established by using various CT scanners are not clinically significant based on our investigation. The deviation from the values of cumulative dose volume histogram derived from CT-based voxel phantoms is less than 1% for the given target.
A Markov Chain Monte Carlo Based Method for System Identification
Energy Technology Data Exchange (ETDEWEB)
Glaser, R E; Lee, C L; Nitao, J J; Hanley, W G
2002-10-22
This paper describes a novel methodology for the identification of mechanical systems and structures from vibration response measurements. It combines prior information, observational data and predictive finite element models to produce configurations and system parameter values that are most consistent with the available data and model. Bayesian inference and a Metropolis simulation algorithm form the basis for this approach. The resulting process enables the estimation of distributions of both individual parameters and system-wide states. Attractive features of this approach include its ability to: (1) provide quantitative measures of the uncertainty of a generated estimate; (2) function effectively when exposed to degraded conditions including: noisy data, incomplete data sets and model misspecification; (3) allow alternative estimates to be produced and compared, and (4) incrementally update initial estimates and analysis as more data becomes available. A series of test cases based on a simple fixed-free cantilever beam is presented. These results demonstrate that the algorithm is able to identify the system, based on the stiffness matrix, given applied force and resultant nodal displacements. Moreover, it effectively identifies locations on the beam where damage (represented by a change in elastic modulus) was specified.
In silico prediction of the β-cyclodextrin complexation based on Monte Carlo method.
Veselinović, Aleksandar M; Veselinović, Jovana B; Toropov, Andrey A; Toropova, Alla P; Nikolić, Goran M
2015-11-10
In this study QSPR models were developed to predict the complexation of structurally diverse compounds with β-cyclodextrin based on SMILES notation optimal descriptors using Monte Carlo method. The predictive potential of the applied approach was tested with three random splits into the sub-training, calibration, test and validation sets and with different statistical methods. Obtained results demonstrate that Monte Carlo method based modeling is a very promising computational method in the QSPR studies for predicting the complexation of structurally diverse compounds with β-cyclodextrin. The SMILES attributes (structural features both local and global), defined as molecular fragments, which are promoters of the increase/decrease of molecular binding constants were identified. These structural features were correlated to the complexation process and their identification helped to improve the understanding for the complexation mechanisms of the host molecules.
Simulation model based on Monte Carlo method for traffic assignment in local area road network
Institute of Scientific and Technical Information of China (English)
Yuchuan DU; Yuanjing GENG; Lijun SUN
2009-01-01
For a local area road network, the available traffic data of traveling are the flow volumes in the key intersections, not the complete OD matrix. Considering the circumstance characteristic and the data availability of a local area road network, a new model for traffic assignment based on Monte Carlo simulation of intersection turning movement is provided in this paper. For good stability in temporal sequence, turning ratio is adopted as the important parameter of this model. The formulation for local area road network assignment problems is proposed on the assumption of random turning behavior. The traffic assignment model based on the Monte Carlo method has been used in traffic analysis for an actual urban road network. The results comparing surveying traffic flow data and determining flow data by the previous model verify the applicability and validity of the proposed methodology.
Final Project Report for project titled "Fluoroalkylphosphonic-acid-based proton conductors"
Energy Technology Data Exchange (ETDEWEB)
Stephen Creager
2011-12-08
The overall objective of this research was to create new proton-conducting polymer electrolytes for use in energy conversion devices including hydrogen fuel cells that could operate at high temperatures (95-130 C) and under low relative humidity (< 50% RH) conditions. The new polymers were based on the fluoroalkylphosphonic and phosphinic acid (FPA) groups (see illustration below) which offer prospects for rapid proton transport by a proton-hopping mechanism similar to that which operates in phosphoric acid, a well-known proton-transporting electrolyte that is used in a class of hydrogen fuel cells that work well under the conditions noted above and are already commercially successful. The two specific project objectives were as follows: (1) synthesize and characterize new proton-conducting electrolytes based on the fluoroalkylphosphonic and phosphinic acid (FPA) functional groups; and (2) create and apply new computer models to study protonic conduction in FPA-based electrolytes. The project was successful in creating the desired polymer electrolytes and also a series of molecular model compounds which were used to study proton transport in FPA electrolytes in general. Computer models were created to study both structure and proton-transport dynamics in the electrolytes, particularly the molecular model compounds. Rapid proton transport by a hopping mechanism was found in many of the model compounds and correlations with transport rates with molecular structure were identified. Several polymeric analogs of FPA model compounds were prepared and studied, however FPA-based polymeric materials having very high protonic conductivities under either wet or dry conditions were not obtained. Several possible reasons for the failure of polymeric materials to exhibit the expected high protonic conductivities were identified, including a failure of the polymers to adopt the phase-separated secondary structure/morphology necessary for high proton conductivity, and an
Development of a space radiation Monte Carlo computer simulation based on the FLUKA and ROOT codes
Pinsky, L; Ferrari, A; Sala, P; Carminati, F; Brun, R
2001-01-01
This NASA funded project is proceeding to develop a Monte Carlo-based computer simulation of the radiation environment in space. With actual funding only initially in place at the end of May 2000, the study is still in the early stage of development. The general tasks have been identified and personnel have been selected. The code to be assembled will be based upon two major existing software packages. The radiation transport simulation will be accomplished by updating the FLUKA Monte Carlo program, and the user interface will employ the ROOT software being developed at CERN. The end-product will be a Monte Carlo-based code which will complement the existing analytic codes such as BRYNTRN/HZETRN presently used by NASA to evaluate the effects of radiation shielding in space. The planned code will possess the ability to evaluate the radiation environment for spacecraft and habitats in Earth orbit, in interplanetary space, on the lunar surface, or on a planetary surface such as Mars. Furthermore, it will be usef...
Hu, Xingzhi; Chen, Xiaoqian; Parks, Geoffrey T.; Yao, Wen
2016-10-01
Ever-increasing demands of uncertainty-based design, analysis, and optimization in aerospace vehicles motivate the development of Monte Carlo methods with wide adaptability and high accuracy. This paper presents a comprehensive review of typical improved Monte Carlo methods and summarizes their characteristics to aid the uncertainty-based multidisciplinary design optimization (UMDO). Among them, Bayesian inference aims to tackle the problems with the availability of prior information like measurement data. Importance sampling (IS) settles the inconvenient sampling and difficult propagation through the incorporation of an intermediate importance distribution or sequential distributions. Optimized Latin hypercube sampling (OLHS) is a stratified sampling approach to achieving better space-filling and non-collapsing characteristics. Meta-modeling approximation based on Monte Carlo saves the computational cost by using cheap meta-models for the output response. All the reviewed methods are illustrated by corresponding aerospace applications, which are compared to show their techniques and usefulness in UMDO, thus providing a beneficial reference for future theoretical and applied research.
Energy Technology Data Exchange (ETDEWEB)
Jones, Kevin C.; Solberg, Timothy D.; Avery, Stephen, E-mail: Stephen.Avery@uphs.upenn.edu [Department of Radiation Oncology, University of Pennsylvania, Philadelphia, Pennsylvania 19104 (United States); Vander Stappen, François; Janssens, Guillaume; Prieels, Damien [Ion Beam Applications SA, Louvain-la-Neuve 1348 (Belgium); Bawiec, Christopher R.; Lewin, Peter A. [School of Biomedical Engineering, Drexel University, Philadelphia, Pennsylvania 19104 (United States); Sehgal, Chandra M. [Department of Radiology, University of Pennsylvania, Philadelphia, Pennsylvania 19104 (United States)
2015-12-15
Purpose: To measure the acoustic signal generated by a pulsed proton spill from a hospital-based clinical cyclotron. Methods: An electronic function generator modulated the IBA C230 isochronous cyclotron to create a pulsed proton beam. The acoustic emissions generated by the proton beam were measured in water using a hydrophone. The acoustic measurements were repeated with increasing proton current and increasing distance between detector and beam. Results: The cyclotron generated proton spills with rise times of 18 μs and a maximum measured instantaneous proton current of 790 nA. Acoustic emissions generated by the proton energy deposition were measured to be on the order of mPa. The origin of the acoustic wave was identified as the proton beam based on the correlation between acoustic emission arrival time and distance between the hydrophone and proton beam. The acoustic frequency spectrum peaked at 10 kHz, and the acoustic pressure amplitude increased monotonically with increasing proton current. Conclusions: The authors report the first observation of acoustic emissions generated by a proton beam from a hospital-based clinical cyclotron. When modulated by an electronic function generator, the cyclotron is capable of creating proton spills with fast rise times (18 μs) and high instantaneous currents (790 nA). Measurements of the proton-generated acoustic emissions in a clinical setting may provide a method for in vivo proton range verification and patient monitoring.
Molecular Design of Ionization-Induced Proton Switching Element Based on Fluorinated DNA Base Pair.
Tachikawa, Hiroto; Kawabata, Hiroshi
2016-03-10
To design theoretically the high-performance proton switching element based on DNA base pair, the effects of fluorine substitution on the rate of proton transfer (PT) in the DNA model base pair have been investigated by means of direct ab initio molecular dynamics (AIMD) method. The 2-aminopyridine dimer, (AP)2, was used as the model of the DNA base pair. One of the hydrogen atoms of the AP molecule in the dimer was substituted by a fluorine (F) atom, and the structures of the dimer, expressed by F-(AP)2, were fully optimized at the MP2/6-311++G(d,p) level. The direct AIMD calculations showed that the proton is transferred within the base pair after the vertical ionization. The rates of PT in F-(AP)2(+) were calculated and compared with that of (AP)2(+) without an F atom. It was found that PT rate is accelerated by the F-substitution. Also, the direction of PT between F-AP and AP molecules can be clearly controlled by the position of F-substitution (AP)2 in the dimer.
Pair correlations in iron-based superconductors: Quantum Monte Carlo study
Energy Technology Data Exchange (ETDEWEB)
Kashurnikov, V.A.; Krasavin, A.V., E-mail: avkrasavin@gmail.com
2014-08-01
The new generalized quantum continuous time world line Monte Carlo algorithm was developed to calculate pair correlation functions for two-dimensional FeAs-clusters modeling of iron-based superconductors using a two-orbital model. The data obtained for clusters with sizes up to 10×10 FeAs-cells favor the possibility of an effective charge carrier's attraction that is corresponding the A{sub 1g}-symmetry, at some parameters of interaction. The analysis of pair correlations depending on the cluster size, temperature, interaction, and the type of symmetry of the order parameter is carried out. - Highlights: • New generalized quantum continuous time world line Monte Carlo algorithm is developed. • Pair correlation functions for two-dimensional FeAs-clusters are calculated. • Parameters of two-orbital model corresponding to attraction of carriers are defined.
Electron density of states of Fe-based superconductors: Quantum trajectory Monte Carlo method
Kashurnikov, V. A.; Krasavin, A. V.; Zhumagulov, Ya. V.
2016-03-01
The spectral and total electron densities of states in two-dimensional FeAs clusters, which simulate iron-based superconductors, have been calculated using the generalized quantum Monte Carlo algorithm within the full two-orbital model. Spectra have been reconstructed by solving the integral equation relating the Matsubara Green's function and spectral density by the method combining the gradient descent and Monte Carlo algorithms. The calculations have been performed for clusters with dimensions up to 10 × 10 FeAs cells. The profiles of the Fermi surface for the entire Brillouin zone have been presented in the quasiparticle approximation. Data for the total density of states near the Fermi level have been obtained. The effect of the interaction parameter, size of the cluster, and temperature on the spectrum of excitations has been studied.
Directory of Open Access Journals (Sweden)
He Deyu
2016-09-01
Full Text Available Assessing the risks of steering system faults in underwater vehicles is a human-machine-environment (HME systematic safety field that studies faults in the steering system itself, the driver’s human reliability (HR and various environmental conditions. This paper proposed a fault risk assessment method for an underwater vehicle steering system based on virtual prototyping and Monte Carlo simulation. A virtual steering system prototype was established and validated to rectify a lack of historic fault data. Fault injection and simulation were conducted to acquire fault simulation data. A Monte Carlo simulation was adopted that integrated randomness due to the human operator and environment. Randomness and uncertainty of the human, machine and environment were integrated in the method to obtain a probabilistic risk indicator. To verify the proposed method, a case of stuck rudder fault (SRF risk assessment was studied. This method may provide a novel solution for fault risk assessment of a vehicle or other general HME system.
ERSN-OpenMC, a Java-based GUI for OpenMC Monte Carlo code
Directory of Open Access Journals (Sweden)
Jaafar EL Bakkali
2016-07-01
Full Text Available OpenMC is a new Monte Carlo transport particle simulation code focused on solving two types of neutronic problems mainly the k-eigenvalue criticality fission source problems and external fixed fission source problems. OpenMC does not have any Graphical User Interface and the creation of one is provided by our java-based application named ERSN-OpenMC. The main feature of this application is to provide to the users an easy-to-use and flexible graphical interface to build better and faster simulations, with less effort and great reliability. Additionally, this graphical tool was developed with several features, as the ability to automate the building process of OpenMC code and related libraries as well as the users are given the freedom to customize their installation of this Monte Carlo code. A full description of the ERSN-OpenMC application is presented in this paper.
GPU-accelerated Monte Carlo simulation of particle coagulation based on the inverse method
Wei, J.; Kruis, F. E.
2013-09-01
Simulating particle coagulation using Monte Carlo methods is in general a challenging computational task due to its numerical complexity and the computing cost. Currently, the lowest computing costs are obtained when applying a graphic processing unit (GPU) originally developed for speeding up graphic processing in the consumer market. In this article we present an implementation of accelerating a Monte Carlo method based on the Inverse scheme for simulating particle coagulation on the GPU. The abundant data parallelism embedded within the Monte Carlo method is explained as it will allow an efficient parallelization of the MC code on the GPU. Furthermore, the computation accuracy of the MC on GPU was validated with a benchmark, a CPU-based discrete-sectional method. To evaluate the performance gains by using the GPU, the computing time on the GPU against its sequential counterpart on the CPU were compared. The measured speedups show that the GPU can accelerate the execution of the MC code by a factor 10-100, depending on the chosen particle number of simulation particles. The algorithm shows a linear dependence of computing time with the number of simulation particles, which is a remarkable result in view of the n2 dependence of the coagulation.
Oxide-based protonic conductors: Point defects and transport properties
DEFF Research Database (Denmark)
Bonanos, N.
2001-01-01
that determine the protonic concentrations are considered, with emphasis on the regime of low oxygen partial pressure. The measurement of the thermoelectric power (TEP) and of the H+/D+ isotope effect in conductivity are discussed as a means of characterising the conduction process. (C) 2001 Elsevier Science B...
Monte Carlo Methods in Materials Science Based on FLUKA and ROOT
Pinsky, Lawrence; Wilson, Thomas; Empl, Anton; Andersen, Victor
2003-01-01
A comprehensive understanding of mitigation measures for space radiation protection necessarily involves the relevant fields of nuclear physics and particle transport modeling. One method of modeling the interaction of radiation traversing matter is Monte Carlo analysis, a subject that has been evolving since the very advent of nuclear reactors and particle accelerators in experimental physics. Countermeasures for radiation protection from neutrons near nuclear reactors, for example, were an early application and Monte Carlo methods were quickly adapted to this general field of investigation. The project discussed here is concerned with taking the latest tools and technology in Monte Carlo analysis and adapting them to space applications such as radiation shielding design for spacecraft, as well as investigating how next-generation Monte Carlos can complement the existing analytical methods currently used by NASA. We have chosen to employ the Monte Carlo program known as FLUKA (A legacy acronym based on the German for FLUctuating KAscade) used to simulate all of the particle transport, and the CERN developed graphical-interface object-oriented analysis software called ROOT. One aspect of space radiation analysis for which the Monte Carlo s are particularly suited is the study of secondary radiation produced as albedoes in the vicinity of the structural geometry involved. This broad goal of simulating space radiation transport through the relevant materials employing the FLUKA code necessarily requires the addition of the capability to simulate all heavy-ion interactions from 10 MeV/A up to the highest conceivable energies. For all energies above 3 GeV/A the Dual Parton Model (DPM) is currently used, although the possible improvement of the DPMJET event generator for energies 3-30 GeV/A is being considered. One of the major tasks still facing us is the provision for heavy ion interactions below 3 GeV/A. The ROOT interface is being developed in conjunction with the
Ulmer, W
2010-01-01
We have developed a model for proton depth dose and lateral distributions based on Monte Carlo calculations (GEANT4) and an integration procedure of the Bethe-Bloch equation (BBE). The model accounts for the transport of primary and secondary protons, the creation of recoil protons and heavy recoil nuclei as well as lateral scattering of these contributions. The buildup, which is experimentally observed in higher energy depth dose curves, is modeled by inclusion of two different origins: 1. Secondary reaction protons with a contribution of ca. 65 % of the buildup (for monoenergetic protons). 2. Landau tails as well as Gaussian type of fluctuations for range straggling effects. All parameters of the model for initially monoenergetic proton beams have been obtained from Monte Carlo calculations or checked by them. Furthermore, there are a few parameters, which can be obtained by fitting the model to measured depth dose curves in order to describe individual characteristics of the beamline - the most important b...
Comparing analytical and Monte Carlo optical diffusion models in phosphor-based X-ray detectors
Kalyvas, N.; Liaparinos, P.
2014-03-01
Luminescent materials are employed as radiation to light converters in detectors of medical imaging systems, often referred to as phosphor screens. Several processes affect the light transfer properties of phosphors. Amongst the most important is the interaction of light. Light attenuation (absorption and scattering) can be described either through "diffusion" theory in theoretical models or "quantum" theory in Monte Carlo methods. Although analytical methods, based on photon diffusion equations, have been preferentially employed to investigate optical diffusion in the past, Monte Carlo simulation models can overcome several of the analytical modelling assumptions. The present study aimed to compare both methodologies and investigate the dependence of the analytical model optical parameters as a function of particle size. It was found that the optical photon attenuation coefficients calculated by analytical modeling are decreased with respect to the particle size (in the region 1- 12 μm). In addition, for particles sizes smaller than 6μm there is no simultaneous agreement between the theoretical modulation transfer function and light escape values with respect to the Monte Carlo data.
Energy Technology Data Exchange (ETDEWEB)
Murata, Isao [Osaka Univ., Suita (Japan); Mori, Takamasa; Nakagawa, Masayuki; Itakura, Hirofumi
1996-03-01
The method to calculate neutronics parameters of a core composed of randomly distributed spherical fuels has been developed based on a statistical geometry model with a continuous energy Monte Carlo method. This method was implemented in a general purpose Monte Carlo code MCNP, and a new code MCNP-CFP had been developed. This paper describes the model and method how to use it and the validation results. In the Monte Carlo calculation, the location of a spherical fuel is sampled probabilistically along the particle flight path from the spatial probability distribution of spherical fuels, called nearest neighbor distribution (NND). This sampling method was validated through the following two comparisons: (1) Calculations of inventory of coated fuel particles (CFPs) in a fuel compact by both track length estimator and direct evaluation method, and (2) Criticality calculations for ordered packed geometries. This method was also confined by applying to an analysis of the critical assembly experiment at VHTRC. The method established in the present study is quite unique so as to a probabilistic model of the geometry with a great number of spherical fuels distributed randomly. Realizing the speed-up by vector or parallel computations in future, it is expected to be widely used in calculation of a nuclear reactor core, especially HTGR cores. (author).
Proton conductive membranes based on doped sulfonated polytriazole
Energy Technology Data Exchange (ETDEWEB)
Boaventura, M.; Brandao, L.; Mendes, A. [Laboratorio de Engenharia de Processos, Ambiente e Energia (LEPAE), Faculdade de Engenharia da Universidade do Porto, Rua Roberto Frias, 4200-465 Porto (Portugal); Ponce, M.L.; Nunes, S.P. [GKSS Research Centre Geesthacht GmbH, Max Planck Str. 1, D-21502, Geesthacht (Germany)
2010-11-15
This work reports the preparation and characterization of proton conducting sulfonated polytriazole membranes doped with three different agents: 1H-benzimidazole-2-sulfonic acid, benzimidazole and phosphoric acid. The modified membranes were characterized by scanning electron microscopy (SEM), infrared spectra, thermogravimetric analysis (TGA), dynamical mechanical thermal analysis (DMTA) and electrochemical impedance spectroscopy (EIS). The addition of doping agents resulted in a decrease of the glass transition temperature. For membranes doped with 85 wt.% phosphoric acid solution proton conductivity increased up to 2.10{sup -3} S cm{sup -1} at 120 C and at 5% relative humidity. The performance of the phosphoric acid doped membranes was evaluated in a fuel cell set-up at 120 C and 2.5% relative humidity. (author)
Proton exchange membranes based on PVDF/SEBS blends
Energy Technology Data Exchange (ETDEWEB)
Mokrini, A.; Huneault, M.A. [Industrial Materials Institute, National Research Council of Canada, 75 de Mortagne Blvd., Boucherville, Que. (Canada J4B 6Y4)
2006-03-09
Proton-conductive polymer membranes are used as an electrolyte in the so-called proton exchange membrane fuel cells. Current commercially available membranes are perfluorosulfonic acid polymers, a class of high-cost ionomers. This paper examines the potential of polymer blends, namely those of styrene-(ethylene-butylene)-styrene block copolymer (SEBS) and polyvinylidene fluoride (PVDF), in the proton exchange membrane application. SEBS/PVDF blends were prepared by twin-screw extrusion and the membranes were formed by calendering. SEBS is a phase-segregated material where the polystyrene blocks can be selectively functionalized offering high ionic conductivity, while PVDF insures good dimensional stability and chemical resistance to the films. Proton conductivity of the films was obtained by solid-state grafting of sulfonic acid moieties. The obtained membranes were characterized in terms of conductivity, ionic exchange capacity and water uptake. In addition, the membranes were characterized in terms of morphology, microstructure and thermo-mechanical properties to establish the blends morphology-property relationships. Modification of interfacial properties between SEBS and PVDF was found to be a key to optimize the blends performance. Addition of a methyl methacrylate-butyl acrylate-methyl methacrylate block copolymer (MMA-BA-MMA) was found to compatibilize the blend by reducing the segregation scale and improving the blend homogeneity. Mechanical resistance of the membranes was also improved through the addition of this compatibilizer. As little as 2wt.% compatibilizer was sufficient for complete interfacial coverage and lead to improved mechanical properties. Compatibilized blend membranes also showed higher conductivities, 1.9x10{sup -2} to 5.5x10{sup -3}Scm{sup -1}, and improved water management. (author)
Proton exchange membranes based on PVDF/SEBS blends
Mokrini, A.; Huneault, M. A.
Proton-conductive polymer membranes are used as an electrolyte in the so-called proton exchange membrane fuel cells. Current commercially available membranes are perfluorosulfonic acid polymers, a class of high-cost ionomers. This paper examines the potential of polymer blends, namely those of styrene-(ethylene-butylene)-styrene block copolymer (SEBS) and polyvinylidene fluoride (PVDF), in the proton exchange membrane application. SEBS/PVDF blends were prepared by twin-screw extrusion and the membranes were formed by calendering. SEBS is a phase-segregated material where the polystyrene blocks can be selectively functionalized offering high ionic conductivity, while PVDF insures good dimensional stability and chemical resistance to the films. Proton conductivity of the films was obtained by solid-state grafting of sulfonic acid moieties. The obtained membranes were characterized in terms of conductivity, ionic exchange capacity and water uptake. In addition, the membranes were characterized in terms of morphology, microstructure and thermo-mechanical properties to establish the blends morphology-property relationships. Modification of interfacial properties between SEBS and PVDF was found to be a key to optimize the blends performance. Addition of a methyl methacrylate-butyl acrylate-methyl methacrylate block copolymer (MMA-BA-MMA) was found to compatibilize the blend by reducing the segregation scale and improving the blend homogeneity. Mechanical resistance of the membranes was also improved through the addition of this compatibilizer. As little as 2 wt.% compatibilizer was sufficient for complete interfacial coverage and lead to improved mechanical properties. Compatibilized blend membranes also showed higher conductivities, 1.9 × 10 -2 to 5.5 × 10 -3 S cm -1, and improved water management.
Thermal transport in nanocrystalline Si and SiGe by ab initio based Monte Carlo simulation.
Yang, Lina; Minnich, Austin J
2017-03-14
Nanocrystalline thermoelectric materials based on Si have long been of interest because Si is earth-abundant, inexpensive, and non-toxic. However, a poor understanding of phonon grain boundary scattering and its effect on thermal conductivity has impeded efforts to improve the thermoelectric figure of merit. Here, we report an ab-initio based computational study of thermal transport in nanocrystalline Si-based materials using a variance-reduced Monte Carlo method with the full phonon dispersion and intrinsic lifetimes from first-principles as input. By fitting the transmission profile of grain boundaries, we obtain excellent agreement with experimental thermal conductivity of nanocrystalline Si [Wang et al. Nano Letters 11, 2206 (2011)]. Based on these calculations, we examine phonon transport in nanocrystalline SiGe alloys with ab-initio electron-phonon scattering rates. Our calculations show that low energy phonons still transport substantial amounts of heat in these materials, despite scattering by electron-phonon interactions, due to the high transmission of phonons at grain boundaries, and thus improvements in ZT are still possible by disrupting these modes. This work demonstrates the important insights into phonon transport that can be obtained using ab-initio based Monte Carlo simulations in complex nanostructured materials.
The precision of respiratory-gated delivery of synchrotron-based pulsed beam proton therapy
Energy Technology Data Exchange (ETDEWEB)
Tsunashima, Yoshikazu; Vedam, Sastry; Dong Lei; Balter, Peter; Mohan, Radhe [Department of Radiation Physics, Unit 94, University of Texas M D Anderson Cancer Center, 1515 Holcombe Blvd, Houston, TX 77030 (United States); Umezawa, Masumi, E-mail: ytsunash@mdanderson.or [Accelerator System Group Medical System Project, Hitachi, Ltd, Energy and Environmental Systems Laboratory, 2-1, Omika-cho 7-chome, Hitachi-shi, Ibaraki-ken 319-1221 (Japan)
2010-12-21
A synchrotron-based proton therapy system operates in a low repetition rate pulsed beam delivery mode. Unlike cyclotron-based beam delivery, there is no guarantee that a synchrotron beam can be delivered effectively or precisely under the respiratory-gated mode. To evaluate the performance of gated synchrotron treatment, we simulated proton beam delivery in the synchrotron-based respiratory-gated mode using realistic patient breathing signals. Parameters used in the simulation were respiratory motion traces (70 traces from 24 patients), respiratory gate levels (10%, 20% and 30% duty cycles at the exhalation phase) and synchrotron magnet excitation cycles (T{sub cyc}) (fixed T{sub cyc} mode: 2.7, 3.0-6.0 s and each patient breathing cycle, and variable T{sub cyc} mode). The simulations were computed according to the breathing trace in which the proton beams were delivered. In the shorter fixed T{sub cyc} (<4 s), most of the proton beams were delivered uniformly to the target during the entire expiration phase of the respiratory cycle. In the longer fixed T{sub cyc} (>4 s) and the variable T{sub cyc} mode, the proton beams were not consistently delivered during the end-expiration phase of the respiratory cycle. However we found that the longer and variable T{sub cyc} operation modes delivered proton beams more precisely during irregular breathing.
GFS algorithm based on batch Monte Carlo trials for solving global optimization problems
Popkov, Yuri S.; Darkhovskiy, Boris S.; Popkov, Alexey Y.
2016-10-01
A new method for global optimization of Hölder goal functions under compact sets given by inequalities is proposed. All functions are defined only algorithmically. The method is based on performing simple Monte Carlo trials and constructing the sequences of records and the sequence of their decrements. An estimating procedure of Hölder constants is proposed. Probability estimation of exact global minimum neighborhood using Hölder constants estimates is presented. Results on some analytical and algorithmic test problems illustrate the method's performance.
Pair correlation functions of FeAs-based superconductors: Quantum Monte Carlo study
Kashurnikov, V. A.; Krasavin, A. V.
2015-01-01
The new generalized quantum continuous time world line Monte Carlo algorithm was developed to calculate pair correlation functions for two-dimensional FeAs-clusters modeling of iron-based superconductors within the framework of the two-orbital model. The analysis of pair correlations depending on the cluster size, temperature, interaction, and the type of symmetry of the order parameter is carried out. The data obtained for clusters with sizes up to 1 0x1 0 FeAs-cells favor the possibility of an effective charge carrier's attraction that is corresponding the A1g-symmetry, at some parameters of interaction.
PeneloPET, a Monte Carlo PET simulation tool based on PENELOPE: features and validation
Energy Technology Data Exchange (ETDEWEB)
Espana, S; Herraiz, J L; Vicente, E; Udias, J M [Grupo de Fisica Nuclear, Departmento de Fisica Atomica, Molecular y Nuclear, Universidad Complutense de Madrid, Madrid (Spain); Vaquero, J J; Desco, M [Unidad de Medicina y CirugIa Experimental, Hospital General Universitario Gregorio Maranon, Madrid (Spain)], E-mail: jose@nuc2.fis.ucm.es
2009-03-21
Monte Carlo simulations play an important role in positron emission tomography (PET) imaging, as an essential tool for the research and development of new scanners and for advanced image reconstruction. PeneloPET, a PET-dedicated Monte Carlo tool, is presented and validated in this work. PeneloPET is based on PENELOPE, a Monte Carlo code for the simulation of the transport in matter of electrons, positrons and photons, with energies from a few hundred eV to 1 GeV. PENELOPE is robust, fast and very accurate, but it may be unfriendly to people not acquainted with the FORTRAN programming language. PeneloPET is an easy-to-use application which allows comprehensive simulations of PET systems within PENELOPE. Complex and realistic simulations can be set by modifying a few simple input text files. Different levels of output data are available for analysis, from sinogram and lines-of-response (LORs) histogramming to fully detailed list mode. These data can be further exploited with the preferred programming language, including ROOT. PeneloPET simulates PET systems based on crystal array blocks coupled to photodetectors and allows the user to define radioactive sources, detectors, shielding and other parts of the scanner. The acquisition chain is simulated in high level detail; for instance, the electronic processing can include pile-up rejection mechanisms and time stamping of events, if desired. This paper describes PeneloPET and shows the results of extensive validations and comparisons of simulations against real measurements from commercial acquisition systems. PeneloPET is being extensively employed to improve the image quality of commercial PET systems and for the development of new ones.
GPU-based Monte Carlo radiotherapy dose calculation using phase-space sources
Townson, Reid; Tian, Zhen; Graves, Yan Jiang; Zavgorodni, Sergei; Jiang, Steve B
2013-01-01
A novel phase-space source implementation has been designed for GPU-based Monte Carlo dose calculation engines. Due to the parallelized nature of GPU hardware, it is essential to simultaneously transport particles of the same type and similar energies but separated spatially to yield a high efficiency. We present three methods for phase-space implementation that have been integrated into the most recent version of the GPU-based Monte Carlo radiotherapy dose calculation package gDPM v3.0. The first method is to sequentially read particles from a patient-dependent phase-space and sort them on-the-fly based on particle type and energy. The second method supplements this with a simple secondary collimator model and fluence map implementation so that patient-independent phase-space sources can be used. Finally, as the third method (called the phase-space-let, or PSL, method) we introduce a novel strategy to pre-process patient-independent phase-spaces and bin particles by type, energy and position. Position bins l...
A Monte-Carlo based model of the AX-PET demonstrator and its experimental validation.
Solevi, P; Oliver, J F; Gillam, J E; Bolle, E; Casella, C; Chesi, E; De Leo, R; Dissertori, G; Fanti, V; Heller, M; Lai, M; Lustermann, W; Nappi, E; Pauss, F; Rudge, A; Ruotsalainen, U; Schinzel, D; Schneider, T; Séguinot, J; Stapnes, S; Weilhammer, P; Tuna, U; Joram, C; Rafecas, M
2013-08-21
AX-PET is a novel PET detector based on axially oriented crystals and orthogonal wavelength shifter (WLS) strips, both individually read out by silicon photo-multipliers. Its design decouples sensitivity and spatial resolution, by reducing the parallax error due to the layered arrangement of the crystals. Additionally the granularity of AX-PET enhances the capability to track photons within the detector yielding a large fraction of inter-crystal scatter events. These events, if properly processed, can be included in the reconstruction stage further increasing the sensitivity. Its unique features require dedicated Monte-Carlo simulations, enabling the development of the device, interpreting data and allowing the development of reconstruction codes. At the same time the non-conventional design of AX-PET poses several challenges to the simulation and modeling tasks, mostly related to the light transport and distribution within the crystals and WLS strips, as well as the electronics readout. In this work we present a hybrid simulation tool based on an analytical model and a Monte-Carlo based description of the AX-PET demonstrator. It was extensively validated against experimental data, providing excellent agreement.
X-ray imaging plate performance investigation based on a Monte Carlo simulation tool
Energy Technology Data Exchange (ETDEWEB)
Yao, M., E-mail: philippe.duvauchelle@insa-lyon.fr [Laboratoire Vibration Acoustique (LVA), INSA de Lyon, 25 Avenue Jean Capelle, 69621 Villeurbanne Cedex (France); Duvauchelle, Ph.; Kaftandjian, V. [Laboratoire Vibration Acoustique (LVA), INSA de Lyon, 25 Avenue Jean Capelle, 69621 Villeurbanne Cedex (France); Peterzol-Parmentier, A. [AREVA NDE-Solutions, 4 Rue Thomas Dumorey, 71100 Chalon-sur-Saône (France); Schumm, A. [EDF R& D SINETICS, 1 Avenue du Général de Gaulle, 92141 Clamart Cedex (France)
2015-01-01
Computed radiography (CR) based on imaging plate (IP) technology represents a potential replacement technique for traditional film-based industrial radiography. For investigating the IP performance especially at high energies, a Monte Carlo simulation tool based on PENELOPE has been developed. This tool tracks separately direct and secondary radiations, and monitors the behavior of different particles. The simulation output provides 3D distribution of deposited energy in IP and evaluation of radiation spectrum propagation allowing us to visualize the behavior of different particles and the influence of different elements. A detailed analysis, on the spectral and spatial responses of IP at different energies up to MeV, has been performed. - Highlights: • A Monte Carlo tool for imaging plate (IP) performance investigation is presented. • The tool outputs 3D maps of energy deposition in IP due to different signals. • The tool also provides the transmitted spectra along the radiation propagation. • An industrial imaging case is simulated with the presented tool. • A detailed analysis, on the spectral and spatial responses of IP, is presented.
Excited State Potential Energy Surfaces of Polyenes and Protonated Schiff Bases.
Send, Robert; Sundholm, Dage; Johansson, Mikael P; Pawłowski, Filip
2009-09-08
The potential energy surface of the (1)Bu and (1)A' states of all-trans-polyenes and the corresponding protonated Schiff bases have been studied at density functional theory and coupled cluster levels. Linear polyenes and protonated Schiff bases with 4 to 12 heavy atoms have been investigated. The calculations show remarkable differences in the excited state potential energy surfaces of the polyenes and the protonated Schiff bases. The excited states of the polyenes exhibit high torsion barriers for single-bond twists and low torsion barriers for double-bond twists. The protonated Schiff bases, on the other hand, are very flexible molecules in the first excited state with low or vanishing torsion barriers for both single and double bonds. Calculations at density functional theory and coupled cluster levels yield qualitatively similar potential energy surfaces. However, significant differences are found for some single-bond torsions in longer protonated Schiff bases, which indicate a flaw of the employed time-dependent density functional theory methods. The close agreement between the approximate second and third order coupled cluster levels indicates that for these systems calculations at second order coupled cluster level are useful in the validation of results based on time-dependent density functional theory.
The first proton sponge-based amino acids: synthesis, acid-base properties and some reactivity.
Ozeryanskii, Valery A; Gorbacheva, Anastasia Yu; Pozharskii, Alexander F; Vlasenko, Marina P; Tereznikov, Alexander Yu; Chernov'yants, Margarita S
2015-08-21
The first hybrid base constructed from 1,8-bis(dimethylamino)naphthalene (proton sponge or DMAN) and glycine, N-methyl-N-(8-dimethylamino-1-naphthyl)aminoacetic acid, was synthesised in high yield and its hydrobromide was structurally characterised and used to determine the acid-base properties via potentiometric titration. It was found that the basic strength of the DMAN-glycine base (pKa = 11.57, H2O) is on the level of amidine amino acids like arginine and creatine and its structure, zwitterionic vs. neutral, based on the spectroscopic (IR, NMR, mass) and theoretical (DFT) approaches has a strong preference to the zwitterionic form. Unlike glycine, the DMAN-glycine zwitterion is N-chiral and is hydrolytically cleaved with the loss of glycolic acid on heating in DMSO. This reaction together with the mild decarboxylative conversion of proton sponge-based amino acids into 2,3-dihydroperimidinium salts under air-oxygen was monitored with the help of the DMAN-alanine amino acid. The newly devised amino acids are unique as they combine fluorescence, strongly basic and redox-active properties.
Proton radiography to improve proton therapy treatment
Takatsu, J.; van der Graaf, E. R.; Van Goethem, M. -J.; van Beuzekom, M.; Klaver, T.; Visser, J.; Brandenburg, S.; Biegun, A. K.
2016-01-01
The quality of cancer treatment with protons critically depends on an accurate prediction of the proton stopping powers for the tissues traversed by the protons. Today, treatment planning in proton radiotherapy is based on stopping power calculations from densities of X-ray Computed Tomography (CT)
Energy Technology Data Exchange (ETDEWEB)
Ureba, A.; Pereira-Barbeiro, A. R.; Jimenez-Ortega, E.; Baeza, J. A.; Salguero, F. J.; Leal, A.
2013-07-01
The use of Monte Carlo (MC) has shown an improvement in the accuracy of the calculation of the dose compared to other analytics algorithms installed on the systems of business planning, especially in the case of non-standard situations typical of complex techniques such as IMRT and VMAT. Our treatment planning system called CARMEN, is based on the complete simulation, both the beam transport in the head of the accelerator and the patient, and simulation designed for efficient operation in terms of the accuracy of the estimate and the required computation times. (Author)
Comment on "A study on tetrahedron-based inhomogeneous Monte-Carlo optical simulation".
Fang, Qianqian
2011-04-19
The Monte Carlo (MC) method is a popular approach to modeling photon propagation inside general turbid media, such as human tissue. Progress had been made in the past year with the independent proposals of two mesh-based Monte Carlo methods employing ray-tracing techniques. Both methods have shown improvements in accuracy and efficiency in modeling complex domains. A recent paper by Shen and Wang [Biomed. Opt. Express 2, 44 (2011)] reported preliminary results towards the cross-validation of the two mesh-based MC algorithms and software implementations, showing a 3-6 fold speed difference between the two software packages. In this comment, we share our views on unbiased software comparisons and discuss additional issues such as the use of pre-computed data, interpolation strategies, impact of compiler settings, use of Russian roulette, memory cost and potential pitfalls in measuring algorithm performance. Despite key differences between the two algorithms in handling of non-tetrahedral meshes, we found that they share similar structure and performance for tetrahedral meshes. A significant fraction of the observed speed differences in the mentioned article was the result of inconsistent use of compilers and libraries.
Monte Carlo simulation as a tool to predict blasting fragmentation based on the Kuz Ram model
Morin, Mario A.; Ficarazzo, Francesco
2006-04-01
Rock fragmentation is considered the most important aspect of production blasting because of its direct effects on the costs of drilling and blasting and on the economics of the subsequent operations of loading, hauling and crushing. Over the past three decades, significant progress has been made in the development of new technologies for blasting applications. These technologies include increasingly sophisticated computer models for blast design and blast performance prediction. Rock fragmentation depends on many variables such as rock mass properties, site geology, in situ fracturing and blasting parameters and as such has no complete theoretical solution for its prediction. However, empirical models for the estimation of size distribution of rock fragments have been developed. In this study, a blast fragmentation Monte Carlo-based simulator, based on the Kuz-Ram fragmentation model, has been developed to predict the entire fragmentation size distribution, taking into account intact and joints rock properties, the type and properties of explosives and the drilling pattern. Results produced by this simulator were quite favorable when compared with real fragmentation data obtained from a blast quarry. It is anticipated that the use of Monte Carlo simulation will increase our understanding of the effects of rock mass and explosive properties on the rock fragmentation by blasting, as well as increase our confidence in these empirical models. This understanding will translate into improvements in blasting operations, its corresponding costs and the overall economics of open pit mines and rock quarries.
Diffenderfer, Eric S; Dolney, Derek; Schaettler, Maximilian; Sanzari, Jenine K; McDonough, James; Cengel, Keith A
2014-03-01
The space radiation environment imposes increased dangers of exposure to ionizing radiation, particularly during a solar particle event (SPE). These events consist primarily of low energy protons that produce a highly inhomogeneous dose distribution. Due to this inherent dose heterogeneity, experiments designed to investigate the radiobiological effects of SPE radiation present difficulties in evaluating and interpreting dose to sensitive organs. To address this challenge, we used the Geant4 Monte Carlo simulation framework to develop dosimetry software that uses computed tomography (CT) images and provides radiation transport simulations incorporating all relevant physical interaction processes. We found that this simulation accurately predicts measured data in phantoms and can be applied to model dose in radiobiological experiments with animal models exposed to charged particle (electron and proton) beams. This study clearly demonstrates the value of Monte Carlo radiation transport methods for two critically interrelated uses: (i) determining the overall dose distribution and dose levels to specific organ systems for animal experiments with SPE-like radiation, and (ii) interpreting the effect of random and systematic variations in experimental variables (e.g. animal movement during long exposures) on the dose distributions and consequent biological effects from SPE-like radiation exposure. The software developed and validated in this study represents a critically important new tool that allows integration of computational and biological modeling for evaluating the biological outcomes of exposures to inhomogeneous SPE-like radiation dose distributions, and has potential applications for other environmental and therapeutic exposure simulations.
GPU-based high performance Monte Carlo simulation in neutron transport
Energy Technology Data Exchange (ETDEWEB)
Heimlich, Adino; Mol, Antonio C.A.; Pereira, Claudio M.N.A. [Instituto de Engenharia Nuclear (IEN/CNEN-RJ), Rio de Janeiro, RJ (Brazil). Lab. de Inteligencia Artificial Aplicada], e-mail: cmnap@ien.gov.br
2009-07-01
Graphics Processing Units (GPU) are high performance co-processors intended, originally, to improve the use and quality of computer graphics applications. Since researchers and practitioners realized the potential of using GPU for general purpose, their application has been extended to other fields out of computer graphics scope. The main objective of this work is to evaluate the impact of using GPU in neutron transport simulation by Monte Carlo method. To accomplish that, GPU- and CPU-based (single and multicore) approaches were developed and applied to a simple, but time-consuming problem. Comparisons demonstrated that the GPU-based approach is about 15 times faster than a parallel 8-core CPU-based approach also developed in this work. (author)
IMPROVED ALGORITHM FOR ROAD REGION SEGMENTATION BASED ON SEQUENTIAL MONTE-CARLO ESTIMATION
Directory of Open Access Journals (Sweden)
Zdenek Prochazka
2014-12-01
Full Text Available In recent years, many researchers and car makers put a lot of intensive effort into development of autonomous driving systems. Since visual information is the main modality used by human driver, a camera mounted on moving platform is very important kind of sensor, and various computer vision algorithms to handle vehicle surrounding situation are under intensive research. Our final goal is to develop a vision based lane detection system with ability to handle various types of road shapes, working on both structured and unstructured roads, ideally under presence of shadows. This paper presents a modified road region segmentation algorithm based on sequential Monte-Carlo estimation. Detailed description of the algorithm is given, and evaluation results show that the proposed algorithm outperforms the segmentation algorithm developed as a part of our previous work, as well as an conventional algorithm based on colour histogram.
Huang, Guanghui; Wan, Jianping; Chen, Hui
2013-02-01
Nonlinear stochastic differential equation models with unobservable state variables are now widely used in analysis of PK/PD data. Unobservable state variables are usually estimated with extended Kalman filter (EKF), and the unknown pharmacokinetic parameters are usually estimated by maximum likelihood estimator. However, EKF is inadequate for nonlinear PK/PD models, and MLE is known to be biased downwards. A density-based Monte Carlo filter (DMF) is proposed to estimate the unobservable state variables, and a simulation-based M estimator is proposed to estimate the unknown parameters in this paper, where a genetic algorithm is designed to search the optimal values of pharmacokinetic parameters. The performances of EKF and DMF are compared through simulations for discrete time and continuous time systems respectively, and it is found that the results based on DMF are more accurate than those given by EKF with respect to mean absolute error.
FFAG-BASED HIGH-INTENSITY PROTON DRIVERS.
Energy Technology Data Exchange (ETDEWEB)
RUGGIERO, A.G.
2004-10-13
This paper is the summary of a feasibility study of a Fixed-Field Alternating-Gradient (FFAG) Accelerator for Protons in the one-to-few GeV energy range, and average beam power of several MWatt. The example they have adopted here is a beam energy of 1 GeV and an average power of 10 MWatt, but of course the same design approach can be used with other beam parameters. The design principles, merits and limitations of the FFAG accelerators have been described previously. In particular, more advanced techniques to minimize magnet dimension and field strength have been recently proposed. The design makes use of a novel concept by which it is possible to cancel chromatic effects, thus making betatron tunes and functions independent of the particle momentum, with an Adjusted Field Profile. The example given here assumes a pulsed mode of operation at the repetition rate of 1.0 kHz.
Possible magnetism based on orbital motion of protons in ice
Yen, Fei; Liu, Yongsheng; Berlie, Adam
2016-01-01
A peak anomaly is observed in the magnetic susceptibility as a function of temperature in solid H2O near Tp=60 K. At external magnetic fields below 2 kOe, Tp becomes positive in the temperature range between 45 and 66 K. The magnetic field dependence of the susceptibility in the same temperature range exhibits an inverted ferromagnetic hysteretic loop superimposed on top of the diamagnetic signature of ice at fields below 600 Oe. We suggest that a fraction of protons that are capable of undergoing correlated tunneling in a hexagonal path without disrupting the stoichiometry of the lattice create an induced magnetic field opposite to the induced magnetic field created by the electrons upon application of an external field which counters the overall diamagnetism of the material.
SU-E-T-761: TOMOMC, A Monte Carlo-Based Planning VerificationTool for Helical Tomotherapy
Energy Technology Data Exchange (ETDEWEB)
Chibani, O; Ma, C [Fox Chase Cancer Center, Philadelphia, PA (United States)
2015-06-15
Purpose: Present a new Monte Carlo code (TOMOMC) to calculate 3D dose distributions for patients undergoing helical tomotherapy treatments. TOMOMC performs CT-based dose calculations using the actual dynamic variables of the machine (couch motion, gantry rotation, and MLC sequences). Methods: TOMOMC is based on the GEPTS (Gama Electron and Positron Transport System) general-purpose Monte Carlo system (Chibani and Li, Med. Phys. 29, 2002, 835). First, beam models for the Hi-Art Tomotherpy machine were developed for the different beam widths (1, 2.5 and 5 cm). The beam model accounts for the exact geometry and composition of the different components of the linac head (target, primary collimator, jaws and MLCs). The beams models were benchmarked by comparing calculated Pdds and lateral/transversal dose profiles with ionization chamber measurements in water. See figures 1–3. The MLC model was tuned in such a way that tongue and groove effect, inter-leaf and intra-leaf transmission are modeled correctly. See figure 4. Results: By simulating the exact patient anatomy and the actual treatment delivery conditions (couch motion, gantry rotation and MLC sinogram), TOMOMC is able to calculate the 3D patient dose distribution which is in principal more accurate than the one from the treatment planning system (TPS) since it relies on the Monte Carlo method (gold standard). Dose volume parameters based on the Monte Carlo dose distribution can also be compared to those produced by the TPS. Attached figures show isodose lines for a H&N patient calculated by TOMOMC (transverse and sagittal views). Analysis of differences between TOMOMC and TPS is an ongoing work for different anatomic sites. Conclusion: A new Monte Carlo code (TOMOMC) was developed for Tomotherapy patient-specific QA. The next step in this project is implementing GPU computing to speed up Monte Carlo simulation and make Monte Carlo-based treatment verification a practical solution.
Fujii, R.; Imahori, Y.; Nakakmura, M.; Takada, M.; Kamada, S.; Hamano, T.; Hoshi, M.; Sato, H.; Itami, J.; Abe, Y.; Fuse, M.
2012-12-01
The neutron source for Boron Neutron Capture Therapy (BNCT) is in the transition stage from nuclear reactor to accelerator based neutron source. Generation of low energy neutron can be achieved by 7Li (p, n) 7Be reaction using accelerator based neutron source. Development of small-scale and safe neutron source is within reach. The melting point of lithium that is used for the target is low, and durability is questioned for an extended use at a high current proton beam. In order to test its durability, we have irradiated lithium with proton beam at the same level as the actual current density, and found no deterioration after 3 hours of continuous irradiation. As a result, it is suggested that lithium target can withstand proton irradiation at high current, confirming suitability as accelerator based neutron source for BNCT.
2012-01-01
Proton Therapy Physics goes beyond current books on proton therapy to provide an in-depth overview of the physics aspects of this radiation therapy modality, eliminating the need to dig through information scattered in the medical physics literature. After tracing the history of proton therapy, the book summarizes the atomic and nuclear physics background necessary for understanding proton interactions with tissue. It describes the physics of proton accelerators, the parameters of clinical proton beams, and the mechanisms to generate a conformal dose distribution in a patient. The text then covers detector systems and measuring techniques for reference dosimetry, outlines basic quality assurance and commissioning guidelines, and gives examples of Monte Carlo simulations in proton therapy. The book moves on to discussions of treatment planning for single- and multiple-field uniform doses, dose calculation concepts and algorithms, and precision and uncertainties for nonmoving and moving targets. It also exami...
DYNAMIC PARAMETERS ESTIMATION OF INTERFEROMETRIC SIGNALS BASED ON SEQUENTIAL MONTE CARLO METHOD
Directory of Open Access Journals (Sweden)
M. A. Volynsky
2014-05-01
Full Text Available The paper deals with sequential Monte Carlo method applied to problem of interferometric signals parameters estimation. The method is based on the statistical approximation of the posterior probability density distribution of parameters. Detailed description of the algorithm is given. The possibility of using the residual minimum between prediction and observation as a criterion for the selection of multitude elements generated at each algorithm step is shown. Analysis of input parameters influence on performance of the algorithm has been conducted. It was found that the standard deviation of the amplitude estimation error for typical signals is about 10% of the maximum amplitude value. The phase estimation error was shown to have a normal distribution. Analysis of the algorithm characteristics depending on input parameters is done. In particular, the influence analysis for a number of selected vectors of parameters on evaluation results is carried out. On the basis of simulation results for the considered class of signals, it is recommended to select 30% of the generated vectors number. The increase of the generated vectors number over 150 does not give significant improvement of the obtained estimates quality. The sequential Monte Carlo method is recommended for usage in dynamic processing of interferometric signals for the cases when high immunity is required to non-linear changes of signal parameters and influence of random noise.
Fission yield calculation using toy model based on Monte Carlo simulation
Energy Technology Data Exchange (ETDEWEB)
Jubaidah, E-mail: jubaidah@student.itb.ac.id [Nuclear Physics and Biophysics Division, Department of Physics, Bandung Institute of Technology. Jl. Ganesa No. 10 Bandung – West Java, Indonesia 40132 (Indonesia); Physics Department, Faculty of Mathematics and Natural Science – State University of Medan. Jl. Willem Iskandar Pasar V Medan Estate – North Sumatera, Indonesia 20221 (Indonesia); Kurniadi, Rizal, E-mail: rijalk@fi.itb.ac.id [Nuclear Physics and Biophysics Division, Department of Physics, Bandung Institute of Technology. Jl. Ganesa No. 10 Bandung – West Java, Indonesia 40132 (Indonesia)
2015-09-30
Toy model is a new approximation in predicting fission yield distribution. Toy model assumes nucleus as an elastic toy consist of marbles. The number of marbles represents the number of nucleons, A. This toy nucleus is able to imitate the real nucleus properties. In this research, the toy nucleons are only influenced by central force. A heavy toy nucleus induced by a toy nucleon will be split into two fragments. These two fission fragments are called fission yield. In this research, energy entanglement is neglected. Fission process in toy model is illustrated by two Gaussian curves intersecting each other. There are five Gaussian parameters used in this research. They are scission point of the two curves (R{sub c}), mean of left curve (μ{sub L}) and mean of right curve (μ{sub R}), deviation of left curve (σ{sub L}) and deviation of right curve (σ{sub R}). The fission yields distribution is analyses based on Monte Carlo simulation. The result shows that variation in σ or µ can significanly move the average frequency of asymmetry fission yields. This also varies the range of fission yields distribution probability. In addition, variation in iteration coefficient only change the frequency of fission yields. Monte Carlo simulation for fission yield calculation using toy model successfully indicates the same tendency with experiment results, where average of light fission yield is in the range of 90
A method based on Monte Carlo simulation for the determination of the G(E) function.
Chen, Wei; Feng, Tiancheng; Liu, Jun; Su, Chuanying; Tian, Yanjie
2015-02-01
The G(E) function method is a spectrometric method for the exposure dose estimation; this paper describes a method based on Monte Carlo method to determine the G(E) function of a 4″ × 4″ × 16″ NaI(Tl) detector. Simulated spectrums of various monoenergetic gamma rays in the region of 40 -3200 keV and the corresponding deposited energy in an air ball in the energy region of full-energy peak were obtained using Monte Carlo N-particle Transport Code. Absorbed dose rate in air was obtained according to the deposited energy and divided by counts of corresponding full-energy peak to get the G(E) function value at energy E in spectra. Curve-fitting software 1st0pt was used to determine coefficients of the G(E) function. Experimental results show that the calculated dose rates using the G(E) function determined by the authors' method are accordant well with those values obtained by ionisation chamber, with a maximum deviation of 6.31 %.
GMC: a GPU implementation of a Monte Carlo dose calculation based on Geant4.
Jahnke, Lennart; Fleckenstein, Jens; Wenz, Frederik; Hesser, Jürgen
2012-03-07
We present a GPU implementation called GMC (GPU Monte Carlo) of the low energy (CUDA programming interface. The classes for electron and photon interactions as well as a new parallel particle transport engine were implemented. The way a particle is processed is not in a history by history manner but rather by an interaction by interaction method. Every history is divided into steps that are then calculated in parallel by different kernels. The geometry package is currently limited to voxelized geometries. A modified parallel Mersenne twister was used to generate random numbers and a random number repetition method on the GPU was introduced. All phantom results showed a very good agreement between GPU and CPU simulation with gamma indices of >97.5% for a 2%/2 mm gamma criteria. The mean acceleration on one GTX 580 for all cases compared to Geant4 on one CPU core was 4860. The mean number of histories per millisecond on the GPU for all cases was 658 leading to a total simulation time for one intensity-modulated radiation therapy dose distribution of 349 s. In conclusion, Geant4-based Monte Carlo dose calculations were significantly accelerated on the GPU.
A Strategy for Finding the Optimal Scale of Plant Core Collection Based on Monte Carlo Simulation
Directory of Open Access Journals (Sweden)
Jiancheng Wang
2014-01-01
Full Text Available Core collection is an ideal resource for genome-wide association studies (GWAS. A subcore collection is a subset of a core collection. A strategy was proposed for finding the optimal sampling percentage on plant subcore collection based on Monte Carlo simulation. A cotton germplasm group of 168 accessions with 20 quantitative traits was used to construct subcore collections. Mixed linear model approach was used to eliminate environment effect and GE (genotype × environment effect. Least distance stepwise sampling (LDSS method combining 6 commonly used genetic distances and unweighted pair-group average (UPGMA cluster method was adopted to construct subcore collections. Homogeneous population assessing method was adopted to assess the validity of 7 evaluating parameters of subcore collection. Monte Carlo simulation was conducted on the sampling percentage, the number of traits, and the evaluating parameters. A new method for “distilling free-form natural laws from experimental data” was adopted to find the best formula to determine the optimal sampling percentages. The results showed that coincidence rate of range (CR was the most valid evaluating parameter and was suitable to serve as a threshold to find the optimal sampling percentage. The principal component analysis showed that subcore collections constructed by the optimal sampling percentages calculated by present strategy were well representative.
CAD-based Monte Carlo Program for Integrated Simulation of Nuclear System SuperMC
Wu, Yican; Song, Jing; Zheng, Huaqing; Sun, Guangyao; Hao, Lijuan; Long, Pengcheng; Hu, Liqin
2014-06-01
Monte Carlo (MC) method has distinct advantages to simulate complicated nuclear systems and is envisioned as routine method for nuclear design and analysis in the future. High fidelity simulation with MC method coupled with multi-physical phenomenon simulation has significant impact on safety, economy and sustainability of nuclear systems. However, great challenges to current MC methods and codes prevent its application in real engineering project. SuperMC is a CAD-based Monte Carlo program for integrated simulation of nuclear system developed by FDS Team, China, making use of hybrid MC-deterministic method and advanced computer technologies. The design aim, architecture and main methodology of SuperMC were presented in this paper. SuperMC2.1, the latest version for neutron, photon and coupled neutron and photon transport calculation, has been developed and validated by using a series of benchmarking cases such as the fusion reactor ITER model and the fast reactor BN-600 model. SuperMC is still in its evolution process toward a general and routine tool for nuclear system. Warning, no authors found for 2014snam.conf06023.
Simulation of Cone Beam CT System Based on Monte Carlo Method
Wang, Yu; Cao, Ruifen; Hu, Liqin; Li, Bingbing
2014-01-01
Adaptive Radiation Therapy (ART) was developed based on Image-guided Radiation Therapy (IGRT) and it is the trend of photon radiation therapy. To get a better use of Cone Beam CT (CBCT) images for ART, the CBCT system model was established based on Monte Carlo program and validated against the measurement. The BEAMnrc program was adopted to the KV x-ray tube. Both IOURCE-13 and ISOURCE-24 were chosen to simulate the path of beam particles. The measured Percentage Depth Dose (PDD) and lateral dose profiles under 1cm water were compared with the dose calculated by DOSXYZnrc program. The calculated PDD was better than 1% within the depth of 10cm. More than 85% points of calculated lateral dose profiles was within 2%. The correct CBCT system model helps to improve CBCT image quality for dose verification in ART and assess the CBCT image concomitant dose risk.
Comparative Monte Carlo analysis of InP- and GaN-based Gunn diodes
García, S.; Pérez, S.; Íñiguez-de-la-Torre, I.; Mateos, J.; González, T.
2014-01-01
In this work, we report on Monte Carlo simulations to study the capability to generate Gunn oscillations of diodes based on InP and GaN with around 1 μm active region length. We compare the power spectral density of current sequences in diodes with and without notch for different lengths and two doping profiles. It is found that InP structures provide 400 GHz current oscillations for the fundamental harmonic in structures without notch and around 140 GHz in notched diodes. On the other hand, GaN diodes can operate up to 300 GHz for the fundamental harmonic, and when the notch is effective, a larger number of harmonics, reaching the Terahertz range, with higher spectral purity than in InP diodes are generated. Therefore, GaN-based diodes offer a high power alternative for sub-millimeter wave Gunn oscillations.
Quasi-Monte Carlo Simulation-Based SFEM for Slope Reliability Analysis
Institute of Scientific and Technical Information of China (English)
Yu Yuzhen; Xie Liquan; Zhang Bingyin
2005-01-01
Considering the stochastic spatial variation of geotechnical parameters over the slope, a Stochastic Finite Element Method (SFEM) is established based on the combination of the Shear Strength Reduction (SSR) concept and quasi-Monte Carlo simulation. The shear strength reduction FEM is superior to the slice method based on the limit equilibrium theory in many ways, so it will be more powerful to assess the reliability of global slope stability when combined with probability theory. To illustrate the performance of the proposed method, it is applied to an example of simple slope. The results of simulation show that the proposed method is effective to perform the reliability analysis of global slope stability without presupposing a potential slip surface.
Sign learning kink-based (SiLK) quantum Monte Carlo for molecular systems
Ma, Xiaoyao; Loffler, Frank; Kowalski, Karol; Bhaskaran-Nair, Kiran; Jarrell, Mark; Moreno, Juana
2015-01-01
The Sign Learning Kink (SiLK) based Quantum Monte Carlo (QMC) method is used to calculate the ab initio ground state energies for multiple geometries of the H$_{2}$O, N$_2$, and F$_2$ molecules. The method is based on Feynman's path integral formulation of quantum mechanics and has two stages. The first stage is called the learning stage and reduces the well-known QMC minus sign problem by optimizing the linear combinations of Slater determinants which are used in the second stage, a conventional QMC simulation. The method is tested using different vector spaces and compared to the results of other quantum chemical methods and to exact diagonalization. Our findings demonstrate that the SiLK method is accurate and reduces or eliminates the minus sign problem.
Sign Learning Kink-based (SiLK) Quantum Monte Carlo for molecular systems
Energy Technology Data Exchange (ETDEWEB)
Ma, Xiaoyao [Department of Physics and Astronomy, Louisiana State University, Baton Rouge, Louisiana 70803 (United States); Hall, Randall W. [Department of Natural Sciences and Mathematics, Dominican University of California, San Rafael, California 94901 (United States); Department of Chemistry, Louisiana State University, Baton Rouge, Louisiana 70803 (United States); Löffler, Frank [Center for Computation and Technology, Louisiana State University, Baton Rouge, Louisiana 70803 (United States); Kowalski, Karol [William R. Wiley Environmental Molecular Sciences Laboratory, Battelle, Pacific Northwest National Laboratory, Richland, Washington 99352 (United States); Bhaskaran-Nair, Kiran; Jarrell, Mark; Moreno, Juana [Department of Physics and Astronomy, Louisiana State University, Baton Rouge, Louisiana 70803 (United States); Center for Computation and Technology, Louisiana State University, Baton Rouge, Louisiana 70803 (United States)
2016-01-07
The Sign Learning Kink (SiLK) based Quantum Monte Carlo (QMC) method is used to calculate the ab initio ground state energies for multiple geometries of the H{sub 2}O, N{sub 2}, and F{sub 2} molecules. The method is based on Feynman’s path integral formulation of quantum mechanics and has two stages. The first stage is called the learning stage and reduces the well-known QMC minus sign problem by optimizing the linear combinations of Slater determinants which are used in the second stage, a conventional QMC simulation. The method is tested using different vector spaces and compared to the results of other quantum chemical methods and to exact diagonalization. Our findings demonstrate that the SiLK method is accurate and reduces or eliminates the minus sign problem.
Sign Learning Kink-based (SiLK) Quantum Monte Carlo for molecular systems
Energy Technology Data Exchange (ETDEWEB)
Ma, Xiaoyao [Department of Physics and Astronomy, Louisiana State University, Baton Rouge, Louisiana 70803, USA; Hall, Randall W. [Department of Natural Sciences and Mathematics, Dominican University of California, San Rafael, California 94901, USA; Department of Chemistry, Louisiana State University, Baton Rouge, Louisiana 70803, USA; Löffler, Frank [Center for Computation and Technology, Louisiana State University, Baton Rouge, Louisiana 70803, USA; Kowalski, Karol [William R. Wiley Environmental Molecular Sciences Laboratory, Battelle, Pacific Northwest National Laboratory, Richland, Washington 99352, USA; Bhaskaran-Nair, Kiran [Department of Physics and Astronomy, Louisiana State University, Baton Rouge, Louisiana 70803, USA; Center for Computation and Technology, Louisiana State University, Baton Rouge, Louisiana 70803, USA; Jarrell, Mark [Department of Physics and Astronomy, Louisiana State University, Baton Rouge, Louisiana 70803, USA; Center for Computation and Technology, Louisiana State University, Baton Rouge, Louisiana 70803, USA; Moreno, Juana [Department of Physics and Astronomy, Louisiana State University, Baton Rouge, Louisiana 70803, USA; Center for Computation and Technology, Louisiana State University, Baton Rouge, Louisiana 70803, USA
2016-01-07
The Sign Learning Kink (SiLK) based Quantum Monte Carlo (QMC) method is used to calculate the ab initio ground state energies for multiple geometries of the H2O, N2, and F2 molecules. The method is based on Feynman’s path integral formulation of quantum mechanics and has two stages. The first stage is called the learning stage and reduces the well-known QMC minus sign problem by optimizing the linear combinations of Slater determinants which are used in the second stage, a conventional QMC simulation. The method is tested using different vector spaces and compared to the results of other quantum chemical methods and to exact diagonalization. Our findings demonstrate that the SiLK method is accurate and reduces or eliminates the minus sign problem.
Niccolini, G.; Alcolea, J.
Solving the radiative transfer problem is a common problematic to may fields in astrophysics. With the increasing angular resolution of spatial or ground-based telescopes (VLTI, HST) but also with the next decade instruments (NGST, ALMA, ...), astrophysical objects reveal and will certainly reveal complex spatial structures. Consequently, it is necessary to develop numerical tools being able to solve the radiative transfer equation in three dimensions in order to model and interpret these observations. I present a 3D radiative transfer program, using a new method for the construction of an adaptive spatial grid, based on the Monte Claro method. With the help of this tools, one can solve the continuum radiative transfer problem (e.g. a dusty medium), computes the temperature structure of the considered medium and obtain the flux of the object (SED and images).
GPU-based Monte Carlo radiotherapy dose calculation using phase-space sources
Townson, Reid W.; Jia, Xun; Tian, Zhen; Jiang Graves, Yan; Zavgorodni, Sergei; Jiang, Steve B.
2013-06-01
A novel phase-space source implementation has been designed for graphics processing unit (GPU)-based Monte Carlo dose calculation engines. Short of full simulation of the linac head, using a phase-space source is the most accurate method to model a clinical radiation beam in dose calculations. However, in GPU-based Monte Carlo dose calculations where the computation efficiency is very high, the time required to read and process a large phase-space file becomes comparable to the particle transport time. Moreover, due to the parallelized nature of GPU hardware, it is essential to simultaneously transport particles of the same type and similar energies but separated spatially to yield a high efficiency. We present three methods for phase-space implementation that have been integrated into the most recent version of the GPU-based Monte Carlo radiotherapy dose calculation package gDPM v3.0. The first method is to sequentially read particles from a patient-dependent phase-space and sort them on-the-fly based on particle type and energy. The second method supplements this with a simple secondary collimator model and fluence map implementation so that patient-independent phase-space sources can be used. Finally, as the third method (called the phase-space-let, or PSL, method) we introduce a novel source implementation utilizing pre-processed patient-independent phase-spaces that are sorted by particle type, energy and position. Position bins located outside a rectangular region of interest enclosing the treatment field are ignored, substantially decreasing simulation time with little effect on the final dose distribution. The three methods were validated in absolute dose against BEAMnrc/DOSXYZnrc and compared using gamma-index tests (2%/2 mm above the 10% isodose). It was found that the PSL method has the optimal balance between accuracy and efficiency and thus is used as the default method in gDPM v3.0. Using the PSL method, open fields of 4 × 4, 10 × 10 and 30 × 30 cm
GPU-based Monte Carlo radiotherapy dose calculation using phase-space sources.
Townson, Reid W; Jia, Xun; Tian, Zhen; Graves, Yan Jiang; Zavgorodni, Sergei; Jiang, Steve B
2013-06-21
A novel phase-space source implementation has been designed for graphics processing unit (GPU)-based Monte Carlo dose calculation engines. Short of full simulation of the linac head, using a phase-space source is the most accurate method to model a clinical radiation beam in dose calculations. However, in GPU-based Monte Carlo dose calculations where the computation efficiency is very high, the time required to read and process a large phase-space file becomes comparable to the particle transport time. Moreover, due to the parallelized nature of GPU hardware, it is essential to simultaneously transport particles of the same type and similar energies but separated spatially to yield a high efficiency. We present three methods for phase-space implementation that have been integrated into the most recent version of the GPU-based Monte Carlo radiotherapy dose calculation package gDPM v3.0. The first method is to sequentially read particles from a patient-dependent phase-space and sort them on-the-fly based on particle type and energy. The second method supplements this with a simple secondary collimator model and fluence map implementation so that patient-independent phase-space sources can be used. Finally, as the third method (called the phase-space-let, or PSL, method) we introduce a novel source implementation utilizing pre-processed patient-independent phase-spaces that are sorted by particle type, energy and position. Position bins located outside a rectangular region of interest enclosing the treatment field are ignored, substantially decreasing simulation time with little effect on the final dose distribution. The three methods were validated in absolute dose against BEAMnrc/DOSXYZnrc and compared using gamma-index tests (2%/2 mm above the 10% isodose). It was found that the PSL method has the optimal balance between accuracy and efficiency and thus is used as the default method in gDPM v3.0. Using the PSL method, open fields of 4 × 4, 10 × 10 and 30 × 30 cm
Modelling of scintillator based flat-panel detectors with Monte-Carlo simulations
Reims, N.; Sukowski, F.; Uhlmann, N.
2011-01-01
Scintillator based flat panel detectors are state of the art in the field of industrial X-ray imaging applications. Choosing the proper system and setup parameters for the vast range of different applications can be a time consuming task, especially when developing new detector systems. Since the system behaviour cannot always be foreseen easily, Monte-Carlo (MC) simulations are keys to gain further knowledge of system components and their behaviour for different imaging conditions. In this work we used two Monte-Carlo based models to examine an indirect converting flat panel detector, specifically the Hamamatsu C9312SK. We focused on the signal generation in the scintillation layer and its influence on the spatial resolution of the whole system. The models differ significantly in their level of complexity. The first model gives a global description of the detector based on different parameters characterizing the spatial resolution. With relatively small effort a simulation model can be developed which equates the real detector regarding signal transfer. The second model allows a more detailed insight of the system. It is based on the well established cascade theory, i.e. describing the detector as a cascade of elemental gain and scattering stages, which represent the built in components and their signal transfer behaviour. In comparison to the first model the influence of single components especially the important light spread behaviour in the scintillator can be analysed in a more differentiated way. Although the implementation of the second model is more time consuming both models have in common that a relatively small amount of system manufacturer parameters are needed. The results of both models were in good agreement with the measured parameters of the real system.
TPSPET—A TPS-based approach for in vivo dose verification with PET in proton therapy
Frey, K.; Bauer, J.; Unholtz, D.; Kurz, C.; Krämer, M.; Bortfeld, T.; Parodi, K.
2014-01-01
Since the interest in ion-irradiation for tumour therapy has significantly increased over the last few decades, intensive investigations are performed to improve the accuracy of this form of patient treatment. One major goal is the development of methods for in vivo dose verification. In proton therapy, a PET (positron emission tomography)-based approach measuring the irradiation-induced tissue activation inside the patient has been already clinically implemented. The acquired PET images can be compared to an expectation, derived under the assumption of a correct treatment application, to validate the particle range and the lateral field position in vivo. In the context of this work, TPSPET is introduced as a new approach to predict proton-irradiation induced three-dimensional positron emitter distributions by means of the same algorithms of the clinical treatment planning system (TPS). In order to perform additional activity calculations, reaction-channel-dependent input positron emitter depth distributions are necessary, which are determined from the application of a modified filtering approach to the TPS reference depth dose profiles in water. This paper presents the implementation of TPSPET on the basis of the research treatment planning software treatment planning for particles. The results are validated in phantom and patient studies against Monte Carlo simulations, and compared to β+-emitter distributions obtained from a slightly modified version of the originally proposed one-dimensional filtering approach applied to three-dimensional dose distributions. In contrast to previously introduced methods, TPSPET provides a faster implementation, the results show no sensitivity to lateral field extension and the predicted β+-emitter densities are fully consistent to the planned treatment dose as they are calculated by the same pencil beam algorithms. These findings suggest a large potential of the application of TPSPET for in vivo dose verification in the daily
Proton induced tautomeric switching in N-rich aromatics with tunable acid-base character
Centore, Roberto; Manfredi, Carla; Fusco, Sandra; Maglione, Cira; Carella, Antonio; Capobianco, Amedeo; Peluso, Andrea; Colonna, Daniele; Di Carlo, Aldo
2015-08-01
The acid-base properties of selected derivatives of the [1,2,4]triazolo[3,2-c][1,2,4]triazole fused aromatic system have been investigated by UV-vis spectroscopy. Neutral heterobicycles (HL) exhibit amphoteric behavior (they can deliver the N-H proton forming the conjugated base L- and can accept up to two protons, forming the species H2L+ and H3L++) and show an unprecedented tautomeric switching upon protonation, as revealed by single crystal X-ray analysis and confirmed by theoretical calculations. By varying the groups attached at the heterocycle, a remarkable shift of pKai values, up to 5-6 units, is observed. In particular, with strong electron attractor groups at position 7 (e.g. p-nitrophenyl or pentafluorophenyl) the neutral compounds are stronger acids than phenol or p-nitrophenol.
Hydrogen bonding: a channel for protons to transfer through acid-base pairs.
Wu, Liang; Huang, Chuanhui; Woo, Jung-Je; Wu, Dan; Yun, Sung-Hyun; Seo, Seok-Jun; Xu, Tongwen; Moon, Seung-Hyeon
2009-09-10
Different from H(3)O(+) transport as in the vehicle mechanism, protons find another channel to transfer through the poorly hydrophilic interlayers in a hydrated multiphase membrane. This membrane was prepared from poly(phthalazinone ether sulfone kentone) (SPPESK) and H(+)-form perfluorosulfonic resin (FSP), and poorly hydrophilic electrostatically interacted acid-base pairs constitute the interlayer between two hydrophilic phases (FSP and SPPESK). By hydrogen bonds forming and breaking between acid-base pairs and water molecules, protons transport directly through these poorly hydrophilic zones. The multiphase membrane, due to this unique transfer mechanism, exhibits better electrochemical performances during fuel cell tests than those of pure FSP and Nafion-112 membranes: 0.09-0.12 S cm(-1) of proton conductivity at 25 degrees C and 990 mW cm(-2) of the maximum power density at a current density of 2600 mA cm(-2) and a cell voltage of 0.38 V.
The models of proton assisted and the unassisted formation of CGC base triplets.
Medhi, Chitrani
2002-01-01
The triple helix is formed by combining a double and a single strand DNAs in low pH and dissociates in high pH. Under such conditions, protonation of cytosine in the single strand is necessary for triplex formation where cytosine-guanine-cytosine (CGC+) base triplet stabilizes the triple helix. The mechanism of CGC+ triplet formation from guanine-cytosine (GC) and a protonated cytosine (C+) shows the importance of N3 proton. Similarly in the case of CGC (unprotonated) triplet, the donor acceptor H-bond at N3 hydrogen of the cytosine analog (C) initiates the interaction with GC. The correspondence between the two models of triplets, CGC+ and CGC, unambiguously assigned that protonation at N3 cytosine in low pH to be the first step in triplet formation, but a donor acceptor triplet (CGC) can be designed without involving a proton in the Hoogsteen H-bond. Further, the bases of cytosine analogue also show the capability of forming Watson Crick (WC) H-bonds with guanine.
Lee, Pei-Yi; Liu, Yuan-Hao; Jiang, Shiang-Huei
2014-10-01
The (7)Li(p,xn)(7)Be nuclear reaction, based on the low-energy protons, could produce soft neutrons for accelerator-based boron neutron capture therapy (AB-BNCT). Based on the fact that the induced neutron field is relatively divergent, the relationship between the incident angle of proton beam and the neutron beam quality was evaluated in this study. To provide an intense epithermal neutron beam, a beam-shaping assembly (BSA) was designed. And a modified Snyder head phantom was used in the calculations for evaluating the dosimetric performance. From the calculated results, the intensity of epithermal neutrons increased with the increase in proton incident angle. Hence, either the irradiation time or the required proton current can be reduced. When the incident angle of 2.5-MeV proton beam is 120°, the required proton current is ∼13.3 mA for an irradiation time of half an hour.
Pattern-oriented Agent-based Monte Carlo simulation of Cellular Redox Environment
DEFF Research Database (Denmark)
Tang, Jiaowei; Holcombe, Mike; Boonen, Harrie C.M.
Research suggests that cellular redox environment could affect the phenotype and function of cells through a complex reaction network[1]. In cells, redox status is mainly regulated by several redox couples, such as Glutathione/glutathione disulfide (GSH/GSSG), Cysteine/ Cystine (CYS......, that there is a connection between extracellular and intracellular redox [2], whereas others oppose this view [3]. In general however, these experiments lack insight into the dynamics, complex network of reactions and transportation through cell membrane of redox. Therefore, current experimental results reveal......] could be very important factors. In our project, an agent-based Monte Carlo modeling [6] is offered to study the dynamic relationship between extracellular and intracellular redox and complex networks of redox reactions. In the model, pivotal redox-related reactions will be included, and the reactants...
A Monte Carlo-based treatment-planning tool for ion beam therapy
Böhlen, T T; Dosanjh, M; Ferrari, A; Haberer, T; Parodi, K; Patera, V; Mairan, A
2013-01-01
Ion beam therapy, as an emerging radiation therapy modality, requires continuous efforts to develop and improve tools for patient treatment planning (TP) and research applications. Dose and fluence computation algorithms using the Monte Carlo (MC) technique have served for decades as reference tools for accurate dose computations for radiotherapy. In this work, a novel MC-based treatment-planning (MCTP) tool for ion beam therapy using the pencil beam scanning technique is presented. It allows single-field and simultaneous multiple-fields optimization for realistic patient treatment conditions and for dosimetric quality assurance for irradiation conditions at state-of-the-art ion beam therapy facilities. It employs iterative procedures that allow for the optimization of absorbed dose and relative biological effectiveness (RBE)-weighted dose using radiobiological input tables generated by external RBE models. Using a re-implementation of the local effect model (LEM), theMCTP tool is able to perform TP studies u...
Ma, X. B.; Qiu, R. M.; Chen, Y. X.
2017-02-01
Uncertainties regarding fission fractions are essential in understanding antineutrino flux predictions in reactor antineutrino experiments. A new Monte Carlo-based method to evaluate the covariance coefficients between isotopes is proposed. The covariance coefficients are found to vary with reactor burnup and may change from positive to negative because of balance effects in fissioning. For example, between 235U and 239Pu, the covariance coefficient changes from 0.15 to -0.13. Using the equation relating fission fraction and atomic density, consistent uncertainties in the fission fraction and covariance matrix were obtained. The antineutrino flux uncertainty is 0.55%, which does not vary with reactor burnup. The new value is about 8.3% smaller.
Monte Carlo based statistical power analysis for mediation models: methods and software.
Zhang, Zhiyong
2014-12-01
The existing literature on statistical power analysis for mediation models often assumes data normality and is based on a less powerful Sobel test instead of the more powerful bootstrap test. This study proposes to estimate statistical power to detect mediation effects on the basis of the bootstrap method through Monte Carlo simulation. Nonnormal data with excessive skewness and kurtosis are allowed in the proposed method. A free R package called bmem is developed to conduct the power analysis discussed in this study. Four examples, including a simple mediation model, a multiple-mediator model with a latent mediator, a multiple-group mediation model, and a longitudinal mediation model, are provided to illustrate the proposed method.
Monte Carlo-based Noise Compensation in Coil Intensity Corrected Endorectal MRI
Lui, Dorothy; Haider, Masoom; Wong, Alexander
2015-01-01
Background: Prostate cancer is one of the most common forms of cancer found in males making early diagnosis important. Magnetic resonance imaging (MRI) has been useful in visualizing and localizing tumor candidates and with the use of endorectal coils (ERC), the signal-to-noise ratio (SNR) can be improved. The coils introduce intensity inhomogeneities and the surface coil intensity correction built into MRI scanners is used to reduce these inhomogeneities. However, the correction typically performed at the MRI scanner level leads to noise amplification and noise level variations. Methods: In this study, we introduce a new Monte Carlo-based noise compensation approach for coil intensity corrected endorectal MRI which allows for effective noise compensation and preservation of details within the prostate. The approach accounts for the ERC SNR profile via a spatially-adaptive noise model for correcting non-stationary noise variations. Such a method is useful particularly for improving the image quality of coil i...
Development of a Monte-Carlo based method for calculating the effect of stationary fluctuations
DEFF Research Database (Denmark)
Pettersen, E. E.; Demazire, C.; Jareteg, K.;
2015-01-01
This paper deals with the development of a novel method for performing Monte Carlo calculations of the effect, on the neutron flux, of stationary fluctuations in macroscopic cross-sections. The basic principle relies on the formulation of two equivalent problems in the frequency domain: one...... equivalent problems nevertheless requires the possibility to modify the macroscopic cross-sections, and we use the work of Kuijper, van der Marck and Hogenbirk to define group-wise macroscopic cross-sections in MCNP [1]. The method is illustrated in this paper at a frequency of 1 Hz, for which only the real...... part of the neutron balance plays a significant role and for driving fluctuations leading to neutron sources having the same sign in the two equivalent sub-critical problems. A semi-analytical diffusion-based solution is used to verily the implementation of the method on a test case representative...
Auxiliary-field based trial wave functions in quantum Monte Carlo simulations
Chang, Chia-Chen; Rubenstein, Brenda; Morales, Miguel
We propose a simple scheme for generating correlated multi-determinant trial wave functions for quantum Monte Carlo algorithms. The method is based on the Hubbard-Stratonovich transformation which decouples a two-body Jastrow-type correlator into one-body projectors coupled to auxiliary fields. We apply the technique to generate stochastic representations of the Gutzwiller wave function, and present benchmark resuts for the ground state energy of the Hubbard model in one dimension. Extensions of the proposed scheme to chemical systems will also be discussed. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344, 15-ERD-013.
Considerable variation in NNT - A study based on Monte Carlo simulations
DEFF Research Database (Denmark)
Wisloff, T.; Aalen, O. O.; Sønbø Kristiansen, Ivar
2011-01-01
Objective: The aim of this analysis was to explore the variation in measures of effect, such as the number-needed-to-treat (NNT) and the relative risk (RR). Study Design and Setting: We performed Monte Carlo simulations of therapies using binominal distributions based on different true absolute...... risk reductions (ARR), number of patients (n), and the baseline risk of adverse events (p(0)) as parameters and presented results in histograms with NNT and RR. We also estimated the probability of observing no or a negative treatment effect, given that the true effect is positive. Results: When RR...... is used to express treatment effectiveness, it has a regular distribution around the expected value for various values of true ARR, n, and p(0). The equivalent distribution of NNT is by definition nonconnected at zero and is also irregular. The probability that the observed treatment effectiveness is zero...
GPU-based fast Monte Carlo simulation for radiotherapy dose calculation
Jia, Xun; Graves, Yan Jiang; Folkerts, Michael; Jiang, Steve B
2011-01-01
Monte Carlo (MC) simulation is commonly considered to be the most accurate dose calculation method in radiotherapy. However, its efficiency still requires improvement for many routine clinical applications. In this paper, we present our recent progress towards the development a GPU-based MC dose calculation package, gDPM v2.0. It utilizes the parallel computation ability of a GPU to achieve high efficiency, while maintaining the same particle transport physics as in the original DPM code and hence the same level of simulation accuracy. In GPU computing, divergence of execution paths between threads can considerably reduce the efficiency. Since photons and electrons undergo different physics and hence attain different execution paths, we use a simulation scheme where photon transport and electron transport are separated to partially relieve the thread divergence issue. High performance random number generator and hardware linear interpolation are also utilized. We have also developed various components to hand...
Energy Technology Data Exchange (ETDEWEB)
Park, Peter C. [Department of Radiation Physics, University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Schreibmann, Eduard; Roper, Justin; Elder, Eric; Crocker, Ian [Department of Radiation Oncology, Winship Cancer Institute of Emory University, Atlanta, Georgia (United States); Fox, Tim [Varian Medical Systems, Palo Alto, California (United States); Zhu, X. Ronald [Department of Radiation Physics, University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Dong, Lei [Scripps Proton Therapy Center, San Diego, California (United States); Dhabaan, Anees, E-mail: anees.dhabaan@emory.edu [Department of Radiation Oncology, Winship Cancer Institute of Emory University, Atlanta, Georgia (United States)
2015-03-15
Purpose: Computed tomography (CT) artifacts can severely degrade dose calculation accuracy in proton therapy. Prompted by the recently increased popularity of magnetic resonance imaging (MRI) in the radiation therapy clinic, we developed an MRI-based CT artifact correction method for improving the accuracy of proton range calculations. Methods and Materials: The proposed method replaces corrupted CT data by mapping CT Hounsfield units (HU number) from a nearby artifact-free slice, using a coregistered MRI. MRI and CT volumetric images were registered with use of 3-dimensional (3D) deformable image registration (DIR). The registration was fine-tuned on a slice-by-slice basis by using 2D DIR. Based on the intensity of paired MRI pixel values and HU from an artifact-free slice, we performed a comprehensive analysis to predict the correct HU for the corrupted region. For a proof-of-concept validation, metal artifacts were simulated on a reference data set. Proton range was calculated using reference, artifactual, and corrected images to quantify the reduction in proton range error. The correction method was applied to 4 unique clinical cases. Results: The correction method resulted in substantial artifact reduction, both quantitatively and qualitatively. On respective simulated brain and head and neck CT images, the mean error was reduced from 495 and 370 HU to 108 and 92 HU after correction. Correspondingly, the absolute mean proton range errors of 2.4 cm and 1.7 cm were reduced to less than 2 mm in both cases. Conclusions: Our MRI-based CT artifact correction method can improve CT image quality and proton range calculation accuracy for patients with severe CT artifacts.
Phase behavior of a PVAL-based polymer proton conductor
Energy Technology Data Exchange (ETDEWEB)
Vargas, M.A. [Univ. Popular del Cesar, Valledupar (Colombia). Dept. de Fisica; Vargas, R.A. [Dept. de Fisica, Univ. del Valle, Cali (Colombia); Mellander, B.-E. [Physics Dept., Chalmers Univ. of Technology, Gothenburg (Sweden)
2000-07-01
Solid protonic conductor gels were synthesized using poly(vinyl alcohol) (PVAL), hypophosphorous acid (H{sub 3}PO{sub 2}) and water as prime chemicals. The samples were characterized by means of impedance spectroscopy, fuel cell measurements, differential scanning calorimetry (DSC), thermogravimetry (TG) and X-ray diffraction. The electrical conductivity of the samples at room temperature showed a sensitive variation between 10{sup -6} and 10{sup -1} S/cm as the acid concentration was increased. Using the raw membranes as electrolytic separator in a fuel cell, voltages up to 726 mV were obtained. DSC thermograms showed a well-resolved step anomaly associated to a glass transition for samples with the highest acid concentrations: at about -130 C for the first set of samples and at about -120 C for the other sets, which indicates the amorphous character of the samples. TG traces confirmed that the membranes with higher acid concentrations have higher water contents and that the maximum rate of water removal is at about 50 C for all samples. X-ray spectra for the raw samples at room temperature show a large peak at about 2{theta} = 20 , which is smaller for the higher acid content samples and increases when the samples are annealed at 70 C, indicating that the amorphousness of PVAL complexes increases with the H{sub 3}PO{sub 2} content and drops with the water removal. The results, then, indicated the presence of a separate acid/water phase in the raw samples and an increasing polymer chain intervention in the ionic mobility as the samples are thermally treated. (orig.)
Radiosurgery with photons or protons for benign and malignant tumours of the skull base: a review.
Amichetti, Maurizio; Amelio, Dante; Minniti, Giuseppe
2012-12-14
Stereotactic radiosurgery (SRS) is an important treatment option for intracranial lesions. Many studies have shown the effectiveness of photon-SRS for the treatment of skull base (SB) tumours; however, limited data are available for proton-SRS.Several photon-SRS techniques, including Gamma Knife, modified linear accelerators (Linac) and CyberKnife, have been developed and several studies have compared treatment plan characteristics between protons and photons.The principles of classical radiobiology are similar for protons and photons even though they differ in terms of physical properties and interaction with matter resulting in different dose distributions.Protons have special characteristics that allow normal tissues to be spared better than with the use of photons, although their potential clinical superiority remains to be demonstrated.A critical analysis of the fundamental radiobiological principles, dosimetric characteristics, clinical results, and toxicity of proton- and photon-SRS for SB tumours is provided and discussed with an attempt of defining the advantages and limits of each radiosurgical technique.
A research plan based on high intensity proton accelerator Neutron Science Research Center
Energy Technology Data Exchange (ETDEWEB)
Mizumoto, Motoharu [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment
1997-03-01
A plan called Neutron Science Research Center (NSRC) has been proposed in JAERI. The center is a complex composed of research facilities based on a proton linac with an energy of 1.5GeV and an average current of 10mA. The research facilities will consist of Thermal/Cold Neutron Facility, Neutron Irradiation Facility, Neutron Physics Facility, OMEGA/Nuclear Energy Facility, Spallation RI Beam Facility, Meson/Muon Facility and Medium Energy Experiment Facility, where high intensity proton beam and secondary particle beams such as neutron, pion, muon and unstable radio isotope (RI) beams generated from the proton beam will be utilized for innovative researches in the fields on nuclear engineering and basic sciences. (author)
Picchioni, F.; Tricoli, V.; Carretta, N.
2000-01-01
Homogeneuosly sulfonated poly(styrene) (SPS) was prepared with various concentration of sulfonic acid groups in the base polymer. Membranes cast from these materials were investigated in relation to proton conductivity and methanol permeability in the temperature range from 20°C to 60°C. It was foun
Full modelling of the MOSAIC animal PET system based on the GATE Monte Carlo simulation code
Merheb, C.; Petegnief, Y.; Talbot, J. N.
2007-02-01
within 9%. For a 410-665 keV energy window, the measured sensitivity for a centred point source was 1.53% and mouse and rat scatter fractions were respectively 12.0% and 18.3%. The scattered photons produced outside the rat and mouse phantoms contributed to 24% and 36% of total simulated scattered coincidences. Simulated and measured single and prompt count rates agreed well for activities up to the electronic saturation at 110 MBq for the mouse and rat phantoms. Volumetric spatial resolution was 17.6 µL at the centre of the FOV with differences less than 6% between experimental and simulated spatial resolution values. The comprehensive evaluation of the Monte Carlo modelling of the Mosaic™ system demonstrates that the GATE package is adequately versatile and appropriate to accurately describe the response of an Anger logic based animal PET system.
An image-guided precision proton radiation platform for preclinical in vivo research
Ford, E.; Emery, R.; Huff, D.; Narayanan, M.; Schwartz, J.; Cao, N.; Meyer, J.; Rengan, R.; Zeng, J.; Sandison, G.; Laramore, G.; Mayr, N.
2017-01-01
There are many unknowns in the radiobiology of proton beams and other particle beams. We describe the development and testing of an image-guided low-energy proton system optimized for radiobiological research applications. A 50 MeV proton beam from an existing cyclotron was modified to produce collimated beams (as small as 2 mm in diameter). Ionization chamber and radiochromic film measurements were performed and benchmarked with Monte Carlo simulations (TOPAS). The proton beam was aligned with a commercially-available CT image-guided x-ray irradiator device (SARRP, Xstrahl Inc.). To examine the alternative possibility of adapting a clinical proton therapy system, we performed Monte Carlo simulations of a range-shifted 100 MeV clinical beam. The proton beam exhibits a pristine Bragg Peak at a depth of 21 mm in water with a dose rate of 8.4 Gy min‑1 (3 mm depth). The energy of the incident beam can be modulated to lower energies while preserving the Bragg peak. The LET was: 2.0 keV µm‑1 (water surface), 16 keV µm‑1 (Bragg peak), 27 keV µm‑1 (10% peak dose). Alignment of the proton beam with the SARRP system isocenter was measured at 0.24 mm agreement. The width of the beam changes very little with depth. Monte Carlo-based calculations of dose using the CT image data set as input demonstrate in vivo use. Monte Carlo simulations of the modulated 100 MeV clinical proton beam show a significantly reduced Bragg peak. We demonstrate the feasibility of a proton beam integrated with a commercial x-ray image-guidance system for preclinical in vivo studies. To our knowledge this is the first description of an experimental image-guided proton beam for preclinical radiobiology research. It will enable in vivo investigations of radiobiological effects in proton beams.
Doolan, P J; Testa, M; Sharp, G; Bentefour, E H; Royle, G; Lu, H-M
2015-03-07
A simple robust optimizer has been developed that can produce patient-specific calibration curves to convert x-ray computed tomography (CT) numbers to relative stopping powers (HU-RSPs) for proton therapy treatment planning. The difference between a digitally reconstructed radiograph water-equivalent path length (DRRWEPL) map through the x-ray CT dataset and a proton radiograph (set as the ground truth) is minimized by optimizing the HU-RSP calibration curve. The function of the optimizer is validated with synthetic datasets that contain no noise and its robustness is shown against CT noise. Application of the procedure is then demonstrated on a plastic and a real tissue phantom, with proton radiographs produced using a single detector. The mean errors using generic/optimized calibration curves between the DRRWEPL map and the proton radiograph were 1.8/0.4% for a plastic phantom and -2.1/ - 0.2% for a real tissue phantom. It was then demonstrated that these optimized calibration curves offer a better prediction of the water equivalent path length at a therapeutic depth. We believe that these promising results are suggestive that a single proton radiograph could be used to generate a patient-specific calibration curve as part of the current proton treatment planning workflow.
A zero-variance based scheme for Monte Carlo criticality simulations
Christoforou, S.
2010-01-01
The ability of the Monte Carlo method to solve particle transport problems by simulating the particle behaviour makes it a very useful technique in nuclear reactor physics. However, the statistical nature of Monte Carlo implies that there will always be a variance associated with the estimate obtain
Efficiency of respiratory-gated delivery of synchrotron-based pulsed proton irradiation
Energy Technology Data Exchange (ETDEWEB)
Tsunashima, Yoshikazu; Vedam, Sastry; Dong, Lei; Bues, Martin; Balter, Peter; Smith, Alfred; Mohan, Radhe [Department of Radiation Physics, University of Texas M D Anderson Cancer Center, 1515 Holcombe Blvd., Unit 94, Houston, TX 77030 (United States); Umezawa, Masumi [Hitachi America Ltd, PTC-H Construction Site, 7707 Fannin Street, Suite 203, Houston, TX 77054 (United States); Sakae, Takeji [Proton Medical Research Center, University of Tsukuba, 1-1-1 Tennodai, Tsukuba, Ibaraki, 305-0801 (Japan)], E-mail: svedam@mdanderson.org
2008-04-07
Significant differences exist in respiratory-gated proton beam delivery with a synchrotron-based accelerator system when compared to photon therapy with a conventional linear accelerator. Delivery of protons with a synchrotron accelerator is governed by a magnet excitation cycle pattern. Optimal synchronization of the magnet excitation cycle pattern with the respiratory motion pattern is critical to the efficiency of respiratory-gated proton delivery. There has been little systematic analysis to optimize the accelerator's operational parameters to improve gated treatment efficiency. The goal of this study was to estimate the overall efficiency of respiratory-gated synchrotron-based proton irradiation through realistic simulation. Using 62 respiratory motion traces from 38 patients, we simulated respiratory gating for duty cycles of 30%, 20% and 10% around peak exhalation for various fixed and variable magnet excitation patterns. In each case, the time required to deliver 100 monitor units in both non-gated and gated irradiation scenarios was determined. Based on results from this study, the minimum time required to deliver 100 MU was 1.1 min for non-gated irradiation. For respiratory-gated delivery at a 30% duty cycle around peak exhalation, corresponding average delivery times were typically three times longer with a fixed magnet excitation cycle pattern. However, when a variable excitation cycle was allowed in synchrony with the patient's respiratory cycle, the treatment time only doubled. Thus, respiratory-gated delivery of synchrotron-based pulsed proton irradiation is feasible and more efficient when a variable magnet excitation cycle pattern is used.
Proton-Pump Mechanism in Retinal Schiff Base: On the molecular structure of the M-state
Datta, A; Datta, Ayan; Pati, Swapan K.
2005-01-01
Theoretical characterizations of the various intermediates in the proton pump cycle of the retinal Schiff base in the Halobacterium salinarium have been performed. Contrary to the general belief over the years that the most stable intermediate, the M-state, is a non-protonated cis-isomer, we find that the M-state is a polarized cis-isomer stabilized due to interactions of the dissociating proton with the pi-electrons. The role of proton in the pump cycle is found to be profound leading to the stabilization or in certain cases destabilization of the intermediates. We propose the chemical structure of the M-state for the first time.
Light activation of the isomerization and deprotonation of the protonated Schiff base retinal.
Kubli-Garfias, Carlos; Salazar-Salinas, Karim; Perez-Angel, Emily C; Seminario, Jorge M
2011-10-01
We perform an ab initio analysis of the photoisomerization of the protonated Schiff base of retinal (PSB-retinal) from 11-cis to 11-trans rotating the C10-C11=C12-C13 dihedral angle from 0° (cis) to -180° (trans). We find that the retinal molecule shows the lowest rotational barrier (0.22 eV) when its charge state is zero as compared to the barrier for the protonated molecule which is ∼0.89 eV. We conclude that rotation most likely takes place in the excited state of the deprotonated retinal. The addition of a proton creates a much larger barrier implying a switching behavior of retinal that might be useful for several applications in molecular electronics. All conformations of the retinal compound absorb in the green region with small shifts following the dihedral angle rotation; however, the Schiff base of retinal (SB-retinal) at trans-conformation absorbs in the violet region. The rotation of the dihedral angle around the C11=C12 π-bond affects the absorption energy of the retinal and the binding energy of the SB-retinal with the proton at the N-Schiff; the binding energy is slightly lower at the trans-SB-retinal than at other conformations of the retinal.
CT based treatment planning system of proton beam therapy for ocular melanoma
Energy Technology Data Exchange (ETDEWEB)
Nakano, Takashi E-mail: tnakano@med.gunma-u.ac.jp; Kanai, Tatsuaki; Furukawa, Shigeo; Shibayama, Kouichi; Sato, Sinichiro; Hiraoka, Takeshi; Morita, Shinroku; Tsujii, Hirohiko
2003-09-01
A computed tomography (CT) based treatment planning system of proton beam therapy was established specially for ocular melanoma treatment. A technique of collimated proton beams with maximum energy of 70 MeV are applied for treatment for ocular melanoma. The vertical proton beam line has a range modulator for spreading beams out, a multi-leaf collimator, an aperture, light beam localizer, field light, and X-ray verification system. The treatment planning program includes; eye model, selecting the best direction of gaze, designing the shape of aperture, determining the proton range and range modulation necessary to encompass the target volume, and indicating the relative positions of the eyes, beam center and creation of beam aperture. Tumor contours are extracted from CT/MRI images of 1 mm thickness by assistant by various information of fundus photography and ultrasonography. The CT image-based treatment system for ocular melanoma is useful for Japanese patients as having thick choroid membrane in terms of dose sparing to skin and normal organs in the eye. The characteristics of the system and merits/demerits were reported.
Ultrafast cone-beam CT scatter correction with GPU-based Monte Carlo simulation
Directory of Open Access Journals (Sweden)
Yuan Xu
2014-03-01
Full Text Available Purpose: Scatter artifacts severely degrade image quality of cone-beam CT (CBCT. We present an ultrafast scatter correction framework by using GPU-based Monte Carlo (MC simulation and prior patient CT image, aiming at automatically finish the whole process including both scatter correction and reconstruction within 30 seconds.Methods: The method consists of six steps: 1 FDK reconstruction using raw projection data; 2 Rigid Registration of planning CT to the FDK results; 3 MC scatter calculation at sparse view angles using the planning CT; 4 Interpolation of the calculated scatter signals to other angles; 5 Removal of scatter from the raw projections; 6 FDK reconstruction using the scatter-corrected projections. In addition to using GPU to accelerate MC photon simulations, we also use a small number of photons and a down-sampled CT image in simulation to further reduce computation time. A novel denoising algorithm is used to eliminate MC noise from the simulated scatter images caused by low photon numbers. The method is validated on one simulated head-and-neck case with 364 projection angles.Results: We have examined variation of the scatter signal among projection angles using Fourier analysis. It is found that scatter images at 31 angles are sufficient to restore those at all angles with < 0.1% error. For the simulated patient case with a resolution of 512 × 512 × 100, we simulated 5 × 106 photons per angle. The total computation time is 20.52 seconds on a Nvidia GTX Titan GPU, and the time at each step is 2.53, 0.64, 14.78, 0.13, 0.19, and 2.25 seconds, respectively. The scatter-induced shading/cupping artifacts are substantially reduced, and the average HU error of a region-of-interest is reduced from 75.9 to 19.0 HU.Conclusion: A practical ultrafast MC-based CBCT scatter correction scheme is developed. It accomplished the whole procedure of scatter correction and reconstruction within 30 seconds.----------------------------Cite this
A Multiple Scattering Theory for Proton Penetration
Institute of Scientific and Technical Information of China (English)
YANG Dai-Lun; WU Zhang-Wen; JIANG Steve-Bin; LUO Zheng-Ming
2004-01-01
@@ We extend the electron small-angle multiple scattering theory to proton penetration. After introducing the concept of narrow energy spectra, the proton energy loss process is included in the proton deep penetration theory. It precisely describes the whole process of proton penetration. Compared to the Monte Carlo method,this method maintains the comparable precision and possesses much higher computational efficiency. Thus, it shows the real feasibility of applying this algorithm to proton clinical radiation therapy.
Arai, Kazuhiro; Kadoya, Noriyuki; Kato, Takahiro; Endo, Hiromitsu; Komori, Shinya; Abe, Yoshitomo; Nakamura, Tatsuya; Wada, Hitoshi; Kikuchi, Yasuhiro; Takai, Yoshihiro; Jingu, Keiichi
2017-01-01
The aim of this study was to confirm On-Board Imager cone-beam computed tomography (CBCT) using the histogram-matching algorithm as a useful method for proton dose calculation. We studied one head and neck phantom, one pelvic phantom, and ten patients with head and neck cancer treated using intensity-modulated radiation therapy (IMRT) and proton beam therapy. We modified Hounsfield unit (HU) values of CBCT and generated two modified CBCTs (mCBCT-RR, mCBCT-DIR) using the histogram-matching algorithm: modified CBCT with rigid registration (mCBCT-RR) and that with deformable image registration (mCBCT-DIR). Rigid and deformable image registration were applied to match the CBCT to planning CT. To evaluate the accuracy of the proton dose calculation, we compared dose differences in the dosimetric parameters (D2% and D98%) for clinical target volume (CTV) and planning target volume (PTV). We also evaluated the accuracy of the dosimetric parameters (Dmean and D2%) for some organs at risk, and compared the proton ranges (PR) between planning CT (reference) and CBCT or mCBCTs, and the gamma passing rates of CBCT and mCBCTs. For patients, the average dose and PR differences of mCBCTs were smaller than those of CBCT. Additionally, the average gamma passing rates of mCBCTs were larger than those of CBCT (e.g., 94.1±3.5% in mCBCT-DIR vs. 87.8±7.4% in CBCT). We evaluated the accuracy of the proton dose calculation in CBCT and mCBCTs for two phantoms and ten patients. Our results showed that HU modification using the histogram-matching algorithm could improve the accuracy of the proton dose calculation.
Proton beam therapy; Cancer - proton therapy; Radiation therapy - proton therapy; Prostate cancer - proton therapy ... that use x-rays to destroy cancer cells, proton therapy uses a beam of special particles called ...
Monte carlo method-based QSAR modeling of penicillins binding to human serum proteins.
Veselinović, Jovana B; Toropov, Andrey A; Toropova, Alla P; Nikolić, Goran M; Veselinović, Aleksandar M
2015-01-01
The binding of penicillins to human serum proteins was modeled with optimal descriptors based on the Simplified Molecular Input-Line Entry System (SMILES). The concentrations of protein-bound drug for 87 penicillins expressed as percentage of the total plasma concentration were used as experimental data. The Monte Carlo method was used as a computational tool to build up the quantitative structure-activity relationship (QSAR) model for penicillins binding to plasma proteins. One random data split into training, test and validation set was examined. The calculated QSAR model had the following statistical parameters: r(2) = 0.8760, q(2) = 0.8665, s = 8.94 for the training set and r(2) = 0.9812, q(2) = 0.9753, s = 7.31 for the test set. For the validation set, the statistical parameters were r(2) = 0.727 and s = 12.52, but after removing the three worst outliers, the statistical parameters improved to r(2) = 0.921 and s = 7.18. SMILES-based molecular fragments (structural indicators) responsible for the increase and decrease of penicillins binding to plasma proteins were identified. The possibility of using these results for the computer-aided design of new penicillins with desired binding properties is presented.
The development of GPU-based parallel PRNG for Monte Carlo applications in CUDA Fortran
Kargaran, Hamed; Minuchehr, Abdolhamid; Zolfaghari, Ahmad
2016-04-01
The implementation of Monte Carlo simulation on the CUDA Fortran requires a fast random number generation with good statistical properties on GPU. In this study, a GPU-based parallel pseudo random number generator (GPPRNG) have been proposed to use in high performance computing systems. According to the type of GPU memory usage, GPU scheme is divided into two work modes including GLOBAL_MODE and SHARED_MODE. To generate parallel random numbers based on the independent sequence method, the combination of middle-square method and chaotic map along with the Xorshift PRNG have been employed. Implementation of our developed PPRNG on a single GPU showed a speedup of 150x and 470x (with respect to the speed of PRNG on a single CPU core) for GLOBAL_MODE and SHARED_MODE, respectively. To evaluate the accuracy of our developed GPPRNG, its performance was compared to that of some other commercially available PPRNGs such as MATLAB, FORTRAN and Miller-Park algorithm through employing the specific standard tests. The results of this comparison showed that the developed GPPRNG in this study can be used as a fast and accurate tool for computational science applications.
The development of GPU-based parallel PRNG for Monte Carlo applications in CUDA Fortran
Directory of Open Access Journals (Sweden)
Hamed Kargaran
2016-04-01
Full Text Available The implementation of Monte Carlo simulation on the CUDA Fortran requires a fast random number generation with good statistical properties on GPU. In this study, a GPU-based parallel pseudo random number generator (GPPRNG have been proposed to use in high performance computing systems. According to the type of GPU memory usage, GPU scheme is divided into two work modes including GLOBAL_MODE and SHARED_MODE. To generate parallel random numbers based on the independent sequence method, the combination of middle-square method and chaotic map along with the Xorshift PRNG have been employed. Implementation of our developed PPRNG on a single GPU showed a speedup of 150x and 470x (with respect to the speed of PRNG on a single CPU core for GLOBAL_MODE and SHARED_MODE, respectively. To evaluate the accuracy of our developed GPPRNG, its performance was compared to that of some other commercially available PPRNGs such as MATLAB, FORTRAN and Miller-Park algorithm through employing the specific standard tests. The results of this comparison showed that the developed GPPRNG in this study can be used as a fast and accurate tool for computational science applications.
Zhai, Xue; Fei, Cheng-Wei; Choy, Yat-Sze; Wang, Jian-Jun
2017-01-01
To improve the accuracy and efficiency of computation model for complex structures, the stochastic model updating (SMU) strategy was proposed by combining the improved response surface model (IRSM) and the advanced Monte Carlo (MC) method based on experimental static test, prior information and uncertainties. Firstly, the IRSM and its mathematical model were developed with the emphasis on moving least-square method, and the advanced MC simulation method is studied based on Latin hypercube sampling method as well. And then the SMU procedure was presented with experimental static test for complex structure. The SMUs of simply-supported beam and aeroengine stator system (casings) were implemented to validate the proposed IRSM and advanced MC simulation method. The results show that (1) the SMU strategy hold high computational precision and efficiency for the SMUs of complex structural system; (2) the IRSM is demonstrated to be an effective model due to its SMU time is far less than that of traditional response surface method, which is promising to improve the computational speed and accuracy of SMU; (3) the advanced MC method observably decrease the samples from finite element simulations and the elapsed time of SMU. The efforts of this paper provide a promising SMU strategy for complex structure and enrich the theory of model updating.
Saha, Krishnendu; Straus, Kenneth J; Chen, Yu; Glick, Stephen J
2014-08-28
To maximize sensitivity, it is desirable that ring Positron Emission Tomography (PET) systems dedicated for imaging the breast have a small bore. Unfortunately, due to parallax error this causes substantial degradation in spatial resolution for objects near the periphery of the breast. In this work, a framework for computing and incorporating an accurate system matrix into iterative reconstruction is presented in an effort to reduce spatial resolution degradation towards the periphery of the breast. The GATE Monte Carlo Simulation software was utilized to accurately model the system matrix for a breast PET system. A strategy for increasing the count statistics in the system matrix computation and for reducing the system element storage space was used by calculating only a subset of matrix elements and then estimating the rest of the elements by using the geometric symmetry of the cylindrical scanner. To implement this strategy, polar voxel basis functions were used to represent the object, resulting in a block-circulant system matrix. Simulation studies using a breast PET scanner model with ring geometry demonstrated improved contrast at 45% reduced noise level and 1.5 to 3 times resolution performance improvement when compared to MLEM reconstruction using a simple line-integral model. The GATE based system matrix reconstruction technique promises to improve resolution and noise performance and reduce image distortion at FOV periphery compared to line-integral based system matrix reconstruction.
Energy Technology Data Exchange (ETDEWEB)
Baba, Justin S [ORNL; John, Dwayne O [ORNL; Koju, Vijay [ORNL
2015-01-01
The propagation of light in turbid media is an active area of research with relevance to numerous investigational fields, e.g., biomedical diagnostics and therapeutics. The statistical random-walk nature of photon propagation through turbid media is ideal for computational based modeling and simulation. Ready access to super computing resources provide a means for attaining brute force solutions to stochastic light-matter interactions entailing scattering by facilitating timely propagation of sufficient (>10million) photons while tracking characteristic parameters based on the incorporated physics of the problem. One such model that works well for isotropic but fails for anisotropic scatter, which is the case for many biomedical sample scattering problems, is the diffusion approximation. In this report, we address this by utilizing Berry phase (BP) evolution as a means for capturing anisotropic scattering characteristics of samples in the preceding depth where the diffusion approximation fails. We extend the polarization sensitive Monte Carlo method of Ramella-Roman, et al.,1 to include the computationally intensive tracking of photon trajectory in addition to polarization state at every scattering event. To speed-up the computations, which entail the appropriate rotations of reference frames, the code was parallelized using OpenMP. The results presented reveal that BP is strongly correlated to the photon penetration depth, thus potentiating the possibility of polarimetric depth resolved characterization of highly scattering samples, e.g., biological tissues.
Seismic wavefield imaging based on the replica exchange Monte Carlo method
Kano, Masayuki; Nagao, Hiromichi; Ishikawa, Daichi; Ito, Shin-ichi; Sakai, Shin'ichi; Nakagawa, Shigeki; Hori, Muneo; Hirata, Naoshi
2017-01-01
Earthquakes sometimes cause serious disasters not only directly by ground motion itself but also secondarily by infrastructure damage, particularly in densely populated urban areas that have capital functions. To reduce the number and severity of secondary disasters, it is important to evaluate seismic hazards rapidly by analysing the seismic responses of individual structures to input ground motions. We propose a method that integrates physics-based and data-driven approaches in order to obtain a seismic wavefield for use as input to a seismic response analysis. The new contribution of this study is the use of the replica exchange Monte Carlo (REMC) method, which is one of the Markov chain Monte Carlo (MCMC) methods, for estimation of a seismic wavefield, together with a 1-D local subsurface structure and source information. Numerical tests were conducted to verify the proposed method, using synthetic observation data obtained from analytical solutions for two horizontally layered subsurface structure models. The geometries of the observation sites were determined from the dense seismic observation array called the Metropolitan Seismic Observation network, which has been in operation in the Tokyo metropolitan area in Japan since 2007. The results of the numerical tests show that the proposed method is able to search the parameters related to the source and the local subsurface structure in a broader parameter space than the Metropolis method, which is an ordinary MCMC method. The proposed method successfully reproduces a seismic wavefield consistent with a true wavefield. In contrast, ordinary kriging, which is a classical data-driven interpolation method for spatial data, is hardly able to reproduce a true wavefield, even in the low frequency bands. This suggests that it is essential to employ both physics-based and data-driven approaches in seismic wavefield imaging, utilizing seismograms from a dense seismic array. The REMC method, which provides not only
Seismic wavefield imaging based on the replica exchange Monte Carlo method
Kano, Masayuki; Nagao, Hiromichi; Ishikawa, Daichi; Ito, Shin-ichi; Sakai, Shin'ichi; Nakagawa, Shigeki; Hori, Muneo; Hirata, Naoshi
2016-11-01
Earthquakes sometimes cause serious disasters not only directly by ground motion itself but also secondarily by infrastructure damage, particularly in densely populated urban areas that have capital functions. To reduce the number and severity of secondary disasters, it is important to evaluate seismic hazards rapidly by analyzing the seismic responses of individual structures to input ground motions. We propose a method that integrates physics-based and data-driven approaches in order to obtain a seismic wavefield for use as input to a seismic response analysis. The new contribution of this study is the use of the replica exchange Monte Carlo (REMC) method, which is one of the Markov chain Monte Carlo (MCMC) methods, for estimation of a seismic wavefield, together with a one-dimensional (1-D) local subsurface structure and source information. Numerical tests were conducted to verify the proposed method, using synthetic observation data obtained from analytical solutions for two horizontally-layered subsurface structure models. The geometries of the observation sites were determined from the dense seismic observation array called the Metropolitan Seismic Observation network (MeSO-net), which has been in operation in the Tokyo metropolitan area in Japan since 2007. The results of the numerical tests show that the proposed method is able to search the parameters related to the source and the local subsurface structure in a broader parameter space than the Metropolis method, which is an ordinary MCMC method. The proposed method successfully reproduces a seismic wavefield consistent with a true wavefield. In contrast, ordinary kriging, which is a classical data-driven interpolation method for spatial data, is hardly able to reproduce a true wavefield, even in the low frequency bands. This suggests that it is essential to employ both physics-based and data-driven approaches in seismic wavefield imaging, utilizing seismograms from a dense seismic array. The REMC method
Nievaart, V. A.; Daquino, G. G.; Moss, R. L.
2007-06-01
Boron Neutron Capture Therapy (BNCT) is a bimodal form of radiotherapy for the treatment of tumour lesions. Since the cancer cells in the treatment volume are targeted with 10B, a higher dose is given to these cancer cells due to the 10B(n,α)7Li reaction, in comparison with the surrounding healthy cells. In Petten (The Netherlands), at the High Flux Reactor, a specially tailored neutron beam has been designed and installed. Over 30 patients have been treated with BNCT in 2 clinical protocols: a phase I study for the treatment of glioblastoma multiforme and a phase II study on the treatment of malignant melanoma. Furthermore, activities concerning the extra-corporal treatment of metastasis in the liver (from colorectal cancer) are in progress. The irradiation beam at the HFR contains both neutrons and gammas that, together with the complex geometries of both patient and beam set-up, demands for very detailed treatment planning calculations. A well designed Treatment Planning System (TPS) should obey the following general scheme: (1) a pre-processing phase (CT and/or MRI scans to create the geometric solid model, cross-section files for neutrons and/or gammas); (2) calculations (3D radiation transport, estimation of neutron and gamma fluences, macroscopic and microscopic dose); (3) post-processing phase (displaying of the results, iso-doses and -fluences). Treatment planning in BNCT is performed making use of Monte Carlo codes incorporated in a framework, which includes also the pre- and post-processing phases. In particular, the glioblastoma multiforme protocol used BNCT_rtpe, while the melanoma metastases protocol uses NCTPlan. In addition, an ad hoc Positron Emission Tomography (PET) based treatment planning system (BDTPS) has been implemented in order to integrate the real macroscopic boron distribution obtained from PET scanning. BDTPS is patented and uses MCNP as the calculation engine. The precision obtained by the Monte Carlo based TPSs exploited at Petten
Nanostructure-based proton exchange membrane for fuel cell applications at high temperature.
Li, Junsheng; Wang, Zhengbang; Li, Junrui; Pan, Mu; Tang, Haolin
2014-02-01
As a clean and highly efficient energy source, the proton exchange membrane fuel cell (PEMFC) has been considered an ideal alternative to traditional fossil energy sources. Great efforts have been devoted to realizing the commercialization of the PEMFC in the past decade. To eliminate some technical problems that are associated with the low-temperature operation (such as catalyst poisoning and poor water management), PEMFCs are usually operated at elevated temperatures (e.g., > 100 degrees C). However, traditional proton exchange membrane (PEM) shows poor performance at elevated temperature. To achieve a high-performance PEM for high temperature fuel cell applications, novel PEMs, which are based on nanostructures, have been developed recently. In this review, we discuss and summarize the methods for fabricating the nanostructure-based PEMs for PEMFC operated at elevated temperatures and the high temperature performance of these PEMs. We also give an outlook on the rational design and development of the nanostructure-based PEMs.
Kuusk, Priit, 1938-
2001-01-01
Novembrikuus elab šveitsi linn Basel "Euroopa muusikakuu" tähe all. Noor norra pianist Leif Ove Andsnes kutsuti Londonisse esinema. Konkursipreemiaid erinevatel konkurssidelt. Suri ameerika laulja Monte Pederson
Energy Technology Data Exchange (ETDEWEB)
Metzkes, J.; Kraft, S. D.; Sobiella, M.; Stiller, N.; Zeil, K.; Schramm, U. [Helmholtz-Zentrum Dresden-Rossendorf, Bautzner Landstr. 400, 01328 Dresden (Germany); Karsch, L.; Schuerer, M. [OncoRay - National Center for Radiation Research in Oncology, TU Dresden, Fetscherstr. 74, 01307 Dresden (Germany); Pawelke, J.; Richter, C. [Helmholtz-Zentrum Dresden-Rossendorf, Bautzner Landstr. 400, 01328 Dresden (Germany); OncoRay - National Center for Radiation Research in Oncology, TU Dresden, Fetscherstr. 74, 01307 Dresden (Germany)
2012-12-15
In recent years, a new generation of high repetition rate ({approx}10 Hz), high power ({approx}100 TW) laser systems has stimulated intense research on laser-driven sources for fast protons. Considering experimental instrumentation, this development requires online diagnostics for protons to be added to the established offline detection tools such as solid state track detectors or radiochromic films. In this article, we present the design and characterization of a scintillator-based online detector that gives access to the angularly resolved proton distribution along one spatial dimension and resolves 10 different proton energy ranges. Conceived as an online detector for key parameters in laser-proton acceleration, such as the maximum proton energy and the angular distribution, the detector features a spatial resolution of {approx}1.3 mm and a spectral resolution better than 1.5 MeV for a maximum proton energy above 12 MeV in the current design. Regarding its areas of application, we consider the detector a useful complement to radiochromic films and Thomson parabola spectrometers, capable to give immediate feedback on the experimental performance. The detector was characterized at an electrostatic Van de Graaff tandetron accelerator and tested in a laser-proton acceleration experiment, proving its suitability as a diagnostic device for laser-accelerated protons.
Development and validation of MCNPX-based Monte Carlo treatment plan verification system
Iraj Jabbari; Shahram Monadi
2015-01-01
A Monte Carlo treatment plan verification (MCTPV) system was developed for clinical treatment plan verification (TPV), especially for the conformal and intensity-modulated radiotherapy (IMRT) plans. In the MCTPV, the MCNPX code was used for particle transport through the accelerator head and the patient body. MCTPV has an interface with TiGRT planning system and reads the information which is needed for Monte Carlo calculation transferred in digital image communications in medicine-radiation ...
Tanaka, H; Sakurai, Y; Suzuki, M; Takata, T; Masunaga, S; Kinashi, Y; Kashino, G; Liu, Y; Mitsumoto, T; Yajima, S; Tsutsui, H; Takada, M; Maruhashi, A; Ono, K
2009-07-01
In order to generate epithermal neutrons for boron neutron capture therapy (BNCT), we proposed the method of filtering and moderating fast neutrons, which are emitted from the reaction between a beryllium target and 30 MeV protons accelerated by a cyclotron, using an optimum moderator system composed of iron, lead, aluminum, calcium fluoride, and enriched (6)LiF ceramic filter. At present, the epithermal-neutron source is under construction since June 2008 at Kyoto University Research Reactor Institute. This system consists of a cyclotron to supply a proton beam of about 1 mA at 30 MeV, a beam transport system, a beam scanner system for heat reduction on the beryllium target, a target cooling system, a beam shaping assembly, and an irradiation bed for patients. In this article, an overview of the cyclotron-based neutron source (CBNS) and the properties of the treatment neutron beam optimized by using the MCNPX Monte Carlo code are presented. The distribution of the RBE (relative biological effectiveness) dose in a phantom shows that, assuming a (10)B concentration of 13 ppm for normal tissue, this beam could be employed to treat a patient with an irradiation time less than 30 min and a dose less than 12.5 Gy-eq to normal tissue. The CBNS might be an alternative to the reactor-based neutron sources for BNCT treatments.
A GPU-based Monte Carlo dose calculation code for photon transport in a voxel phantom
Energy Technology Data Exchange (ETDEWEB)
Bellezzo, M.; Do Nascimento, E.; Yoriyaz, H., E-mail: mbellezzo@gmail.br [Instituto de Pesquisas Energeticas e Nucleares / CNEN, Av. Lineu Prestes 2242, Cidade Universitaria, 05508-000 Sao Paulo (Brazil)
2014-08-15
As the most accurate method to estimate absorbed dose in radiotherapy, Monte Carlo method has been widely used in radiotherapy treatment planning. Nevertheless, its efficiency can be improved for clinical routine applications. In this paper, we present the CUBMC code, a GPU-based Mc photon transport algorithm for dose calculation under the Compute Unified Device Architecture platform. The simulation of physical events is based on the algorithm used in Penelope, and the cross section table used is the one generated by the Material routine, als present in Penelope code. Photons are transported in voxel-based geometries with different compositions. To demonstrate the capabilities of the algorithm developed in the present work four 128 x 128 x 128 voxel phantoms have been considered. One of them is composed by a homogeneous water-based media, the second is composed by bone, the third is composed by lung and the fourth is composed by a heterogeneous bone and vacuum geometry. Simulations were done considering a 6 MeV monoenergetic photon point source. There are two distinct approaches that were used for transport simulation. The first of them forces the photon to stop at every voxel frontier, the second one is the Woodcock method, where the photon stop in the frontier will be considered depending on the material changing across the photon travel line. Dose calculations using these methods are compared for validation with Penelope and MCNP5 codes. Speed-up factors are compared using a NVidia GTX 560-Ti GPU card against a 2.27 GHz Intel Xeon CPU processor. (Author)
Energy Technology Data Exchange (ETDEWEB)
Remetti, Romolo, E-mail: romolo.remetti@uniroma1.i [' Sapienza' University of Rome, Department of Energetics, via A.Scarpa 14, 00161 Rome (Italy); Burgio, Nunzio T. [ENEA, Casaccia Research Centre, Via Anguillarese, 301, 00060 S. Maria Di Galeria, Rome (Italy); Maciocco, Luca; Arcese, Manuele; Filannino, M. Azzurra [Advanced Accelerator Applications (AAA), 20 rue du Diesel, 01630, Saint-Genis Pouilly (France)
2011-07-15
The aim of this work is quantifying the radionuclidic impurities of the irradiated [{sup 18}O]water originated by the [{sup 18}F]FDG synthesis process, and characterizing, from a radioprotection point of view, the waste streams produced. Two samples of 2.4 ml [{sup 18}O]H{sub 2}O, contained in two different target cells, have been irradiated with a proton current of 37 {mu}A in a PETtrace cyclotron for about one hour each; after irradiation, without performing any chemical purification process but waiting only for the {sup 18}F decay, they have been transferred in two vials and measured by HPGe gamma spectrometry and, subsequently, by Liquid Scintillation Counting. Previously, Monte Carlo calculations had been carried out in order to estimate the radionuclides generated within the target components ([{sup 18}O]H{sub 2}O, silver body and Havar (registered) foil), with the aim to identify the nuclides expected to be found in the irradiated water. Experimental results for the two samples, normalized to the same irradiation time, show practically the same value of tritium concentration (about 36 kBq/ml) while gamma emitters activity concentrations exhibit a greater spread. Considering that tritium derives from water activation while other pollutants are caused by activated cell materials released into water through erosion/corrosion mechanisms, such a spread is likely to be attributable to differences in the proton beam shape and position (production of different natural circulation patterns inside the target and different erosion mechanisms of the target cell walls). Both tritium and the other radioactive pollutants exhibit absolute values of activity and activity concentrations below the exemption limits set down in EURATOM Council Directive 96/29.
Kudrolli, Haris A.
2001-04-01
A three dimensional (3D) reconstruction procedure for Positron Emission Tomography (PET) based on inverse Monte Carlo analysis is presented. PET is a medical imaging modality which employs a positron emitting radio-tracer to give functional images of an organ's metabolic activity. This makes PET an invaluable tool in the detection of cancer and for in-vivo biochemical measurements. There are a number of analytical and iterative algorithms for image reconstruction of PET data. Analytical algorithms are computationally fast, but the assumptions intrinsic in the line integral model limit their accuracy. Iterative algorithms can apply accurate models for reconstruction and give improvements in image quality, but at an increased computational cost. These algorithms require the explicit calculation of the system response matrix, which may not be easy to calculate. This matrix gives the probability that a photon emitted from a certain source element will be detected in a particular detector line of response. The ``Three Dimensional Stochastic Sampling'' (SS3D) procedure implements iterative algorithms in a manner that does not require the explicit calculation of the system response matrix. It uses Monte Carlo techniques to simulate the process of photon emission from a source distribution and interaction with the detector. This technique has the advantage of being able to model complex detector systems and also take into account the physics of gamma ray interaction within the source and detector systems, which leads to an accurate image estimate. A series of simulation studies was conducted to validate the method using the Maximum Likelihood - Expectation Maximization (ML-EM) algorithm. The accuracy of the reconstructed images was improved by using an algorithm that required a priori knowledge of the source distribution. Means to reduce the computational time for reconstruction were explored by using parallel processors and algorithms that had faster convergence rates
Li, Jing; Cao, Xue-Li; Wang, Yuan-Yuan; Zhang, Shu-Ran; Du, Dong-Ying; Qin, Jun-Sheng; Li, Shun-Li; Su, Zhong-Min; Lan, Ya-Qian
2016-06-27
Two novel polyoxometalate (POM)-based coordination polymers, namely, [Co(bpz)(Hbpz)][Co(SO4 )0.5 (H2 O)2 (bpz)]4 [PMo(VI) 8 Mo(V) 4 V(IV) 4 O42 ]⋅13 H2 O (NENU-530) and [Ni2 (bpz)(Hbpz)3 (H2 O)2 ][PMo(VI) 8 Mo(V) 4 V(IV) 4 O44 ]⋅8 H2 O (NENU-531) (H2 bpz=3,3',5,5'-tetramethyl-4,4'-bipyrazole), were isolated by hydrothermal methods, which represented 3D networks constructed by POM units, the protonated ligand and sulfate group. In contrast with most POM-based coordination polymers, these two compounds exhibit exceptional excellent chemical and thermal stability. More importantly, NENU-530 shows a high proton conductivity of 1.5×10(-3) S cm(-1) at 75 °C and 98 % RH, which is one order of magnitude higher than that of NENU-531. Furthermore, structural analysis and functional measurement successfully demonstrated that the introduction of sulfate group is favorable for proton conductivity. Herein, the syntheses, crystal structures, proton conductivity, and the relationship between structure and property are presented.
Mesh-based Monte Carlo code for fluorescence modeling in complex tissues with irregular boundaries
Wilson, Robert H.; Chen, Leng-Chun; Lloyd, William; Kuo, Shiuhyang; Marcelo, Cynthia; Feinberg, Stephen E.; Mycek, Mary-Ann
2011-07-01
There is a growing need for the development of computational models that can account for complex tissue morphology in simulations of photon propagation. We describe the development and validation of a user-friendly, MATLAB-based Monte Carlo code that uses analytically-defined surface meshes to model heterogeneous tissue geometry. The code can use information from non-linear optical microscopy images to discriminate the fluorescence photons (from endogenous or exogenous fluorophores) detected from different layers of complex turbid media. We present a specific application of modeling a layered human tissue-engineered construct (Ex Vivo Produced Oral Mucosa Equivalent, EVPOME) designed for use in repair of oral tissue following surgery. Second-harmonic generation microscopic imaging of an EVPOME construct (oral keratinocytes atop a scaffold coated with human type IV collagen) was employed to determine an approximate analytical expression for the complex shape of the interface between the two layers. This expression can then be inserted into the code to correct the simulated fluorescence for the effect of the irregular tissue geometry.
Absorbed Dose Calculations Using Mesh-based Human Phantoms And Monte Carlo Methods
Kramer, Richard
2011-08-01
Health risks attributable to the exposure to ionizing radiation are considered to be a function of the absorbed or equivalent dose to radiosensitive organs and tissues. However, as human tissue cannot express itself in terms of equivalent dose, exposure models have to be used to determine the distribution of equivalent dose throughout the human body. An exposure model, be it physical or computational, consists of a representation of the human body, called phantom, plus a method for transporting ionizing radiation through the phantom and measuring or calculating the equivalent dose to organ and tissues of interest. The FASH2 (Female Adult meSH) and the MASH2 (Male Adult meSH) computational phantoms have been developed at the University of Pernambuco in Recife/Brazil based on polygon mesh surfaces using open source software tools and anatomical atlases. Representing standing adults, FASH2 and MASH2 have organ and tissue masses, body height and body mass adjusted to the anatomical data published by the International Commission on Radiological Protection for the reference male and female adult. For the purposes of absorbed dose calculations the phantoms have been coupled to the EGSnrc Monte Carlo code, which can transport photons, electrons and positrons through arbitrary media. This paper reviews the development of the FASH2 and the MASH2 phantoms and presents dosimetric applications for X-ray diagnosis and for prostate brachytherapy.
IVF cycle cost estimation using Activity Based Costing and Monte Carlo simulation.
Cassettari, Lucia; Mosca, Marco; Mosca, Roberto; Rolando, Fabio; Costa, Mauro; Pisaturo, Valerio
2016-03-01
The Authors present a new methodological approach in stochastic regime to determine the actual costs of an healthcare process. The paper specifically shows the application of the methodology for the determination of the cost of an Assisted reproductive technology (ART) treatment in Italy. The reason of this research comes from the fact that deterministic regime is inadequate to implement an accurate estimate of the cost of this particular treatment. In fact the durations of the different activities involved are unfixed and described by means of frequency distributions. Hence the need to determine in addition to the mean value of the cost, the interval within which it is intended to vary with a known confidence level. Consequently the cost obtained for each type of cycle investigated (in vitro fertilization and embryo transfer with or without intracytoplasmic sperm injection), shows tolerance intervals around the mean value sufficiently restricted as to make the data obtained statistically robust and therefore usable also as reference for any benchmark with other Countries. It should be noted that under a methodological point of view the approach was rigorous. In fact it was used both the technique of Activity Based Costing for determining the cost of individual activities of the process both the Monte Carlo simulation, with control of experimental error, for the construction of the tolerance intervals on the final result.
Monte Carlo based verification of a beam model used in a treatment planning system
Wieslander, E.; Knöös, T.
2008-02-01
Modern treatment planning systems (TPSs) usually separate the dose modelling into a beam modelling phase, describing the beam exiting the accelerator, followed by a subsequent dose calculation in the patient. The aim of this work is to use the Monte Carlo code system EGSnrc to study the modelling of head scatter as well as the transmission through multi-leaf collimator (MLC) and diaphragms in the beam model used in a commercial TPS (MasterPlan, Nucletron B.V.). An Elekta Precise linear accelerator equipped with an MLC has been modelled in BEAMnrc, based on available information from the vendor regarding the material and geometry of the treatment head. The collimation in the MLC direction consists of leafs which are complemented with a backup diaphragm. The characteristics of the electron beam, i.e., energy and spot size, impinging on the target have been tuned to match measured data. Phase spaces from simulations of the treatment head are used to extract the scatter from, e.g., the flattening filter and the collimating structures. Similar data for the source models used in the TPS are extracted from the treatment planning system, thus a comprehensive analysis is possible. Simulations in a water phantom, with DOSXYZnrc, are also used to study the modelling of the MLC and the diaphragms by the TPS. The results from this study will be helpful to understand the limitations of the model in the TPS and provide knowledge for further improvements of the TPS source modelling.
Iravanian, Shahriar; Kanu, Uche B; Christini, David J
2012-07-01
Cardiac repolarization alternans is an electrophysiologic condition identified by a beat-to-beat fluctuation in action potential waveform. It has been mechanistically linked to instances of T-wave alternans, a clinically defined ECG alternation in T-wave morphology, and associated with the onset of cardiac reentry and sudden cardiac death. Many alternans detection algorithms have been proposed in the past, but the majority have been designed specifically for use with T-wave alternans. Action potential duration (APD) signals obtained from experiments (especially those derived from optical mapping) possess unique characteristics, which requires the development and use of a more appropriate alternans detection method. In this paper, we present a new class of algorithms, based on the Monte Carlo method, for the detection and quantitative measurement of alternans. Specifically, we derive a set of algorithms (one an analytical and more efficient version of the other) and compare its performance with the standard spectral method and the generalized likelihood ratio test algorithm using synthetic APD sequences and optical mapping data obtained from an alternans control experiment. We demonstrate the benefits of the new algorithm in the presence of Gaussian and Laplacian noise and frame-shift errors. The proposed algorithms are well suited for experimental applications, and furthermore, have low complexity and are implementable using fixed-point arithmetic, enabling potential use with implantable cardiac devices.
Geant4-based Monte Carlo simulations on GPU for medical applications.
Bert, Julien; Perez-Ponce, Hector; El Bitar, Ziad; Jan, Sébastien; Boursier, Yannick; Vintache, Damien; Bonissent, Alain; Morel, Christian; Brasse, David; Visvikis, Dimitris
2013-08-21
Monte Carlo simulation (MCS) plays a key role in medical applications, especially for emission tomography and radiotherapy. However MCS is also associated with long calculation times that prevent its use in routine clinical practice. Recently, graphics processing units (GPU) became in many domains a low cost alternative for the acquisition of high computational power. The objective of this work was to develop an efficient framework for the implementation of MCS on GPU architectures. Geant4 was chosen as the MCS engine given the large variety of physics processes available for targeting different medical imaging and radiotherapy applications. In addition, Geant4 is the MCS engine behind GATE which is actually the most popular medical applications' simulation platform. We propose the definition of a global strategy and associated structures for such a GPU based simulation implementation. Different photon and electron physics effects are resolved on the fly directly on GPU without any approximations with respect to Geant4. Validations have shown equivalence in the underlying photon and electron physics processes between the Geant4 and the GPU codes with a speedup factor of 80-90. More clinically realistic simulations in emission and transmission imaging led to acceleration factors of 400-800 respectively compared to corresponding GATE simulations.
Auxiliary-field-based trial wave functions in quantum Monte Carlo calculations
Chang, Chia-Chen; Rubenstein, Brenda M.; Morales, Miguel A.
2016-12-01
Quantum Monte Carlo (QMC) algorithms have long relied on Jastrow factors to incorporate dynamic correlation into trial wave functions. While Jastrow-type wave functions have been widely employed in real-space algorithms, they have seen limited use in second-quantized QMC methods, particularly in projection methods that involve a stochastic evolution of the wave function in imaginary time. Here we propose a scheme for generating Jastrow-type correlated trial wave functions for auxiliary-field QMC methods. The method is based on decoupling the two-body Jastrow into one-body projectors coupled to auxiliary fields, which then operate on a single determinant to produce a multideterminant trial wave function. We demonstrate that intelligent sampling of the most significant determinants in this expansion can produce compact trial wave functions that reduce errors in the calculated energies. Our technique may be readily generalized to accommodate a wide range of two-body Jastrow factors and applied to a variety of model and chemical systems.
Photoresponse of the protonated Schiff-base retinal chromophore in the gas phase
DEFF Research Database (Denmark)
Toker, Jonathan; Rahbek, Dennis Bo; Kiefer, H V
2013-01-01
The fragmentation, initiated by photoexcitation as well as collisionally-induced excitation, of several retinal chromophores was studied in the gas phase. The chromophore in the protonated Schiff-base form (RPSB), essential for mammalian vision, shows a remarkably selective photoresponse. The sel......The fragmentation, initiated by photoexcitation as well as collisionally-induced excitation, of several retinal chromophores was studied in the gas phase. The chromophore in the protonated Schiff-base form (RPSB), essential for mammalian vision, shows a remarkably selective photoresponse...... modifications of the chromophore. We propose that isomerizations play an important role in the photoresponse of gas-phase retinal chromophores and guide internal conversion through conical intersections. The role of protein interactions is then to control the specificity of the photoisomerization in the primary...
Paschos, O; Kunze, J; Stimming, U; Maglia, F
2011-06-15
The electrolytes currently used for proton exchange membrane fuel cells are mainly based on polymers such as Nafion which limits the operation regime of the cell to ∼80 °C. Solid oxide fuel cells operate at much elevated temperatures compared to proton exchange membrane fuel cells (∼1000 °C) and employ oxide electrolytes such as yttrium stabilized zirconia and gadolinium doped ceria. So far an intermediate temperature operation regime (300 °C) has not been widely explored which would open new pathways for novel fuel cell systems. In this review we summarize the potential use of phosphate compounds as electrolytes for intermediate temperature fuel cells. Various examples on ammonium polyphosphate, pyrophosphate, cesium phosphate and other phosphate based electrolytes are presented and their preparation methods, conduction mechanism and conductivity values are demonstrated.
Energy Technology Data Exchange (ETDEWEB)
Paschos, O; Kunze, J; Stimming, U [Department of Physics E19, Technische Universitaet Muenchen, James-Franck-Strasse 1, D-85748, Garching (Germany); Maglia, F, E-mail: odysseas.paschos@ph.tum.de [Dipartimento di Chimica Fisica ' M Rolla' , Universita di Pavia, Viale Taramelli 16, 27100 Pavia (Italy)
2011-06-15
The electrolytes currently used for proton exchange membrane fuel cells are mainly based on polymers such as Nafion which limits the operation regime of the cell to {approx} 80 {sup 0}C. Solid oxide fuel cells operate at much elevated temperatures compared to proton exchange membrane fuel cells ({approx}1000 {sup 0}C) and employ oxide electrolytes such as yttrium stabilized zirconia and gadolinium doped ceria. So far an intermediate temperature operation regime (300 {sup 0}C) has not been widely explored which would open new pathways for novel fuel cell systems. In this review we summarize the potential use of phosphate compounds as electrolytes for intermediate temperature fuel cells. Various examples on ammonium polyphosphate, pyrophosphate, cesium phosphate and other phosphate based electrolytes are presented and their preparation methods, conduction mechanism and conductivity values are demonstrated.
Picchioni, F.; Tricoli,V.; Carretta, N.
2000-01-01
Homogeneuosly sulfonated poly(styrene) (SPS) was prepared with various concentration of sulfonic acid groups in the base polymer. Membranes cast from these materials were investigated in relation to proton conductivity and methanol permeability in the temperature range from 20°C to 60°C. It was found that both these properties increase as the polymer is increasingly sulfonated, with abrupt jumps occurring at a concentration of sulfonic acid groups of about 15 mol%. The most extensively sulfon...
Distance-dependent proton transfer along water wires connecting acid-base pairs
Cox, M.J.; Timmer, R.L.A.; Bakker, H.J.; Park, S.; Agmon, N.
2009-01-01
We report time-resolved mid-IR kinetics for the ultrafast acid−base reaction between photoexcited 8-hydroxypyrene-1,3,6-trisulfonic acid trisodium salt (HPTS), and acetate at three concentrations (0.5, 1.0, and 2.0 M) and three temperatures (5, 30, and 65 °C) in liquid D2O. The observed proton-trans
Chi, Yujie; Tian, Zhen; Jia, Xun
2016-08-01
Monte Carlo (MC) particle transport simulation on a graphics-processing unit (GPU) platform has been extensively studied recently due to the efficiency advantage achieved via massive parallelization. Almost all of the existing GPU-based MC packages were developed for voxelized geometry. This limited application scope of these packages. The purpose of this paper is to develop a module to model parametric geometry and integrate it in GPU-based MC simulations. In our module, each continuous region was defined by its bounding surfaces that were parameterized by quadratic functions. Particle navigation functions in this geometry were developed. The module was incorporated to two previously developed GPU-based MC packages and was tested in two example problems: (1) low energy photon transport simulation in a brachytherapy case with a shielded cylinder applicator and (2) MeV coupled photon/electron transport simulation in a phantom containing several inserts of different shapes. In both cases, the calculated dose distributions agreed well with those calculated in the corresponding voxelized geometry. The averaged dose differences were 1.03% and 0.29%, respectively. We also used the developed package to perform simulations of a Varian VS 2000 brachytherapy source and generated a phase-space file. The computation time under the parameterized geometry depended on the memory location storing the geometry data. When the data was stored in GPU's shared memory, the highest computational speed was achieved. Incorporation of parameterized geometry yielded a computation time that was ~3 times of that in the corresponding voxelized geometry. We also developed a strategy to use an auxiliary index array to reduce frequency of geometry calculations and hence improve efficiency. With this strategy, the computational time ranged in 1.75-2.03 times of the voxelized geometry for coupled photon/electron transport depending on the voxel dimension of the auxiliary index array, and in 0
Chi, Yujie; Tian, Zhen; Jia, Xun
2016-08-01
Monte Carlo (MC) particle transport simulation on a graphics-processing unit (GPU) platform has been extensively studied recently due to the efficiency advantage achieved via massive parallelization. Almost all of the existing GPU-based MC packages were developed for voxelized geometry. This limited application scope of these packages. The purpose of this paper is to develop a module to model parametric geometry and integrate it in GPU-based MC simulations. In our module, each continuous region was defined by its bounding surfaces that were parameterized by quadratic functions. Particle navigation functions in this geometry were developed. The module was incorporated to two previously developed GPU-based MC packages and was tested in two example problems: (1) low energy photon transport simulation in a brachytherapy case with a shielded cylinder applicator and (2) MeV coupled photon/electron transport simulation in a phantom containing several inserts of different shapes. In both cases, the calculated dose distributions agreed well with those calculated in the corresponding voxelized geometry. The averaged dose differences were 1.03% and 0.29%, respectively. We also used the developed package to perform simulations of a Varian VS 2000 brachytherapy source and generated a phase-space file. The computation time under the parameterized geometry depended on the memory location storing the geometry data. When the data was stored in GPU’s shared memory, the highest computational speed was achieved. Incorporation of parameterized geometry yielded a computation time that was ~3 times of that in the corresponding voxelized geometry. We also developed a strategy to use an auxiliary index array to reduce frequency of geometry calculations and hence improve efficiency. With this strategy, the computational time ranged in 1.75-2.03 times of the voxelized geometry for coupled photon/electron transport depending on the voxel dimension of the auxiliary index array, and in 0
A GPU implementation of a track-repeating algorithm for proton radiotherapy dose calculations
Yepes, Pablo P; Taddei, Phillip J
2010-01-01
An essential component in proton radiotherapy is the algorithm to calculate the radiation dose to be delivered to the patient. The most common dose algorithms are fast but they are approximate analytical approaches. However their level of accuracy is not always satisfactory, especially for heterogeneous anatomic areas, like the thorax. Monte Carlo techniques provide superior accuracy, however, they often require large computation resources, which render them impractical for routine clinical use. Track-repeating algorithms, for example the Fast Dose Calculator, have shown promise for achieving the accuracy of Monte Carlo simulations for proton radiotherapy dose calculations in a fraction of the computation time. We report on the implementation of the Fast Dose Calculator for proton radiotherapy on a card equipped with graphics processor units (GPU) rather than a central processing unit architecture. This implementation reproduces the full Monte Carlo and CPU-based track-repeating dose calculations within 2%, w...
A Project of Boron Neutron Capture Therapy System based on a Proton Linac Neutron Source
Kiyanagi, Yoshikai; Asano, Kenji; Arakawa, Akihiro; Fukuchi, Shin; Hiraga, Fujio; Kimura, Kenju; Kobayashi, Hitoshi; Kubota, Michio; Kumada, Hiroaki; Matsumoto, Hiroshi; Matsumoto, Akira; Sakae, Takeji; Saitoh, Kimiaki; Shibata, Tokushi; Yoshioka, Masakazu
At present, the clinical trials of Boron Neutron Capture Therapy (BNCT) are being performed at research reactor facilities. However, an accelerator based BNCT has a merit that it can be built in a hospital. So, we just launched a development project for the BNCT based on an accelerator in order to establish and to spread the BNCT as an effective therapy in the near future. In the project, a compact proton linac installed in a hospital will be applied as a neutron source, and energy of the proton beam is planned to be less than about 10 MeV to reduce the radioactivity. The BNCT requires epithermal neutron beam with an intensity of around 1x109 (n/cm2/sec) to deliver the therapeutic dose to a deeper region in a body and to complete the irradiation within an hour. From this condition, the current of the proton beam required is estimated to be a few mA on average. Enormous heat deposition in the target is a big issue. We are aiming at total optimization of the accelerator based BNCT from the linac to the irradiation position. Here, the outline of the project is introduced and the moderator design is presented.
Chiavassa, S; Aubineau-Lanièce, I; Bitar, A; Lisbona, A; Barbet, J; Franck, D; Jourdain, J R; Bardiès, M
2006-02-07
Dosimetric studies are necessary for all patients treated with targeted radiotherapy. In order to attain the precision required, we have developed Oedipe, a dosimetric tool based on the MCNPX Monte Carlo code. The anatomy of each patient is considered in the form of a voxel-based geometry created using computed tomography (CT) images or magnetic resonance imaging (MRI). Oedipe enables dosimetry studies to be carried out at the voxel scale. Validation of the results obtained by comparison with existing methods is complex because there are multiple sources of variation: calculation methods (different Monte Carlo codes, point kernel), patient representations (model or specific) and geometry definitions (mathematical or voxel-based). In this paper, we validate Oedipe by taking each of these parameters into account independently. Monte Carlo methodology requires long calculation times, particularly in the case of voxel-based geometries, and this is one of the limits of personalized dosimetric methods. However, our results show that the use of voxel-based geometry as opposed to a mathematically defined geometry decreases the calculation time two-fold, due to an optimization of the MCNPX2.5e code. It is therefore possible to envisage the use of Oedipe for personalized dosimetry in the clinical context of targeted radiotherapy.
Research on Calculating Definite Integral Method Based on Monte-Carlo%基于Monte-Carlo方法计算定积分的算法研究
Institute of Scientific and Technical Information of China (English)
马海峰; 刘宇熹
2011-01-01
Monte-Carlo method is a very important class of numerical methods guided by a statistical probability theory. This paper introduces the algorithm reality of Monte-Carlo method for calculating the definite integral, and makes comparative analysis from the accuracy and time efficiency points with interpolation integral. The experimental results show that the Monte-Carlo method for calculating the definite integral suits for a wide range and the computation efficiency is efficient.%Monte-Carlo方法是一种以概率统计理论为指导的一类非常重要的数值计算方法.本文给出Monte-Carlo方法计算定积分的算法实现,并从准确率和时间效率上与插值积分法求积分进行对比分析,实验结果表明Monte-Carlo方法计算定积分适用范围广泛,计算效率高效.
A comparison of Monte Carlo generators
Golan, Tomasz
2014-01-01
A comparison of GENIE, NEUT, NUANCE, and NuWro Monte Carlo neutrino event generators is presented using a set of four observables: protons multiplicity, total visible energy, most energetic proton momentum, and $\\pi^+$ two-dimensional energy vs cosine distribution.
Energy Technology Data Exchange (ETDEWEB)
Weathers, J.B. [Shock, Noise, and Vibration Group, Northrop Grumman Shipbuilding, P.O. Box 149, Pascagoula, MS 39568 (United States)], E-mail: James.Weathers@ngc.com; Luck, R. [Department of Mechanical Engineering, Mississippi State University, 210 Carpenter Engineering Building, P.O. Box ME, Mississippi State, MS 39762-5925 (United States)], E-mail: Luck@me.msstate.edu; Weathers, J.W. [Structural Analysis Group, Northrop Grumman Shipbuilding, P.O. Box 149, Pascagoula, MS 39568 (United States)], E-mail: Jeffrey.Weathers@ngc.com
2009-11-15
The complexity of mathematical models used by practicing engineers is increasing due to the growing availability of sophisticated mathematical modeling tools and ever-improving computational power. For this reason, the need to define a well-structured process for validating these models against experimental results has become a pressing issue in the engineering community. This validation process is partially characterized by the uncertainties associated with the modeling effort as well as the experimental results. The net impact of the uncertainties on the validation effort is assessed through the 'noise level of the validation procedure', which can be defined as an estimate of the 95% confidence uncertainty bounds for the comparison error between actual experimental results and model-based predictions of the same quantities of interest. Although general descriptions associated with the construction of the noise level using multivariate statistics exists in the literature, a detailed procedure outlining how to account for the systematic and random uncertainties is not available. In this paper, the methodology used to derive the covariance matrix associated with the multivariate normal pdf based on random and systematic uncertainties is examined, and a procedure used to estimate this covariance matrix using Monte Carlo analysis is presented. The covariance matrices are then used to construct approximate 95% confidence constant probability contours associated with comparison error results for a practical example. In addition, the example is used to show the drawbacks of using a first-order sensitivity analysis when nonlinear local sensitivity coefficients exist. Finally, the example is used to show the connection between the noise level of the validation exercise calculated using multivariate and univariate statistics.
Monte Carlo-based revised values of dose rate constants at discrete photon energies
Directory of Open Access Journals (Sweden)
T Palani Selvam
2014-01-01
Full Text Available Absorbed dose rate to water at 0.2 cm and 1 cm due to a point isotropic photon source as a function of photon energy is calculated using the EDKnrc user-code of the EGSnrc Monte Carlo system. This code system utilized widely used XCOM photon cross-section dataset for the calculation of absorbed dose to water. Using the above dose rates, dose rate constants are calculated. Air-kerma strength S k needed for deriving dose rate constant is based on the mass-energy absorption coefficient compilations of Hubbell and Seltzer published in the year 1995. A comparison of absorbed dose rates in water at the above distances to the published values reflects the differences in photon cross-section dataset in the low-energy region (difference is up to 2% in dose rate values at 1 cm in the energy range 30-50 keV and up to 4% at 0.2 cm at 30 keV. A maximum difference of about 8% is observed in the dose rate value at 0.2 cm at 1.75 MeV when compared to the published value. S k calculations based on the compilation of Hubbell and Seltzer show a difference of up to 2.5% in the low-energy region (20-50 keV when compared to the published values. The deviations observed in the values of dose rate and S k affect the values of dose rate constants up to 3%.
Directory of Open Access Journals (Sweden)
Vahid Moslemi
2011-03-01
Full Text Available Introduction: In brachytherapy, radioactive sources are placed close to the tumor, therefore, small changes in their positions can cause large changes in the dose distribution. This emphasizes the need for computerized treatment planning. The usual method for treatment planning of cervix brachytherapy uses conventional radiographs in the Manchester system. Nowadays, because of their advantages in locating the source positions and the surrounding tissues, CT and MRI images are replacing conventional radiographs. In this study, we used CT images in Monte Carlo based dose calculation for brachytherapy treatment planning, using an interface software to create the geometry file required in the MCNP code. The aim of using the interface software is to facilitate and speed up the geometry set-up for simulations based on the patient’s anatomy. This paper examines the feasibility of this method in cervix brachytherapy and assesses its accuracy and speed. Material and Methods: For dosimetric measurements regarding the treatment plan, a pelvic phantom was made from polyethylene in which the treatment applicators could be placed. For simulations using CT images, the phantom was scanned at 120 kVp. Using an interface software written in MATLAB, the CT images were converted into MCNP input file and the simulation was then performed. Results: Using the interface software, preparation time for the simulations of the applicator and surrounding structures was approximately 3 minutes; the corresponding time needed in the conventional MCNP geometry entry being approximately 1 hour. The discrepancy in the simulated and measured doses to point A was 1.7% of the prescribed dose. The corresponding dose differences between the two methods in rectum and bladder were 3.0% and 3.7% of the prescribed dose, respectively. Comparing the results of simulation using the interface software with those of simulation using the standard MCNP geometry entry showed a less than 1
Directory of Open Access Journals (Sweden)
Joko Siswantoro
2014-11-01
Full Text Available Volume is one of important issues in the production and processing of food product. Traditionally, volume measurement can be performed using water displacement method based on Archimedes’ principle. Water displacement method is inaccurate and considered as destructive method. Computer vision offers an accurate and nondestructive method in measuring volume of food product. This paper proposes algorithm for volume measurement of irregular shape food product using computer vision based on Monte Carlo method. Five images of object were acquired from five different views and then processed to obtain the silhouettes of object. From the silhouettes of object, Monte Carlo method was performed to approximate the volume of object. The simulation result shows that the algorithm produced high accuracy and precision for volume measurement.
Sun, Wenping
2014-07-25
Yttrium and indium co-doped barium zirconate is investigated to develop a chemically stable and sintering active proton conductor for solid oxide fuel cells (SOFCs). BaZr0.8Y0.2-xInxO3- δ possesses a pure cubic perovskite structure. The sintering activity of BaZr0.8Y0.2-xInxO3- δ increases significantly with In concentration. BaZr0.8Y0.15In0.05O3- δ (BZYI5) exhibits the highest total electrical conductivity among the sintered oxides. BZYI5 also retains high chemical stability against CO2, vapor, and reduction of H2. The good sintering activity, high conductivity, and chemical stability of BZYI5 facilitate the fabrication of durable SOFCs based on a highly conductive BZYI5 electrolyte film by cost-effective ceramic processes. Fully dense BZYI5 electrolyte film is successfully prepared on the anode substrate by a facile drop-coating technique followed by co-firing at 1400 °C for 5 h in air. The BZYI5 film exhibits one of the highest conductivity among the BaZrO3-based electrolyte films with various sintering aids. BZYI5-based single cells output very encouraging and by far the highest peak power density for BaZrO3-based proton-conducting SOFCs, reaching as high as 379 mW cm-2 at 700 °C. The results demonstrate that Y and In co-doping is an effective strategy for exploring sintering active and chemically stable BaZrO3-based proton conductors for high performance proton-conducting SOFCs. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Monte-Carlo simulation of an ultra small-angle neutron scattering instrument based on Soller slits
Energy Technology Data Exchange (ETDEWEB)
Rieker, T. [Univ. of New Mexico, Albuquerque, NM (United States); Hubbard, P. [Sandia National Labs., Albuquerque, NM (United States)
1997-09-01
Monte Carlo simulations are used to investigate an ultra small-angle neutron scattering instrument for use at a pulsed source based on a Soller slit collimator and analyzer. The simulations show that for a q{sub min} of {approximately}le-4 {angstrom}{sup -1} (15 {angstrom} neutrons) a few tenths of a percent of the incident flux is transmitted through both collimators at q=0.
Analytic estimates of secondary neutron dose in proton therapy.
Anferov, V
2010-12-21
Proton beam losses in various components of a treatment nozzle generate secondary neutrons, which bring unwanted out of field dose during treatments. The purpose of this study was to develop an analytic method for estimating neutron dose to a distant organ at risk during proton therapy. Based on radiation shielding calculation methods proposed by Sullivan, we developed an analytical model for converting the proton beam losses in the nozzle components and in the treatment volume into the secondary neutron dose at a point of interest. Using the MCNPx Monte Carlo code, we benchmarked the neutron dose rates generated by the proton beam stopped at various media. The Monte Carlo calculations confirmed the validity of the analytical model for simple beam stop geometry. The analytical model was then applied to neutron dose equivalent measurements performed on double scattering and uniform scanning nozzles at the Midwest Proton Radiotherapy Institute (MPRI). Good agreement was obtained between the model predictions and the data measured at MPRI. This work provides a method for estimating analytically the neutron dose equivalent to a distant organ at risk. This method can be used as a tool for optimizing dose delivery techniques in proton therapy.
Monte Carlo based protocol for cell survival and tumour control probability in BNCT
Ye, Sung-Joon
1999-02-01
A mathematical model to calculate the theoretical cell survival probability (nominally, the cell survival fraction) is developed to evaluate preclinical treatment conditions for boron neutron capture therapy (BNCT). A treatment condition is characterized by the neutron beam spectra, single or bilateral exposure, and the choice of boron carrier drug (boronophenylalanine (BPA) or boron sulfhydryl hydride (BSH)). The cell survival probability defined from Poisson statistics is expressed with the cell-killing yield, the (n, ) reaction density, and the tolerable neutron fluence. The radiation transport calculation from the neutron source to tumours is carried out using Monte Carlo methods: (i) reactor-based BNCT facility modelling to yield the neutron beam library at an irradiation port; (ii) dosimetry to limit the neutron fluence below a tolerance dose (10.5 Gy-Eq); (iii) calculation of the (n, ) reaction density in tumours. A shallow surface tumour could be effectively treated by single exposure producing an average cell survival probability of - for probable ranges of the cell-killing yield for the two drugs, while a deep tumour will require bilateral exposure to achieve comparable cell kills at depth. With very pure epithermal beams eliminating thermal, low epithermal and fast neutrons, the cell survival can be decreased by factors of 2-10 compared with the unmodified neutron spectrum. A dominant effect of cell-killing yield on tumour cell survival demonstrates the importance of choice of boron carrier drug. However, these calculations do not indicate an unambiguous preference for one drug, due to the large overlap of tumour cell survival in the probable ranges of the cell-killing yield for the two drugs. The cell survival value averaged over a bulky tumour volume is used to predict the overall BNCT therapeutic efficacy, using a simple model of tumour control probability (TCP).
GPU-based fast Monte Carlo simulation for radiotherapy dose calculation.
Jia, Xun; Gu, Xuejun; Graves, Yan Jiang; Folkerts, Michael; Jiang, Steve B
2011-11-21
Monte Carlo (MC) simulation is commonly considered to be the most accurate dose calculation method in radiotherapy. However, its efficiency still requires improvement for many routine clinical applications. In this paper, we present our recent progress toward the development of a graphics processing unit (GPU)-based MC dose calculation package, gDPM v2.0. It utilizes the parallel computation ability of a GPU to achieve high efficiency, while maintaining the same particle transport physics as in the original dose planning method (DPM) code and hence the same level of simulation accuracy. In GPU computing, divergence of execution paths between threads can considerably reduce the efficiency. Since photons and electrons undergo different physics and hence attain different execution paths, we use a simulation scheme where photon transport and electron transport are separated to partially relieve the thread divergence issue. A high-performance random number generator and a hardware linear interpolation are also utilized. We have also developed various components to handle the fluence map and linac geometry, so that gDPM can be used to compute dose distributions for realistic IMRT or VMAT treatment plans. Our gDPM package is tested for its accuracy and efficiency in both phantoms and realistic patient cases. In all cases, the average relative uncertainties are less than 1%. A statistical t-test is performed and the dose difference between the CPU and the GPU results is not found to be statistically significant in over 96% of the high dose region and over 97% of the entire region. Speed-up factors of 69.1 ∼ 87.2 have been observed using an NVIDIA Tesla C2050 GPU card against a 2.27 GHz Intel Xeon CPU processor. For realistic IMRT and VMAT plans, MC dose calculation can be completed with less than 1% standard deviation in 36.1 ∼ 39.6 s using gDPM.
Monte Carlo based NMR simulations of open fractures in porous media
Lukács, Tamás; Balázs, László
2014-05-01
According to the basic principles of nuclear magnetic resonance (NMR), a measurement's free induction decay curve has an exponential characteristic and its parameter is the transversal relaxation time, T2, given by the Bloch equations in rotating frame. In our simulations we are observing that particular case when the bulk's volume is neglectable to the whole system, the vertical movement is basically zero, hence the diffusion part of the T2 relation can be editted out. This small-apertured situations are common in sedimentary layers, and the smallness of the observed volume enable us to calculate with just the bulk relaxation and the surface relaxation. The simulation uses the Monte-Carlo method, so it is based on a random-walk generator which provides the brownian motions of the particles by uniformly distributed, pseudorandom generated numbers. An attached differential equation assures the bulk relaxation, the initial and the iterated conditions guarantee the simulation's replicability and enable having consistent estimations. We generate an initial geometry of a plain segment with known height, with given number of particles, the spatial distribution is set to equal to each simulation, and the surface-volume ratio remains at a constant value. It follows that to the given thickness of the open fracture, from the fitted curve's parameter, the surface relaxivity is determinable. The calculated T2 distribution curves are also indicating the inconstancy in the observed fracture situations. The effect of varying the height of the lamina at a constant diffusion coefficient also produces characteristic anomaly and for comparison we have run the simulation with the same initial volume, number of particles and conditions in spherical bulks, their profiles are clear and easily to understand. The surface relaxation enables us to estimate the interaction beetwen the materials of boundary with this two geometrically well-defined bulks, therefore the distribution takes as a
First tests for an online treatment monitoring system with in-beam PET for proton therapy
Kraan, Aafke C; Belcari, N; Camarlinghi, N; Cappucci, F; Ciocca, M; Ferrari, A; Ferretti, S; Mairani, A; Molinelli, S; Pullia, M; Retico, A; Sala, P; Sportelli, G; Del Guerra, A; Rosso, V
2014-01-01
PET imaging is a non-invasive technique for particle range verification in proton therapy. It is based on measuring the beta+ annihilations caused by nuclear interactions of the protons in the patient. In this work we present measurements for proton range verification in phantoms, performed at the CNAO particle therapy treatment center in Pavia, Italy, with our 10 x 10 cm^2 planar PET prototype DoPET. PMMA phantoms were irradiated with mono-energetic proton beams and clinical treatment plans, and PET data were acquired during and shortly after proton irradiation. We created 1-D profiles of the beta+ activity along the proton beam-axis, and evaluated the difference between the proximal rise and the distal fall-off position of the activity distribution. A good agreement with FLUKA Monte Carlo predictions was obtained. We also assessed the system response when the PMMA phantom contained an air cavity. The system was able to detect these cavities quickly after irradiation.
First tests for an online treatment monitoring system with in-beam PET for proton therapy
Kraan, A. C.; Battistoni, G.; Belcari, N.; Camarlinghi, N.; Cappucci, F.; Ciocca, M.; Ferrari, A.; Ferretti, S.; Mairani, A.; Molinelli, S.; Pullia, M.; Retico, A.; Sala, P.; Sportelli, G.; Del Guerra, A.; Rosso, V.
2015-01-01
PET imaging is a non-invasive technique for particle range verification in proton therapy. It is based on measuring the β+ annihilations caused by nuclear interactions of the protons in the patient. In this work we present measurements for proton range verification in phantoms, performed at the CNAO particle therapy treatment center in Pavia, Italy, with our 10 × 10 cm2 planar PET prototype DoPET. PMMA phantoms were irradiated with mono-energetic proton beams and clinical treatment plans, and PET data were acquired during and shortly after proton irradiation. We created 1-D profiles of the β+ activity along the proton beam-axis, and evaluated the difference between the proximal rise and the distal fall-off position of the activity distribution. A good agreement with FLUKA Monte Carlo predictions was obtained. We also assessed the system response when the PMMA phantom contained an air cavity. The system was able to detect these cavities quickly after irradiation.
Proton radiography to improve proton therapy treatment
Takatsu, J.; van der Graaf, E. R.; Van Goethem, M.-J.; van Beuzekom, M.; Klaver, T.; Visser, J.; Brandenburg, S.; Biegun, A. K.
2016-01-01
The quality of cancer treatment with protons critically depends on an accurate prediction of the proton stopping powers for the tissues traversed by the protons. Today, treatment planning in proton radiotherapy is based on stopping power calculations from densities of X-ray Computed Tomography (CT) images. This causes systematic uncertainties in the calculated proton range in a patient of typically 3-4%, but can become even 10% in bone regions [1,2,3,4,5,6,7,8]. This may lead to no dose in parts of the tumor and too high dose in healthy tissues [1]. A direct measurement of proton stopping powers with high-energy protons will allow reducing these uncertainties and will improve the quality of the treatment. Several studies have shown that a sufficiently accurate radiograph can be obtained by tracking individual protons traversing a phantom (patient) [4,6,10]. Our studies benefit from the gas-filled time projection chambers based on GridPix technology [2], developed at Nikhef, capable of tracking a single proton. A BaF2 crystal measuring the residual energy of protons was used. Proton radiographs of phantom consisting of different tissue-like materials were measured with a 30×30 mm2 150 MeV proton beam. Measurements were simulated with the Geant4 toolkit.First experimental and simulated energy radiographs are in very good agreement [3]. In this paper we focus on simulation studies of the proton scattering angle as it affects the position resolution of the proton energy loss radiograph. By selecting protons with a small scattering angle, the image quality can be improved significantly.
Proton exchange membrane fuel cells modeling based on artificial neural networks
Institute of Scientific and Technical Information of China (English)
Yudong Tian; Xinjian Zhu; Guangyi Cao
2005-01-01
To understand the complexity of the mathematical models of a proton exchange membrane fuel cell (PEMFC) and their shortage of practical PEMFC control, the PEMFC complex mechanism and the existing PEMFC models are analyzed, and artificial neural networks based PEMFC modeling is advanced. The structure, algorithm, training and simulation of PEMFC modeling based on improved BP networks are given out in detail. The computer simulation and conducted experiment verify that this model is fast and accurate, and can be used as a suitable operational model for PEMFC real-time control.
Proton tunneling in the A∙T Watson-Crick DNA base pair: myth or reality?
Brovarets', Ol'ha O; Hovorun, Dmytro M
2015-01-01
The results and conclusions reached by Godbeer et al. in their recent work, that proton tunneling in the A∙T(WC) Watson-Crick (WC) DNA base pair occurs according to the Löwdin's (L) model, but with a small (~10(-9)) probability were critically analyzed. Here, it was shown that this finding overestimates the possibility of the proton tunneling at the A∙T(WC)↔A*∙T*(L) tautomerization, because this process cannot be implemented as a chemical reaction. Furthermore, it was outlined those biologically important nucleobase mispairs (A∙A*↔A*∙A, G∙G*↔G*∙G, T∙T*↔T*∙T, C∙C*↔C*∙C, H∙H*↔H*∙H (H - hypoxanthine)) - the players in the field of the spontaneous point mutagenesis - where the tunneling of protons is expected and for which the application of the model proposed by Godbeer et al. can be productive.
Singla, Nidhi; Bhadram, Venkata Srinu; Narayana, Chandrabhas; Chowdhury, Papia
2013-04-01
The motivation of the present work is to understand the optical, chemical, and electrical aspects of the proton transfer mechanism of indole (I) and some carbonyl based indole derivatives: indole-3-carboxaldehyde (I3C) and indole-7-carboxaldehyde (I7C) for both powder form and their liquid solution. Structural information for indole derivatives (isolated molecule and in solution) is obtained with density functional theory (DFT) and time dependent DFT (TD-DFT) methods. Calculated transition energies are used to generate UV-vis, FTIR, Raman, and NMR spectra which are later verified with the experimental spectra. The occurrence of different conformers [cis (N(c)), trans (N(t)), and zwitterion (Z*)] have been interpreted by Mulliken charge, natural bond orbital (NBO) analysis, and polarization versus electric field (P-E loop) studies. (1)H and (13)C NMR and molecular vibrational frequencies of the fundamental modes established the stability of Nc due to the presence of intramolecular hydrogen bonding (IHB) in the ground state (S0). Computed/experimental UV-vis absorption/emission studies reveal the creation of new species: zwitterion (Z*) and anion (A*) in the excited state (S1) due to excited state intramolecular and intermolecular proton transfer (ESI(ra)PT and ESI(er)PT). Increased electrical conductivity (σ(ac)) with temperature and increased ferroelectric polarization at higher field verifies proton conduction in I7C.
Monte Carlo-based multiphysics coupling analysis of x-ray pulsar telescope
Li, Liansheng; Deng, Loulou; Mei, Zhiwu; Zuo, Fuchang; Zhou, Hao
2015-10-01
X-ray pulsar telescope (XPT) is a complex optical payload, which involves optical, mechanical, electrical and thermal disciplines. The multiphysics coupling analysis (MCA) plays an important role in improving the in-orbit performance. However, the conventional MCA methods encounter two serious problems in dealing with the XTP. One is that both the energy and reflectivity information of X-ray can't be taken into consideration, which always misunderstands the essence of XPT. Another is that the coupling data can't be transferred automatically among different disciplines, leading to computational inefficiency and high design cost. Therefore, a new MCA method for XPT is proposed based on the Monte Carlo method and total reflective theory. The main idea, procedures and operational steps of the proposed method are addressed in detail. Firstly, it takes both the energy and reflectivity information of X-ray into consideration simultaneously. And formulate the thermal-structural coupling equation and multiphysics coupling analysis model based on the finite element method. Then, the thermalstructural coupling analysis under different working conditions has been implemented. Secondly, the mirror deformations are obtained using construction geometry function. Meanwhile, the polynomial function is adopted to fit the deformed mirror and meanwhile evaluate the fitting error. Thirdly, the focusing performance analysis of XPT can be evaluated by the RMS. Finally, a Wolter-I XPT is taken as an example to verify the proposed MCA method. The simulation results show that the thermal-structural coupling deformation is bigger than others, the vary law of deformation effect on the focusing performance has been obtained. The focusing performances of thermal-structural, thermal, structural deformations have degraded 30.01%, 14.35% and 7.85% respectively. The RMS of dispersion spot are 2.9143mm, 2.2038mm and 2.1311mm. As a result, the validity of the proposed method is verified through
The information-based complexity of approximation problem by adaptive Monte Carlo methods
Institute of Scientific and Technical Information of China (English)
2008-01-01
In this paper, we study the complexity of information of approximation problem on the multivariate Sobolev space with bounded mixed derivative MWpr,α(Td), 1 < p < ∞, in the norm of Lq(Td), 1 < q < ∞, by adaptive Monte Carlo methods. Applying the discretization technique and some properties of pseudo-s-scale, we determine the exact asymptotic orders of this problem.
On the Assessment of Monte Carlo Error in Simulation-Based Statistical Analyses.
Koehler, Elizabeth; Brown, Elizabeth; Haneuse, Sebastien J-P A
2009-05-01
Statistical experiments, more commonly referred to as Monte Carlo or simulation studies, are used to study the behavior of statistical methods and measures under controlled situations. Whereas recent computing and methodological advances have permitted increased efficiency in the simulation process, known as variance reduction, such experiments remain limited by their finite nature and hence are subject to uncertainty; when a simulation is run more than once, different results are obtained. However, virtually no emphasis has been placed on reporting the uncertainty, referred to here as Monte Carlo error, associated with simulation results in the published literature, or on justifying the number of replications used. These deserve broader consideration. Here we present a series of simple and practical methods for estimating Monte Carlo error as well as determining the number of replications required to achieve a desired level of accuracy. The issues and methods are demonstrated with two simple examples, one evaluating operating characteristics of the maximum likelihood estimator for the parameters in logistic regression and the other in the context of using the bootstrap to obtain 95% confidence intervals. The results suggest that in many settings, Monte Carlo error may be more substantial than traditionally thought.
van der Graaf, E. R.; Limburg, J.; Koomans, R. L.; Tijs, M.
2011-01-01
The calibration of scintillation detectors for gamma radiation in a well characterized setup can be transferred to other geometries using Monte Carlo simulations to account for the differences between the calibration and the other geometry. In this study a calibration facility was used that is const
Institute of Scientific and Technical Information of China (English)
QIAO Li-gen; SHI Wen-fang
2012-01-01
A series of novel amphibious organic/inorganic hybrid proton exchange membranes with H3PO4 doped which could be used under both wet and dry conditions was prepared through a sol-gel process based on acrylated triethoxysilane(A-TES)and benzyltetrazole-modified triethoxysilane(BT-TES).The dual-curing approach including UV-curing and thermal curing was used to obtain the crosslinked membranes.Polyethylene glycol(400)diacrylate(PEGDA)was used as an oligomer to form the polymeric matrix.The molecular structures of precursors were characterized by 1H,13C and 29Si NMR spectra.The thermogravimetric analysis(TGA)results show that the membranes exhibit acceptable thermal stability for their application at above 200 ℃.The differential scanning calorimeter(DSC)determination indicates that the crosslinked membranes with the mass ratios of below 1.6 of BT-TES to A-TES and the same mass of H3PO4 doped as that of A-TES possess the-Tgs,and the lowest Tg(-28.9 ℃)exists for the membrane with double mass of H3PO4 doped as well.The high proton conductivity in a range of 9.4-17.3 mS/cm with the corresponding water uptake of 19.1％-32.8％ of the membranes was detected at 90 ℃ under wet conditions.Meanwhile,the proton conductivity in a dry environment for the membrane with a mass ratio of 2.4 of BT-TES to A-TES and double H3PO4 loading increases from 4.89× 10-2 mS/cm at 30 ℃ to 25.7 mS/cm at 140 ℃.The excellent proton transport ability under both hydrous and anhydrous conditions demonstrates a potential application in the polymer electrolyte membrane fuel cells.
Institute of Scientific and Technical Information of China (English)
张向东; 董胜; 张磊; 张国伟
2012-01-01
The construction cost of breakwaters is large. Once destroyed, the consequences would be very serious. Therefore, correctly calculating breakwater reliability has great significance. With the rapid development of artificial neural network theory, the application of artificial neural network theory in breakwater reliability is gradually attracting more and more attentions. The probabilistic meaning is definite u-sing the artificial neural network-based Monte Carlo method to calculate the failure probability of the vertical breakwaters. The breakwater in Qinhuangdao is taken as an example to inspect and verify the artificial neural network-based Monte Carlo method. All parameters in the sliding failure limit state function and the overturning limit state function are taken as variables. The failure probability and reliability index are calculated using numerical artificial neural network-based Monte Carlo method. The calculation results are compared with those calculated using variable-independent JC method and Monte Carlo simulation (in- ? Eluding direct sampling method and importance sampling method of Monte Carlo simulation). It can be concluded that the reliability indexes calculated using the artificial neural network - based Monte Carlo method are similar to those calculated using the Monte Carlo simulation, but are slightly lower than those calculated using the variable-independent JC method.%防波堤建设费用巨大,且一旦遭到破坏,后果甚为严重,因此,如何准确地计算防波堤的可靠性意义重大.随着人工神经网络理论的快速发展,人工神经网络方法在结构可靠性分析中的应用逐渐得到重视.基于神经网络的Monte Carlo法计算直立式防波堤的可靠性,概率意义明确.以秦皇岛典型直立堤为算例,采用基于神经网络的Monte Carlo法对直立式防波堤进行可靠性分析时,将直立堤滑动破坏和倾覆破坏的极限状态方程中的所有参数均作为变量处理,并将计算结果与Monte
Mechanical Reliability Simulation Based on Monte Carlo%基于Monte Carlo方法的机械可靠性仿真
Institute of Scientific and Technical Information of China (English)
连晋毅; 吴福忠; 李伟波; 林慕义
2005-01-01
介绍了可靠性设计与Monte Carlo方法的基本原理,并提出了一种基于Monte Carlo的可靠性仿真方法.应用该方法可以在样本数据有限的情况下,求解机械零部件的可靠度及其置信水平.给出了应用该方法分析工程机械轮胎可靠性的一个算例.
基于ANSYS及Monte-Carlo法的边坡可靠度分析%Feasibility analysis of the slope based on ANSYS and Monte-Carlo method
Institute of Scientific and Technical Information of China (English)
苏世毅; 梁波
2008-01-01
介绍了可靠度分析的基本理论及方法,依据通用有限元软件ANSYS提供的概率分析功能,对边坡进行可靠度分析,并利用Monte-Carlo法计算其可靠度,以具体算例验证了该法的可行性,从而实现了有限元分析与Monte-Carlo法的结合.
Wang, Song; Gardner, Joseph K; Gordon, John J; Li, Weidong; Clews, Luke; Greer, Peter B; Siebers, Jeffrey V
2009-08-01
The aim of this study is to present an efficient method to generate imager-specific Monte Carlo (MC)-based dose kernels for amorphous silicon-based electronic portal image device dose prediction and determine the effective backscattering thicknesses for such imagers. EPID field size-dependent responses were measured for five matched Varian accelerators from three institutions with 6 MV beams at the source to detector distance (SDD) of 105 cm. For two imagers, measurements were made with and without the imager mounted on the robotic supporting arm. Monoenergetic energy deposition kernels with 0-2.5 cm of water backscattering thicknesses were simultaneously computed by MC to a high precision. For each imager, the backscattering thickness required to match measured field size responses was determined. The monoenergetic kernel method was validated by comparing measured and predicted field size responses at 150 cm SDD, 10 x 10 cm2 multileaf collimator (MLC) sliding window fields created with 5, 10, 20, and 50 mm gaps, and a head-and-neck (H&N) intensity modulated radiation therapy (IMRT) patient field. Field size responses for the five different imagers deviated by up to 1.3%. When imagers were removed from the robotic arms, response deviations were reduced to 0.2%. All imager field size responses were captured by using between 1.0 and 1.6 cm backscatter. The predicted field size responses by the imager-specific kernels matched measurements for all involved imagers with the maximal deviation of 0.34%. The maximal deviation between the predicted and measured field size responses at 150 cm SDD is 0.39%. The maximal deviation between the predicted and measured MLC sliding window fields is 0.39%. For the patient field, gamma analysis yielded that 99.0% of the pixels have gamma < 1 by the 2%, 2 mm criteria with a 3% dose threshold. Tunable imager-specific kernels can be generated rapidly and accurately in a single MC simulation. The resultant kernels are imager position
Uncertainties in Monte Carlo-based absorbed dose calculations for an experimental benchmark.
Renner, F; Wulff, J; Kapsch, R-P; Zink, K
2015-10-01
There is a need to verify the accuracy of general purpose Monte Carlo codes like EGSnrc, which are commonly employed for investigations of dosimetric problems in radiation therapy. A number of experimental benchmarks have been published to compare calculated values of absorbed dose to experimentally determined values. However, there is a lack of absolute benchmarks, i.e. benchmarks without involved normalization which may cause some quantities to be cancelled. Therefore, at the Physikalisch-Technische Bundesanstalt a benchmark experiment was performed, which aimed at the absolute verification of radiation transport calculations for dosimetry in radiation therapy. A thimble-type ionization chamber in a solid phantom was irradiated by high-energy bremsstrahlung and the mean absorbed dose in the sensitive volume was measured per incident electron of the target. The characteristics of the accelerator and experimental setup were precisely determined and the results of a corresponding Monte Carlo simulation with EGSnrc are presented within this study. For a meaningful comparison, an analysis of the uncertainty of the Monte Carlo simulation is necessary. In this study uncertainties with regard to the simulation geometry, the radiation source, transport options of the Monte Carlo code and specific interaction cross sections are investigated, applying the general methodology of the Guide to the expression of uncertainty in measurement. Besides studying the general influence of changes in transport options of the EGSnrc code, uncertainties are analyzed by estimating the sensitivity coefficients of various input quantities in a first step. Secondly, standard uncertainties are assigned to each quantity which are known from the experiment, e.g. uncertainties for geometric dimensions. Data for more fundamental quantities such as photon cross sections and the I-value of electron stopping powers are taken from literature. The significant uncertainty contributions are identified as
Kumada, H; Saito, K; Nakamura, T; Sakae, T; Sakurai, H; Matsumura, A; Ono, K
2011-12-01
Treatment planning for boron neutron capture therapy generally utilizes Monte-Carlo methods for calculation of the dose distribution. The new treatment planning system JCDS-FX employs the multi-purpose Monte-Carlo code PHITS to calculate the dose distribution. JCDS-FX allows to build a precise voxel model consisting of pixel based voxel cells in the scale of 0.4×0.4×2.0 mm(3) voxel in order to perform high-accuracy dose estimation, e.g. for the purpose of calculating the dose distribution in a human body. However, the miniaturization of the voxel size increases calculation time considerably. The aim of this study is to investigate sophisticated modeling methods which can perform Monte-Carlo calculations for human geometry efficiently. Thus, we devised a new voxel modeling method "Multistep Lattice-Voxel method," which can configure a voxel model that combines different voxel sizes by utilizing the lattice function over and over. To verify the performance of the calculation with the modeling method, several calculations for human geometry were carried out. The results demonstrated that the Multistep Lattice-Voxel method enabled the precise voxel model to reduce calculation time substantially while keeping the high-accuracy of dose estimation.
Particle Swarm Optimization based predictive control of Proton Exchange Membrane Fuel Cell (PEMFC)
Institute of Scientific and Technical Information of China (English)
无
2006-01-01
Proton Exchange Membrane Fuel Cells (PEMFCs) are the main focus of their current development as power sources because they are capable of higher power density and faster start-up than other fuel cells. The humidification system and output performance of PEMFC stack are briefly analyzed. Predictive control of PEMFC based on Support Vector Regression Machine (SVRM) is presented and the SVRM is constructed. The processing plant is modelled on SVRM and the predictive control law is obtained by using Particle Swarm Optimization (PSO). The simulation and the results showed that the SVRM and the PSO receding optimization applied to the PEMFC predictive control yielded good performance.
Doronin, Alexander; Rushmeier, Holly E.; Meglinski, Igor; Bykov, Alexander V.
2016-03-01
We present a new Monte Carlo based approach for the modelling of Bidirectional Scattering-Surface Reflectance Distribution Function (BSSRDF) for accurate rendering of human skin appearance. The variations of both skin tissues structure and the major chromophores are taken into account correspondingly to the different ethnic and age groups. The computational solution utilizes HTML5, accelerated by the graphics processing units (GPUs), and therefore is convenient for the practical use at the most of modern computer-based devices and operating systems. The results of imitation of human skin reflectance spectra, corresponding skin colours and examples of 3D faces rendering are presented and compared with the results of phantom studies.
Monte Carlo study of single-barrier structure based on exclusion model full counting statistics
Institute of Scientific and Technical Information of China (English)
Chen Hua; Du Lei; Qu Cheng-Li; He Liang; Chen Wen-Hao; Sun Peng
2011-01-01
Different from the usual full counting statistics theoretical work that focuses on the higher order cumulants computation by using cumulant generating function in electrical structures, Monte Carlo simulation of single-barrier structure is performed to obtain time series for two types of widely applicable exclusion models, counter-flows model,and tunnel model. With high-order spectrum analysis of Matlab, the validation of Monte Carlo methods is shown through the extracted first four cumulants from the time series, which are in agreement with those from cumulant generating function. After the comparison between the counter-flows model and the tunnel model in a single barrier structure, it is found that the essential difference between them consists in the strictly holding of Pauli principle in the former and in the statistical consideration of Pauli principle in the latter.
Numerical Study of Light Transport in Apple Models Based on Monte Carlo Simulations
Directory of Open Access Journals (Sweden)
Mohamed Lamine Askoura
2015-12-01
Full Text Available This paper reports on the quantification of light transport in apple models using Monte Carlo simulations. To this end, apple was modeled as a two-layer spherical model including skin and flesh bulk tissues. The optical properties of both tissue types used to generate Monte Carlo data were collected from the literature, and selected to cover a range of values related to three apple varieties. Two different imaging-tissue setups were simulated in order to show the role of the skin on steady-state backscattering images, spatially-resolved reflectance profiles, and assessment of flesh optical properties using an inverse nonlinear least squares fitting algorithm. Simulation results suggest that apple skin cannot be ignored when a Visible/Near-Infrared (Vis/NIR steady-state imaging setup is used for investigating quality attributes of apples. They also help to improve optical inspection techniques in the horticultural products.
Monte Carol-Based Dosimetry of Beta-Emitters for Intravascular Brachytherapy
Energy Technology Data Exchange (ETDEWEB)
Choi, C.K.
2002-06-25
Monte Carlo simulations for radiation dosimetry and the experimental verifications of the simulations have been developed for the treatment geometry of intravascular brachytherapy, a form of radionuclide therapy for occluded coronary disease (restenosis). Monte Carlo code, MCNP4C, has been used to calculate the radiation dose from the encapsulated array of B-emitting seeds (Sr/Y-source train). Solid water phantoms have been fabricated to measure the dose on the radiochromic films that were exposed to the beta source train for both linear and curved coronary vessel geometries. While the dose difference for the 5-degree curved vessel at the prescription point of f+2.0 mm is within the 10% guideline set by the AAPM, however, the difference increased dramatically to 16.85% for the 10-degree case which requires additional adjustment for the acceptable dosimetry planning. The experimental dose measurements agree well with the simulation results
Institute of Scientific and Technical Information of China (English)
张乐成; 邵梅; 迟津愉; 宁宁宁
2012-01-01
Monte-Carlo方法是一种以概率统计理论为指导的非常重要的数值计算方法,基于Monte-Carlo方法计算定积分的算法是较常见定积分近似计算方法.本文针对计算数学常数e(自然对数的底)值的问题,选择一个特殊定积分分别用Monte-Carlo方法和Newton-Leibniz公式进行计算,通过对这两个计算结果进行比较分析,从中得到数学常数e计算方法.实验结果表明,该算法具有实效性,且有较好的准确率和时间效率.%Monte-Carlo method is a very important numerical method guided by a statistical probability theory. The algorithm based on Monte-Carlo method to calculate the definite integral is the more common method of approximate calculation of definite integrals. To the problem about calculating the value of the mathematical constant "e" (natural logarithm) , the paper uses a method to select a special set points and uses Monte-Carlo method and the Newton-Leibniz formula to calculate "e" in order to get the algorithm on the mathematical constant "e". The result shows that the algorithm is effective and is of better accuracy and time efficiency.
Institute of Scientific and Technical Information of China (English)
张杰梁; 黄洪; 姜苏娜; 杭晨哲; 余时帆
2016-01-01
In order to analyze the uncertainty of air compressor energy efifciency measurement, in view of the complicated mathematical model and the dififculty in using the linear model, the uncertainty evaluation method based on Mathcad and Monte Carlo (Monte-Carlo) method is proposed. Through the Monte-Carlo simulation histogram distribution type selection is proved to be correct, the relative expanded uncertainty of measurement results are obtained. Finally, comparison has been made between the Monte-Carlo method and GUM method. The results shows that the relative expanded uncertainty of two methods are close and meet the requirement.%文中为分析容积式空气压缩机能效测量不确定度，针对其数学模型复杂、难以用线性模型近似等问题，提出了基于Mathcad和蒙特卡洛（Monte-Carlo）法不确定度评定方法，并通过Monte-Carlo模拟直方图验证分布类型选择的正确性，进而得出测量结果的相对扩展不确定度。最后，通过GUM法对所评定的不确定度进行比较与分析。分析结果表明，两种不确定度评定结果相近，相对扩展不确定度满足空气压缩机能效级差的“1/3”要求。
Green-Function-Based Monte Carlo Method for Classical Fields Coupled to Fermions
Weiße, Alexander
2009-01-01
Microscopic models of classical degrees of freedom coupled to non-interacting fermions occur in many different contexts. Prominent examples from solid state physics are descriptions of colossal magnetoresistance manganites and diluted magnetic semiconductors, or auxiliary field methods for correlated electron systems. Monte Carlo simulations are vital for an understanding of such systems, but notorious for requiring the solution of the fermion problem with each change in the classical field c...
Using a Monte-Carlo-based approach to evaluate the uncertainty on fringe projection technique
Molimard, Jérôme
2013-01-01
A complete uncertainty analysis on a given fringe projection set-up has been performed using Monte-Carlo approach. In particular the calibration procedure is taken into account. Two applications are given: at a macroscopic scale, phase noise is predominant whilst at microscopic scale, both phase noise and calibration errors are important. Finally, uncertainty found at macroscopic scale is close to some experimental tests (~100 {\\mu}m).
The information-based complexity of approximation problem by adaptive Monte Carlo methods
Institute of Scientific and Technical Information of China (English)
FANG GenSun; DUAN LiQin
2008-01-01
In this paper,we study the complexity of information of approximation problem on the multivariate Sobolev space with bounded mixed derivative MWTp,α(Td),1＜p＜∞,in the norm of Lq(Td),1＜q＜∞,by adaptive Monte Carlo methods.Applying the discretization technique and some properties of pseudo-s-scale,we determine the exact asymptotic orders of this problem.
Monte Carlo tests of the Rasch model based on scalability coefficients
DEFF Research Database (Denmark)
Christensen, Karl Bang; Kreiner, Svend
2010-01-01
that summarizes the number of Guttman errors in the data matrix. These coefficients are shown to yield efficient tests of the Rasch model using p-values computed using Markov chain Monte Carlo methods. The power of the tests of unequal item discrimination, and their ability to distinguish between local dependence...... and unequal item discrimination, are discussed. The methods are illustrated and motivated using a simulation study and a real data example....
Development and validation of MCNPX-based Monte Carlo treatment plan verification system.
Jabbari, Iraj; Monadi, Shahram
2015-01-01
A Monte Carlo treatment plan verification (MCTPV) system was developed for clinical treatment plan verification (TPV), especially for the conformal and intensity-modulated radiotherapy (IMRT) plans. In the MCTPV, the MCNPX code was used for particle transport through the accelerator head and the patient body. MCTPV has an interface with TiGRT planning system and reads the information which is needed for Monte Carlo calculation transferred in digital image communications in medicine-radiation therapy (DICOM-RT) format. In MCTPV several methods were applied in order to reduce the simulation time. The relative dose distribution of a clinical prostate conformal plan calculated by the MCTPV was compared with that of TiGRT planning system. The results showed well implementation of the beams configuration and patient information in this system. For quantitative evaluation of MCTPV a two-dimensional (2D) diode array (MapCHECK2) and gamma index analysis were used. The gamma passing rate (3%/3 mm) of an IMRT plan was found to be 98.5% for total beams. Also, comparison of the measured and Monte Carlo calculated doses at several points inside an inhomogeneous phantom for 6- and 18-MV photon beams showed a good agreement (within 1.5%). The accuracy and timing results of MCTPV showed that MCTPV could be used very efficiently for additional assessment of complicated plans such as IMRT plan.
Development and validation of MCNPX-based Monte Carlo treatment plan verification system
Directory of Open Access Journals (Sweden)
Iraj Jabbari
2015-01-01
Full Text Available A Monte Carlo treatment plan verification (MCTPV system was developed for clinical treatment plan verification (TPV, especially for the conformal and intensity-modulated radiotherapy (IMRT plans. In the MCTPV, the MCNPX code was used for particle transport through the accelerator head and the patient body. MCTPV has an interface with TiGRT planning system and reads the information which is needed for Monte Carlo calculation transferred in digital image communications in medicine-radiation therapy (DICOM-RT format. In MCTPV several methods were applied in order to reduce the simulation time. The relative dose distribution of a clinical prostate conformal plan calculated by the MCTPV was compared with that of TiGRT planning system. The results showed well implementation of the beams configuration and patient information in this system. For quantitative evaluation of MCTPV a two-dimensional (2D diode array (MapCHECK2 and gamma index analysis were used. The gamma passing rate (3%/3 mm of an IMRT plan was found to be 98.5% for total beams. Also, comparison of the measured and Monte Carlo calculated doses at several points inside an inhomogeneous phantom for 6- and 18-MV photon beams showed a good agreement (within 1.5%. The accuracy and timing results of MCTPV showed that MCTPV could be used very efficiently for additional assessment of complicated plans such as IMRT plan.
Energy Technology Data Exchange (ETDEWEB)
Doolan, P [University College London, London (United Kingdom); Massachusetts General Hospital, Boston, MA (United States); Sharp, G; Testa, M; Lu, H-M [Massachusetts General Hospital, Boston, MA (United States); Bentefour, E [Ion Beam Applications (IBA), Louvain la Neuve (Belgium); Royle, G [University College London, London (United Kingdom)
2014-06-15
Purpose: Beam range uncertainty in proton treatment comes primarily from converting the patient's X-ray CT (xCT) dataset to relative stopping power (RSP). Current practices use a single curve for this conversion, produced by a stoichiometric calibration based on tissue composition data for average, healthy, adult humans, but not for the individual in question. Proton radiographs produce water-equivalent path length (WEPL) maps, dependent on the RSP of tissues within the specific patient. This work investigates the use of such WEPL maps to optimize patient-specific calibration curves for reducing beam range uncertainty. Methods: The optimization procedure works on the principle of minimizing the difference between the known WEPL map, obtained from a proton radiograph, and a digitally-reconstructed WEPL map (DRWM) through an RSP dataset, by altering the calibration curve that is used to convert the xCT into an RSP dataset. DRWMs were produced with Plastimatch, an in-house developed software, and an optimization procedure was implemented in Matlab. Tests were made on a range of systems including simulated datasets with computed WEPL maps and phantoms (anthropomorphic and real biological tissue) with WEPL maps measured by single detector proton radiography. Results: For the simulated datasets, the optimizer showed excellent results. It was able to either completely eradicate or significantly reduce the root-mean-square-error (RMSE) in the WEPL for the homogeneous phantoms (to zero for individual materials or from 1.5% to 0.2% for the simultaneous optimization of multiple materials). For the heterogeneous phantom the RMSE was reduced from 1.9% to 0.3%. Conclusion: An optimization procedure has been designed to produce patient-specific calibration curves. Test results on a range of systems with different complexities and sizes have been promising for accurate beam range control in patients. This project was funded equally by the Engineering and Physical Sciences
Commissioning of a compact laser-based proton beam line for high intensity bunches around 10Â MeV
Busold, S.; Schumacher, D.; Deppert, O.; Brabetz, C.; Kroll, F.; Blažević, A.; Bagnoud, V.; Roth, M.
2014-03-01
We report on the first results of experiments with a new laser-based proton beam line at the GSI accelerator facility in Darmstadt. It delivers high current bunches at proton energies around 9.6 MeV, containing more than 109 particles in less than 10 ns and with tunable energy spread down to 2.7% (ΔE/E0 at FWHM). A target normal sheath acceleration stage serves as a proton source and a pulsed solenoid provides for beam collimation and energy selection. Finally a synchronous radio frequency (rf) field is applied via a rf cavity for energy compression at a synchronous phase of -90 deg. The proton bunch is characterized at the end of the very compact beam line, only 3 m behind the laser matter interaction point, which defines the particle source.
Analysis of accelerator based neutron spectra for BNCT using proton recoil spectroscopy
Energy Technology Data Exchange (ETDEWEB)
Wielopolski, L.; Ludewig, H.; Powell, J.R.; Raparia, D.; Alessi, J.G.; Lowenstein, D.I.
1999-03-01
Boron Neutron Capture Therapy (BNCT) is a promising binary treatment modality for high-grade primary brain tumors (glioblastoma multiforme, GM) and other cancers. BNCT employs a boron-10 containing compound that preferentially accumulates in the cancer cells in the brain. Upon neutron capture by {sup 10}B energetic alpha particles and triton released at the absorption site kill the cancer cell. In order to gain penetration depth in the brain Fairchild proposed, for this purpose, the use of energetic epithermal neutrons at about 10 keV. Phase 1/2 clinical trials of BNCT for GM are underway at the Brookhaven Medical Research Reactor (BMRR) and at the MIT Reactor, using these nuclear reactors as the source for epithermal neutrons. In light of the limitations of new reactor installations, e.g. cost, safety and licensing, and limited capability for modulating the reactor based neutron beam energy spectra, alternative neutron sources are being contemplated for wider implementation of this modality in a hospital environment. For example, accelerator based neutron sources offer the possibility of tailoring the neutron beams, in terms of improved depth-dose distributions, to the individual and offer, with relative ease, the capability of modifying the neutron beam energy and port size. In previous work new concepts for compact accelerator/target configuration were published. In this work, using the Van de Graaff accelerator the authors have explored different materials for filtering and reflecting neutron beams produced by irradiating a thick Li target with 1.8 to 2.5 MeV proton beams. However, since the yield and the maximum neutron energy emerging from the Li-7(p,n)Be-7 reaction increase with increase in the proton beam energy, there is a need for optimization of the proton energy versus filter and shielding requirements to obtain the desired epithermal neutron beam. The MCNP-4A computer code was used for the initial design studies that were verified with benchmark
ANALYSIS OF ACCELERATOR BASED NEUTRON SPECTRA FOR BNCT USING PROTON RECOIL SPECTROSCOPY
Energy Technology Data Exchange (ETDEWEB)
WIELOPOLSKI,L.; LUDEWIG,H.; POWELL,J.R.; RAPARIA,D.; ALESSI,J.G.; LOWENSTEIN,D.I.
1998-11-06
Boron Neutron Capture Therapy (BNCT) is a promising binary treatment modality for high-grade primary brain tumors (glioblastoma multiforme, GM) and other cancers. BNCT employs a boron-10 containing compound that preferentially accumulates in the cancer cells in the brain. Upon neutron capture by {sup 10}B energetic alpha particles and triton released at the absorption site kill the cancer cell. In order to gain penetration depth in the brain Fairchild proposed, for this purpose, the use of energetic epithermal neutrons at about 10 keV. Phase I/II clinical trials of BNCT for GM are underway at the Brookhaven Medical Research Reactor (BMRR) and at the MIT Reactor, using these nuclear reactors as the source for epithermal neutrons. In light of the limitations of new reactor installations, e.g. cost, safety and licensing, and limited capability for modulating the reactor based neutron beam energy spectra alternative neutron sources are being contemplated for wider implementation of this modality in a hospital environment. For example, accelerator based neutron sources offer the possibility of tailoring the neutron beams, in terms of improved depth-dose distributions, to the individual and offer, with relative ease, the capability of modifying the neutron beam energy and port size. In previous work new concepts for compact accelerator/target configuration were published. In this work, using the Van de Graaff accelerator the authors have explored different materials for filtering and reflecting neutron beams produced by irradiating a thick Li target with 1.8 to 2.5 MeV proton beams. However, since the yield and the maximum neutron energy emerging from the Li-7(p,n)Be-7 reaction increase with increase in the proton beam energy, there is a need for optimization of the proton energy versus filter and shielding requirements to obtain the desired epithermal neutron beam. The MCNP-4A computer code was used for the initial design studies that were verified with benchmark
Developing and understanding a hospital-based proton facility: bringing physics into medicine.
Slater, James M
2007-08-01
From October 18 to 20, 2006, a symposium, Developing and Understanding a Hospital-based Proton Facility: Bringing Physics Into Medicine, was held at the Renaissance Esmeralda Resort and Spa, Indian Wells, California. The event was offered by the Department of Radiation Medicine at Loma Linda University (LLU), supported by the Telemedicine and Advanced Technology Research Center (TATRC) and the United States Army Medical Research and Materiel Command (USAMRMC). The meeting was intended to discuss factors involved in planning, developing, and operating a hospital-based proton treatment center. It brought together some of the most distinguished physicists, radiation biologists, and radiation oncologists in the world, and more than 100 individuals participated in the three-day educational offering. This overview reports on the event and introduces several papers written by many of the speakers from their presentations, for publication in this issue of Technology in Cancer Research and Treatment. Both the symposium and the papers are appropriate for this journal: exploitation of technology was one of the underlying themes of the symposium.
Heavy Ion SEU Cross Section Calculation Based on Proton Experimental Data, and Vice Versa
Wrobel, F; Pouget, V; Dilillo, L; Ecoffet, R; Lorfèvre, E; Bezerra, F; Brugger, M; Saigné, F
2014-01-01
The aim of this work is to provide a method to calculate single event upset (SEU) cross sections by using experimental data. Valuable tools such as PROFIT and SIMPA already focus on the calculation of the proton cross section by using heavy ions cross-section experiments. However, there is no available tool that calculates heavy ion cross sections based on measured proton cross sections with no knowledge of the technology. We based our approach on the diffusion-collection model with the aim of analyzing the characteristics of transient currents that trigger SEUs. We show that experimental cross sections could be used to characterize the pulses that trigger an SEU. Experimental results allow yet defining an empirical rule to identify the transient current that are responsible for an SEU. Then, the SEU cross section can be calculated for any kind of particle and any energy with no need to know the Spice model of the cell. We applied our method to some technologies (250 nm, 90 nm and 65 nm bulk SRAMs) and we sho...
Monte Carlo-based treatment planning system calculation engine for microbeam radiation therapy
Energy Technology Data Exchange (ETDEWEB)
Martinez-Rovira, I.; Sempau, J.; Prezado, Y. [Institut de Tecniques Energetiques, Universitat Politecnica de Catalunya, Diagonal 647, Barcelona E-08028 (Spain) and ID17 Biomedical Beamline, European Synchrotron Radiation Facility (ESRF), 6 rue Jules Horowitz B.P. 220, F-38043 Grenoble Cedex (France); Institut de Tecniques Energetiques, Universitat Politecnica de Catalunya, Diagonal 647, Barcelona E-08028 (Spain); Laboratoire Imagerie et modelisation en neurobiologie et cancerologie, UMR8165, Centre National de la Recherche Scientifique (CNRS), Universites Paris 7 et Paris 11, Bat 440., 15 rue Georges Clemenceau, F-91406 Orsay Cedex (France)
2012-05-15
Purpose: Microbeam radiation therapy (MRT) is a synchrotron radiotherapy technique that explores the limits of the dose-volume effect. Preclinical studies have shown that MRT irradiations (arrays of 25-75-{mu}m-wide microbeams spaced by 200-400 {mu}m) are able to eradicate highly aggressive animal tumor models while healthy tissue is preserved. These promising results have provided the basis for the forthcoming clinical trials at the ID17 Biomedical Beamline of the European Synchrotron Radiation Facility (ESRF). The first step includes irradiation of pets (cats and dogs) as a milestone before treatment of human patients. Within this context, accurate dose calculations are required. The distinct features of both beam generation and irradiation geometry in MRT with respect to conventional techniques require the development of a specific MRT treatment planning system (TPS). In particular, a Monte Carlo (MC)-based calculation engine for the MRT TPS has been developed in this work. Experimental verification in heterogeneous phantoms and optimization of the computation time have also been performed. Methods: The penelope/penEasy MC code was used to compute dose distributions from a realistic beam source model. Experimental verification was carried out by means of radiochromic films placed within heterogeneous slab-phantoms. Once validation was completed, dose computations in a virtual model of a patient, reconstructed from computed tomography (CT) images, were performed. To this end, decoupling of the CT image voxel grid (a few cubic millimeter volume) to the dose bin grid, which has micrometer dimensions in the transversal direction of the microbeams, was performed. Optimization of the simulation parameters, the use of variance-reduction (VR) techniques, and other methods, such as the parallelization of the simulations, were applied in order to speed up the dose computation. Results: Good agreement between MC simulations and experimental results was achieved, even at
Energy Technology Data Exchange (ETDEWEB)
Lewis, A.; Marcus, M.A.; Ehrenberg, B.; Crespi, H.
1978-10-01
Resonance Raman spectroscopy of the retinylidene chromophore in various isotopically labeled membrane environments together with spectra of isotopically labeled model compounds demonstrates that a secondary protein interaction is present at the protonated Schiff base linkage in bacteriorhodopsin. The data indicate that although the interaction is present in all protonated bacteriorhodopsin species it is absent in unprotonated intermediates. Furthermore, kinetic resonance Raman spectroscopy has been used to monitor the dynamics of Schiff base deprotonation as a function of pH. All results are consistent with lysine as the interacting group. A structure for the interaction is proposed in which the interacting protein group in an unprotonated configuration is complexed through the Schiff base proton to the Schiff base nitrogen. These data suggest a molecular mechanism for proton pumping and ion gate molecular regulation. In this mechanism, light causes electron redistribution in the retinylidene chromophore, which results in the deprotonation of an amino acid side chain with pK > 10.2 +- 0.3 (e.g., arginine). This induces subsequent retinal and protein conformational transitions which eventually lower the pK of the Schiff base complex from > 12 before light absorption to 10.2 +- 0.3 in microseconds after photon absorption. Finally, in this low pK state the complex can reprotonate the proton-deficient high pK group generated by light, and the complex is then reprotonated from the opposite side of the membrane.
Schiff base protonation changes in Siberian hamster ultraviolet cone pigment photointermediates.
Mooney, Victoria L; Szundi, Istvan; Lewis, James W; Yan, Elsa C Y; Kliger, David S
2012-03-27
Molecular structure and function studies of vertebrate ultraviolet (UV) cone visual pigments are needed to understand the molecular evolution of these photoreceptors, which uniquely contain unprotonated Schiff base linkages between the 11-cis-retinal chromophore and the opsin proteins. In this study, the Siberian hamster ultraviolet cone pigment (SHUV) was expressed and purified in an n-dodecyl-β-D-maltoside suspension for optical characterization. Time-resolved absorbance measurements, over a spectral range from 300 to 700 nm, were taken for the purified pigment at time delays from 30 ns to 4.64 s after photoexcitation using 7 ns pulses of 355 nm light. The resulting data were fit globally to a sum of exponential functions after noise reduction using singular-value decomposition. Four exponentials best fit the data with lifetimes of 1.4 μs, 210 μs, 47 ms, and 1 s. The first photointermediate species characterized here is an equilibrated mixture similar to the one formed after rhodopsin's Batho intermediate decays into equilibrium with its successor, BSI. The extremely large red shift of the SHUV Batho component relative to the pigment suggests that SHUV Batho has a protonated Schiff base and that the SHUV cone pigment itself has an unprotonated Schiff base. In contrast to SHUV Batho, the portion of the equilibrated mixture's spectrum corresponding to SHUV BSI is well fit by a model spectrum with an unprotonated Schiff base. The spectra of the next two photointermediate species revealed that they both have unprotonated Schiff bases and suggest they are analogous to rhodopsin's Lumi I and Lumi II species. After decay of SHUV Lumi II, the correspondence with rhodopsin photointermediates breaks down and the next photointermediate, presumably including the G protein-activating species, is a mixture of protonated and unprotonated Schiff base photointermediate species.
Zhang, Hua-Yu; Guo, Guang-Can; Sun, Fang-Wen
2016-01-01
The nitrogen vacancy (NV) center in diamond has been widely applied for quantum information and sensing in last decade. Based on the laser polarization dependent excitation of fluorescence emission, we propose a super-resolution microscopy of NV center. A series of wide field images of NV centers are taken with different polarizations of the linear polarized excitation laser. The fluorescence intensity of NV center is changed with the relative angle between excitation laser polarization and the orientation of NV center dipole. The images pumped by different excitation laser polarizations are analyzed with Monte Carlo method. Then the symmetry axis and position of NV center are obtained with sub-diffraction resolution.
Directory of Open Access Journals (Sweden)
Kohei Arai
2013-01-01
Full Text Available Monte Carlo Ray Tracing: MCRT based sensitivity analysis of the geophysical parameters (the atmosphere and the ocean on Top of the Atmosphere: TOA radiance in visible to near infrared wavelength regions is conducted. As the results, it is confirmed that the influence due to the atmosphere is greater than that of the ocean. Scattering and absorption due to aerosol particles and molecules in the atmosphere is major contribution followed by water vapor and ozone while scattering due to suspended solid is dominant contribution for the ocean parameters.
Random vibration analysis of switching apparatus based on Monte Carlo method
Institute of Scientific and Technical Information of China (English)
ZHAI Guo-fu; CHEN Ying-hua; REN Wan-bin
2007-01-01
The performance in vibration environment of switching apparatus containing mechanical contact is an important element when judging the apparatus's reliability. A piecewise linear two-degrees-of-freedom mathematical model considering contact loss was built in this work, and the vibration performance of the model under random external Gaussian white noise excitation was investigated by using Monte Carlo simulation in Matlab/Simulink. Simulation showed that the spectral content and statistical characters of the contact force coincided strongly with reality. The random vibration character of the contact system was solved using time (numerical) domain simulation in this paper. Conclusions reached here are of great importance for reliability design of switching apparatus.
Energy Technology Data Exchange (ETDEWEB)
Al-Subeihi, Ala' A.A., E-mail: subeihi@yahoo.com [Division of Toxicology, Wageningen University, Tuinlaan 5, 6703 HE Wageningen (Netherlands); BEN-HAYYAN-Aqaba International Laboratories, Aqaba Special Economic Zone Authority (ASEZA), P. O. Box 2565, Aqaba 77110 (Jordan); Alhusainy, Wasma; Kiwamoto, Reiko; Spenkelink, Bert [Division of Toxicology, Wageningen University, Tuinlaan 5, 6703 HE Wageningen (Netherlands); Bladeren, Peter J. van [Division of Toxicology, Wageningen University, Tuinlaan 5, 6703 HE Wageningen (Netherlands); Nestec S.A., Avenue Nestlé 55, 1800 Vevey (Switzerland); Rietjens, Ivonne M.C.M.; Punt, Ans [Division of Toxicology, Wageningen University, Tuinlaan 5, 6703 HE Wageningen (Netherlands)
2015-03-01
The present study aims at predicting the level of formation of the ultimate carcinogenic metabolite of methyleugenol, 1′-sulfooxymethyleugenol, in the human population by taking variability in key bioactivation and detoxification reactions into account using Monte Carlo simulations. Depending on the metabolic route, variation was simulated based on kinetic constants obtained from incubations with a range of individual human liver fractions or by combining kinetic constants obtained for specific isoenzymes with literature reported human variation in the activity of these enzymes. The results of the study indicate that formation of 1′-sulfooxymethyleugenol is predominantly affected by variation in i) P450 1A2-catalyzed bioactivation of methyleugenol to 1′-hydroxymethyleugenol, ii) P450 2B6-catalyzed epoxidation of methyleugenol, iii) the apparent kinetic constants for oxidation of 1′-hydroxymethyleugenol, and iv) the apparent kinetic constants for sulfation of 1′-hydroxymethyleugenol. Based on the Monte Carlo simulations a so-called chemical-specific adjustment factor (CSAF) for intraspecies variation could be derived by dividing different percentiles by the 50th percentile of the predicted population distribution for 1′-sulfooxymethyleugenol formation. The obtained CSAF value at the 90th percentile was 3.2, indicating that the default uncertainty factor of 3.16 for human variability in kinetics may adequately cover the variation within 90% of the population. Covering 99% of the population requires a larger uncertainty factor of 6.4. In conclusion, the results showed that adequate predictions on interindividual human variation can be made with Monte Carlo-based PBK modeling. For methyleugenol this variation was observed to be in line with the default variation generally assumed in risk assessment. - Highlights: • Interindividual human differences in methyleugenol bioactivation were simulated. • This was done using in vitro incubations, PBK modeling
Colorimetric Sensors for Anion Recognition Based on the Proton Transfer Signaling Mechanism
Institute of Scientific and Technical Information of China (English)
HUANG Xiaohuan; HE Yongbing; CHEN Zhihong; HU Chenguang
2009-01-01
Phenolic hydroxyl based sensors N,N-bi(salicylidene)-1,2-phenylenediamine (1),2-[(4-nitrophenylimino)methyl] phenol (2) and 2-[p-tdylimino methyl] phenol (3) bearing Schiff-base groups can act as selective colorimetric sensors for anions,which exhibit distinct color changes in the presence of fluoride,but show no response to other halogen anions.They also give response to acetate,which is clearly visible to the naked eyes.The selectivity can be rationalized on the proton transfer signaling mechanism.The sensor 1 containing plural phenolic groups undergoes stepwise deprotonation of the two O-H fragments when it interacts with excess fluoride,which is confirmed by UV-Vis,1H NMR and 19F NMR spectroscopic methods.
Proton recoil telescope based on diamond detectors for measurement of fusion neutrons
Caiffi, B; Ripani, M; Pillon, M; Taiuti, M
2015-01-01
Diamonds are very promising candidates for the neutron diagnostics in harsh environments such as fusion reactor. In the first place this is because of their radiation hardness, exceeding that of Silicon by an order of magnitude. Also, in comparison to the standard on-line neutron diagnostics (fission chambers, silicon based detectors, scintillators), diamonds are less sensitive to $\\gamma$ rays, which represent a huge background in fusion devices. Finally, their low leakage current at high temperature suppresses the detector intrinsic noise. In this talk a CVD diamond based detector has been proposed for the measurement of the 14 MeV neutrons from D-T fusion reaction. The detector was arranged in a proton recoil telescope configuration, featuring a plastic converter in front of the sensitive volume in order to induce the (n,p) reaction. The segmentation of the sensitive volume, achieved by using two crystals, allowed to perform measurements in coincidence, which suppressed the neutron elastic scattering backg...
Proton-induced single electron capture on DNA/RNA bases.
Champion, C; Weck, P F; Lekadir, H; Galassi, M E; Fojón, O A; Abufager, P; Rivarola, R D; Hanssen, J
2012-05-21
In this work, we report total cross sections for the single electron capture process induced on DNA/RNA bases by high-energy protons. The calculations are performed within both the continuum distorted wave and the continuum distorted wave-eikonal initial state approximations. The biological targets are described within the framework of self-consistent methods based on the complete neglect of differential overlap model whose accuracy has first been checked for simpler bio-molecules such as water vapour. Furthermore, the multi-electronic problem investigated here is reduced to a mono-electronic one using a version of the independent electron approximation. Finally, the obtained theoretical predictions are confronted with the scarcely available experimental results.
A lattice-based Monte Carlo evaluation of Canada Deuterium Uranium-6 safety parameters
Energy Technology Data Exchange (ETDEWEB)
Kim, Yong Hee; Hartanto, Donny; Kim, Woo Song [Dept. of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology (KAIST), Daejeon (Korea, Republic of)
2016-06-15
Important safety parameters such as the fuel temperature coefficient (FTC) and the power coefficient of reactivity (PCR) of the CANada Deuterium Uranium (CANDU-6) reactor have been evaluated using the Monte Carlo method. For accurate analysis of the parameters, the Doppler broadening rejection correction scheme was implemented in the MCNPX code to account for the thermal motion of the heavy uranium-238 nucleus in the neutron-U scattering reactions. In this work, a standard fuel lattice has been modeled and the fuel is depleted using MCNPX. The FTC value is evaluated for several burnup points including the mid-burnup representing a near-equilibrium core. The Doppler effect has been evaluated using several cross-section libraries such as ENDF/B-VI.8, ENDF/B-VII.0, JEFF-3.1.1, and JENDL-4.0. The PCR value is also evaluated at mid-burnup conditions to characterize the safety features of an equilibrium CANDU-6 reactor. To improve the reliability of the Monte Carlo calculations, we considered a huge number of neutron histories in this work and the standard deviation of the k-infinity values is only 0.5-1 pcm.
Energy Technology Data Exchange (ETDEWEB)
Burke, TImothy P. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Kiedrowski, Brian C. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Martin, William R. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Brown, Forrest B. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2015-11-19
Kernel Density Estimators (KDEs) are a non-parametric density estimation technique that has recently been applied to Monte Carlo radiation transport simulations. Kernel density estimators are an alternative to histogram tallies for obtaining global solutions in Monte Carlo tallies. With KDEs, a single event, either a collision or particle track, can contribute to the score at multiple tally points with the uncertainty at those points being independent of the desired resolution of the solution. Thus, KDEs show potential for obtaining estimates of a global solution with reduced variance when compared to a histogram. Previously, KDEs have been applied to neutronics for one-group reactor physics problems and fixed source shielding applications. However, little work was done to obtain reaction rates using KDEs. This paper introduces a new form of the MFP KDE that is capable of handling general geometries. Furthermore, extending the MFP KDE to 2-D problems in continuous energy introduces inaccuracies to the solution. An ad-hoc solution to these inaccuracies is introduced that produces errors smaller than 4% at material interfaces.
Yuan, L G; Tang, Y Z; Zhang, Y X; Sun, J; Luo, X Y; Zhu, L X; Zhang, Z; Wang, R; Liu, Y H
2015-08-01
To estimate the valnemulin pharmacokinetic profile in a swine population and to assess a dosage regimen for increasing the likelihood of optimization. This study was, respectively, performed in 22 sows culled by p.o. administration and in 80 growing-finishing pigs by i.v. administration at a single dose of 10 mg/kg to develop a population pharmacokinetic model and Monte Carlo simulation. The relationships among the plasma concentration, dose, and time of valnemulin in pigs were illustrated as C(i,v) = X(0 )(8.4191 × 10(-4) × e(-0.2371t) + 1.2788 × 10(-5) × e(-0.0069t)) after i.v. and C(p.o) = X(0) (-8.4964 × 10(-4) × e(-0.5840t) + 8.4195 × e(-0.2371t) + 7.6869 × 10(-6) × e(-0.0069t)) after p.o. Monte Carlo simulation showed that T(>MIC) was more than 24 h when a single daily dosage at 13.5 mg/kg BW in pigs was administrated by p.o., and MIC was 0.031 mg/L. It was concluded that the current dosage regimen at 10-12 mg/kg BW led to valnemulin underexposure if the MIC was more than 0.031 mg/L and could increase the risk of treatment failure and/or drug resistance.
Graphene oxide based nanohybrid proton exchange membranes for fuel cell applications: An overview.
Pandey, Ravi P; Shukla, Geetanjali; Manohar, Murli; Shahi, Vinod K
2017-02-01
In the context of many applications, such as polymer composites, energy-related materials, sensors, 'paper'-like materials, field-effect transistors (FET), and biomedical applications, chemically modified graphene was broadly studied during the last decade, due to its excellent electrical, mechanical, and thermal properties. The presence of reactive oxygen functional groups in the grapheme oxide (GO) responsible for chemical functionalization makes it a good candidate for diversified applications. The main objectives for developing a GO based nanohybrid proton exchange membrane (PEM) include: improved self-humidification (water retention ability), reduced fuel crossover (electro-osmotic drag), improved stabilities (mechanical, thermal, and chemical), enhanced proton conductivity, and processability for the preparation of membrane-electrode assembly. Research carried on this topic may be divided into protocols for covalent grafting of functional groups on GO matrix, preparation of free-standing PEM or choice of suitable polymer matrix, covalent or hydrogen bonding between GO and polymer matrix etc. Herein, we present a brief literature survey on GO based nano-hybrid PEM for fuel cell applications. Different protocols were adopted to produce functionalized GO based materials and prepare their free-standing film or disperse these materials in various polymer matrices with suitable interactions. This review article critically discussed the suitability of these PEMs for fuel cell applications in terms of the dependency of the intrinsic properties of nanohybrid PEMs. Potential applications of these nanohybrid PEMs, and current challenges are also provided along with future guidelines for developing GO based nanohybrid PEMs as promising materials for fuel cell applications.
Proton and carbon ion radiotherapy for primary brain tumors and tumors of the skull base
Energy Technology Data Exchange (ETDEWEB)
Combs, Stephanie E.; Kessel, Kerstin; Habermehl, Daniel; Debus, Jurgen [Univ. Hospital of Heidelberg, Dept. of Radiation Oncology, Heidelberg (Germany)], e-mail: Stephanie.Combs@med.uni-heidelberg.de; Haberer, Thomas [Heidelberger Ionenstrahl Therapiezentrum (HIT), Heidelberg (Germany); Jaekel, Oliver [Univ. Hospital of Heidelberg, Dept. of Radiation Oncology, Heidelberg (Germany); Heidelberger Ionenstrahl Therapiezentrum (HIT), Heidelberg (Germany)
2013-10-15
To analyze clinical concepts, toxicity and treatment outcome in patients with brain and skull base tumors treated with photons and particle therapy. Material and methods: In total 260 patients with brain tumors and tumors of the skull base were treated at the Heidelberg Ion Therapy Center (HIT). Patients enrolled in and randomized within prospective clinical trials as well as bony or soft tissue tumors are not included in this analysis. Treatment was delivered as protons, carbon ions, or combinations of photons and a carbon ion boost. All patients are included in a tight follow-up program. The median follow-up time is 12 months (range 2-39 months). Results: Main histologies included meningioma (n = 107) for skull base lesions, pituitary adenomas (n = 14), low-grade gliomas (n = 51) as well as high-grade gliomas (n = 55) for brain tumors. In all patients treatment could be completed without any unexpected severe toxicities. No side effects > CTC Grade III were observed. To date, no severe late toxicities were observed, however, for endpoints such as secondary malignancies or neuro cognitive side effects follow-up time still remains too short. Local recurrences were mainly seen in the group of high-grade gliomas or atypical meningiomas; for benign skull base meningiomas, to date, no recurrences were observed during follow-up. Conclusion: The specific benefit of particle therapy will potentially reduce the risk of secondary malignancies as well as improve neuro cognitive outcome and quality of life (QOL); thus, longer follow-up will be necessary to confirm these endpoints. Indication-specific trials on meningiomas and gliomas are underway to elucidate the role of protons and carbon ions in these indications.
Neutrons in proton pencil beam scanning: parameterization of energy, quality factors and RBE
Schneider, Uwe; Hälg, Roger A.; Baiocco, Giorgio; Lomax, Tony
2016-08-01
The biological effectiveness of neutrons produced during proton therapy in inducing cancer is unknown, but potentially large. In particular, since neutron biological effectiveness is energy dependent, it is necessary to estimate, besides the dose, also the energy spectra, in order to obtain quantities which could be a measure of the biological effectiveness and test current models and new approaches against epidemiological studies on cancer induction after proton therapy. For patients treated with proton pencil beam scanning, this work aims to predict the spatially localized neutron energies, the effective quality factor, the weighting factor according to ICRP, and two RBE values, the first obtained from the saturation corrected dose mean lineal energy and the second from DSB cluster induction. A proton pencil beam was Monte Carlo simulated using GEANT. Based on the simulated neutron spectra for three different proton beam energies a parameterization of energy, quality factors and RBE was calculated. The pencil beam algorithm used for treatment planning at PSI has been extended using the developed parameterizations in order to calculate the spatially localized neutron energy, quality factors and RBE for each treated patient. The parameterization represents the simple quantification of neutron energy in two energy bins and the quality factors and RBE with a satisfying precision up to 85 cm away from the proton pencil beam when compared to the results based on 3D Monte Carlo simulations. The root mean square error of the energy estimate between Monte Carlo simulation based results and the parameterization is 3.9%. For the quality factors and RBE estimates it is smaller than 0.9%. The model was successfully integrated into the PSI treatment planning system. It was found that the parameterizations for neutron energy, quality factors and RBE were independent of proton energy in the investigated energy range of interest for proton therapy. The pencil beam algorithm has
Collimator scatter and 2D dosimetry in small proton beams
van Luijk, P.; van 't Veld, A.A.; Zelle, H.D.; Schippers, J.M.
2001-01-01
Monte Carlo simulations have been performed to determine the influence of collimator-scattered protons from a 150 MeV proton beam on the dose distribution behind a collimator. Slit-shaped collimators with apertures between 2 and 20 mm have been simulated. The Monte Carlo code GEANT 3.21 has been val
Manifestation of proton structure in ridge-like correlations in high-energy proton-proton collisions
Kubiczek, Patryk
2015-01-01
Recently, the CMS collaboration reported a long range in rapidity, near-side ('ridge-like') angular correlations in high-energy proton-proton collisions, so called ridge effect. This surprising observation suggests the presence of a collective flow that resembles the one believed to produce a similar correlation hydrodynamically in heavy-ion collisions. If the hydrodynamic description is valid then the effect is triggered by the initial spatial anisotropy of the colliding matter. Estimating this anisotropy within different models of the proton internal structure in comparison with measured angular correlations in high-energy proton-proton collision data could in principle discriminate between different proton models. Inspired by recent theoretical developments, we propose several phenomenological models of the proton structure. Subsequently, we calculate the anisotropy coefficients of the dense matter formed in proton-proton collisions within the formalism of the Monte Carlo Glauber model. We find that some p...
Directory of Open Access Journals (Sweden)
Anna Russo
Full Text Available Short peptides can be designed in silico and synthesized through automated techniques, making them advantageous and versatile protein binders. A number of docking-based algorithms allow for a computational screening of peptides as binders. Here we developed ex-novo peptides targeting the maltose site of the Maltose Binding Protein, the prototypical system for the study of protein ligand recognition. We used a Monte Carlo based protocol, to computationally evolve a set of octapeptides starting from a polialanine sequence. We screened in silico the candidate peptides and characterized their binding abilities by surface plasmon resonance, fluorescence and electrospray ionization mass spectrometry assays. These experiments showed the designed binders to recognize their target with micromolar affinity. We finally discuss the obtained results in the light of further improvement in the ex-novo optimization of peptide based binders.
Energy Technology Data Exchange (ETDEWEB)
Attili, A; Vignati, A; Giordanengo, S [Istituto Nazionale di Fisica Nucleare, Sez. Torino, Torino (Italy); Kraan, A [Istituto Nazionale di Fisica Nucleare, Sez. Pisa, Pisa (Italy); Universita degli Studi di Pisa, Pisa (Italy); Dalmasso, F [Istituto Nazionale di Fisica Nucleare, Sez. Torino, Torino (Italy); Universita degli Studi di Torino, Torino (Italy); Battistoni, G [Istituto Nazionale di Fisica Nucleare, Sez. Milano, Milano (Italy)
2015-06-15
Purpose: Ion beam therapy is sensitive to uncertainties from treatment planning and dose delivery. PET imaging of induced positron emitter distributions is a practical approach for in vivo, in situ verification of ion beam treatments. Treatment verification is usually done by comparing measured activity distributions with reference distributions, evaluated in nominal conditions. Although such comparisons give valuable information on treatment quality, a proper clinical evaluation of the treatment ultimately relies on the knowledge of the actual delivered dose. Analytical deconvolution methods relating activity and dose have been studied in this context, but were not clinically applied. In this work we present a feasibility study of an alternative approach for dose reconstruction from activity data, which is based on relating variations in accumulated activity to tissue density variations. Methods: First, reference distributions of dose and activity were calculated from the treatment plan and CT data. Then, the actual measured activity data were cumulatively matched with the reference activity distributions to obtain a set of activity-equivalent path lengths (AEPLs) along the rays of the pencil beams. Finally, these AEPLs were used to deform the original dose distribution, yielding the actual delivered dose. The method was tested by simulating a proton therapy treatment plan delivering 2 Gy on a homogeneous water phantom (the reference), which was compared with the same plan delivered on a phantom containing inhomogeneities. Activity and dose distributions were were calculated by means of the FLUKA Monte Carlo toolkit. Results: The main features of the observed dose distribution in the inhomogeneous situation were reproduced using the AEPL approach. Variations in particle range were reproduced and the positions, where these deviations originated, were properly identified. Conclusions: For a simple inhomogeneous phantom the 3D dose reconstruction from PET
GPU-based Monte Carlo dust radiative transfer scheme applied to AGN
Heymann, Frank
2012-01-01
A three dimensional parallel Monte Carlo (MC) dust radiative transfer code is presented. To overcome the huge computing time requirements of MC treatments, the computational power of vectorized hardware is used, utilizing either multi-core computer power or graphics processing units. The approach is a self-consistent way to solve the radiative transfer equation in arbitrary dust configurations. The code calculates the equilibrium temperatures of two populations of large grains and stochastic heated polycyclic aromatic hydrocarbons (PAH). Anisotropic scattering is treated applying the Heney-Greenstein phase function. The spectral energy distribution (SED) of the object is derived at low spatial resolution by a photon counting procedure and at high spatial resolution by a vectorized ray-tracer. The latter allows computation of high signal-to-noise images of the objects at any frequencies and arbitrary viewing angles. We test the robustness of our approach against other radiative transfer codes. The SED and dust...
Monte Carlo based dosimetry for neutron capture therapy of brain tumors
Zaidi, Lilia; Belgaid, Mohamed; Khelifi, Rachid
2016-11-01
Boron Neutron Capture Therapy (BNCT) is a biologically targeted, radiation therapy for cancer which combines neutron irradiation with a tumor targeting agent labeled with a boron10 having a high thermal neutron capture cross section. The tumor area is subjected to the neutron irradiation. After a thermal neutron capture, the excited 11B nucleus fissions into an alpha particle and lithium recoil nucleus. The high Linear Energy Transfer (LET) emitted particles deposit their energy in a range of about 10μm, which is of the same order of cell diameter [1], at the same time other reactions due to neutron activation with body component are produced. In-phantom measurement of physical dose distribution is very important for BNCT planning validation. Determination of total absorbed dose requires complex calculations which were carried out using the Monte Carlo MCNP code [2].
Monte Carlo based geometrical model for efficiency calculation of an n-type HPGe detector.
Cabal, Fatima Padilla; Lopez-Pino, Neivy; Bernal-Castillo, Jose Luis; Martinez-Palenzuela, Yisel; Aguilar-Mena, Jimmy; D'Alessandro, Katia; Arbelo, Yuniesky; Corrales, Yasser; Diaz, Oscar
2010-12-01
A procedure to optimize the geometrical model of an n-type detector is described. Sixteen lines from seven point sources ((241)Am, (133)Ba, (22)Na, (60)Co, (57)Co, (137)Cs and (152)Eu) placed at three different source-to-detector distances (10, 20 and 30 cm) were used to calibrate a low-background gamma spectrometer between 26 and 1408 keV. Direct Monte Carlo techniques using the MCNPX 2.6 and GEANT 4 9.2 codes, and a semi-empirical procedure were performed to obtain theoretical efficiency curves. Since discrepancies were found between experimental and calculated data using the manufacturer parameters of the detector, a detail study of the crystal dimensions and the geometrical configuration is carried out. The relative deviation with experimental data decreases from a mean value of 18-4%, after the parameters were optimized.
Monte Carlo based geometrical model for efficiency calculation of an n-type HPGe detector
Energy Technology Data Exchange (ETDEWEB)
Padilla Cabal, Fatima, E-mail: fpadilla@instec.c [Instituto Superior de Tecnologias y Ciencias Aplicadas, ' Quinta de los Molinos' Ave. Salvador Allende, esq. Luaces, Plaza de la Revolucion, Ciudad de la Habana, CP 10400 (Cuba); Lopez-Pino, Neivy; Luis Bernal-Castillo, Jose; Martinez-Palenzuela, Yisel; Aguilar-Mena, Jimmy; D' Alessandro, Katia; Arbelo, Yuniesky; Corrales, Yasser; Diaz, Oscar [Instituto Superior de Tecnologias y Ciencias Aplicadas, ' Quinta de los Molinos' Ave. Salvador Allende, esq. Luaces, Plaza de la Revolucion, Ciudad de la Habana, CP 10400 (Cuba)
2010-12-15
A procedure to optimize the geometrical model of an n-type detector is described. Sixteen lines from seven point sources ({sup 241}Am, {sup 133}Ba, {sup 22}Na, {sup 60}Co, {sup 57}Co, {sup 137}Cs and {sup 152}Eu) placed at three different source-to-detector distances (10, 20 and 30 cm) were used to calibrate a low-background gamma spectrometer between 26 and 1408 keV. Direct Monte Carlo techniques using the MCNPX 2.6 and GEANT 4 9.2 codes, and a semi-empirical procedure were performed to obtain theoretical efficiency curves. Since discrepancies were found between experimental and calculated data using the manufacturer parameters of the detector, a detail study of the crystal dimensions and the geometrical configuration is carried out. The relative deviation with experimental data decreases from a mean value of 18-4%, after the parameters were optimized.
Experimental validation of a rapid Monte Carlo based micro-CT simulator
Colijn, A. P.; Zbijewski, W.; Sasov, A.; Beekman, F. J.
2004-09-01
We describe a newly developed, accelerated Monte Carlo simulator of a small animal micro-CT scanner. Transmission measurements using aluminium slabs are employed to estimate the spectrum of the x-ray source. The simulator incorporating this spectrum is validated with micro-CT scans of physical water phantoms of various diameters, some containing stainless steel and Teflon rods. Good agreement is found between simulated and real data: normalized error of simulated projections, as compared to the real ones, is typically smaller than 0.05. Also the reconstructions obtained from simulated and real data are found to be similar. Thereafter, effects of scatter are studied using a voxelized software phantom representing a rat body. It is shown that the scatter fraction can reach tens of per cents in specific areas of the body and therefore scatter can significantly affect quantitative accuracy in small animal CT imaging.
Web-Based Parallel Monte Carlo Simulation Platform for Financial Computation
Institute of Scientific and Technical Information of China (English)
无
2006-01-01
Using Java, Java-enabled Web and object-oriented programming technologies, a framework is designed to organize multicomputer system on Intranet quickly to complete Monte Carlo simulation parallelizing. The high-performance computing environment is embedded in Web server so it can be accessed more easily. Adaptive parallelism and eager scheduling algorithm are used to realize load balancing, parallel processing and system fault-tolerance. Independent sequence pseudo-random number generator schemes to keep the parallel simulation availability. Three kinds of stock option pricing models as instances, ideal speedup and pricing results obtained on test bed. Now, as a Web service, a high-performance financial derivative security-pricing platform is set up for training and studying. The framework can also be used to develop other SPMD (single procedure multiple data) application. Robustness is still a major problem for further research.
McMillan, Kyle; McNitt-Gray, Michael; Ruan, Dan
2013-01-01
underestimated measurements by 1.35%–5.31% (mean difference = −3.42%, SD = 1.09%). Conclusions: This work demonstrates the feasibility of using a measurement-based kV CBCT source model to facilitate dose calculations with Monte Carlo methods for both the radiographic and CBCT mode of operation. While this initial work validates simulations against measurements for simple geometries, future work will involve utilizing the source model to investigate kV CBCT dosimetry with more complex anthropomorphic phantoms and patient specific models. PMID:24320440
基于Monte Carlo method的均衡度确定模型%Equilibrium Degree Determine Model based on the Monte Carlo method
Institute of Scientific and Technical Information of China (English)
朱颖; 程纪品
2012-01-01
The Monte Carlo method,also known as the statistical simulation method,is a very important kind of numerical methods guided by the theory of probability and statistics.It is applied to solve many computational problems using the random number （or pseudo-random number）.%蒙特卡罗方法（Monte Carlo method）,也称统计模拟方法,是一种以概率统计理论为指导的一类非常重要的数值计算方法,是指使用随机数（或更常见的伪随机数）来解决很多计算问题的方法,本文尝试建立警察服务平台的均衡度模型并用蒙特卡罗方法求解,实验结果可以满足一般的应用需求。
Proton exchange membrane fuel cell system diagnosis based on the signed directed graph method
Hua, Jianfeng; Lu, Languang; Ouyang, Minggao; Li, Jianqiu; Xu, Liangfei
The fuel-cell powered bus is becoming the favored choice for electric vehicles because of its extended driving range, zero emissions, and high energy conversion efficiency when compared with battery-operated electric vehicles. In China, a demonstration program for the fuel cell bus fleet operated at the Beijing Olympics in 2008 and the Shanghai Expo in 2010. It is necessary to develop comprehensive proton exchange membrane fuel cell (PEMFC) diagnostic tools to increase the reliability of these systems. It is especially critical for fuel-cell city buses serving large numbers of passengers using public transportation. This paper presents a diagnostic analysis and implementation study based on the signed directed graph (SDG) method for the fuel-cell system. This diagnostic system was successfully implemented in the fuel-cell bus fleet at the Shanghai Expo in 2010.
DEFF Research Database (Denmark)
Cleemann, Lars Nilausen; Buazar, F.; Li, Qingfeng;
2013-01-01
Degradation of carbon supported platinum catalysts is a major failure mode for the long term durability of high temperature proton exchange membrane fuel cells based on phosphoric acid doped polybenzimidazole membranes. With Vulcan carbon black as a reference, thermally treated carbon black...... and multi‐walled carbon nanotubes were used as supports for electrode catalysts and evaluated in accelerated durability tests under potential cycling at 150 °C. Measurements of open circuit voltage, area specific resistance and hydrogen permeation through the membrane were carried out, indicating little...... contribution of the membrane degradation to the performance losses during the potential cycling tests. As the major mechanism of the fuel cell performance degradation, the electrochemical active area of the cathodic catalysts showed a steady decrease in the cyclic voltammetric measurements, which was also...
DEFF Research Database (Denmark)
To achieve high temperature operation of proton exchange membrane fuel cells (PEMFC), preferably under ambient pressure, phosphoric acid doped polybenzimidazole (PBI) membrane represents an effective approach, which in recent years has motivated extensive research activities with great progress....... As a critical concern, issues of long term durability of PBI based fuel cells are addressed in this talk, including oxidative degradation of the polymer, mechanical failures of the membrane, acid leaching out, corrosion of carbon support and sintering of catalysts particles. Excellent polymer durability has...... observed under continuous operation with hydrogen and air at 150-160oC, with a fuel cell performance degradation rate of 5-10 µV/h. Improvement of the membrane performance such as mechanical strength, swelling and oxidative stability has achieved by exploring the polymer chemistry, i.e. covalently...
Institute of Scientific and Technical Information of China (English)
无
2001-01-01
The molecular first hyperpolarizabilities (b) and electronic properties of some azulenic retinal analogues and their derivatives have been investigated theoretically by employing semiempirical approaches. The results indicate that the protonated Schiff bases (PSB) of the 2-substituted azulenic retinal analogues possess extremely large negative b values and very good transparency. These can be attributed to the large difference between the ground state dipole moment and the first excited state dipole moment according to the electronic property analysis. The characteristic blue- shifted absorption in polar solvents of the 2-substituted PSB chromophores can be well explained by the negative solvato-chromic effects. The largest calculated |mb | value can reach the magnitude of 10-44 esu, which is close to the highest re-ported values of synthesized chromophores.
DSC and conductivity studies on PVA based proton conducting gel electrolytes
Indian Academy of Sciences (India)
S L Agrawal; Arvind Awadhia
2004-12-01
An attempt has been made in the present work to prepare polyvinyl alcohol (PVA) based proton conducting gel electrolytes in ammonium thiocyanate (NH4SCN) solution and characterize them. DSC studies affirm the formation of gels along with the presence of partial complexes. The cole–cole plots exhibit maximum ionic conductivity (2.58 × 10-3 S cm-1) for gel samples containing 6 wt% of PVA. The conductivity of gel electrolytes exhibit liquid like nature at low polymer concentrations while the behaviour is seen to be affected by the formation of PVA–NH4SCN complexes upon increase in polymer content beyond 5 wt%. Temperature dependence of ionic conductivity exhibits VTF behaviour.
High temperature proton exchange membranes based on polybenzimidazoles for fuel cells
DEFF Research Database (Denmark)
Li, Qingfeng; Jensen, Jens Oluf; Savinell, Robert F
2009-01-01
To achieve high temperature operation of proton exchange membrane fuel cells (PEMFC), preferably under ambient pressure, acid–base polymer membranes represent an effective approach. The phosphoric acid-doped polybenzimidazole membrane seems so far the most successful system in the field. It has...... in recent years motivated extensive research activities with great progress. This treatise is devoted to updating the development, covering polymer synthesis, membrane casting, physicochemical characterizations and fuel cell technologies. To optimize the membrane properties, high molecular weight polymers...... with synthetically modified or N-substituted structures have been synthesized. Techniques for membrane casting from organic solutions and directly from acid solutions have been developed. Ionic and covalent cross-linking as well as inorganic–organic composites has been explored. Membrane characterizations...
Proton-air and proton-proton cross sections
Directory of Open Access Journals (Sweden)
Ulrich Ralf
2013-06-01
Full Text Available Different attempts to measure hadronic cross sections with cosmic ray data are reviewed. The major results are compared to each other and the differences in the corresponding analyses are discussed. Besides some important differences, it is crucial to see that all analyses are based on the same fundamental relation of longitudinal air shower development to the observed fluctuation of experimental observables. Furthermore, the relation of the measured proton-air to the more fundamental proton-proton cross section is discussed. The current global picture combines hadronic proton-proton cross section data from accelerator and cosmic ray measurements and indicates a good consistency with predictions of models up to the highest energies.
Influence of Geant4 parameters on proton dose distribution
Directory of Open Access Journals (Sweden)
Asad Merouani
2015-09-01
Full Text Available Purpose: The proton therapy presents a great precision during the radiation dose delivery. It is useful when the tumor is located in a sensitive area like brain or eyes. The Monte Carlo (MC simulations are usually used in treatment planning system (TPS to estimate the radiation dose. In this paper we are interested in estimating the proton dose statistical uncertainty generated by the MC simulations. Methods: Geant4 was used in the simulation of the eye’s treatment room for 62 MeV protons therapy, installed in the Istituto Nazionale Fisica Nucleare Laboratori Nazionali del Sud (LNS-INFN facility in Catania. This code is a Monte Carlo based on software dedicated to simulate the passage of particles through the matter. In this work, we are interested in optimizing the Geant4 parameters on energy deposit distribution by proton to achieve the spatial resolution of dose distribution required for cancer therapy. We propose various simulations and compare the corresponding dose distribution inside water to evaluate the statistical uncertainties. Results: The simulated Bragg peak, based on facility model is in agreement with the experimental data, The calculations show that the mean statistical uncertainty is less than 1% for a simulation set with 5 × 104 events, 10-3 mm production threshold and a 10-2 mm step limit. Conclusion: The set of Geant4 cut and step limit values can be chosen in combination with the number of events to reach precision recommended from International Commission on Radiation Units and measurements (ICRU in Monte Carlo codes for proton therapy treatment.
Energy Technology Data Exchange (ETDEWEB)
Song, Wei [Nuclear and Radiation Safety Center, P. O. Box 8088, Beijing (China); Wu, Yuanyu [ITER Organization, Route de Vinon-sur-Verdon, 13115 Saint-Paul-lès-Durance (France); Hu, Wenjun [China Institute of Atomic Energy, P. O. Box 275(34), Beijing (China); Zuo, Jiaxu, E-mail: zuojiaxu@chinansc.cn [Nuclear and Radiation Safety Center, P. O. Box 8088, Beijing (China)
2015-11-15
Highlights: • Monte-Carlo principle coupling with fire dynamic code is adopted to perform sodium fire vulnerability assessment. • The method can be used to calculate the failure probability of sodium fire scenarios. • A calculation example and results are given to illustrate the feasibility of the methodology. • Some critical parameters and experience are shared. - Abstract: Sodium fire is a typical and distinctive hazard in sodium cooled fast reactors, which is significant for nuclear safety. In this paper, a method of sodium fire vulnerability assessment based on the Monte-Carlo principle was introduced, which could be used to calculate the probabilities of every failure mode in sodium fire scenarios. After that, the sodium fire scenario vulnerability assessment of primary cold trap room of China Experimental Fast Reactor was performed to illustrate the feasibility of the methodology. The calculation result of the example shows that the conditional failure probability of key cable is 23.6% in the sodium fire scenario which is caused by continuous sodium leakage because of the isolation device failure, but the wall temperature, the room pressure and the aerosol discharge mass are all lower than the safety limits.
van Goethem, M. J.; van der Meer, R.; Reist, H. W.; Schippers, J. M.
2009-01-01
Monte Carlo simulations based on the Geant4 simulation toolkit were performed for the carbon wedge degrader used in the beam line at the Center of Proton Therapy of the Paul Scherrer Institute (PSI). The simulations are part of the beam line studies for the development and understanding of the GANTR
Monte Carlo simulations of the radiation environment for the CMS Experiment
Mallows, Sophie
2015-01-01
Monte Carlo radiation transport codes are used by the CMS Beam Radiation Instrumentation and Luminosity (BRIL) project to estimate the radiation levels due to proton-proton collisions and machine induced background. Results are used by the CMS collaboration for various applications: comparison with detector hit rates, pile-up studies, predictions of radiation damage based on various models (Dose, NIEL, DPA), shielding design, estimations of residual dose environment. Simulation parameters, and the maintenance of the input files are summarised, and key results are presented. Furthermore, an overview of additional programs developed by the BRIL project to meet the specific needs of CMS community is given.
Nasani, Narendar; Pukazhselvan, D.; Kovalevsky, Andrei V.; Shaula, Aliaksandr L.; Fagg, Duncan P.
2017-01-01
Owing to their high stability and good bulk proton conductivity yttrium doped barium zirconate-based materials are considered as potential electrolytes for protonic ceramic fuel cell applications. Nonetheless, their refractory nature leads to problematic densification that can necessitate the addition of sintering additives. While these additives assist processing, undesirable, strong, negative impacts on proton conductivity have been regularly reported. The current work assesses the potential sintering additives NiO, BaNiOx and BaY2NiO5 and their influence on subsequent electrochemical properties of BaZr0.85Y0.15O3-δ. All sintering additives allow dense electrolyte materials (>95%) to be formed at temperatures below 1450 °C, with enhanced grain growth; with the largest grain growth being offered by the BaNiOx additive. Degradation in overall electrical performances is shown to be bulk related, corresponding to large reductions in bulk conductivity up to two orders of magnitude, whilst grain boundary conductivities are less affected. Most importantly, the current article demonstrates that these high depletions in bulk proton conductivity can be effectively inverted by redox cycling in relatively mild conditions (750 °C, cycling from N2 to H2 and back to N2), opening the way to improve processing of these materials whilst maintaining high levels of proton conductivity.
Das, Susanta K.; Berry, K. J.
A two-cell theory is developed to measure proton exchange membrane (PEM) resistance to proton flow during conduction through a PEM fuel cell. The theoretical framework developed herein is based upon fundamental thermodynamic principles and engineering laws. We made appropriate corrections to develop the theoretical model previously proposed by Babu and Nair (B.V. Babu, N. Nair, J. Energy Edu. Sci. Technol. 13 (2004) 13-20) for measuring membrane resistance to the flow of protons, which is the only ion that travels from one electrode to the other through the membrane. A simple experimental set-up and procedure are also developed to validate the theoretical model predictions. A widely used commercial membrane (Nafion ®) and several in-house membranes are examined to compare relative resistance among membranes. According to the theory, resistance of the proton exchange membrane is directly proportional to the time taken for a specific amount of protons to pass through the membrane. A second order differential equation describes the entire process. The results show that theoretical predictions are in excellent agreement with experimental observations. It is our speculation that the investigation results will open up a route to develop a simple device to measure resistance during membrane manufacturing since electrolyte resistance is one of the key performance drivers for the advancement of fuel cell technology.
Directory of Open Access Journals (Sweden)
Daniel P Tonge
Full Text Available Next generation sequencing technology has revolutionised microbiology by allowing concurrent analysis of whole microbial communities. Here we developed and verified similar methods for the analysis of fungal communities using a proton release sequencing platform with the ability to sequence reads of up to 400 bp in length at significant depth. This read length permits the sequencing of amplicons from commonly used fungal identification regions and thereby taxonomic classification. Using the 400 bp sequencing capability, we have sequenced amplicons from the ITS1, ITS2 and LSU fungal regions to a depth of approximately 700,000 raw reads per sample. Representative operational taxonomic units (OTUs were chosen by the USEARCH algorithm, and identified taxonomically through nucleotide blast (BLASTn. Combination of this sequencing technology with the bioinformatics pipeline allowed species recognition in two controlled fungal spore populations containing members of known identity and concentration. Each species included within the two controlled populations was found to correspond to a representative OTU, and these OTUs were found to be highly accurate representations of true biological sequences. However, the absolute number of reads attributed to each OTU differed among species. The majority of species were represented by an OTU derived from all three genomic regions although in some cases, species were only represented in two of the regions due to the absence of conserved primer binding sites or due to sequence composition. It is apparent from our data that proton release sequencing technologies can deliver a qualitative assessment of the fungal members comprising a sample. The fact that some fungi cannot be amplified by specific "conserved" primer pairs confirms our recommendation that a multi-region approach be taken for other amplicon-based metagenomic studies.
Hrivnacova, I; Berejnov, V V; Brun, R; Carminati, F; Fassò, A; Futo, E; Gheata, A; Caballero, I G; Morsch, Andreas
2003-01-01
The concept of Virtual Monte Carlo (VMC) has been developed by the ALICE Software Project to allow different Monte Carlo simulation programs to run without changing the user code, such as the geometry definition, the detector response simulation or input and output formats. Recently, the VMC classes have been integrated into the ROOT framework, and the other relevant packages have been separated from the AliRoot framework and can be used individually by any other HEP project. The general concept of the VMC and its set of base classes provided in ROOT will be presented. Existing implementations for Geant3, Geant4 and FLUKA and simple examples of usage will be described.
Chevelkov, Veniamin; Habenstein, Birgit; Loquet, Antoine; Giller, Karin; Becker, Stefan; Lange, Adam
2014-05-01
Proton-detected solid-state NMR was applied to a highly deuterated insoluble, non-crystalline biological assembly, the Salmonella typhimurium type iii secretion system (T3SS) needle. Spectra of very high resolution and sensitivity were obtained at a low protonation level of 10-20% at exchangeable amide positions. We developed efficient experimental protocols for resonance assignment tailored for this system and the employed experimental conditions. Using exclusively dipolar-based interspin magnetization transfers, we recorded two sets of 3D spectra allowing for an almost complete backbone resonance assignment of the needle subunit PrgI. The additional information provided by the well-resolved proton dimension revealed the presence of two sets of resonances in the N-terminal helix of PrgI, while in previous studies employing 13C detection only a single set of resonances was observed.
DEFF Research Database (Denmark)
Ali, Syed Talat; Li, Qingfeng; Pan, Chao;
2011-01-01
The effect of chloride as an air impurity and as a catalyst contaminant on the performance and durability of polybenzimidazole (PBI)-based high temperature proton exchange membrane fuel cell (HT-PEMFC) was studied. The ion chromatographic analysis reveals the existence of chloride contaminations...
Energy Technology Data Exchange (ETDEWEB)
Pavlou, Andrew T., E-mail: pavloa2@rpi.edu; Ji, Wei, E-mail: jiw2@rpi.edu
2016-06-15
Highlights: • Thermal scattering data are fit using linear least squares regression. • Mesh points are optimally selected from phonon frequency distributions. • New meshes give more accurate fits of thermal data than our previous work. • Coefficient data storage is significantly reduced compared to current methods. - Abstract: In a series of papers, we have introduced a new sampling method for Monte Carlo codes for the low-energy secondary scattering parameters that greatly reduces data storage requirements. The method is based on the temperature dependence of the energy transfer (beta) and squared momentum transfer (alpha) between a neutron and a target nuclide. Cumulative distribution functions (CDFs) in beta and alpha are constructed for a range of temperatures on a mesh of incident energies in the thermal range and temperature fits are created for beta and alpha at discrete CDF probability lines. The secondary energy and angle distributions generated from the fit coefficients showed good agreement with the standard Monte Carlo sampling. However, some discrepancies still existed because the CDF probability mesh values were selected uniformly and arbitrarily. In this paper, a physics-based approach for optimally selecting the CDF probability meshes for the on-the-fly sampling method is introduced, using bound carbon in graphite as the example nuclide. This approach is based on the structure of the phonon frequency distribution of thermal excitations. From the study, it was determined that low (<0.1) and high (>0.9) beta CDF probabilities are important to the structure of the beta probability density functions (PDFs) while very low (<1 × 10{sup −4}) alpha CDF probabilities are important to the structure of the alpha PDFs. The final meshes contain 200 probability values for both beta and alpha. This results in 14.5 MB of total data storage for the on-the-fly coefficients which are used for any temperature realization. This is a significant reduction in
Performance evaluation of Biograph PET/CT system based on Monte Carlo simulation
Wang, Bing; Gao, Fei; Liu, Hua-Feng
2010-10-01
Combined lutetium oxyorthosilicate (LSO) Biograph PET/CT is developed by Siemens Company and has been introduced into medical practice. There is no septa between the scintillator rings, the acquisition mode is full 3D mode. The PET components incorporate three rings of 48 detector blocks which comprises a 13×13 matrix of 4×4×20mm3 elements. The patient aperture is 70cm, the transversal field of view (FOV) is 58.5cm, and the axial field of view is 16.2cm. The CT components adopt 16 slices spiral CT scanner. The physical performance of this PET/CT scanner has been evaluated using Monte Carlo simulation method according to latest NEMA NU 2-2007 standard and the results have been compared with real experiment results. For PET part, in the center FOV the average transversal resolution is 3.67mm, the average axial resolution is 3.94mm, and the 3D-reconstructed scatter fraction is 31.7%. The sensitivities of the PET scanner are 4.21kcps/MBq and 4.26kcps/MBq at 0cm and 10cm off the center of the transversal FOV. The peak NEC is 95.6kcps at a concentration of 39.2kBq/ml. The spatial resolution of CT part is up to 1.12mm at 10mm off the center. The errors between simulated and real results are permitted.
Monte Carlo based water/medium stopping-power ratios for various ICRP and ICRU tissues.
Fernández-Varea, José M; Carrasco, Pablo; Panettieri, Vanessa; Brualla, Lorenzo
2007-11-07
Water/medium stopping-power ratios, s(w,m), have been calculated for several ICRP and ICRU tissues, namely adipose tissue, brain, cortical bone, liver, lung (deflated and inflated) and spongiosa. The considered clinical beams were 6 and 18 MV x-rays and the field size was 10 x 10 cm(2). Fluence distributions were scored at a depth of 10 cm using the Monte Carlo code PENELOPE. The collision stopping powers for the studied tissues were evaluated employing the formalism of ICRU Report 37 (1984 Stopping Powers for Electrons and Positrons (Bethesda, MD: ICRU)). The Bragg-Gray values of s(w,m) calculated with these ingredients range from about 0.98 (adipose tissue) to nearly 1.14 (cortical bone), displaying a rather small variation with beam quality. Excellent agreement, to within 0.1%, is found with stopping-power ratios reported by Siebers et al (2000a Phys. Med. Biol. 45 983-95) for cortical bone, inflated lung and spongiosa. In the case of cortical bone, s(w,m) changes approximately 2% when either ICRP or ICRU compositions are adopted, whereas the stopping-power ratios of lung, brain and adipose tissue are less sensitive to the selected composition. The mass density of lung also influences the calculated values of s(w,m), reducing them by around 1% (6 MV) and 2% (18 MV) when going from deflated to inflated lung.
Institute of Scientific and Technical Information of China (English)
ZHANG Jun; GUO Fan
2015-01-01
Tooth modification technique is widely used in gear industry to improve the meshing performance of gearings. However, few of the present studies on tooth modification considers the influence of inevitable random errors on gear modification effects. In order to investigate the uncertainties of tooth modification amount variations on system’s dynamic behaviors of a helical planetary gears, an analytical dynamic model including tooth modification parameters is proposed to carry out a deterministic analysis on the dynamics of a helical planetary gear. The dynamic meshing forces as well as the dynamic transmission errors of the sun-planet 1 gear pair with and without tooth modifications are computed and compared to show the effectiveness of tooth modifications on gear dynamics enhancement. By using response surface method, a fitted regression model for the dynamic transmission error(DTE) fluctuations is established to quantify the relationship between modification amounts and DTE fluctuations. By shifting the inevitable random errors arousing from manufacturing and installing process to tooth modification amount variations, a statistical tooth modification model is developed and a methodology combining Monte Carlo simulation and response surface method is presented for uncertainty analysis of tooth modifications. The uncertainly analysis reveals that the system’s dynamic behaviors do not obey the normal distribution rule even though the design variables are normally distributed. In addition, a deterministic modification amount will not definitely achieve an optimal result for both static and dynamic transmission error fluctuation reduction simultaneously.
Kumar, A; Chauhan, S
2017-03-08
Obesity is one of the most provoking health burdens in the developed countries. One of the strategies to prevent obesity is the inhibition of pancreatic lipase enzyme. The aim of this study was to build QSAR models for natural lipase inhibitors by using the Monte Carlo method. The molecular structures were represented by the simplified molecular input line entry system (SMILES) notation and molecular graphs. Three sets - training, calibration and test set of three splits - were examined and validated. Statistical quality of all the described models was very good. The best QSAR model showed the following statistical parameters: r(2) = 0.864 and Q(2) = 0.836 for the test set and r(2) = 0.824 and Q(2) = 0.819 for the validation set. Structural attributes for increasing and decreasing the activity (expressed as pIC50) were also defined. Using defined structural attributes, the design of new potential lipase inhibitors is also presented. Additionally, a molecular docking study was performed for the determination of binding modes of designed molecules.
Živković, Jelena V; Trutić, Nataša V; Veselinović, Jovana B; Nikolić, Goran M; Veselinović, Aleksandar M
2015-09-01
The Monte Carlo method was used for QSAR modeling of maleimide derivatives as glycogen synthase kinase-3β inhibitors. The first QSAR model was developed for a series of 74 3-anilino-4-arylmaleimide derivatives. The second QSAR model was developed for a series of 177 maleimide derivatives. QSAR models were calculated with the representation of the molecular structure by the simplified molecular input-line entry system. Two splits have been examined: one split into the training and test set for the first QSAR model, and one split into the training, test and validation set for the second. The statistical quality of the developed model is very good. The calculated model for 3-anilino-4-arylmaleimide derivatives had following statistical parameters: r(2)=0.8617 for the training set; r(2)=0.8659, and r(m)(2)=0.7361 for the test set. The calculated model for maleimide derivatives had following statistical parameters: r(2)=0.9435, for the training, r(2)=0.9262 and r(m)(2)=0.8199 for the test and r(2)=0.8418, r(av)(m)(2)=0.7469 and ∆r(m)(2)=0.1476 for the validation set. Structural indicators considered as molecular fragments responsible for the increase and decrease in the inhibition activity have been defined. The computer-aided design of new potential glycogen synthase kinase-3β inhibitors has been presented by using defined structural alerts.
Lattice based Kinetic Monte Carlo Simulations of a complex chemical reaction network
Danielson, Thomas; Savara, Aditya; Hin, Celine
Lattice Kinetic Monte Carlo (KMC) simulations offer a powerful alternative to using ordinary differential equations for the simulation of complex chemical reaction networks. Lattice KMC provides the ability to account for local spatial configurations of species in the reaction network, resulting in a more detailed description of the reaction pathway. In KMC simulations with a large number of reactions, the range of transition probabilities can span many orders of magnitude, creating subsets of processes that occur more frequently or more rarely. Consequently, processes that have a high probability of occurring may be selected repeatedly without actually progressing the system (i.e. the forward and reverse process for the same reaction). In order to avoid the repeated occurrence of fast frivolous processes, it is necessary to throttle the transition probabilities in such a way that avoids altering the overall selectivity. Likewise, as the reaction progresses, new frequently occurring species and reactions may be introduced, making a dynamic throttling algorithm a necessity. We present a dynamic steady-state detection scheme with the goal of accurately throttling rate constants in order to optimize the KMC run time without compromising the selectivity of the reaction network. The algorithm has been applied to a large catalytic chemical reaction network, specifically that of methanol oxidative dehydrogenation, as well as additional pathways on CeO2(111) resulting in formaldehyde, CO, methanol, CO2, H2 and H2O as gas products.
Energy Technology Data Exchange (ETDEWEB)
Abdel-Khalik, Hany S. [North Carolina State Univ., Raleigh, NC (United States); Zhang, Qiong [North Carolina State Univ., Raleigh, NC (United States)
2014-05-20
The development of hybrid Monte-Carlo-Deterministic (MC-DT) approaches, taking place over the past few decades, have primarily focused on shielding and detection applications where the analysis requires a small number of responses, i.e. at the detector locations(s). This work further develops a recently introduced global variance reduction approach, denoted by the SUBSPACE approach is designed to allow the use of MC simulation, currently limited to benchmarking calculations, for routine engineering calculations. By way of demonstration, the SUBSPACE approach is applied to assembly level calculations used to generate the few-group homogenized cross-sections. These models are typically expensive and need to be executed in the order of 10^{3} - 10^{5} times to properly characterize the few-group cross-sections for downstream core-wide calculations. Applicability to k-eigenvalue core-wide models is also demonstrated in this work. Given the favorable results obtained in this work, we believe the applicability of the MC method for reactor analysis calculations could be realized in the near future.
Markov chain Monte Carlo based analysis of post-translationally modified VDAC gating kinetics.
Tewari, Shivendra G; Zhou, Yifan; Otto, Bradley J; Dash, Ranjan K; Kwok, Wai-Meng; Beard, Daniel A
2014-01-01
The voltage-dependent anion channel (VDAC) is the main conduit for permeation of solutes (including nucleotides and metabolites) of up to 5 kDa across the mitochondrial outer membrane (MOM). Recent studies suggest that VDAC activity is regulated via post-translational modifications (PTMs). Yet the nature and effect of these modifications is not understood. Herein, single channel currents of wild-type, nitrosated, and phosphorylated VDAC are analyzed using a generalized continuous-time Markov chain Monte Carlo (MCMC) method. This developed method describes three distinct conducting states (open, half-open, and closed) of VDAC activity. Lipid bilayer experiments are also performed to record single VDAC activity under un-phosphorylated and phosphorylated conditions, and are analyzed using the developed stochastic search method. Experimental data show significant alteration in VDAC gating kinetics and conductance as a result of PTMs. The effect of PTMs on VDAC kinetics is captured in the parameters associated with the identified Markov model. Stationary distributions of the Markov model suggest that nitrosation of VDAC not only decreased its conductance but also significantly locked VDAC in a closed state. On the other hand, stationary distributions of the model associated with un-phosphorylated and phosphorylated VDAC suggest a reversal in channel conformation from relatively closed state to an open state. Model analyses of the nitrosated data suggest that faster reaction of nitric oxide with Cys-127 thiol group might be responsible for the biphasic effect of nitric oxide on basal VDAC conductance.
Calculation of Credit Valuation Adjustment Based on Least Square Monte Carlo Methods
Directory of Open Access Journals (Sweden)
Qian Liu
2015-01-01
Full Text Available Counterparty credit risk has become one of the highest-profile risks facing participants in the financial markets. Despite this, relatively little is known about how counterparty credit risk is actually priced mathematically. We examine this issue using interest rate swaps. This largely traded financial product allows us to well identify the risk profiles of both institutions and their counterparties. Concretely, Hull-White model for rate and mean-reverting model for default intensity have proven to be in correspondence with the reality and to be well suited for financial institutions. Besides, we find that least square Monte Carlo method is quite efficient in the calculation of credit valuation adjustment (CVA, for short as it avoids the redundant step to generate inner scenarios. As a result, it accelerates the convergence speed of the CVA estimators. In the second part, we propose a new method to calculate bilateral CVA to avoid double counting in the existing bibliographies, where several copula functions are adopted to describe the dependence of two first to default times.
Monte Carlo based unit commitment procedures for the deregulated market environment
Energy Technology Data Exchange (ETDEWEB)
Granelli, G.P.; Marannino, P.; Montagna, M.; Zanellini, F. [Universita di Pavia, Pavia (Italy). Dipartimento di Ingegneria Elettrica
2006-12-15
The unit commitment problem, originally conceived in the framework of short term operation of vertically integrated utilities, needs a thorough re-examination in the light of the ongoing transition towards the open electricity market environment. In this work the problem is re-formulated to adapt unit commitment to the viewpoint of a generation company (GENCO) which is no longer bound to satisfy its load, but is willing to maximize its profits. Moreover, with reference to the present day situation in many countries, the presence of a GENCO (the former monopolist) which is in the position of exerting the market power, requires a careful analysis to be carried out considering the different perspectives of a price taker and of the price maker GENCO. Unit commitment is thus shown to lead to a couple of distinct, yet slightly different problems. The unavoidable uncertainties in load profile and price behaviour over the time period of interest are also taken into account by means of a Monte Carlo simulation. Both the forecasted loads and prices are handled as random variables with a normal multivariate distribution. The correlation between the random input variables corresponding to successive hours of the day was considered by carrying out a statistical analysis of actual load and price data. The whole procedure was tested making use of reasonable approximations of the actual data of the thermal generation units available to come actual GENCOs operating in Italy. (author)
Comparison of polynomial approximations to speed up planewave-based quantum Monte Carlo calculations
Parker, William D; Alfè, Dario; Hennig, Richard G; Wilkins, John W
2013-01-01
The computational cost of quantum Monte Carlo (QMC) calculations of realistic periodic systems depends strongly on the method of storing and evaluating the many-particle wave function. Previous work [A. J. Williamson et al., Phys. Rev. Lett. 87, 246406 (2001); D. Alf\\`e and M. J. Gillan, Phys. Rev. B 70, 161101 (2004)] has demonstrated the reduction of the O(N^3) cost of evaluating the Slater determinant with planewaves to O(N^2) using localized basis functions. We compare four polynomial approximations as basis functions -- interpolating Lagrange polynomials, interpolating piecewise-polynomial-form (pp-) splines, and basis-form (B-) splines (interpolating and smoothing). All these basis functions provide a similar speedup relative to the planewave basis. The pp-splines have eight times the memory requirement of the other methods. To test the accuracy of the basis functions, we apply them to the ground state structures of Si, Al, and MgO. The polynomial approximations differ in accuracy most strongly for MgO ...
Balashov, S P; Petrovskaya, L E; Lukashev, E P; Imasheva, E S; Dioumaev, A K; Wang, J M; Sychev, S V; Dolgikh, D A; Rubin, A B; Kirpichnikov, M P; Lanyi, J K
2012-07-24
One of the distinctive features of eubacterial retinal-based proton pumps, proteorhodopsins, xanthorhodopsin, and others, is hydrogen bonding of the key aspartate residue, the counterion to the retinal Schiff base, to a histidine. We describe properties of the recently found eubacterium proton pump from Exiguobacterium sibiricum (named ESR) expressed in Escherichia coli, especially features that depend on Asp-His interaction, the protonation state of the key aspartate, Asp85, and its ability to accept a proton from the Schiff base during the photocycle. Proton pumping by liposomes and E. coli cells containing ESR occurs in a broad pH range above pH 4.5. Large light-induced pH changes indicate that ESR is a potent proton pump. Replacement of His57 with methionine or asparagine strongly affects the pH-dependent properties of ESR. In the H57M mutant, a dramatic decrease in the quantum yield of chromophore fluorescence emission and a 45 nm blue shift of the absorption maximum with an increase in the pH from 5 to 8 indicate deprotonation of the counterion with a pK(a) of 6.3, which is also the pK(a) at which the M intermediate is observed in the photocycle of the protein solubilized in detergent [dodecyl maltoside (DDM)]. This is in contrast with the case for the wild-type protein, for which the same experiments show that the major fraction of Asp85 is deprotonated at pH >3 and that it protonates only at low pH, with a pK(a) of 2.3. The M intermediate in the wild-type photocycle accumulates only at high pH, with an apparent pK(a) of 9, via deprotonation of a residue interacting with Asp85, presumably His57. In liposomes reconstituted with ESR, the pK(a) values for M formation and spectral shifts are 2-3 pH units lower than in DDM. The distinctively different pH dependencies of the protonation of Asp85 and the accumulation of the M intermediate in the wild-type protein versus the H57M mutant indicate that there is strong Asp-His interaction, which substantially lowers
Balashov, S.P.; Petrovskaya, L.E.; Lukashev, E.P.; Imasheva, E.S.; Dioumaev, A.K.; Wang, J.M.; Sychev, S.V.; Dolgikh, D.A.; Rubin, A.B.; Kirpichnikov, M.P.; Lanyi, J.K.
2012-01-01
One of the distinctive features of eubacterial retinal based proton pumps, proteorhodopsins, xanthorhodopsin and others, is hydrogen bonding of the key aspartate residue, the counterion to the retinal Schiff base, to a histidine. We describe properties of the recently found eubacterium proton pump from Exiguobacterium sibiricum (named ESR) expressed in E. coli, especially features that depend on Asp-His interaction, the protonation state of the key aspartate, Asp85, and its ability to accept proton from the Schiff base during the photocycle. Proton pumping by liposomes and E. coli cells containing ESR occurs in a broad pH range above pH 4.5. Large light-induced pH changes indicate that ESR is a potent proton pump. Replacement of His57 with methionine or asparagine strongly affects the pH dependent properties of ESR. In the H57M mutant a dramatic decrease in the quantum yield of chromophore fluorescence emission and a 45 nm blue shift of the absorption maximum upon raising the pH from 5 to 8 indicates deprotonation of the counterion with a pKa of 6.3, which is also the pKa at which the M intermediate is observed in the photocycle of the protein solubilized in detergent (DDM). This is in contrast with the wild type protein, in which the same experiments show that the major fraction of Asp85 is deprotonated at pH > 3 and that it protonates only at low pH, with a pKa of 2.3. The M intermediate in the wild type photocycle accumulates only at high pH, with an apparent pKa of 9 from deprotonation of a residue interacting with Asp85, presumably His57. In liposomes reconstituted with ESR the pKas for M formation and spectral shifts are 2–3 pH units lower than in DDM. The distinctively different pH dependencies of the protonation of Asp85 and the accumulation of the M intermediate in the wild type protein vs. the H57M mutant indicate that there is strong Asp-His interaction, which substantially lowers the pKa of Asp85 by stabilizing its deprotonated state. PMID:22738070
Energy Loss of Proton in Extraction Window
Institute of Scientific and Technical Information of China (English)
LIU; Bao-jie; ZENG; Zi-qiang
2015-01-01
The particle is transported in vacuum in accelerator,and is exported through extraction windows.The Kapton foil is used in a 3 MeV proton accelerator.The energy loss of 3 MeV proton is calculated when it comes through Kapton foil of different thicknesses with Monte Carlo method.The energy loss of 3 MeV proton in
Albin, T.; Koschny, D.; Soja, R.; Srama, R.; Poppe, B.
2016-01-01
The Canary Islands Long-Baseline Observatory (CILBO) is a double station meteor camera system (Koschny et al., 2013; Koschny et al., 2014) that consists of 5 cameras. The two cameras considered in this report are ICC7 and ICC9, and are installed on Tenerife and La Palma. They point to the same atmospheric volume between both islands allowing stereoscopic observation of meteors. Since its installation in 2011 and the start of operation in 2012 CILBO has detected over 15000 simultaneously observed meteors. Koschny and Diaz (2002) developed the Meteor Orbit and Trajectory Software (MOTS) to compute the trajectory of such meteors. The software uses the astrometric data from the detection software MetRec (Molau, 1998) and determines the trajectory in geodetic coordinates. This work presents a Monte-Carlo based extension of the MOTS code to compute the orbital elements of simultaneously detected meteors by CILBO.
Ye, Hong-zhou; Jiang, Hong
2014-01-01
Materials with spin-crossover (SCO) properties hold great potentials in information storage and therefore have received a lot of concerns in the recent decades. The hysteresis phenomena accompanying SCO is attributed to the intermolecular cooperativity whose underlying mechanism may have a vibronic origin. In this work, a new vibronic Ising-like model in which the elastic coupling between SCO centers is included by considering harmonic stretching and bending (SAB) interactions is proposed and solved by Monte Carlo simulations. The key parameters in the new model, $k_1$ and $k_2$, corresponding to the elastic constant of the stretching and bending mode, respectively, can be directly related to the macroscopic bulk and shear modulus of the material in study, which can be readily estimated either based on experimental measurements or first-principles calculations. The convergence issue in the MC simulations of the thermal hysteresis has been carefully checked, and it was found that the stable hysteresis loop can...
Determination of low-energy structures of a small RNA hairpin using Monte Carlo–based techniques
Indian Academy of Sciences (India)
Sudhanshu Shanker; Pradipta Bandyopadhyay
2012-07-01
The energy landscape of RNA is known to be extremely rugged, and hence finding low-energy structures starting from a random structure is a challenging task for any optimization algorithm. In the current work, we have investigated the ability of one Monte Carlo–based optimization algorithm, Temperature Basin Paving, to explore the energy landscape of a small RNA T-loop hairpin. In this method, the history of the simulation is used to increase the probability of states less visited in the simulation. It has been found that using both energy and end-to-end distance as the biasing parameters in the simulation, the partially folded structure of the hairpin starting from random structures could be obtained.
Monte Carlo based approach to the LS–NaI 4πβ–γ anticoincidence extrapolation and uncertainty.
Fitzgerald, R
2016-03-01
The 4πβ–γ anticoincidence method is used for the primary standardization of β−, β+, electron capture (EC), α, and mixed-mode radionuclides. Efficiency extrapolation using one or more γ ray coincidence gates is typically carried out by a low-order polynomial fit. The approach presented here is to use a Geant4-based Monte Carlo simulation of the detector system to analyze the efficiency extrapolation. New code was developed to account for detector resolution, direct γ ray interaction with the PMT, and implementation of experimental β-decay shape factors. The simulation was tuned to 57Co and 60Co data, then tested with 99mTc data, and used in measurements of 18F, 129I, and 124I. The analysis method described here offers a more realistic activity value and uncertainty than those indicated from a least-squares fit alone.
Gong, Y.; Yu, Y. J.; Zhang, W. Y.
2016-08-01
This study has established a set of methodological systems by simulating loads and analyzing optimization strategy integrity for the optimization of watershed non-point source pollution control. First, the source of watershed agricultural non-point source pollution is divided into four aspects, including agricultural land, natural land, livestock breeding, and rural residential land. Secondly, different pollution control measures at the source, midway and ending stages are chosen. Thirdly, the optimization effect of pollution load control in three stages are simulated, based on the Monte Carlo simulation. The method described above is applied to the Ashi River watershed in Heilongjiang Province of China. Case study results indicate that the combined three types of control measures can be implemented only if the government promotes the optimized plan and gradually improves implementation efficiency. This method for the optimization strategy integrity for watershed non-point source pollution control has significant reference value.
Townson, Reid W.; Zavgorodni, Sergei
2014-12-01
In GPU-based Monte Carlo simulations for radiotherapy dose calculation, source modelling from a phase-space source can be an efficiency bottleneck. Previously, this has been addressed using phase-space-let (PSL) sources, which provided significant efficiency enhancement. We propose that additional speed-up can be achieved through the use of a hybrid primary photon point source model combined with a secondary PSL source. A novel phase-space derived and histogram-based implementation of this model has been integrated into gDPM v3.0. Additionally, a simple method for approximately deriving target photon source characteristics from a phase-space that does not contain inheritable particle history variables (LATCH) has been demonstrated to succeed in selecting over 99% of the true target photons with only ~0.3% contamination (for a Varian 21EX 18 MV machine). The hybrid source model was tested using an array of open fields for various Varian 21EX and TrueBeam energies, and all cases achieved greater than 97% chi-test agreement (the mean was 99%) above the 2% isodose with 1% / 1 mm criteria. The root mean square deviations (RMSDs) were less than 1%, with a mean of 0.5%, and the source generation time was 4-5 times faster. A seven-field intensity modulated radiation therapy patient treatment achieved 95% chi-test agreement above the 10% isodose with 1% / 1 mm criteria, 99.8% for 2% / 2 mm, a RMSD of 0.8%, and source generation speed-up factor of 2.5. Presented as part of the International Workshop on Monte Carlo Techniques in Medical Physics
Energy Technology Data Exchange (ETDEWEB)
Park, Yong-il; Nagai, Masayuki [Advanced Research Center for Energy and Environment, Musashi Institute of Technology, 1-28-1 Tamazutsumi, Tokyo 158-8557 Setagaya (Japan)
2001-12-01
Novel fast proton-conducting GPTS-STA-SiO{sub 2} and GPTS-STA-ZrP composites were successfully fabricated. The polymer matrix obtained through hydrolysis and condensation reaction of 3-glycidoxypropyltrimethoxysilane (GPTS) showed apparent proton conduction at high relative humidity with conductivity from 1.0x10{sup -7} to 3.6x10{sup -6} S/cm, although no proton donor was incorporated. The proton conductivities of the fabricated composites were high, and increased up to 1.9x10{sup -2} S/cm by addition of silicotungstic acid (STA). By incorporating {alpha}-zirconium phosphate (ZrP) into the GPTS-STA polymer matrix, the composite showed increased conductivity at low temperature (80C), indicating weak dependence on humidity by molecular water in ZrP. The high proton conductivity of the composites is due to the proton conducting path through the GPTS-derived 'pseudo-polyethylene oxide (pseudo-PEO)' networks, which also contains a trapped solid acid (silicotungstic acid) as a proton donor.
Performance Analysis of Korean Liquid metal type TBM based on Monte Carlo code
Energy Technology Data Exchange (ETDEWEB)
Kim, C. H.; Han, B. S.; Park, H. J.; Park, D. K. [Seoul National Univ., Seoul (Korea, Republic of)
2007-01-15
The objective of this project is to analyze a nuclear performance of the Korean HCML(Helium Cooled Molten Lithium) TBM(Test Blanket Module) which will be installed in ITER(International Thermonuclear Experimental Reactor). This project is intended to analyze a neutronic design and nuclear performances of the Korean HCML ITER TBM through the transport calculation of MCCARD. In detail, we will conduct numerical experiments for analyzing the neutronic design of the Korean HCML TBM and the DEMO fusion blanket, and improving the nuclear performances. The results of the numerical experiments performed in this project will be utilized further for a design optimization of the Korean HCML TBM. In this project, Monte Carlo transport calculations for evaluating TBR (Tritium Breeding Ratio) and EMF (Energy Multiplication factor) were conducted to analyze a nuclear performance of the Korean HCML TBM. The activation characteristics and shielding performances for the Korean HCML TBM were analyzed using ORIGEN and MCCARD. We proposed the neutronic methodologies for analyzing the nuclear characteristics of the fusion blanket, which was applied to the blanket analysis of a DEMO fusion reactor. In the results, the TBR of the Korean HCML ITER TBM is 0.1352 and the EMF is 1.362. Taking into account a limitation for the Li amount in ITER TBM, it is expected that tritium self-sufficiency condition can be satisfied through a change of the Li quantity and enrichment. In the results of activation and shielding analysis, the activity drops to 1.5% of the initial value and the decay heat drops to 0.02% of the initial amount after 10 years from plasma shutdown.
Monte-Carlo based Uncertainty Analysis For CO2 Laser Microchanneling Model
Prakash, Shashi; Kumar, Nitish; Kumar, Subrata
2016-09-01
CO2 laser microchanneling has emerged as a potential technique for the fabrication of microfluidic devices on PMMA (Poly-methyl-meth-acrylate). PMMA directly vaporizes when subjected to high intensity focused CO2 laser beam. This process results in clean cut and acceptable surface finish on microchannel walls. Overall, CO2 laser microchanneling process is cost effective and easy to implement. While fabricating microchannels on PMMA using a CO2 laser, the maximum depth of the fabricated microchannel is the key feature. There are few analytical models available to predict the maximum depth of the microchannels and cut channel profile on PMMA substrate using a CO2 laser. These models depend upon the values of thermophysical properties of PMMA and laser beam parameters. There are a number of variants of transparent PMMA available in the market with different values of thermophysical properties. Therefore, for applying such analytical models, the values of these thermophysical properties are required to be known exactly. Although, the values of laser beam parameters are readily available, extensive experiments are required to be conducted to determine the value of thermophysical properties of PMMA. The unavailability of exact values of these property parameters restrict the proper control over the microchannel dimension for given power and scanning speed of the laser beam. In order to have dimensional control over the maximum depth of fabricated microchannels, it is necessary to have an idea of uncertainty associated with the predicted microchannel depth. In this research work, the uncertainty associated with the maximum depth dimension has been determined using Monte Carlo method (MCM). The propagation of uncertainty with different power and scanning speed has been predicted. The relative impact of each thermophysical property has been determined using sensitivity analysis.
Graves, Yan Jiang; Jia, Xun; Jiang, Steve B
2013-03-21
The γ-index test has been commonly adopted to quantify the degree of agreement between a reference dose distribution and an evaluation dose distribution. Monte Carlo (MC) simulation has been widely used for the radiotherapy dose calculation for both clinical and research purposes. The goal of this work is to investigate both theoretically and experimentally the impact of the MC statistical fluctuation on the γ-index test when the fluctuation exists in the reference, the evaluation, or both dose distributions. To the first order approximation, we theoretically demonstrated in a simplified model that the statistical fluctuation tends to overestimate γ-index values when existing in the reference dose distribution and underestimate γ-index values when existing in the evaluation dose distribution given the original γ-index is relatively large for the statistical fluctuation. Our numerical experiments using realistic clinical photon radiation therapy cases have shown that (1) when performing a γ-index test between an MC reference dose and a non-MC evaluation dose, the average γ-index is overestimated and the gamma passing rate decreases with the increase of the statistical noise level in the reference dose; (2) when performing a γ-index test between a non-MC reference dose and an MC evaluation dose, the average γ-index is underestimated when they are within the clinically relevant range and the gamma passing rate increases with the increase of the statistical noise level in the evaluation dose; (3) when performing a γ-index test between an MC reference dose and an MC evaluation dose, the gamma passing rate is overestimated due to the statistical noise in the evaluation dose and underestimated due to the statistical noise in the reference dose. We conclude that the γ-index test should be used with caution when comparing dose distributions computed with MC simulation.
Rizzo, Robert C; Udier-Blagović, Marina; Wang, De-Ping; Watkins, Edward K; Kroeger Smith, Marilyn B; Smith, Richard H; Tirado-Rives, Julian; Jorgensen, William L
2002-07-04
Results of Monte Carlo (MC) simulations for more than 200 nonnucleoside inhibitors of HIV-1 reverse transcriptase (NNRTIs) representing eight diverse chemotypes have been correlated with their anti-HIV activities in an effort to establish simulation protocols and methods that can be used in the development of more effective drugs. Each inhibitor was modeled in a complex with the protein and by itself in water, and potentially useful descriptors of binding affinity were collected during the MC simulations. A viable regression equation was obtained for each data set using an extended linear response approach, which yielded r(2) values between 0.54 and 0.85 and an average unsigned error of only 0.50 kcal/mol. The most common descriptors confirm that a good geometrical match between the inhibitor and the protein is important and that the net loss of hydrogen bonds with the inhibitor upon binding is unfavorable. Other physically reasonable descriptors of binding are needed on a chemotype case-by-case basis. By including descriptors in common from the individual fits, combination regressions that include multiple data sets were also developed. This procedure led to a refined "master" regression for 210 NNRTIs with an r(2) of 0.60 and a cross-validated q(2) of 0.55. The computed activities show an rms error of 0.86 kcal/mol in comparison with experiment and an average unsigned error of 0.69 kcal/mol. Encouraging results were obtained for the predictions of 27 NNRTIs, representing a new chemotype not included in the development of the regression model. Predictions for this test set using the master regression yielded a q(2) value of 0.51 and an average unsigned error of 0.67 kcal/mol. Finally, additional regression analysis reveals that use of ligand-only descriptors leads to models with much diminished predictive ability.
GPU-based Monte Carlo Dust Radiative Transfer Scheme Applied to Active Galactic Nuclei
Heymann, Frank; Siebenmorgen, Ralf
2012-05-01
A three-dimensional parallel Monte Carlo (MC) dust radiative transfer code is presented. To overcome the huge computing-time requirements of MC treatments, the computational power of vectorized hardware is used, utilizing either multi-core computer power or graphics processing units. The approach is a self-consistent way to solve the radiative transfer equation in arbitrary dust configurations. The code calculates the equilibrium temperatures of two populations of large grains and stochastic heated polycyclic aromatic hydrocarbons. Anisotropic scattering is treated applying the Heney-Greenstein phase function. The spectral energy distribution (SED) of the object is derived at low spatial resolution by a photon counting procedure and at high spatial resolution by a vectorized ray tracer. The latter allows computation of high signal-to-noise images of the objects at any frequencies and arbitrary viewing angles. We test the robustness of our approach against other radiative transfer codes. The SED and dust temperatures of one- and two-dimensional benchmarks are reproduced at high precision. The parallelization capability of various MC algorithms is analyzed and included in our treatment. We utilize the Lucy algorithm for the optical thin case where the Poisson noise is high, the iteration-free Bjorkman & Wood method to reduce the calculation time, and the Fleck & Canfield diffusion approximation for extreme optical thick cells. The code is applied to model the appearance of active galactic nuclei (AGNs) at optical and infrared wavelengths. The AGN torus is clumpy and includes fluffy composite grains of various sizes made up of silicates and carbon. The dependence of the SED on the number of clumps in the torus and the viewing angle is studied. The appearance of the 10 μm silicate features in absorption or emission is discussed. The SED of the radio-loud quasar 3C 249.1 is fit by the AGN model and a cirrus component to account for the far-infrared emission.
Messina, Luca; Castin, Nicolas; Domain, Christophe; Olsson, Pär
2017-02-01
The quality of kinetic Monte Carlo (KMC) simulations of microstructure evolution in alloys relies on the parametrization of point-defect migration rates, which are complex functions of the local chemical composition and can be calculated accurately with ab initio methods. However, constructing reliable models that ensure the best possible transfer of physical information from ab initio to KMC is a challenging task. This work presents an innovative approach, where the transition rates are predicted by artificial neural networks trained on a database of 2000 migration barriers, obtained with density functional theory (DFT) in place of interatomic potentials. The method is tested on copper precipitation in thermally aged iron alloys, by means of a hybrid atomistic-object KMC model. For the object part of the model, the stability and mobility properties of copper-vacancy clusters are analyzed by means of independent atomistic KMC simulations, driven by the same neural networks. The cluster diffusion coefficients and mean free paths are found to increase with size, confirming the dominant role of coarsening of medium- and large-sized clusters in the precipitation kinetics. The evolution under thermal aging is in better agreement with experiments with respect to a previous interatomic-potential model, especially concerning the experiment time scales. However, the model underestimates the solubility of copper in iron due to the excessively high solution energy predicted by the chosen DFT method. Nevertheless, this work proves the capability of neural networks to transfer complex ab initio physical properties to higher-scale models, and facilitates the extension to systems with increasing chemical complexity, setting the ground for reliable microstructure evolution simulations in a wide range of alloys and applications.
Proton Linear Energy Transfer measurement using Emulsion Cloud Chamber
Energy Technology Data Exchange (ETDEWEB)
Shin, Jae-ik [Proton Therapy Center, National Cancer Center (Korea, Republic of); Division of Heavy Ion Clinical Research, Korea Institute of Radiological & Medical Sciences (KIRAMS), Seoul (Korea, Republic of); Park, Seyjoon [Department of Radiation Oncology, Samsung Medical Center, Sungkyunkwan University, School of Medicine, Seoul (Korea, Republic of); Kim, Haksoo; Kim, Meyoung [Proton Therapy Center, National Cancer Center (Korea, Republic of); Jeong, Chiyoung [Department of Radiation Oncology, Asan Medical Center, University of Ulsan College of Medicine, Seoul (Korea, Republic of); Cho, Sungkoo [Department of Radiation Oncology, Samsung Medical Center, Sungkyunkwan University, School of Medicine, Seoul (Korea, Republic of); Lim, Young Kyung; Shin, Dongho [Proton Therapy Center, National Cancer Center (Korea, Republic of); Lee, Se Byeong, E-mail: sblee@ncc.re.kr [Proton Therapy Center, National Cancer Center (Korea, Republic of); Morishima, Kunihiro; Naganawa, Naotaka; Sato, Osamu [Department of Physics, Nagoya University, Nagoya (Japan); Kwak, Jungwon [Department of Radiation Oncology, Asan Medical Center, University of Ulsan College of Medicine, Seoul (Korea, Republic of); Kim, Sung Hyun [Center for Underground Physics, Institute for Basic Science (IBS), Daejeon (Korea, Republic of); Cho, Jung Sook [Department of refinement education, Dongseo University, Busan (Korea, Republic of); Ahn, Jung Keun [Department of Physics, Korea University, Seoul (Korea, Republic of); Kim, Ji Hyun; Yoon, Chun Sil [Gyeongsang National University, Jinju (Korea, Republic of); Incerti, Sebastien [CNRS, IN2P3, CENBG, UMR 5797, F-33170 Gradignan (France); Université Bordeaux 1, CENBG, UMR 5797, F-33170 Gradignan (France)
2015-04-15
This study proposes to determine the correlation between the Volume Pulse Height (VPH) measured by nuclear emulsion and Linear Energy Transfer (LET) calculated by Monte Carlo simulation based on Geant4. The nuclear emulsion was irradiated at the National Cancer Center (NCC) with a therapeutic proton beam and was installed at 5.2 m distance from the beam nozzle structure with various thicknesses of water-equivalent material (PMMA) blocks to position with specific positions along the Bragg curve. After the beam exposure and development of the emulsion films, the films were scanned by S-UTS developed in Nagoya University. The proton tracks in the scanned films were reconstructed using the ‘NETSCAN’ method. Through this procedure, the VPH can be derived from each reconstructed proton track at each position along the Bragg curve. The VPH value indicates the magnitude of energy loss in proton track. By comparison with the simulation results obtained using Geant4, we found the correlation between the LET calculated by Monte Carlo simulation and the VPH measured by the nuclear emulsion.
Wang, Chunmei
Proton exchange membrane (PEM) fuel cells are regarded as highly promising energy conversion systems for future transportation and stationary power generation and have been under intensive investigations for the last decade. Unfortunately, cutting edge PEM fuel cell design and components still do not allow economically commercial implementation of this technology. The main obstacles are high cost of proton conductive membranes, low-proton conductivity at low relative humidity (RH), and dehydration and degradation of polymer membranes at high temperatures. The objective of this study was to develop a systematic approach to design a high proton conductive composite membrane that can provide a conductivity of approximately 100 mS cm-1 under hot and dry conditions (120°C and 50% RH). The approach was based on fundamental and experimental studies of the proton conductivity of inorganic additives and composite membranes. We synthesized and investigated a variety of organic-inorganic Nafion-based composite membranes. In particular, we analyzed their fundamental properties, which included thermal stability, morphology, the interaction between inorganic network and Nafion clusters, and the effect of inorganic phase on the membrane conductivity. A wide range of inorganic materials was studied in advance in order to select the proton conductive inorganic additives for composite membranes. We developed a conductivity measurement method, with which the proton conductivity characteristics of solid acid materials, zirconium phosphates, sulfated zirconia (S-ZrO2), phosphosilicate gels, and Santa Barbara Amorphous silica (SBA-15) were discussed in detail. Composite membranes containing Nafion and different amounts of functionalized inorganic additives (sulfated inorganics such as S-ZrO2, SBA-15, Mobil Composition of Matter MCM-41, and S-SiO2, and phosphonated inorganic P-SiO2) were synthesized with different methods. We incorporated inorganic particles within Nafion clusters
Design of a 10 MeV normal conducting CW proton linac based on equidistant multi-gap CH cavities
Li, Zhihui
2014-01-01
The continue wave (CW) high current proton linac has wide applications as the front end of the high power proton machines. The low energy part is the most difficult one and there is no widely accepted solution yet. Based on the analysis of the focusing properties of the CW low energy proton linac, a 10 MeV low energy normal conducting proton linac based on equidistant seven-gap Cross-bar H-type (CH) cavities is proposed. The linac is composed of ten 7-gap CH cavities and the transverse focusing is maintained by the quadrupole doublets located between cavities. The total length of the linac is less than 6 meters and the average acceleration gradient is about 1.2 MeV/m. The electromagnetic properties of the cavities are investigated by Microwave Studio. At the nominal acceleration gradient the maximum surface electric field in the cavities is less than 1.3 times Kilpatrick limit, and the Ohmic loss of each cavity is less than 35 kW. The multi-particle beam dynamics simulations are performed with the help of the...
Design of a 10 MeV normal conducting CW proton linac based on equidistant multi-gap CH cavities
Li, Zhi-Hui
2015-09-01
Continuous wave (CW) high current proton linacs have wide applications as the front end of high power proton machines. The low energy part of such a linac is the most difficult and there is currently no widely accepted solution. Based on the analysis of the focusing properties of the CW low energy proton linac, a 10 MeV low energy normal conducting proton linac based on equidistant seven-gap Cross-bar H-type (CH) cavities is proposed. The linac is composed of ten 7-gap CH cavities and the transverse focusing is maintained by quadrupole doublets located between the cavities. The total length of the linac is less than 6 meters and the average acceleration gradient is about 1.2 MeV/m. The electromagnetic properties of the cavities are investigated by Microwave Studio. At the nominal acceleration gradient the maximum surface electric field in the cavities is less than 1.3 times the Kilpatrick limit, and the Ohmic loss of each cavity is less than 35 kW. Multi-particle beam dynamics simulations are performed with Tracewin code, and the results show that the beam dynamics of the linac are quite stable, the linac has the capability to accelerate up to 30 mA beam with acceptable dynamics behavior. Supported by National Natural Science Foundation of China (11375122, 91126003)
Nichiporov, D.; Coutinho, L.; Klyachko, A. V.
2016-04-01
Accurate, high-spatial resolution dosimetry in proton therapy is a time consuming task, and may be challenging in the case of small fields, due to the lack of adequate instrumentation. The purpose of this work is to develop a novel dose imaging detector with high spatial resolution and tissue equivalent response to dose in the Bragg peak, suitable for beam commissioning and quality assurance measurements. A scintillation gas electron multiplier (GEM) detector based on a double GEM amplification structure with optical readout was filled with a He/CF4 gas mixture and evaluated in pristine and modulated proton beams of several penetration ranges. The detector’s performance was characterized in terms of linearity in dose rate, spatial resolution, short- and long-term stability and tissue-equivalence of response at different energies. Depth-dose profiles measured with the GEM detector in the 115-205 MeV energy range were compared with the profiles measured under similar conditions using the PinPoint 3D small-volume ion chamber. The GEM detector filled with a He-based mixture has a nearly tissue equivalent response in the proton beam and may become an attractive and efficient tool for high-resolution 2D and 3D dose imaging in proton dosimetry, and especially in small-field applications.
Harrison, Kevin W.
This research endeavor began with the design and construction of a new hydrogen test facility at the National Renewable Energy Laboratory (NREL). To improve the electrical link of wind-based electrolysis the characterization of a proton exchange membrane (PEM) electrolyzer under varying input power was performed at NRELs new test facility. The commercially available electrolyzer from Proton Energy Systems (PES) was characterized using constant direct current (DC), sinusoidally varying DC, photovoltaics and variable magnitude and frequency energy from a 10 kW wind turbine. At rated stack current and ˜ 40°C the system efficiency of the commercial electrolyzer was measured to be 55%. At lower stack current it was shown that commercial electrolyzer system efficiency falls because of the continuous hydrogen purge (˜0.1 Nm3 hr-1) used to maintain the hydrogen desiccant drying system. A novel thermoelectric-based dew point controller is designed and modeled to reduce the penalty to renewable sources because they do not always operate at 100% of rated stack current. It is predicted that the thermoelectric design when operated 100% of the time at full current to the thermoelectric modules would consume 3.1 kWh kg -1 of hydrogen. Using the higher heating value of hydrogen and a stack efficiency of 60% to produce the hydrogen that is continuously vented, the desiccant system consumes about 5.7 kWh kg-1. Design of the UND electrolyzer sub-systems responsible for all aspects of water, power to the stack, and hydrogen conditioning enables more flexible and precise experimental data to be obtained than from an off-the-shelf system. Current-voltage (IV) characteristic curves were obtained on the UND system at temperatures between 7--70°C. The anode and cathode exchange current densities are fitted to 2.0 E-06 e0.043T and 0.12 e 0.026T A cm-2 respectively. Stack conductivity was fitted to 0.001T + 0.03 S cm-1. The three coefficients represent physical stack parameters and are
Institute of Scientific and Technical Information of China (English)
徐克; 何华刚; 朱毅川
2012-01-01
目前气体扩散模拟研究多采用流体力学的计算方法,分析气体扩散过程中的动力学特性.有限体积、有限元等方法都需要对事故区域整体进行网格划分,计算过程效率无法满足长输管道事故应急跨区域、多气象以及复杂地形的要求.Monte-Carlo方法利用RAMS预测的平均风场,模拟有限气体粒子在风场中的随机行走特性,有效地弥补了计算效率与网格精度冲突所导致的模拟性能下降的缺点.通过HAVEGE方法收集计算的硬件信息熵形成随机源,修正了以往伪随机数问题,增强了Monte-Carlo方法的计算精度.结果表明Monte-Carlo气体扩散模拟研究方法满足了长输管道事故灾害应急决策的需要.%The gas dispersion simulations almost used the methods of computational fluid dynamics at present, which analyzed the dynamic mechanics of the dispersion. However FVM and FEM have to mesh the total accidental area and the computation of both failed to meet the accidents emergency requirements, which included cross-regions , multiple meteorology and complicate terrains. Through researching the random walk performance of the particles in the average wind field predicted by the RAMS model, Monte-Carlo method resolved simulating performance degradation stemming from the conflict between the computing efficiency and the meshing accuracy. Besides, the HAVEGE method corrected paseudorandom problem by collecting the computer hardware comentropy as a random resource increased the computing accuracy. The results showed that the gas dispersion simulation based on Monte-Carlo could satisfy the requirement of the long-distant pipeline disasters emergency decision-making.
Monte Carlo-based diode design for correction-less small field dosimetry
Charles, P. H.; Crowe, S. B.; Kairn, T.; Knight, R. T.; Hill, B.; Kenny, J.; Langton, C. M.; Trapp, J. V.
2013-07-01
Due to their small collecting volume, diodes are commonly used in small field dosimetry. However, the relative sensitivity of a diode increases with decreasing small field size. Conversely, small air gaps have been shown to cause a significant decrease in the sensitivity of a detector as the field size is decreased. Therefore, this study uses Monte Carlo simulations to look at introducing air upstream to diodes such that they measure with a constant sensitivity across all field sizes in small field dosimetry. Varying thicknesses of air were introduced onto the upstream end of two commercial diodes (PTW 60016 photon diode and PTW 60017 electron diode), as well as a theoretical unenclosed silicon chip using field sizes as small as 5 mm × 5 mm. The metric \\frac{{D_{w,Q} }}{{D_{Det,Q} }} used in this study represents the ratio of the dose to a point of water to the dose to the diode active volume, for a particular field size and location. The optimal thickness of air required to provide a constant sensitivity across all small field sizes was found by plotting \\frac{{D_{w,Q} }}{{D_{Det,Q} }} as a function of introduced air gap size for various field sizes, and finding the intersection point of these plots. That is, the point at which \\frac{{D_{w,Q} }}{{D_{Det,Q} }} was constant for all field sizes was found. The optimal thickness of air was calculated to be 3.3, 1.15 and 0.10 mm for the photon diode, electron diode and unenclosed silicon chip, respectively. The variation in these results was due to the different design of each detector. When calculated with the new diode design incorporating the upstream air gap, k_{Q_{clin} ,Q_{msr} }^{f_{clin} ,f_{msr} } was equal to unity to within statistical uncertainty (0.5%) for all three diodes. Cross-axis profile measurements were also improved with the new detector design. The upstream air gap could be implanted on the commercial diodes via a cap consisting of the air cavity surrounded by water equivalent material. The
Evaluation of a commercial electron treatment planning system based on Monte Carlo techniques (eMC).
Pemler, Peter; Besserer, Jürgen; Schneider, Uwe; Neuenschwander, Hans
2006-01-01
A commercial electron beam treatment planning system on the basis of a Monte Carlo algorithm (Varian Eclipse, eMC V7.2.35) was evaluated. Measured dose distributions were used for comparison with dose distributions predicted by eMC calculations. Tests were carried out for various applicators and field sizes, irregular shaped cut outs and an inhomogeneity phantom for energies between 6 Me V and 22 MeV Monitor units were calculated for all applicator/energy combinations and field sizes down to 3 cm diameter and source-to-surface distances of 100 cm and 110 cm. A mass-density-to-Hounsfield-Units calibration was performed to compare dose distributions calculated with a default and an individual calibration. The relationship between calculation parameters of the eMC and the resulting dose distribution was studied in detail. Finally, the algorithm was also applied to a clinical case (boost treatment of the breast) to reveal possible problems in the implementation. For standard geometries there was a good agreement between measurements and calculations, except for profiles for low energies (6 MeV) and high energies (18 Me V 22 MeV), in which cases the algorithm overestimated the dose off-axis in the high-dose region. For energies of 12 MeV and higher there were oscillations in the plateau region of the corresponding depth dose curves calculated with a grid size of 1 mm. With irregular cut outs, an overestimation of the dose was observed for small slits and low energies (4% for 6 MeV), as well as for asymmetric cases and extended source-to-surface distances (12% for SSD = 120 cm). While all monitor unit calculations for SSD = 100 cm were within 3% compared to measure-ments, there were large deviations for small cut outs and source-to-surface distances larger than 100 cm (7%for a 3 cm diameter cut-out and a source-to-surface distance of 10 cm).
Energy Technology Data Exchange (ETDEWEB)
Kumta, Prashant N.; Kadakia, Karan Sandeep; Datta, Moni Kanchan; Velikokhatnyi, Oleg
2017-02-07
The invention provides electro-catalyst compositions for an anode electrode of a proton exchange membrane-based water electrolysis system. The compositions include a noble metal component selected from the group consisting of iridium oxide, ruthenium oxide, rhenium oxide and mixtures thereof, and a non-noble metal component selected from the group consisting of tantalum oxide, tin oxide, niobium oxide, titanium oxide, tungsten oxide, molybdenum oxide, yttrium oxide, scandium oxide, cooper oxide, zirconium oxide, nickel oxide and mixtures thereof. Further, the non-noble metal component can include a dopant. The dopant can be at least one element selected from Groups III, V, VI and VII of the Periodic Table. The compositions can be prepared using a surfactant approach or a sol gel approach. Further, the compositions are prepared using noble metal and non-noble metal precursors. Furthermore, a thin film containing the compositions can be deposited onto a substrate to form the anode electrode.
Proton and electron deep dose profiles for retinoblastoma based on GEANT 4 code
Energy Technology Data Exchange (ETDEWEB)
Braga, Flavia V., E-mail: flaviafisica@gmail.co [Universidade Federal de Minas Gerais (UFMG), Belo Horizonte, MG (Brazil). Programa de Pos-graduacao em Ciencias e Tecnicas Nucleares; Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil); Campos, Tarcisio P.R. de [Universidade Federal de Minas Gerais (UFMG), Belo Horizonte, MG (Brazil). Programa de Pos-graduacao em Ciencias e Tecnicas Nucleares; Ribeiro, Kilder L., E-mail: kilderlr@gmail.co [Universidade Estadual de Feira de Santana (UEFS), BA (Brazil). Dept. de Fisica
2009-07-01
Herein, the dosimetry responses to a retinoblastoma proton and electron radiation therapy were investigated. The computational tool applied to this simulation was the Geant4 code, version 4.9.1. The code allows simulating the charge particle interaction with eyeball tissue. In the present simulation, a box of 4 cm side water filled had represented the human eye. The simulation was performed considering mono energetic beams of protons and electrons with spectra of 57 to 70 MeV for protons and 2 to 8 MeV for electrons. The simulation was guide by the advanced hadron therapy example distributed with the Geant4 code. The phantom was divided in voxels with 0.2 mm side. The energy deposited in each voxel was evaluated taken the direct beam at one face. The simulation results show the delivery energy and therefore the dose deposited in each voxel. The deep dose profiles to proton and electron were plotted. The well known Bragg peak was reproduced for protons. The maximum delivered dose defined the position at the proton stopped. However, to electrons, the absorbed energies were delivered along its path producing a more continuous distribution following the water depth, but also being stopped in the end of its path. (author)
AUTHOR|(SzGeCERN)655637
The measurement of prompt photon associated with a b jet in proton-proton interactions can provide us insight into the inner structure of proton. This is because precision of determination of parton distribution functions of b quark and gluon can be increased by such a measurement. The measurement of cross-section of prompt photon associated with a b jet (process $pp\\longrightarrow \\gamma + b + X$) at $\\sqrt{s}$= 8 TeV with the ATLAS detector is presented. Full 8 TeV dataset collected by ATLAS during the year 2012 was used in this analysis. Corresponding integrated luminosity is 20.3 $fb^{-1}$. Fiducial differential cross-section as a function of photon transverse momentum at particle level was extracted from data and compared with the prediction of leading order event generator Pythia 8. Cross-section extracted from data is normalised independently on the Monte Carlo prediction. Values of data distribution lie above Monte Carlo values. The difference can be explained by presence of higher order effects not ...
... IMRT) Brain Tumor Treatment Brain Tumors Prostate Cancer Lung Cancer Treatment Lung Cancer Head and Neck Cancer Images related to Proton Therapy Videos related to Proton Therapy Sponsored by Please ...
Energy Technology Data Exchange (ETDEWEB)
Yu, Xudong, E-mail: 081022009@fudan.edu.cn [College of Chemistry and Material Sciences, Hebei Normal University, Yuhua Road 113, Shijiazhuang 050024 (China); College of Science and Hebei Research Center of Pharmaceutical and Chemical Engineering, Hebei University of Science and Technology, Yuhua Road 70, Shijiazhuang 050080 (China); Zhang, Ping [College of Chemistry and Material Sciences, Hebei Normal University, Yuhua Road 113, Shijiazhuang 050024 (China); Li, Yajuan; Zhen, Xiaoli; Geng, Lijun; Wang, Yanqiu [College of Science and Hebei Research Center of Pharmaceutical and Chemical Engineering, Hebei University of Science and Technology, Yuhua Road 70, Shijiazhuang 050080 (China); Ma, Zichuan, E-mail: ma7405@hebtu.edu.cn [College of Chemistry and Material Sciences, Hebei Normal University, Yuhua Road 113, Shijiazhuang 050024 (China)
2014-07-01
In this paper, a new kind of phenol-based chemsensor L2 comprised of a Schiff base and azo groups was rationally designed and synthesized. It could selectively recognize fluoride anion among tested anions such as F{sup −}, AcO{sup −}, H{sub 2}PO{sub 4}{sup −}, Cl{sup −}, Br{sup −}, and I{sup −} with obvious color changes from yellow to fuchsia. The intramolecular PT (proton transfer) in L1 and L2 was responsible for the sensing ability, which was certified by the {sup 1}H NMR and Uv–vis experiments. - Highlights: • The phenol derivative L2 could selectively sense F{sup −} among test anions. • Intramolecular proton transfer happened when L2 was bonded with F{sup −}. • It is the first antipyrine-based anion receptor.
Proton exchange membranes based on the short-side-chain perfluorinated ionomer
Ghielmi, A.; Vaccarono, P.; Troglia, C.; Arcella, V.
Due to the renovated availability of the base monomer for the synthesis of the short-side-chain (SSC) perfluorinated ionomer, fuel cell membrane development is being pursued using this well known ionomer structure, which was originally developed by Dow in the 1980s. The new membranes under development have the trade name Hyflon Ion. After briefly reviewing the literature on the Dow ionomer, new characterization data are reported on extruded Hyflon Ion membranes. The data are compared to those available in the literature on the Dow SSC ionomer and membranes. Comparison is made also with data obtained in this work or available in the literature on the long-side-chain (LSC) perfluorinated ionomer (Nafion). Thermal, visco-elastic, water absorption and mechanical properties of Hyflon Ion are studied. While the general behavior is similar to that shown in the past by the Dow membranes, slight differences are evident in the hydration behavior at equivalent weight (EW) glass transition temperature compared to Nafion, which makes it a more promising material for high temperature proton exchange membrane (PEM) fuel cell operation ( T > 100 °C). Beginning of life fuel cell performance has also been confirmed to be higher than that given by a Nafion membrane of equal thickness.
Photoisomerization action spectrum of retinal protonated Schiff base in the gas phase
Energy Technology Data Exchange (ETDEWEB)
Coughlan, N. J. A.; Catani, K. J.; Adamson, B. D.; Wille, U.; Bieske, E. J., E-mail: evanjb@unimelb.edu.au [School of Chemistry, The University of Melbourne, Victoria 3010 (Australia)
2014-04-28
The photophysical behaviour of the isolated retinal protonated n-butylamine Schiff base (RPSB) is investigated in the gas phase using a combination of ion mobility spectrometry and laser spectroscopy. The RPSB cations are introduced by electrospray ionisation into an ion mobility mass spectrometer where they are exposed to tunable laser radiation in the region of the S{sub 1} ← S{sub 0} transition (420–680 nm range). Four peaks are observed in the arrival time distribution of the RPSB ions. On the basis of predicted collision cross sections with nitrogen gas, the dominant peak is assigned to the all-trans isomer, whereas the subsidiary peaks are assigned to various single, double and triple cis geometric isomers. RPSB ions that absorb laser radiation undergo photoisomerization, leading to a detectable change in their drift speed. By monitoring the photoisomer signal as a function of laser wavelength an action spectrum, extending from 480 to 660 nm with a clear peak at 615 ± 5 nm, is obtained. The photoisomerization action spectrum is related to the absorption spectrum of isolated retinal RPSB molecules and should help benchmark future electronic structure calculations.
Energy Technology Data Exchange (ETDEWEB)
Malhado, Joao Pedro [Instituto de Fisica, Universidade de Sao Paulo, CP 66318, 05314-970 Sao Paulo, SP (Brazil); Hynes, James T. [Department of Chemistry and Biochemistry, University of Colorado, Boulder, Colorado 80309-0215 (United States); Chemistry Department, Ecole Normale Superieure, UMR ENS-CNRS-UPMC 8640, 24 rue Lhomond, 75005 Paris (France)
2012-12-14
The topographical character of conical intersections (CIs)-either sloped or peaked-has played a fundamental and important role in the discussion of the efficiency of CIs as photochemical 'funnels.' Here this perspective is employed in connection with a recent study of a model protonated Schiff base (PSB) cis to trans photoisomerization in solution [Malhado et al., J. Phys. Chem. A 115, 3720 (2011)]. In that study, the calculated reduced photochemical quantum yield for the successful production of trans product versus cis reactant in acetonitrile solvent compared to water was interpreted in terms of a dynamical solvent effect related to the dominance, for the acetonitrile case, of S{sub 1} to S{sub 0} nonadiabatic transitions prior to the reaching the seam of CIs. The solvent influence on the quantum yield is here re-examined in the sloped/peaked CI topographical perspective via conversion of the model's two PSB internal coordinates and a nonequilibrium solvent coordinate into an effective branching space description, which is then used to re-analyze the generalized Langevin equation/surface hopping results. The present study supports the original interpretation and enriches it in terms of topographical detail.
Seyyedmajidi, Mohammadreza; Ahmadi, Anahita; Hajiebrahimi, Shahin; Seyedmajidi, Seyedali; Rajabikashani, Majid; Firoozabadi, Mona; Vafaeimanesh, Jamshid
2016-01-01
Objective: Proton pump inhibitor-based triple therapy with two antibiotics for Helicobacter pylori eradication is widely accepted, but this combination fails in a considerable number of cases. Some studies have shown that cranberry inhibits the adhesion of a wide range of microbial pathogens, including H. pylori. The aim of this study was to assess the effect of cranberry on H. pylori eradication with a standard therapy including lansoprazole, clarithromycin, and amoxicillin (LCA) in patients with peptic ulcer disease (PUD). Methods: In this study, H. pylori-positive patients with PUD were randomized into two groups: Group A: A 14-day LCA triple therapy with 30 mg lansoprazole bid, 1000 mg amoxicillin bid, and 500 mg clarithromycin bid; Group B: A 14-day 500 mg cranberry capsules bid plus LCA triple therapy. A 13C-urea breath test was performed for eradication assessment 6 weeks after the completion of the treatment. Findings: Two hundred patients (53.5% males, between 23 and 77 years, mean age ± standard deviation: 50.29 ± 17.79 years) continued treatment protocols and underwent 13C-urea breath testing. H. pylori eradication was achieved in 74% in Group A (LCA without cranberry) and 89% in Group B (LCA with cranberry) (P = 0.042). Conclusion: The addition of cranberry to LCA triple therapy for H. pylori has a higher rate of eradication than the standard regimen alone (up to 89% and significant). PMID:27843960
15 MeV proton irradiation effects on Bi-based high temperature superconductors
Energy Technology Data Exchange (ETDEWEB)
Alinejad, N.; Sohrabi, D. [Nuclear Science and Technology Research Institute, Tehran (Iran, Islamic Republic of). Plasma and Nuclear Fusion Research School; Bolori, F. [Karaj Agricultural, Medical, and Industrial Research School, Karaj (Iran, Islamic Republic of)
2015-11-15
Nowadays, superconducting magnetic coils are used in some tokamaks such as EAST, KSTAR, JT-60, and T-15 to generate strong magnetic fields and also in ITER magnetic fields of about 13 tesla will be produced with the help of superconductors. The tokamak superconductors are exposed to the variety of radiations (neutron, ions beam, and gamma) from plasma nuclear reactions which will affect some of the superconductor properties. Therefore, study of the irradiation effects on the superconductor structure and properties are very crucial from technological and scientific point of view. One of the superconductor irradiation effects to be investigated under different conditions of energy and dosage is the potential resistance of the material used in tokamak reactor magnetic coils against activation by radiation. In this work, pellets of high T{sub c} Bi-based superconductors have been prepared and after measurement of parameters, a sample of pellet has been irradiated with 15 MeV protons using Karaj cyclotron facility. The sample's parameters have been measured again after irradiation treatment. X-ray diffraction patterns and SEM images of the sample before and after irradiation treatment have been studied.
Proton conducting sodium alginate electrolyte laterally coupled low-voltage oxide-based transistors
Energy Technology Data Exchange (ETDEWEB)
Liu, Yang Hui; Wan, Qing, E-mail: wanqing@nju.edu.cn [Ningbo Institute of Materials Technology and Engineering, Chinese Academy of Sciences, Ningbo 315201 (China); School of Electronic Science and Engineering, Nanjing University, Nanjing 210093 (China); Qiang Zhu, Li, E-mail: lqzhu@nimte.ac.cn [Ningbo Institute of Materials Technology and Engineering, Chinese Academy of Sciences, Ningbo 315201 (China); Shi, Yi [School of Electronic Science and Engineering, Nanjing University, Nanjing 210093 (China)
2014-03-31
Solution-processed sodium alginate electrolyte film shows a high proton conductivity of ∼5.5 × 10{sup −3} S/cm and a high lateral electric-double-layer (EDL) capacitance of ∼2.0 μF/cm{sup 2} at room temperature with a relative humidity of 57%. Low-voltage in-plane-gate indium-zinc-oxide-based EDL transistors laterally gated by sodium alginate electrolytes are fabricated on glass substrates. The field-effect mobility, current ON/OFF ratio, and subthreshold swing of such EDL transistors are estimated to be 4.2 cm{sup 2} V{sup −1} s{sup −1}, 2.8 × 10{sup 6}, and 130 mV/decade, respectively. At last, a low-voltage driven resistor-load inverter is also demonstrated. Such in-plane-gate EDL transistors have potential applications in portable electronics and low-cost biosensors.
Modeling the Earth's radiation belts. A review of quantitative data based electron and proton models
Vette, J. I.; Teague, M. J.; Sawyer, D. M.; Chan, K. W.
1979-01-01
The evolution of quantitative models of the trapped radiation belts is traced to show how the knowledge of the various features has developed, or been clarified, by performing the required analysis and synthesis. The Starfish electron injection introduced problems in the time behavior of the inner zone, but this residue decayed away, and a good model of this depletion now exists. The outer zone electrons were handled statistically by a log normal distribution such that above 5 Earth radii there are no long term changes over the solar cycle. The transition region between the two zones presents the most difficulty, therefore the behavior of individual substorms as well as long term changes must be studied. The latest corrections to the electron environment based on new data are outlined. The proton models have evolved to the point where the solar cycle effect at low altitudes is included. Trends for new models are discussed; the feasibility of predicting substorm injections and solar wind high-speed streams make the modeling of individual events a topical activity.
Javed, Kamran; Gouriveau, Rafael; Zerhouni, Noureddine; Hissel, Daniel
2016-08-01
Proton Exchange Membrane Fuel Cell (PEMFC) is considered the most versatile among available fuel cell technologies, which qualify for diverse applications. However, the large-scale industrial deployment of PEMFCs is limited due to their short life span and high exploitation costs. Therefore, ensuring fuel cell service for a long duration is of vital importance, which has led to Prognostics and Health Management of fuel cells. More precisely, prognostics of PEMFC is major area of focus nowadays, which aims at identifying degradation of PEMFC stack at early stages and estimating its Remaining Useful Life (RUL) for life cycle management. This paper presents a data-driven approach for prognostics of PEMFC stack using an ensemble of constraint based Summation Wavelet- Extreme Learning Machine (SW-ELM) models. This development aim at improving the robustness and applicability of prognostics of PEMFC for an online application, with limited learning data. The proposed approach is applied to real data from two different PEMFC stacks and compared with ensembles of well known connectionist algorithms. The results comparison on long-term prognostics of both PEMFC stacks validates our proposition.
Automatic Monte-Carlo Tuning for Minimum Bias Events at the LHC
Kama, Sami; Kolanoski, Hermann
The Large Hadron Collider near Geneva Switzerland will ultimately collide protons at a center-of-mass energy of $14\\tev$ and $40\\mhz$ bunch crossing rate with a luminosity of $\\lumi{10^{34}}$. At each bunch crossing about 20 soft proton-proton interactions are expected to happen. In order to study new phenomena and improve our current knowledge of the physics these events must be understood. However, the physics of soft interactions are not completely known at such high energies. Different phenomenological models, trying to explain these interactions, are implemented in several Monte-Carlo (MC) programs such as PYTHIA, PHOJET and EPOS. Some parameters in such MC programs can be tuned to improve the agreement with the data. In this thesis a new method for tuning the MC programs, based on Genetic Algorithms and distributed analysis techniques have been presented. This method represents the first and fully automated MC tuning technique that is based on true MC distributions. It ...
Hikosaka, Koki
2002-01-01
We discuss the status of supersymmetric grand unified theories [SUSY GUTs] with regards to the observation of proton decay. In this talk we focus on SUSY GUTs in 4 dimensions. We outline the major theoretical uncertainties present in the calculation of the proton lifetime and then present our best estimate of an absolute upper bound on the predicted proton lifetime. Towards the end, we consider some new results in higher dimensional GUTs and the ramifications for proton decay.
Directory of Open Access Journals (Sweden)
Tuija Kangasmaa
2012-01-01
Full Text Available Simultaneous Tl-201/Tc-99m dual isotope myocardial perfusion SPECT is seriously hampered by down-scatter from Tc-99m into the Tl-201 energy window. This paper presents and optimises the ordered-subsets-expectation-maximisation-(OS-EM- based reconstruction algorithm, which corrects the down-scatter using an efficient Monte Carlo (MC simulator. The algorithm starts by first reconstructing the Tc-99m image with attenuation, collimator response, and MC-based scatter correction. The reconstructed Tc-99m image is then used as an input for an efficient MC-based down-scatter simulation of Tc-99m photons into the Tl-201 window. This down-scatter estimate is finally used in the Tl-201 reconstruction to correct the crosstalk between the two isotopes. The mathematical 4D NCAT phantom and physical cardiac phantoms were used to optimise the number of OS-EM iterations where the scatter estimate is updated and the number of MC simulated photons. The results showed that two scatter update iterations and 105 simulated photons are enough for the Tc-99m and Tl-201 reconstructions, whereas 106 simulated photons are needed to generate good quality down-scatter estimates. With these parameters, the entire Tl-201/Tc-99m dual isotope reconstruction can be accomplished in less than 3 minutes.
Kangasmaa, Tuija; Kuikka, Jyrki; Sohlberg, Antti
2012-01-01
Simultaneous Tl-201/Tc-99m dual isotope myocardial perfusion SPECT is seriously hampered by down-scatter from Tc-99m into the Tl-201 energy window. This paper presents and optimises the ordered-subsets-expectation-maximisation-(OS-EM-) based reconstruction algorithm, which corrects the down-scatter using an efficient Monte Carlo (MC) simulator. The algorithm starts by first reconstructing the Tc-99m image with attenuation, collimator response, and MC-based scatter correction. The reconstructed Tc-99m image is then used as an input for an efficient MC-based down-scatter simulation of Tc-99m photons into the Tl-201 window. This down-scatter estimate is finally used in the Tl-201 reconstruction to correct the crosstalk between the two isotopes. The mathematical 4D NCAT phantom and physical cardiac phantoms were used to optimise the number of OS-EM iterations where the scatter estimate is updated and the number of MC simulated photons. The results showed that two scatter update iterations and 10(5) simulated photons are enough for the Tc-99m and Tl-201 reconstructions, whereas 10(6) simulated photons are needed to generate good quality down-scatter estimates. With these parameters, the entire Tl-201/Tc-99m dual isotope reconstruction can be accomplished in less than 3 minutes.
An OpenCL-based Monte Carlo dose calculation engine (oclMC) for coupled photon-electron transport
Tian, Zhen; Folkerts, Michael; Qin, Nan; Jiang, Steve B; Jia, Xun
2015-01-01
Monte Carlo (MC) method has been recognized the most accurate dose calculation method for radiotherapy. However, its extremely long computation time impedes clinical applications. Recently, a lot of efforts have been made to realize fast MC dose calculation on GPUs. Nonetheless, most of the GPU-based MC dose engines were developed in NVidia CUDA environment. This limits the code portability to other platforms, hindering the introduction of GPU-based MC simulations to clinical practice. The objective of this paper is to develop a fast cross-platform MC dose engine oclMC using OpenCL environment for external beam photon and electron radiotherapy in MeV energy range. Coupled photon-electron MC simulation was implemented with analogue simulations for photon transports and a Class II condensed history scheme for electron transports. To test the accuracy and efficiency of our dose engine oclMC, we compared dose calculation results of oclMC and gDPM, our previously developed GPU-based MC code, for a 15 MeV electron ...
Pan, J.; Durand, M. T.; Vanderjagt, B. J.
2015-12-01
Markov Chain Monte Carlo (MCMC) method is a retrieval algorithm based on Bayes' rule, which starts from an initial state of snow/soil parameters, and updates it to a series of new states by comparing the posterior probability of simulated snow microwave signals before and after each time of random walk. It is a realization of the Bayes' rule, which gives an approximation to the probability of the snow/soil parameters in condition of the measured microwave TB signals at different bands. Although this method could solve all snow parameters including depth, density, snow grain size and temperature at the same time, it still needs prior information of these parameters for posterior probability calculation. How the priors will influence the SWE retrieval is a big concern. Therefore, in this paper at first, a sensitivity test will be carried out to study how accurate the snow emission models and how explicit the snow priors need to be to maintain the SWE error within certain amount. The synthetic TB simulated from the measured snow properties plus a 2-K observation error will be used for this purpose. It aims to provide a guidance on the MCMC application under different circumstances. Later, the method will be used for the snowpits at different sites, including Sodankyla, Finland, Churchill, Canada and Colorado, USA, using the measured TB from ground-based radiometers at different bands. Based on the previous work, the error in these practical cases will be studied, and the error sources will be separated and quantified.
Lysak, Y. V.; Klimanov, V. A.; Narkevich, B. Ya
2017-01-01
One of the most difficult problems of modern radionuclide therapy (RNT) is control of the absorbed dose in pathological volume. This research presents new approach based on estimation of radiopharmaceutical (RP) accumulated activity value in tumor volume, based on planar scintigraphic images of the patient and calculated radiation transport using Monte Carlo method, including absorption and scattering in biological tissues of the patient, and elements of gamma camera itself. In our research, to obtain the data, we performed modeling scintigraphy of the vial with administered to the patient activity of RP in gamma camera, the vial was placed at the certain distance from the collimator, and the similar study was performed in identical geometry, with the same values of activity of radiopharmaceuticals in the pathological target in the body of the patient. For correct calculation results, adapted Fisher-Snyder human phantom was simulated in MCNP program. In the context of our technique, calculations were performed for different sizes of pathological targets and various tumors deeps inside patient’s body, using radiopharmaceuticals based on a mixed β-γ-radiating (131I, 177Lu), and clear β- emitting (89Sr, 90Y) therapeutic radionuclides. Presented method can be used for adequate implementing in clinical practice estimation of absorbed doses in the regions of interest on the basis of planar scintigraphy of the patient with sufficient accuracy.
Jin, Shengye; Tamura, Masayuki
2013-10-01
Monte Carlo Ray Tracing (MCRT) method is a versatile application for simulating radiative transfer regime of the Solar - Atmosphere - Landscape system. Moreover, it can be used to compute the radiation distribution over a complex landscape configuration, as an example like a forest area. Due to its robustness to the complexity of the 3-D scene altering, MCRT method is also employed for simulating canopy radiative transfer regime as the validation source of other radiative transfer models. In MCRT modeling within vegetation, one basic step is the canopy scene set up. 3-D scanning application was used for representing canopy structure as accurately as possible, but it is time consuming. Botanical growth function can be used to model the single tree growth, but cannot be used to express the impaction among trees. L-System is also a functional controlled tree growth simulation model, but it costs large computing memory. Additionally, it only models the current tree patterns rather than tree growth during we simulate the radiative transfer regime. Therefore, it is much more constructive to use regular solid pattern like ellipsoidal, cone, cylinder etc. to indicate single canopy. Considering the allelopathy phenomenon in some open forest optical images, each tree in its own `domain' repels other trees. According to this assumption a stochastic circle packing algorithm is developed to generate the 3-D canopy scene in this study. The canopy coverage (%) and the tree amount (N) of the 3-D scene are declared at first, similar to the random open forest image. Accordingly, we randomly generate each canopy radius (rc). Then we set the circle central coordinate on XY-plane as well as to keep circles separate from each other by the circle packing algorithm. To model the individual tree, we employ the Ishikawa's tree growth regressive model to set the tree parameters including DBH (dt), tree height (H). However, the relationship between canopy height (Hc) and trunk height (Ht) is
Energy Technology Data Exchange (ETDEWEB)
Kieseler, Jan
2015-12-15
In this thesis, measurements of the production cross sections for top-quark pairs and the determination of the top-quark mass are presented. Dileptonic decays of top-quark pairs (t anti t) with two opposite-charged lepton (electron and muon) candidates in the final state are considered. The studied data samples are collected in proton-proton collisions at the CERN Large Hadron Collider with the CMS detector and correspond to integrated luminosities of 5.0 fb{sup -1} and 19.7 fb{sup -1} at center-of-mass energies of √(s) = 7 TeV and √(s) = 8 TeV, respectively. The cross sections, σ{sub t} {sub anti} {sub t}, are measured in the fiducial detector volume (visible phase space), defined by the kinematics of the top-quark decay products, and are extrapolated to the full phase space. The visible cross sections are extracted in a simultaneous binned-likelihood fit to multi-differential distributions of final-state observables, categorized according to the multiplicity of jets associated to b quarks (b jets) and other jets in each event. The fit is performed with emphasis on a consistent treatment of correlations between systematic uncertainties and taking into account features of the t anti t event topology. By comparison with predictions from the Standard Model at next-to-next-to leading order (NNLO) accuracy, the top-quark pole mass, m{sub t}{sup pole}, is extracted from the measured cross sections for different state-of-the-art PDF sets. Furthermore, the top-quark mass parameter used in Monte-Carlo simulations, m{sub t}{sup MC}, is determined using the distribution of the invariant mass of a lepton candidate and the leading b jet in the event, m{sub lb}. Being defined by the kinematics of the top-quark decay, this observable is unaffected by the description of the top-quark production mechanism. Events are selected from the data collected at √(s) = 8 TeV that contain at least two jets and one b jet in addition to the lepton candidate pair. A novel technique is
España, Samuel; Paganetti, Harald
2010-12-21
The advantages of a finite range of proton beams can only be partly exploited in radiation therapy unless the range can be predicted in patient anatomy with proton-induced PET imaging aims at ∼2 mm accuracy in range verification. The latter is done using Monte Carlo predicted PET images. Monte Carlo methods are based on CT images to describe patient anatomy. The dose calculation algorithm and the CT resolution/artifacts might affect dose calculation accuracy. Additionally, when using Monte Carlo for PET range verification, the biological decay model and the cross sections for positron emitter production affect predicted PET images. The goal of this work is to study the effect of uncertainties in the CT conversion on the proton beam range predicted by Monte Carlo dose calculations and proton-induced PET signals. Conversion schemes to assign density and elemental composition based on a CT image of the patient define a unique Hounsfield unit (HU) to tissue parameters relationship. Uncertainties are introduced because there is no unique relationship between HU and tissue parameters. In this work, different conversion schemes based on a stoichiometric calibration method as well as different numbers of tissue bins were considered in three head and neck patients. For Monte Carlo dose calculation, the results show close to zero (proton dose distributions based on Monte Carlo calculation are only slightly affected by the uncertainty on density and elemental composition introduced by unique assignment to each HU if a stoichiometric calibration is used. Calculated PET images used for range verification are more sensitive to conversion uncertainties causing an intrinsic limitation due to CT conversion alone of at least 1 mm.
Energy Technology Data Exchange (ETDEWEB)
Chung, Kwang Zoo; Han, Young Yih; Kim, Jin Sung [Dept. of Radiation Oncology, Samsung Medical Center, Sungkyunkwan University School of Medicine, Seoul (Korea, Republic of); and others
2015-12-15
The purpose of this report is to describe the proton therapy system at Samsung Medical Center (SMC-PTS) including the proton beam generator, irradiation system, patient positioning system, patient position verification system, respiratory gating system, and operating and safety control system, and review the current status of the SMC-PTS. The SMC-PTS has a cyclotron (230 MeV) and two treatment rooms: one treatment room is equipped with a multi-purpose nozzle and the other treatment room is equipped with a dedicated pencil beam scanning nozzle. The proton beam generator including the cyclotron and the energy selection system can lower the energy of protons down to 70 MeV from the maximum 230 MeV. The multi-purpose nozzle can deliver both wobbling proton beam and active scanning proton beam, and a multi-leaf collimator has been installed in the downstream of the nozzle. The dedicated scanning nozzle can deliver active scanning proton beam with a helium gas filled pipe minimizing unnecessary interactions with the air in the beam path. The equipment was provided by Sumitomo Heavy Industries Ltd., RayStation from RaySearch Laboratories AB is the selected treatment planning system, and data management will be handled by the MOSAIQ system from Elekta AB. The SMC-PTS located in Seoul, Korea, is scheduled to begin treating cancer patients in 2015.
Intermediate temperature fuel cells based on proton conducting electrolytes. Final report
Energy Technology Data Exchange (ETDEWEB)
Duval, S.; Holtappels, P.
2006-03-15
Solid oxide proton conductors can offer a new intermediate temperature fuel cell technology combining the advantages of polymeric fuel cells and solid oxide fuel cells. Among potential proton conductor materials, Y-doped barium zirconate (BZY) was found to be a promising candidate. This material was synthesised and characterised at EMPA. The synthesis study shows the possibility to use up scalable methods to produce BZY. It was demonstrated that BZY can take up protons and that the protons are the mobile charge carriers that dominate the conductivity. The conductivity of the grain interior (log {sigma} {approx} -3 S.cm{sup -1} at 300 {sup o}C) competes with the conductivity of the best proton conductors. A correlation between the bulk conductivity and the cubic lattice parameter was observed. It was found that controlling the lattice parameter during the synthesis enable to tune the conductivity. The total conductivity of the test material was found to be dominated by the large resistive grain boundary contribution. Neither a clear microstructure/conductivity relationship could be identified nor could be found a blocking secondary phase. Only an exceptional thermal treatment (annealing up to 2200 {sup o}C) showed an improvement of the grain boundary conductivity. A first interpretation presumes an electronic effect arising from the shearing of crystallographic planes that depresses either the proton concentration or the proton mobility in the vicinity of the grain boundaries (i.e. in the so-called 'space charge region'). Consequences for the further development of BZY for fuel cell application are discussed. (author)
Jana, Sankar; Dalapati, Sasanka; Guchhait, Nikhil
2012-11-15
Photochromic Schiff bases 5-diethylamino-2-[(4-diethylamino-benzylidene)-hydrazonomethyl]-phenol (DDBHP) and N,N'-bis(4-N,N-diethylaminosalisalidene) hydrazine (DEASH) with both the proton and charge transfer moieties have been synthesized, and their photophysical properties such as excited state intramolecular charge transfer (ICT) and proton transfer (ESIPT) processes have been reported on the basis of steady-state and time-resolved spectral measurement in various solvents. The ground-state six-membered intramolecular hydrogen bonding network at the proton transfer site accelerates the ESIPT process for these compounds. Both the compounds show large Stokes-shifted emission bands for proton transfer and charge transfer processes. The hydrogen bonding solvents play a crucial role in these photophysical processes. Excited-state dipole moment of DDBHP and DEASH calculated by the solvatochromic method supports the polar character of the charge transfer excited state. Introduction of -NEt(2) groups to the reported salicylaldehyde azine (SAA) Schiff base results an increase in fluorescence lifetime from femtosecond to picosecond time scale for the proton transfer process.
Reply to "Comment on 'A study on tetrahedron-based inhomogeneous Monte-Carlo optical simulation'".
Shen, Haiou; Wang, Ge
2011-04-19
We compare the accuracy of TIM-OS and MMCM in response to the recent analysis made by Fang [Biomed. Opt. Express 2, 1258 (2011)]. Our results show that the tetrahedron-based energy deposition algorithm used in TIM-OS is more accurate than the node-based energy deposition algorithm used in MMCM.
Multi-period mean–variance portfolio optimization based on Monte-Carlo simulation
Cong, F.; Oosterlee, C.W.
2016-01-01
We propose a simulation-based approach for solving the constrained dynamic mean– variance portfolio managemen tproblem. For this dynamic optimization problem, we first consider a sub-optimal strategy, called the multi-stage strategy, which can be utilized in a forward fashion. Then, based on this fa
Demidov, A.; Eschlböck-Fuchs, S.; Kazakov, A. Ya.; Gornushkin, I. B.; Kolmhofer, P. J.; Pedarnig, J. D.; Huber, N.; Heitz, J.; Schmid, T.; Rössler, R.; Panne, U.
2016-11-01
The improved Monte-Carlo (MC) method for standard-less analysis in laser induced breakdown spectroscopy (LIBS) is presented. Concentrations in MC LIBS are found by fitting model-generated synthetic spectra to experimental spectra. The current version of MC LIBS is based on the graphic processing unit (GPU) computation and reduces the analysis time down to several seconds per spectrum/sample. The previous version of MC LIBS which was based on the central processing unit (CPU) computation requested unacceptably long analysis times of 10's minutes per spectrum/sample. The reduction of the computational time is achieved through the massively parallel computing on the GPU which embeds thousands of co-processors. It is shown that the number of iterations on the GPU exceeds that on the CPU by a factor > 1000 for the 5-dimentional parameter space and yet requires > 10-fold shorter computational time. The improved GPU-MC LIBS outperforms the CPU-MS LIBS in terms of accuracy, precision, and analysis time. The performance is tested on LIBS-spectra obtained from pelletized powders of metal oxides consisting of CaO, Fe2O3, MgO, and TiO2 that simulated by-products of steel industry, steel slags. It is demonstrated that GPU-based MC LIBS is capable of rapid multi-element analysis with relative error between 1 and 10's percent that is sufficient for industrial applications (e.g. steel slag analysis). The results of the improved GPU-based MC LIBS are positively compared to that of the CPU-based MC LIBS as well as to the results of the standard calibration-free (CF) LIBS based on the Boltzmann plot method.
Free standing diamond-like carbon thin films by PLD for laser based electrons/protons acceleration
Energy Technology Data Exchange (ETDEWEB)
Thema, F.T.; Beukes, P.; Ngom, B.D. [UNESCO Africa Chair in Nanosciences-Nanotechnology, College of Graduate Studies, University of South Africa, Muckleneuk Ridge, PO Box 392, Pretoria (South Africa); Nanosciences African Network (NANOAFNET), iThemba LABS-National Research Foundation, 1 Old Faure Road, Somerset West, 7129, PO Box722, Western Cape Province (South Africa); Manikandan, E., E-mail: mani@tlabs.ac.za [UNESCO Africa Chair in Nanosciences-Nanotechnology, College of Graduate Studies, University of South Africa, Muckleneuk Ridge, PO Box 392, Pretoria (South Africa); Nanosciences African Network (NANOAFNET), iThemba LABS-National Research Foundation, 1 Old Faure Road, Somerset West, 7129, PO Box722, Western Cape Province (South Africa); Central Research Laboratory, Sree Balaji Medical College & Hospital (SBMCH), Chrompet, Bharath University, Chennai, 600044 (India); Maaza, M., E-mail: maaza@tlabs.ac.za [UNESCO Africa Chair in Nanosciences-Nanotechnology, College of Graduate Studies, University of South Africa, Muckleneuk Ridge, PO Box 392, Pretoria (South Africa); Nanosciences African Network (NANOAFNET), iThemba LABS-National Research Foundation, 1 Old Faure Road, Somerset West, 7129, PO Box722, Western Cape Province (South Africa)
2015-11-05
This study we reports for the first time on the synthesis and optical characteristics of free standing diamond-like carbon (DLC) deposited by pulsed laser deposition (PLD) onto graphene buffer layers for ultrahigh intensity laser based electron/proton acceleration applications. The fingerprint techniques of micro-Raman, UV–VIS–NIR and the IR spectroscopic investigations indicate that the suitability of such free standing DLC thin-films within the laser window and long wave infrared (LWIR) spectral range and hence their appropriateness for the targeted applications. - Highlights: • We report for the first time synthesis of free standing diamond-like carbon. • Pulsed laser deposition onto graphene buffer layers. • Fingerprint techniques of micro-Raman, UV–VIS–NIR and the IR spectroscopic investigations. • Ultrahigh intensity laser based electron/proton acceleration applications. • This material's suitable for the laser window and long wave infrared (LWIR) spectral range.
Shypailo, R. J.; Ellis, K. J.
2011-05-01
During construction of the whole body counter (WBC) at the Children's Nutrition Research Center (CNRC), efficiency calibration was needed to translate acquired counts of 40K to actual grams of potassium for measurement of total body potassium (TBK) in a diverse subject population. The MCNP Monte Carlo n-particle simulation program was used to describe the WBC (54 detectors plus shielding), test individual detector counting response, and create a series of virtual anthropomorphic phantoms based on national reference anthropometric data. Each phantom included an outer layer of adipose tissue and an inner core of lean tissue. Phantoms were designed for both genders representing ages 3.5 to 18.5 years with body sizes from the 5th to the 95th percentile based on body weight. In addition, a spherical surface source surrounding the WBC was modeled in order to measure the effects of subject mass on room background interference. Individual detector measurements showed good agreement with the MCNP model. The background source model came close to agreement with empirical measurements, but showed a trend deviating from unity with increasing subject size. Results from the MCNP simulation of the CNRC WBC agreed well with empirical measurements using BOMAB phantoms. Individual detector efficiency corrections were used to improve the accuracy of the model. Nonlinear multiple regression efficiency calibration equations were derived for each gender. Room background correction is critical in improving the accuracy of the WBC calibration.
Simakov, S P; Moellendorff, U V; Schmuck, I; Konobeev, A Y; Korovin, Y A; Pereslavtsev, P
2002-01-01
A newly developed computational procedure is presented for the generation of d-Li source neutrons in Monte Carlo transport calculations based on the use of evaluated double-differential d+ sup 6 sup , sup 7 Li cross section data. A new code M sup c DeLicious was developed as an extension to MCNP4C to enable neutronics design calculations for the d-Li based IFMIF neutron source making use of the evaluated deuteron data files. The M sup c DeLicious code was checked against available experimental data and calculation results of M sup c DeLi and MCNPX, both of which use built-in analytical models for the Li(d, xn) reaction. It is shown that M sup c DeLicious along with newly evaluated d+ sup 6 sup , sup 7 Li data is superior in predicting the characteristics of the d-Li neutron source. As this approach makes use of tabulated Li(d, xn) cross sections, the accuracy of the IFMIF d-Li neutron source term can be steadily improved with more advanced and validated data.
Stamenkovic, Dragan D.; Popovic, Vladimir M.
2015-02-01
Warranty is a powerful marketing tool, but it always involves additional costs to the manufacturer. In order to reduce these costs and make use of warranty's marketing potential, the manufacturer needs to master the techniques for warranty cost prediction according to the reliability characteristics of the product. In this paper a combination free replacement and pro rata warranty policy is analysed as warranty model for one type of light bulbs. Since operating conditions have a great impact on product reliability, they need to be considered in such analysis. A neural network model is used to predict light bulb reliability characteristics based on the data from the tests of light bulbs in various operating conditions. Compared with a linear regression model used in the literature for similar tasks, the neural network model proved to be a more accurate method for such prediction. Reliability parameters obtained in this way are later used in Monte Carlo simulation for the prediction of times to failure needed for warranty cost calculation. The results of the analysis make possible for the manufacturer to choose the optimal warranty policy based on expected product operating conditions. In such a way, the manufacturer can lower the costs and increase the profit.
Bruser, Christoph; Strutz, Nils; Winter, Stefan; Leonhardt, Steffen; Walter, Marian
2015-06-01
Unobtrusive, long-term monitoring of cardiac (and respiratory) rhythms using only non-invasive vibration sensors mounted in beds promises to unlock new applications in home and low acuity monitoring. This paper presents a novel concept for such a system based on an array of near infrared (NIR) sensors placed underneath a regular bed mattress. We focus on modeling and analyzing the underlying technical measurement principle with the help of a 2D model of a polyurethane foam mattress and Monte-Carlo simulations of the opto-mechanical interaction responsible for signal genesis. Furthermore, a test rig to automatically and repeatably impress mechanical vibrations onto a mattress is introduced and used to identify the properties of a prototype implementation of the proposed measurement principle. Results show that NIR-based sensing is capable of registering miniscule deformations of the mattress with a high spatial specificity. As a final outlook, proof-of-concept measurements with the sensor array are presented which demonstrate that cardiorespiratory movements of the body can be detected and that automatic heart rate estimation at competitive error levels is feasible with the proposed approach.
Quan, Guotao; Gong, Hui; Deng, Yong; Fu, Jianwei; Luo, Qingming
2011-02-01
High-speed fluorescence molecular tomography (FMT) reconstruction for 3-D heterogeneous media is still one of the most challenging problems in diffusive optical fluorescence imaging. In this paper, we propose a fast FMT reconstruction method that is based on Monte Carlo (MC) simulation and accelerated by a cluster of graphics processing units (GPUs). Based on the Message Passing Interface standard, we modified the MC code for fast FMT reconstruction, and different Green's functions representing the flux distribution in media are calculated simultaneously by different GPUs in the cluster. A load-balancing method was also developed to increase the computational efficiency. By applying the Fréchet derivative, a Jacobian matrix is formed to reconstruct the distribution of the fluorochromes using the calculated Green's functions. Phantom experiments have shown that only 10 min are required to get reconstruction results with a cluster of 6 GPUs, rather than 6 h with a cluster of multiple dual opteron CPU nodes. Because of the advantages of high accuracy and suitability for 3-D heterogeneity media with refractive-index-unmatched boundaries from the MC simulation, the GPU cluster-accelerated method provides a reliable approach to high-speed reconstruction for FMT imaging.
Shypailo, R J; Ellis, K J
2011-05-21
During construction of the whole body counter (WBC) at the Children's Nutrition Research Center (CNRC), efficiency calibration was needed to translate acquired counts of (40)K to actual grams of potassium for measurement of total body potassium (TBK) in a diverse subject population. The MCNP Monte Carlo n-particle simulation program was used to describe the WBC (54 detectors plus shielding), test individual detector counting response, and create a series of virtual anthropomorphic phantoms based on national reference anthropometric data. Each phantom included an outer layer of adipose tissue and an inner core of lean tissue. Phantoms were designed for both genders representing ages 3.5 to 18.5 years with body sizes from the 5th to the 95th percentile based on body weight. In addition, a spherical surface source surrounding the WBC was modeled in order to measure the effects of subject mass on room background interference. Individual detector measurements showed good agreement with the MCNP model. The background source model came close to agreement with empirical measurements, but showed a trend deviating from unity with increasing subject size. Results from the MCNP simulation of the CNRC WBC agreed well with empirical measurements using BOMAB phantoms. Individual detector efficiency corrections were used to improve the accuracy of the model. Nonlinear multiple regression efficiency calibration equations were derived for each gender. Room background correction is critical in improving the accuracy of the WBC calibration.
Energy Technology Data Exchange (ETDEWEB)
Shypailo, R J; Ellis, K J, E-mail: shypailo@bcm.edu [USDA/ARS Children' s Nutrition Research Center, Baylor College of Medicine, 1100 Bates Street, Houston, TX 77030 (United States)
2011-05-21
During construction of the whole body counter (WBC) at the Children's Nutrition Research Center (CNRC), efficiency calibration was needed to translate acquired counts of {sup 40}K to actual grams of potassium for measurement of total body potassium (TBK) in a diverse subject population. The MCNP Monte Carlo n-particle simulation program was used to describe the WBC (54 detectors plus shielding), test individual detector counting response, and create a series of virtual anthropomorphic phantoms based on national reference anthropometric data. Each phantom included an outer layer of adipose tissue and an inner core of lean tissue. Phantoms were designed for both genders representing ages 3.5 to 18.5 years with body sizes from the 5th to the 95th percentile based on body weight. In addition, a spherical surface source surrounding the WBC was modeled in order to measure the effects of subject mass on room background interference. Individual detector measurements showed good agreement with the MCNP model. The background source model came close to agreement with empirical measurements, but showed a trend deviating from unity with increasing subject size. Results from the MCNP simulation of the CNRC WBC agreed well with empirical measurements using BOMAB phantoms. Individual detector efficiency corrections were used to improve the accuracy of the model. Nonlinear multiple regression efficiency calibration equations were derived for each gender. Room background correction is critical in improving the accuracy of the WBC calibration.
Institute of Scientific and Technical Information of China (English)
张利; 周志革; 黄文振
2001-01-01
现实生产中，由于夹具定位元件的磨损、松动、弯曲和断裂，使得夹具不能够对零件进行精确定位．针对目前实际生产中夹具过定位普遍存在且缺少相应的故障诊断方法的情况，提出了应用统计方法进行夹具的故障诊断．夹具过定位故障诊断模型只需用工件的几何信息，同时模型可给出相应的故障报警概率．Monte Carl0模拟结果表明了该方法的正确性．%The parts can not be located accurately because of the worn, loose or broken location elements in the process of assembly of body in white, and this is the main cause for assembly tolerance of body in white. Fault diagnosis method only follows the 3-2-1 fixturing principle now. There is not a clear geometric relationship between the location element displacement and measurement point variation under the condition of redundant constraint. In this paper, Monte Carlo method was used to simulate positions of a part, and a model was presented to diagnose the fault based on statistical data, which gives the probability of detecting faults and the probability of false alarms also. The results of the Monte Carlo simulation show that this method is reasonable.
Energy Technology Data Exchange (ETDEWEB)
Garcia, Marie-Paule, E-mail: marie-paule.garcia@univ-brest.fr; Villoing, Daphnée [UMR 1037 INSERM/UPS, CRCT, 133 Route de Narbonne, 31062 Toulouse (France); McKay, Erin [St George Hospital, Gray Street, Kogarah, New South Wales 2217 (Australia); Ferrer, Ludovic [ICO René Gauducheau, Boulevard Jacques Monod, St Herblain 44805 (France); Cremonesi, Marta; Botta, Francesca; Ferrari, Mahila [European Institute of Oncology, Via Ripamonti 435, Milano 20141 (Italy); Bardiès, Manuel [UMR 1037 INSERM/UPS, CRCT, 133 Route de Narbonne, Toulouse 31062 (France)
2015-12-15
Purpose: The TestDose platform was developed to generate scintigraphic imaging protocols and associated dosimetry by Monte Carlo modeling. TestDose is part of a broader project (www.dositest.com) whose aim is to identify the biases induced by different clinical dosimetry protocols. Methods: The TestDose software allows handling the whole pipeline from virtual patient generation to resulting planar and SPECT images and dosimetry calculations. The originality of their approach relies on the implementation of functional segmentation for the anthropomorphic model representing a virtual patient. Two anthropomorphic models are currently available: 4D XCAT and ICRP 110. A pharmacokinetic model describes the biodistribution of a given radiopharmaceutical in each defined compartment at various time-points. The Monte Carlo simulation toolkit GATE offers the possibility to accurately simulate scintigraphic images and absorbed doses in volumes of interest. The TestDose platform relies on GATE to reproduce precisely any imaging protocol and to provide reference dosimetry. For image generation, TestDose stores user’s imaging requirements and generates automatically command files used as input for GATE. Each compartment is simulated only once and the resulting output is weighted using pharmacokinetic data. Resulting compartment projections are aggregated to obtain the final image. For dosimetry computation, emission data are stored in the platform database and relevant GATE input files are generated for the virtual patient model and associated pharmacokinetics. Results: Two samples of software runs are given to demonstrate the potential of TestDose. A clinical imaging protocol for the Octreoscan™ therapeutical treatment was implemented using the 4D XCAT model. Whole-body “step and shoot” acquisitions at different times postinjection and one SPECT acquisition were generated within reasonable computation times. Based on the same Octreoscan™ kinetics, a dosimetry
Otto, H; Marti, T; Holz, M; Mogi, T; Lindau, M; Khorana, H G; Heyn, M P
1989-01-01
Above pH 8 the decay of the photocycle intermediate M of bacteriorhodopsin splits into two components: the usual millisecond pH-independent component and an additional slower component with a rate constant proportional to the molar concentration of H+, [H+]. In parallel, the charge translocation signal associated with the reprotonation of the Schiff base develops a similar slow component. These observations are explained by a two-step reprotonation mechanism. An internal donor first reprotonates the Schiff base in the decay of M to N and is then reprotonated from the cytoplasm in the N----O transition. The decay rate of N is proportional to [H+]. By postulating a back reaction from N to M, the M decay splits up into two components, with the slower one having the same pH dependence as the decay of N. Photocycle, photovoltage, and pH-indicator experiments with mutants in which aspartic acid-96 is replaced by asparagine or alanine, which we call D96N and D96A, suggest that Asp-96 is the internal proton donor involved in the re-uptake pathway. In both mutants the stoichiometry of proton pumping is the same as in wild type. However, the M decay is monophasic, with the logarithm of the decay time [log (tau)] linearly dependent on pH, suggesting that the internal donor is absent and that the Schiff base is directly reprotonated from the cytoplasm. Like H+, azide increases the M decay rate in D96N. The rate constant is proportional to the azide concentration and can become greater than 100 times greater than in wild type. Thus, azide functions as a mobile proton donor directly reprotonating the Schiff base in a bimolecular reaction. Both the proton and azide effects, which are absent in wild type, indicate that the internal donor is removed and that the reprotonation pathway is different from wild type in these mutants. PMID:2556706
The underlying event in proton-proton collisions
Energy Technology Data Exchange (ETDEWEB)
Bechtel, F.
2009-05-15
In this thesis, studies of the underlying event in proton-proton collisions at a center-of-mass energy of {radical}(s) = 10 TeV are presented. Crucial ingredient to underlying event models are multiple parton-parton scatters in single proton-proton collisions. The feasibility of measuring the underlying event was investigated with the Compact Muon Solenoid (CMS) detector at the Large Hadron Collider (LHC) using charged particles and charged-particle jets. Systematic uncertainties of the underlying event measurement due to detector misalignment and imperfect track reconstruction are found to be negligible after {integral}Ldt=1 pb{sup -1} of data are available. Different model predictions are compared with each other using fully simulated Monte Carlo samples. It is found, that distinct models differ strongly enough to tell them apart with early data. (orig.)
MC-Net: a method for the construction of phylogenetic networks based on the Monte-Carlo method
Directory of Open Access Journals (Sweden)
Eslahchi Changiz
2010-08-01
Full Text Available Abstract Background A phylogenetic network is a generalization of phylogenetic trees that allows the representation of conflicting signals or alternative evolutionary histories in a single diagram. There are several methods for constructing these networks. Some of these methods are based on distances among taxa. In practice, the methods which are based on distance perform faster in comparison with other methods. The Neighbor-Net (N-Net is a distance-based method. The N-Net produces a circular ordering from a distance matrix, then constructs a collection of weighted splits using circular ordering. The SplitsTree which is a program using these weighted splits makes a phylogenetic network. In general, finding an optimal circular ordering is an NP-hard problem. The N-Net is a heuristic algorithm to find the optimal circular ordering which is based on neighbor-joining algorithm. Results In this paper, we present a heuristic algorithm to find an optimal circular ordering based on the Monte-Carlo method, called MC-Net algorithm. In order to show that MC-Net performs better than N-Net, we apply both algorithms on different data sets. Then we draw phylogenetic networks corresponding to outputs of these algorithms using SplitsTree and compare the results. Conclusions We find that the circular ordering produced by the MC-Net is closer to optimal circular ordering than the N-Net. Furthermore, the networks corresponding to outputs of MC-Net made by SplitsTree are simpler than N-Net.
Swart, Marcel; Bickelhaupt, F Matthias
2006-03-01
We have carried out an extensive exploration of the gas-phase basicity of archetypal anionic bases across the periodic system using the generalized gradient approximation of density functional theory (DFT) at BP86/QZ4P//BP86/TZ2P. First, we validate DFT as a reliable tool for computing proton affinities and related thermochemical quantities: BP86/QZ4P//BP86/TZ2P is shown to yield a mean absolute deviation of 1.6 kcal/mol for the proton affinity at 0 K with respect to high-level ab initio benchmark data. The main purpose of this work is to provide the proton affinities (and corresponding entropies) at 298 K of the anionic conjugate bases of all main-group-element hydrides of groups 14-17 and periods 2-6. We have also studied the effect of stepwise methylation of the protophilic center of the second- and third-period bases.
Beam Dynamics Based Design of Solenoid Channel for TAC Proton Linac
Kisoglu, H F
2014-01-01
Today a linear particle accelerator (linac), in which electric and magnetic fields are of vital importance, is one of the popular energy generation sources like Accelerator Driven System (ADS). A multipurpose, including primarily ADS, proton linac with energy of ~2 GeV is planned to constitute within the Turkish Accelerator Center (TAC) project collaborated by more than 10 Turkish universities. A Low Energy Beam Transport (LEBT) channel with two solenoids is a subcomponent of this linac. It transports the proton beam ejected by an ion source, and matches it with the Radio Frequency Quadrupole (RFQ) that is an important part of the linac. The LEBT channel would be consisted of two focusing solenoids and some diagnostic elements such as faraday cup, BC transformers, etc. This paper includes a beam dynamical design and optimization study of LEBT channel for TAC proton linac done by using a beam dynamics simulation code PATH MANAGER and comparing of the simulation results with the theoretical expectations.
Institute of Scientific and Technical Information of China (English)
张佳琦; 靳家玉; 张隽佶; 邹雷
2012-01-01
A simple method for the synthesis of new bithienylethenes bearing a functional group on the cyclopentene moi- ety is developed. Three new photochromic compounds （4a, 4b, 4c） have been successfully synthesized through this simple method, and exhibit good photochromic properties with alternate irradiation of ultraviolet and visible light. Furthermore, the fluorescence of compound 4a, which bears a quinoline unit on the cyclopentene, can be modulated via optic and proton dual inputs. Upon excitation by 320 nm light, 4a emits a strong fluorescence at 404 nm. After irradiation with 254 nm light, the emission intensity is reduced due to the fluorescence resonance energy transfers （FRET） from quinoline unit to bithienylethene unit. Moreover, on addition of H~, the fluorescence is quenched completely due to the protonation of the quinoline. In addition, both the FRET and protonation process are reversi- ble, which indicates a potential application in molecular switches and logic gates.
Zhao, Shu-Na; Song, Xue-Zhi; Zhu, Min; Meng, Xing; Wu, Lan-Lan; Song, Shu-Yan; Wang, Cheng; Zhang, Hong-Jie
2015-01-21
Three new coordination polymers (CPs)/metal-organic frameworks (MOFs) with different structures have been synthesized using 4,8-disulfonyl-2,6-naphthalenedicarboxylic acid (H4L) and metal ions, Cu(2+), Ca(2+) and Cd(2+). The Cu compound features a one-dimensional chain structure, further extending into a 2D layer network through H-bond interactions. Both the Ca and Cd compounds show 3D frameworks with (4,4)-connected PtS-type topology and (3,6)-connected bct-type topology, respectively. These CPs/MOFs all exhibit proton conduction behavior, especially for the Cu compound with a proton conductivity of 3.46 × 10(-3) S cm(-1) at 368 K and 95% relative humidity (RH). Additionally, the activation energy (Ea) has also been investigated to deeply understand the proton-conduction mechanism.
Yuan, Jiankui; Zheng, Yiran; Wessels, Barry; Lo, Simon S; Ellis, Rodney; Machtay, Mitchell; Yao, Min
2016-12-01
A virtual source model for Monte Carlo simulations of helical TomoTherapy has been developed previously by the authors. The purpose of this work is to perform experiments in an anthropomorphic (RANDO) phantom with the same order of complexity as in clinical treatments to validate the virtual source model to be used for quality assurance secondary check on TomoTherapy patient planning dose. Helical TomoTherapy involves complex delivery pattern with irregular beam apertures and couch movement during irradiation. Monte Carlo simulation, as the most accurate dose algorithm, is desirable in radiation dosimetry. Current Monte Carlo simulations for helical TomoTherapy adopt the full Monte Carlo model, which includes detailed modeling of individual machine component, and thus, large phase space files are required at different scoring planes. As an alternative approach, we developed a virtual source model without using the large phase space files for the patient dose calculations previously. In this work, we apply the simulation system to recompute the patient doses, which were generated by the treatment planning system in an anthropomorphic phantom to mimic the real patient treatments. We performed thermoluminescence dosimeter point dose and film measurements to compare with Monte Carlo results. Thermoluminescence dosimeter measurements show that the relative difference in both Monte Carlo and treatment planning system is within 3%, with the largest difference less than 5% for both the test plans. The film measurements demonstrated 85.7% and 98.4% passing rate using the 3 mm/3% acceptance criterion for the head and neck and lung cases, respectively. Over 95% passing rate is achieved if 4 mm/4% criterion is applied. For the dose-volume histograms, very good agreement is obtained between the Monte Carlo and treatment planning system method for both cases. The experimental results demonstrate that the virtual source model Monte Carlo system can be a viable option for the
Nuclear interaction cross sections for proton radiotherapy
Chadwick, M B; Arendse, G J; Cowley, A A; Richter, W A; Lawrie, J J; Newman, R T; Pilcher, J V; Smit, F D; Steyn, G F; Koen, J W; Stander, J A
1999-01-01
Model calculations of proton-induced nuclear reaction cross sections are described for biologically-important targets. Measurements made at the National Accelerator Centre are presented for double-differential proton, deuteron, triton, helium-3 and alpha particle spectra, for 150 and 200 MeV protons incident on C, N, and O. These data are needed for Monte Carlo simulations of radiation transport and absorbed dose in proton therapy. Data relevant to the use of positron emission tomography to locate the Bragg peak are also described.
Energy Technology Data Exchange (ETDEWEB)
Visvikis, D. [INSERM U650, LaTIM, University Hospital Medical School, F-29609 Brest (France)]. E-mail: Visvikis.Dimitris@univ-brest.fr; Lefevre, T. [INSERM U650, LaTIM, University Hospital Medical School, F-29609 Brest (France); Lamare, F. [INSERM U650, LaTIM, University Hospital Medical School, F-29609 Brest (France); Kontaxakis, G. [ETSI Telecomunicacion Universidad Politecnica de Madrid, Ciudad Universitaria, s/n 28040, Madrid (Spain); Santos, A. [ETSI Telecomunicacion Universidad Politecnica de Madrid, Ciudad Universitaria, s/n 28040, Madrid (Spain); Darambara, D. [Department of Physics, School of Engineering and Physical Sciences, University of Surrey, Guildford (United Kingdom)
2006-12-20
The majority of present position emission tomography (PET) animal systems are based on the coupling of high-density scintillators and light detectors. A disadvantage of these detector configurations is the compromise between image resolution, sensitivity and energy resolution. In addition, current combined imaging devices are based on simply placing back-to-back and in axial alignment different apparatus without any significant level of software or hardware integration. The use of semiconductor CdZnTe (CZT) detectors is a promising alternative to scintillators for gamma-ray imaging systems. At the same time CZT detectors have the potential properties necessary for the construction of a truly integrated imaging device (PET/SPECT/CT). The aims of this study was to assess the performance of different small animal PET scanner architectures based on CZT pixellated detectors and compare their performance with that of state of the art existing PET animal scanners. Different scanner architectures were modelled using GATE (Geant4 Application for Tomographic Emission). Particular scanner design characteristics included an overall cylindrical scanner format of 8 and 24 cm in axial and transaxial field of view, respectively, and a temporal coincidence window of 8 ns. Different individual detector modules were investigated, considering pixel pitch down to 0.625 mm and detector thickness from 1 to 5 mm. Modified NEMA NU2-2001 protocols were used in order to simulate performance based on mouse, rat and monkey imaging conditions. These protocols allowed us to directly compare the performance of the proposed geometries with the latest generation of current small animal systems. Results attained demonstrate the potential for higher NECR with CZT based scanners in comparison to scintillator based animal systems.
Detection of Explosives by Using a Neutron Source Based on a Proton Linac
Dolya, S N
2016-01-01
The paper considers an opportunity of detecting explosives by using radiation capture of a neutron with nitrogen nucleus. Proton LINAC is offered as the neutron source with the following parameters: proton energy five Mega electron Volts , beam pulse current one and seven-tenths milliampere, duration of the current pulse two hundreds microseconds, repetition rate fifty Hertz. The reaction in which neutrons are formed is lithium (p,n) beryllium. It is shown that this neutron source will have the intensity of ten to the twelfth degree neutron per second that will allow one to detect explosives of the size of a tennis ball.
Shafiei, M.; Gharari, S.; Pande, S.; Bhulai, S.
2014-01-01
Posterior sampling methods are increasingly being used to describe parameter and model predictive uncertainty in hydrologic modelling. This paper proposes an alternative to random walk chains (such as DREAM-zs). We propose a sampler based on independence chains with an embedded feature of standardiz
Cully, William P.L.; Cotton, Simon L.; Scanlon, William G.
2012-01-01
The range of potential applications for indoor and campus based personnel localisation has led researchers to create a wide spectrum of different algorithmic approaches and systems. However, the majority of the proposed systems overlook the unique radio environment presented by the human body leadin
Energy Technology Data Exchange (ETDEWEB)
Chi, Y; Li, Y; Tian, Z; Gu, X; Jiang, S; Jia, X [UT Southwestern Medical Center, Dallas, TX (United States)
2015-06-15
Purpose: Pencil-beam or superposition-convolution type dose calculation algorithms are routinely used in inverse plan optimization for intensity modulated radiation therapy (IMRT). However, due to their limited accuracy in some challenging cases, e.g. lung, the resulting dose may lose its optimality after being recomputed using an accurate algorithm, e.g. Monte Carlo (MC). It is the objective of this study to evaluate the feasibility and advantages of a new method to include MC in the treatment planning process. Methods: We developed a scheme to iteratively perform MC-based beamlet dose calculations and plan optimization. In the MC stage, a GPU-based dose engine was used and the particle number sampled from a beamlet was proportional to its optimized fluence from the previous step. We tested this scheme in four lung cancer IMRT cases. For each case, the original plan dose, plan dose re-computed by MC, and dose optimized by our scheme were obtained. Clinically relevant dosimetric quantities in these three plans were compared. Results: Although the original plan achieved a satisfactory PDV dose coverage, after re-computing doses using MC method, it was found that the PTV D95% were reduced by 4.60%–6.67%. After re-optimizing these cases with our scheme, the PTV coverage was improved to the same level as in the original plan, while the critical OAR coverages were maintained to clinically acceptable levels. Regarding the computation time, it took on average 144 sec per case using only one GPU card, including both MC-based beamlet dose calculation and treatment plan optimization. Conclusion: The achieved dosimetric gains and high computational efficiency indicate the feasibility and advantages of the proposed MC-based IMRT optimization method. Comprehensive validations in more patient cases are in progress.
Scalability tests of R-GMA based Grid job monitoring system for CMS Monte Carlo data production
Bonacorsi, D; Field, L; Fisher, S; Grandi, C; Hobson, P R; Kyberd, P; MacEvoy, B; Nebrensky, J J; Tallini, H; Traylen, S
2004-01-01
High Energy Physics experiments such as CMS (Compact Muon Solenoid) at the Large Hadron Collider have unprecedented, large-scale data processing computing requirements, with data accumulating at around 1 Gbyte/s. The Grid distributed computing paradigm has been chosen as the solution to provide the requisite computing power. The demanding nature of CMS software and computing requirements, such as the production of large quantities of Monte Carlo simulated data, makes them an ideal test case for the Grid and a major driver for the development of Grid technologies. One important challenge when using the Grid for large-scale data analysis is the ability to monitor the large numbers of jobs that are being executed simultaneously at multiple remote sites. R-GMA is a monitoring and information management service for distributed resources based on the Grid Monitoring Architecture of the Global Grid Forum. In this paper we report on the first measurements of R-GMA as part of a monitoring architecture to be used for b...
Investigation of the CRT performance of a PET scanner based in liquid xenon: A Monte Carlo study
Gomez-Cadenas, J J; Ferrario, P; Monrabal, F; Rodríguez, J; Toledo, J F
2016-01-01
The measurement of the time of flight of the two 511 keV gammas recorded in coincidence in a PET scanner provides an effective way of reducing the random background and therefore increases the scanner sensitivity, provided that the coincidence resolving time (CRT) of the gammas is sufficiently good. Existing commercial systems based in LYSO crystals, such as the GEMINIS of Philips, reach CRT values of ~ 600 ps (FWHM). In this paper we present a Monte Carlo investigation of the CRT performance of a PET scanner exploiting the scintillating properties of liquid xenon. We find that an excellent CRT of 60-70 ps (depending on the PDE of the sensor) can be obtained if the scanner is instrumented with silicon photomultipliers (SiPMs) sensitive to the ultraviolet light emitted by xenon. Alternatively, a CRT of 120 ps can be obtained instrumenting the scanner with (much cheaper) blue-sensitive SiPMs coated with a suitable wavelength shifter. These results show the excellent time of flight capabilities of a PET device b...
Joshi, Kaushik; Chaudhuri, Santanu
2016-10-01
Ability to accelerate the morphological evolution of nanoscale precipitates is a fundamental challenge for atomistic simulations. Kinetic Monte Carlo (KMC) methodology is an effective approach for accelerating the evolution of nanoscale systems that are dominated by so-called rare events. The quality and accuracy of energy landscape used in KMC calculations can be significantly improved using DFT-informed interatomic potentials. Using newly developed computational framework that uses molecular simulator LAMMPS as a library function inside KMC solver SPPARKS, we investigated formation and growth of Guiner–Preston (GP) zones in dilute Al–Cu alloys at different temperature and copper concentrations. The KMC simulations with angular dependent potential (ADP) predict formation of coherent disc-shaped monolayers of copper atoms (GPI zones) in early stage. Such monolayers are then gradually transformed into energetically favored GPII phase that has two aluminum layers sandwiched between copper layers. We analyzed the growth kinetics of KMC trajectory using Johnson–Mehl–Avrami (JMA) theory and obtained a phase transformation index close to 1.0. In the presence of grain boundaries, the KMC calculations predict the segregation of copper atoms near the grain boundaries instead of formation of GP zones. The computational framework presented in this work is based on open source potentials and MD simulator and can predict morphological changes during the evolution of the alloys in the bulk and around grain boundaries.
Energy Technology Data Exchange (ETDEWEB)
Brunetti, Antonio; Golosio, Bruno [Universita degli Studi di Sassari, Dipartimento di Scienze Politiche, Scienze della Comunicazione e Ingegneria dell' Informazione, Sassari (Italy); Melis, Maria Grazia [Universita degli Studi di Sassari, Dipartimento di Storia, Scienze dell' Uomo e della Formazione, Sassari (Italy); Mura, Stefania [Universita degli Studi di Sassari, Dipartimento di Agraria e Nucleo di Ricerca sulla Desertificazione, Sassari (Italy)
2014-11-08
X-ray fluorescence (XRF) is a well known nondestructive technique. It is also applied to multilayer characterization, due to its possibility of estimating both composition and thickness of the layers. Several kinds of cultural heritage samples can be considered as a complex multilayer, such as paintings or decorated objects or some types of metallic samples. Furthermore, they often have rough surfaces and this makes a precise determination of the structure and composition harder. The standard quantitative XRF approach does not take into account this aspect. In this paper, we propose a novel approach based on a combined use of X-ray measurements performed with a polychromatic beam and Monte Carlo simulations. All the information contained in an X-ray spectrum is used. This approach allows obtaining a very good estimation of the sample contents both in terms of chemical elements and material thickness, and in this sense, represents an improvement of the possibility of XRF measurements. Some examples will be examined and discussed. (orig.)
Wierling, Christoph; Kühn, Alexander; Hache, Hendrik; Daskalaki, Andriani; Maschke-Dutz, Elisabeth; Peycheva, Svetlana; Li, Jian; Herwig, Ralf; Lehrach, Hans
2012-08-15
Cancer is known to be a complex disease and its therapy is difficult. Much information is available on molecules and pathways involved in cancer onset and progression and this data provides a valuable resource for the development of predictive computer models that can help to identify new potential drug targets or to improve therapies. Modeling cancer treatment has to take into account many cellular pathways usually leading to the construction of large mathematical models. The development of such models is complicated by the fact that relevant parameters are either completely unknown, or can at best be measured under highly artificial conditions. Here we propose an approach for constructing predictive models of such complex biological networks in the absence of accurate knowledge on parameter values, and apply this strategy to predict the effects of perturbations induced by anti-cancer drug target inhibitions on an epidermal growth factor (EGF) signaling network. The strategy is based on a Monte Carlo approach, in which the kinetic parameters are repeatedly sampled from specific probability distributions and used for multiple parallel simulations. Simulation results from different forms of the model (e.g., a model that expresses a certain mutation or mutation pattern or the treatment by a certain drug or drug combination) can be compared with the unperturbed control model and used for the prediction of the perturbation effects. This framework opens the way to experiment with complex biological networks in the computer, likely to save costs in drug development and to improve patient therapy.
Study on the Uncertainty of the Available Time Under Ship Fire Based on Monte Carlo Sampling Method
Institute of Scientific and Technical Information of China (English)
WANG Jin-hui; CHU Guan-quan; LI Kai-yuan
2013-01-01
Available safety egress time under ship fire (SFAT) is critical to ship fire safety assessment,design and emergency rescue.Although it is available to determine SFAT by using fire models such as the two-zone fire model CFAST and the field model FDS,none of these models can address the uncertainties involved in the input parameters.To solve this problem,current study presents a framework of uncertainty analysis for SFAT.Firstly,a deterministic model estimating SFAT is built.The uncertainties of the input parameters are regarded as random variables with the given probability distribution functions.Subsequently,the deterministic SFAT model is employed to couple with a Monte Carlo sampling method to investigate the uncertainties of the SFAT.The Spearman's rank-order correlation coefficient (SRCC) is used to examine the sensitivity of each input uncertainty parameter on SFAT.To illustrate the proposed approach in detail,a case study is performed.Based on the proposed approach,probability density function and cumulative density function of SFAT are obtained.Furthermore,sensitivity analysis with regard to SFAT is also conducted.The results give a high-negative correlation of SFAT and the fire growth coefficient whereas the effect of other parameters is so weak that they can be neglected.
Institute of Scientific and Technical Information of China (English)
张颖
2011-01-01
Traditionally,the safety factor which is a certain value calculated by a certain method was often used as evaluation indexes of slope stability.But the variability of parameters was not took into account,so the safety factor could not express the degree of slope safety.For this reason,the concept of reliability was imported in this paper,and the Monte Carlo and Rosenblueth methods based on probability theory and mathematical statistics were used into slope reliability analysis,therefor the shortages of traditional evaluation methodology was offseted effectively.%传统上常以安全系数作为边坡稳定性的评价指标,但是安全系数只是由一种确定的方法计算所得的一个定值,没有考虑设计参数的变异性,因此安全系数很难表征边坡的安全程度,为此本文引入了可靠度的概念,并运用基于概率论和数理统计学的蒙特卡洛法（Monte Carlo）和Rosenblueth法进行边坡可靠性分析,有效的弥补了边坡传统评价方法的不足。
Directory of Open Access Journals (Sweden)
Sandeep Chakraborty
Full Text Available The pathways of proton abstraction (PA, a key aspect of most catalytic reactions, is often controversial and highly debated. Ultrahigh-resolution diffraction studies, molecular dynamics, quantum mechanics and molecular mechanic simulations are often adopted to gain insights in the PA mechanisms in enzymes. These methods require expertise and effort to setup and can be computationally intensive. We present a push button methodology--Proton abstraction Simulation (PRISM--to enumerate the possible pathways of PA in a protein with known 3D structure based on the spatial and electrostatic properties of residues in the proximity of a given nucleophilic residue. Proton movements are evaluated in the vicinity of this nucleophilic residue based on distances, potential differences, spatial channels and characteristics of the individual residues (polarity, acidic, basic, etc. Modulating these parameters eliminates their empirical nature and also might reveal pathways that originate from conformational changes. We have validated our method using serine proteases and concurred with the dichotomy in PA in Class A β-lactamases, both of which are hydrolases. The PA mechanism in a transferase has also been corroborated. The source code is made available at www.sanchak.com/prism.
Directory of Open Access Journals (Sweden)
S. Maiti
2011-03-01
Full Text Available Koyna region is well-known for its triggered seismic activities since the hazardous earthquake of M=6.3 occurred around the Koyna reservoir on 10 December 1967. Understanding the shallow distribution of resistivity pattern in such a seismically critical area is vital for mapping faults, fractures and lineaments. However, deducing true resistivity distribution from the apparent resistivity data lacks precise information due to intrinsic non-linearity in the data structures. Here we present a new technique based on the Bayesian neural network (BNN theory using the concept of Hybrid Monte Carlo (HMC/Markov Chain Monte Carlo (MCMC simulation scheme. The new method is applied to invert one and two-dimensional Direct Current (DC vertical electrical sounding (VES data acquired around the Koyna region in India. Prior to apply the method on actual resistivity data, the new method was tested for simulating synthetic signal. In this approach the objective/cost function is optimized following the Hybrid Monte Carlo (HMC/Markov Chain Monte Carlo (MCMC sampling based algorithm and each trajectory was updated by approximating the Hamiltonian differential equations through a leapfrog discretization scheme. The stability of the new inversion technique was tested in presence of correlated red noise and uncertainty of the result was estimated using the BNN code. The estimated true resistivity distribution was compared with the results of singular value decomposition (SVD-based conventional resistivity inversion results. Comparative results based on the HMC-based Bayesian Neural Network are in good agreement with the existing model results, however in some cases, it also provides more detail and precise results, which appears to be justified with local geological and structural details. The new BNN approach based on HMC is faster and proved to be a promising inversion scheme to interpret complex and non-linear resistivity problems. The HMC-based BNN results