Monte Carlo simulations on SIMD computer architectures
International Nuclear Information System (INIS)
Burmester, C.P.; Gronsky, R.; Wille, L.T.
1992-01-01
In this paper algorithmic considerations regarding the implementation of various materials science applications of the Monte Carlo technique to single instruction multiple data (SIMD) computer architectures are presented. In particular, implementation of the Ising model with nearest, next nearest, and long range screened Coulomb interactions on the SIMD architecture MasPar MP-1 (DEC mpp-12000) series of massively parallel computers is demonstrated. Methods of code development which optimize processor array use and minimize inter-processor communication are presented including lattice partitioning and the use of processor array spanning tree structures for data reduction. Both geometric and algorithmic parallel approaches are utilized. Benchmarks in terms of Monte Carl updates per second for the MasPar architecture are presented and compared to values reported in the literature from comparable studies on other architectures
Radiotherapy Monte Carlo simulation using cloud computing technology.
Poole, C M; Cornelius, I; Trapp, J V; Langton, C M
2012-12-01
Cloud computing allows for vast computational resources to be leveraged quickly and easily in bursts as and when required. Here we describe a technique that allows for Monte Carlo radiotherapy dose calculations to be performed using GEANT4 and executed in the cloud, with relative simulation cost and completion time evaluated as a function of machine count. As expected, simulation completion time decreases as 1/n for n parallel machines, and relative simulation cost is found to be optimal where n is a factor of the total simulation time in hours. Using the technique, we demonstrate the potential usefulness of cloud computing as a solution for rapid Monte Carlo simulation for radiotherapy dose calculation without the need for dedicated local computer hardware as a proof of principal.
Radiotherapy Monte Carlo simulation using cloud computing technology
International Nuclear Information System (INIS)
Poole, C.M.; Cornelius, I.; Trapp, J.V.; Langton, C.M.
2012-01-01
Cloud computing allows for vast computational resources to be leveraged quickly and easily in bursts as and when required. Here we describe a technique that allows for Monte Carlo radiotherapy dose calculations to be performed using GEANT4 and executed in the cloud, with relative simulation cost and completion time evaluated as a function of machine count. As expected, simulation completion time decreases as 1/n for n parallel machines, and relative simulation cost is found to be optimal where n is a factor of the total simulation time in hours. Using the technique, we demonstrate the potential usefulness of cloud computing as a solution for rapid Monte Carlo simulation for radiotherapy dose calculation without the need for dedicated local computer hardware as a proof of principal.
A computer code package for electron transport Monte Carlo simulation
International Nuclear Information System (INIS)
Popescu, Lucretiu M.
1999-01-01
A computer code package was developed for solving various electron transport problems by Monte Carlo simulation. It is based on condensed history Monte Carlo algorithm. In order to get reliable results over wide ranges of electron energies and target atomic numbers, specific techniques of electron transport were implemented such as: Moliere multiscatter angular distributions, Blunck-Leisegang multiscatter energy distribution, sampling of electron-electron and Bremsstrahlung individual interactions. Path-length and lateral displacement corrections algorithms and the module for computing collision, radiative and total restricted stopping powers and ranges of electrons are also included. Comparisons of simulation results with experimental measurements are finally presented. (author)
Computed radiography simulation using the Monte Carlo code MCNPX
International Nuclear Information System (INIS)
Correa, S.C.A.; Souza, E.M.; Silva, A.X.; Lopes, R.T.
2009-01-01
Simulating x-ray images has been of great interest in recent years as it makes possible an analysis of how x-ray images are affected owing to relevant operating parameters. In this paper, a procedure for simulating computed radiographic images using the Monte Carlo code MCNPX is proposed. The sensitivity curve of the BaFBr image plate detector as well as the characteristic noise of a 16-bit computed radiography system were considered during the methodology's development. The results obtained confirm that the proposed procedure for simulating computed radiographic images is satisfactory, as it allows obtaining results comparable with experimental data. (author)
Computed radiography simulation using the Monte Carlo code MCNPX
Energy Technology Data Exchange (ETDEWEB)
Correa, S.C.A. [Programa de Engenharia Nuclear/COPPE, Universidade Federal do Rio de Janeiro, Ilha do Fundao, Caixa Postal 68509, 21945-970, Rio de Janeiro, RJ (Brazil); Centro Universitario Estadual da Zona Oeste (CCMAT)/UEZO, Av. Manuel Caldeira de Alvarenga, 1203, Campo Grande, 23070-200, Rio de Janeiro, RJ (Brazil); Souza, E.M. [Programa de Engenharia Nuclear/COPPE, Universidade Federal do Rio de Janeiro, Ilha do Fundao, Caixa Postal 68509, 21945-970, Rio de Janeiro, RJ (Brazil); Silva, A.X., E-mail: ademir@con.ufrj.b [PEN/COPPE-DNC/Poli CT, Universidade Federal do Rio de Janeiro, Ilha do Fundao, Caixa Postal 68509, 21945-970, Rio de Janeiro, RJ (Brazil); Cassiano, D.H. [Instituto de Radioprotecao e Dosimetria/CNEN Av. Salvador Allende, s/n, Recreio, 22780-160, Rio de Janeiro, RJ (Brazil); Lopes, R.T. [Programa de Engenharia Nuclear/COPPE, Universidade Federal do Rio de Janeiro, Ilha do Fundao, Caixa Postal 68509, 21945-970, Rio de Janeiro, RJ (Brazil)
2010-09-15
Simulating X-ray images has been of great interest in recent years as it makes possible an analysis of how X-ray images are affected owing to relevant operating parameters. In this paper, a procedure for simulating computed radiographic images using the Monte Carlo code MCNPX is proposed. The sensitivity curve of the BaFBr image plate detector as well as the characteristic noise of a 16-bit computed radiography system were considered during the methodology's development. The results obtained confirm that the proposed procedure for simulating computed radiographic images is satisfactory, as it allows obtaining results comparable with experimental data.
Monte Carlo simulation with the Gate software using grid computing
International Nuclear Information System (INIS)
Reuillon, R.; Hill, D.R.C.; Gouinaud, C.; El Bitar, Z.; Breton, V.; Buvat, I.
2009-03-01
Monte Carlo simulations are widely used in emission tomography, for protocol optimization, design of processing or data analysis methods, tomographic reconstruction, or tomograph design optimization. Monte Carlo simulations needing many replicates to obtain good statistical results can be easily executed in parallel using the 'Multiple Replications In Parallel' approach. However, several precautions have to be taken in the generation of the parallel streams of pseudo-random numbers. In this paper, we present the distribution of Monte Carlo simulations performed with the GATE software using local clusters and grid computing. We obtained very convincing results with this large medical application, thanks to the EGEE Grid (Enabling Grid for E-science), achieving in one week computations that could have taken more than 3 years of processing on a single computer. This work has been achieved thanks to a generic object-oriented toolbox called DistMe which we designed to automate this kind of parallelization for Monte Carlo simulations. This toolbox, written in Java is freely available on SourceForge and helped to ensure a rigorous distribution of pseudo-random number streams. It is based on the use of a documented XML format for random numbers generators statuses. (authors)
Improving computational efficiency of Monte Carlo simulations with variance reduction
International Nuclear Information System (INIS)
Turner, A.; Davis, A.
2013-01-01
CCFE perform Monte-Carlo transport simulations on large and complex tokamak models such as ITER. Such simulations are challenging since streaming and deep penetration effects are equally important. In order to make such simulations tractable, both variance reduction (VR) techniques and parallel computing are used. It has been found that the application of VR techniques in such models significantly reduces the efficiency of parallel computation due to 'long histories'. VR in MCNP can be accomplished using energy-dependent weight windows. The weight window represents an 'average behaviour' of particles, and large deviations in the arriving weight of a particle give rise to extreme amounts of splitting being performed and a long history. When running on parallel clusters, a long history can have a detrimental effect on the parallel efficiency - if one process is computing the long history, the other CPUs complete their batch of histories and wait idle. Furthermore some long histories have been found to be effectively intractable. To combat this effect, CCFE has developed an adaptation of MCNP which dynamically adjusts the WW where a large weight deviation is encountered. The method effectively 'de-optimises' the WW, reducing the VR performance but this is offset by a significant increase in parallel efficiency. Testing with a simple geometry has shown the method does not bias the result. This 'long history method' has enabled CCFE to significantly improve the performance of MCNP calculations for ITER on parallel clusters, and will be beneficial for any geometry combining streaming and deep penetration effects. (authors)
Monte Carlo computer simulation of sedimentation of charged hard spherocylinders
International Nuclear Information System (INIS)
Viveros-Méndez, P. X.; Aranda-Espinoza, S.; Gil-Villegas, Alejandro
2014-01-01
In this article we present a NVT Monte Carlo computer simulation study of sedimentation of an electroneutral mixture of oppositely charged hard spherocylinders (CHSC) with aspect ratio L/σ = 5, where L and σ are the length and diameter of the cylinder and hemispherical caps, respectively, for each particle. This system is an extension of the restricted primitive model for spherical particles, where L/σ = 0, and it is assumed that the ions are immersed in an structureless solvent, i.e., a continuum with dielectric constant D. The system consisted of N = 2000 particles and the Wolf method was implemented to handle the coulombic interactions of the inhomogeneous system. Results are presented for different values of the strength ratio between the gravitational and electrostatic interactions, Γ = (mgσ)/(e 2 /Dσ), where m is the mass per particle, e is the electron's charge and g is the gravitational acceleration value. A semi-infinite simulation cell was used with dimensions L x ≈ L y and L z = 5L x , where L x , L y , and L z are the box dimensions in Cartesian coordinates, and the gravitational force acts along the z-direction. Sedimentation effects were studied by looking at every layer formed by the CHSC along the gravitational field. By increasing Γ, particles tend to get more packed at each layer and to arrange in local domains with an orientational ordering along two perpendicular axis, a feature not observed in the uncharged system with the same hard-body geometry. This type of arrangement, known as tetratic phase, has been observed in two-dimensional systems of hard-rectangles and rounded hard-squares. In this way, the coupling of gravitational and electric interactions in the CHSC system induces the arrangement of particles in layers, with the formation of quasi-two dimensional tetratic phases near the surface
Monte Carlo computer simulation of sedimentation of charged hard spherocylinders
Energy Technology Data Exchange (ETDEWEB)
Viveros-Méndez, P. X., E-mail: xviveros@fisica.uaz.edu.mx; Aranda-Espinoza, S. [Unidad Académica de Física, Universidad Autónoma de Zacatecas, Calzada Solidaridad esq. Paseo, La Bufa s/n, 98060 Zacatecas, Zacatecas, México (Mexico); Gil-Villegas, Alejandro [Departamento de Ingeniería Física, División de Ciencias e Ingenierías, Campus León, Universidad de Guanajuato, Loma del Bosque 103, Lomas del Campestre, 37150 León, Guanajuato, México (Mexico)
2014-07-28
In this article we present a NVT Monte Carlo computer simulation study of sedimentation of an electroneutral mixture of oppositely charged hard spherocylinders (CHSC) with aspect ratio L/σ = 5, where L and σ are the length and diameter of the cylinder and hemispherical caps, respectively, for each particle. This system is an extension of the restricted primitive model for spherical particles, where L/σ = 0, and it is assumed that the ions are immersed in an structureless solvent, i.e., a continuum with dielectric constant D. The system consisted of N = 2000 particles and the Wolf method was implemented to handle the coulombic interactions of the inhomogeneous system. Results are presented for different values of the strength ratio between the gravitational and electrostatic interactions, Γ = (mgσ)/(e{sup 2}/Dσ), where m is the mass per particle, e is the electron's charge and g is the gravitational acceleration value. A semi-infinite simulation cell was used with dimensions L{sub x} ≈ L{sub y} and L{sub z} = 5L{sub x}, where L{sub x}, L{sub y}, and L{sub z} are the box dimensions in Cartesian coordinates, and the gravitational force acts along the z-direction. Sedimentation effects were studied by looking at every layer formed by the CHSC along the gravitational field. By increasing Γ, particles tend to get more packed at each layer and to arrange in local domains with an orientational ordering along two perpendicular axis, a feature not observed in the uncharged system with the same hard-body geometry. This type of arrangement, known as tetratic phase, has been observed in two-dimensional systems of hard-rectangles and rounded hard-squares. In this way, the coupling of gravitational and electric interactions in the CHSC system induces the arrangement of particles in layers, with the formation of quasi-two dimensional tetratic phases near the surface.
Parallel Monte Carlo simulations on an ARC-enabled computing grid
International Nuclear Information System (INIS)
Nilsen, Jon K; Samset, Bjørn H
2011-01-01
Grid computing opens new possibilities for running heavy Monte Carlo simulations of physical systems in parallel. The presentation gives an overview of GaMPI, a system for running an MPI-based random walker simulation on grid resources. Integrating the ARC middleware and the new storage system Chelonia with the Ganga grid job submission and control system, we show that MPI jobs can be run on a world-wide computing grid with good performance and promising scaling properties. Results for relatively communication-heavy Monte Carlo simulations run on multiple heterogeneous, ARC-enabled computing clusters in several countries are presented.
Adding computationally efficient realism to Monte Carlo turbulence simulation
Campbell, C. W.
1985-01-01
Frequently in aerospace vehicle flight simulation, random turbulence is generated using the assumption that the craft is small compared to the length scales of turbulence. The turbulence is presumed to vary only along the flight path of the vehicle but not across the vehicle span. The addition of the realism of three-dimensionality is a worthy goal, but any such attempt will not gain acceptance in the simulator community unless it is computationally efficient. A concept for adding three-dimensional realism with a minimum of computational complexity is presented. The concept involves the use of close rational approximations to irrational spectra and cross-spectra so that systems of stable, explicit difference equations can be used to generate the turbulence.
Microcanonical Monte Carlo approach for computing melting curves by atomistic simulations
Davis, Sergio; Gutiérrez, Gonzalo
2017-01-01
We report microcanonical Monte Carlo simulations of melting and superheating of a generic, Lennard-Jones system starting from the crystalline phase. The isochoric curve, the melting temperature $T_m$ and the critical superheating temperature $T_{LS}$ obtained are in close agreement (well within the microcanonical temperature fluctuations) with standard molecular dynamics one-phase and two-phase methods. These results validate the use of microcanonical Monte Carlo to compute melting points, a ...
CloudMC: a cloud computing application for Monte Carlo simulation
International Nuclear Information System (INIS)
Miras, H; Jiménez, R; Miras, C; Gomà, C
2013-01-01
This work presents CloudMC, a cloud computing application—developed in Windows Azure®, the platform of the Microsoft® cloud—for the parallelization of Monte Carlo simulations in a dynamic virtual cluster. CloudMC is a web application designed to be independent of the Monte Carlo code in which the simulations are based—the simulations just need to be of the form: input files → executable → output files. To study the performance of CloudMC in Windows Azure®, Monte Carlo simulations with penelope were performed on different instance (virtual machine) sizes, and for different number of instances. The instance size was found to have no effect on the simulation runtime. It was also found that the decrease in time with the number of instances followed Amdahl's law, with a slight deviation due to the increase in the fraction of non-parallelizable time with increasing number of instances. A simulation that would have required 30 h of CPU on a single instance was completed in 48.6 min when executed on 64 instances in parallel (speedup of 37 ×). Furthermore, the use of cloud computing for parallel computing offers some advantages over conventional clusters: high accessibility, scalability and pay per usage. Therefore, it is strongly believed that cloud computing will play an important role in making Monte Carlo dose calculation a reality in future clinical practice. (note)
CloudMC: a cloud computing application for Monte Carlo simulation.
Miras, H; Jiménez, R; Miras, C; Gomà, C
2013-04-21
This work presents CloudMC, a cloud computing application-developed in Windows Azure®, the platform of the Microsoft® cloud-for the parallelization of Monte Carlo simulations in a dynamic virtual cluster. CloudMC is a web application designed to be independent of the Monte Carlo code in which the simulations are based-the simulations just need to be of the form: input files → executable → output files. To study the performance of CloudMC in Windows Azure®, Monte Carlo simulations with penelope were performed on different instance (virtual machine) sizes, and for different number of instances. The instance size was found to have no effect on the simulation runtime. It was also found that the decrease in time with the number of instances followed Amdahl's law, with a slight deviation due to the increase in the fraction of non-parallelizable time with increasing number of instances. A simulation that would have required 30 h of CPU on a single instance was completed in 48.6 min when executed on 64 instances in parallel (speedup of 37 ×). Furthermore, the use of cloud computing for parallel computing offers some advantages over conventional clusters: high accessibility, scalability and pay per usage. Therefore, it is strongly believed that cloud computing will play an important role in making Monte Carlo dose calculation a reality in future clinical practice.
A Monte Carlo simulation of scattering reduction in spectral x-ray computed tomography
DEFF Research Database (Denmark)
Busi, Matteo; Olsen, Ulrik Lund; Bergbäck Knudsen, Erik
2017-01-01
In X-ray computed tomography (CT), scattered radiation plays an important role in the accurate reconstruction of the inspected object, leading to a loss of contrast between the different materials in the reconstruction volume and cupping artifacts in the images. We present a Monte Carlo simulation...
The adaptation method in the Monte Carlo simulation for computed tomography
Energy Technology Data Exchange (ETDEWEB)
Lee, Hyoung Gun; Yoon, Chang Yeon; Lee, Won Ho [Dept. of Bio-convergence Engineering, Korea University, Seoul (Korea, Republic of); Cho, Seung Ryong [Dept. of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of); Park, Sung Ho [Dept. of Neurosurgery, Ulsan University Hospital, Ulsan (Korea, Republic of)
2015-06-15
The patient dose incurred from diagnostic procedures during advanced radiotherapy has become an important issue. Many researchers in medical physics are using computational simulations to calculate complex parameters in experiments. However, extended computation times make it difficult for personal computers to run the conventional Monte Carlo method to simulate radiological images with high-flux photons such as images produced by computed tomography (CT). To minimize the computation time without degrading imaging quality, we applied a deterministic adaptation to the Monte Carlo calculation and verified its effectiveness by simulating CT image reconstruction for an image evaluation phantom (Catphan; Phantom Laboratory, New York NY, USA) and a human-like voxel phantom (KTMAN-2) (Los Alamos National Laboratory, Los Alamos, NM, USA). For the deterministic adaptation, the relationship between iteration numbers and the simulations was estimated and the option to simulate scattered radiation was evaluated. The processing times of simulations using the adaptive method were at least 500 times faster than those using a conventional statistical process. In addition, compared with the conventional statistical method, the adaptive method provided images that were more similar to the experimental images, which proved that the adaptive method was highly effective for a simulation that requires a large number of iterations-assuming no radiation scattering in the vicinity of detectors minimized artifacts in the reconstructed image.
International Nuclear Information System (INIS)
Nomura, Yasushi; Tamaki, Hitoshi; Kanai, Shigeru
2000-04-01
In a plant system consisting of complex equipments and components for a reprocessing facility, there might be grace time between an initiating event and a resultant serious accident, allowing operating personnel to take remedial actions, thus, terminating the ongoing accident sequence. A component Monte Carlo simulation computer program TITAN has been developed to analyze such a complex reliability model including the grace time without any difficulty to obtain an accident occurrence frequency. Firstly, basic methods for the component Monte Carlo simulation is introduced to obtain an accident occurrence frequency, and then, the basic performance such as precision, convergence, and parallelization of calculation, is shown through calculation of a prototype accident sequence model. As an example to illustrate applicability to a real scale plant model, a red oil explosion in a German reprocessing plant model is simulated to show that TITAN can give an accident occurrence frequency with relatively good accuracy. Moreover, results of uncertainty analyses by TITAN are rendered to show another performance, and a proposal is made for introducing of a new input-data format to adapt the component Monte Carlo simulation. The present paper describes the calculational method, performance, applicability to a real scale, and new proposal for the TITAN code. In the Appendixes, a conventional analytical method is shown to avoid complex and laborious calculation to obtain a strict solution of accident occurrence frequency, compared with Monte Carlo method. The user's manual and the list/structure of program are also contained in the Appendixes to facilitate TITAN computer program usage. (author)
Yu, Leiming; Nina-Paravecino, Fanny; Kaeli, David; Fang, Qianqian
2018-01-01
We present a highly scalable Monte Carlo (MC) three-dimensional photon transport simulation platform designed for heterogeneous computing systems. Through the development of a massively parallel MC algorithm using the Open Computing Language framework, this research extends our existing graphics processing unit (GPU)-accelerated MC technique to a highly scalable vendor-independent heterogeneous computing environment, achieving significantly improved performance and software portability. A number of parallel computing techniques are investigated to achieve portable performance over a wide range of computing hardware. Furthermore, multiple thread-level and device-level load-balancing strategies are developed to obtain efficient simulations using multiple central processing units and GPUs. (2018) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE).
GATE Monte Carlo simulation of dose distribution using MapReduce in a cloud computing environment.
Liu, Yangchuan; Tang, Yuguo; Gao, Xin
2017-12-01
The GATE Monte Carlo simulation platform has good application prospects of treatment planning and quality assurance. However, accurate dose calculation using GATE is time consuming. The purpose of this study is to implement a novel cloud computing method for accurate GATE Monte Carlo simulation of dose distribution using MapReduce. An Amazon Machine Image installed with Hadoop and GATE is created to set up Hadoop clusters on Amazon Elastic Compute Cloud (EC2). Macros, the input files for GATE, are split into a number of self-contained sub-macros. Through Hadoop Streaming, the sub-macros are executed by GATE in Map tasks and the sub-results are aggregated into final outputs in Reduce tasks. As an evaluation, GATE simulations were performed in a cubical water phantom for X-ray photons of 6 and 18 MeV. The parallel simulation on the cloud computing platform is as accurate as the single-threaded simulation on a local server and the simulation correctness is not affected by the failure of some worker nodes. The cloud-based simulation time is approximately inversely proportional to the number of worker nodes. For the simulation of 10 million photons on a cluster with 64 worker nodes, time decreases of 41× and 32× were achieved compared to the single worker node case and the single-threaded case, respectively. The test of Hadoop's fault tolerance showed that the simulation correctness was not affected by the failure of some worker nodes. The results verify that the proposed method provides a feasible cloud computing solution for GATE.
Toward real-time Monte Carlo simulation using a commercial cloud computing infrastructure
International Nuclear Information System (INIS)
Wang, Henry; Ma Yunzhi; Pratx, Guillem; Xing Lei
2011-01-01
Monte Carlo (MC) methods are the gold standard for modeling photon and electron transport in a heterogeneous medium; however, their computational cost prohibits their routine use in the clinic. Cloud computing, wherein computing resources are allocated on-demand from a third party, is a new approach for high performance computing and is implemented to perform ultra-fast MC calculation in radiation therapy. We deployed the EGS5 MC package in a commercial cloud environment. Launched from a single local computer with Internet access, a Python script allocates a remote virtual cluster. A handshaking protocol designates master and worker nodes. The EGS5 binaries and the simulation data are initially loaded onto the master node. The simulation is then distributed among independent worker nodes via the message passing interface, and the results aggregated on the local computer for display and data analysis. The described approach is evaluated for pencil beams and broad beams of high-energy electrons and photons. The output of cloud-based MC simulation is identical to that produced by single-threaded implementation. For 1 million electrons, a simulation that takes 2.58 h on a local computer can be executed in 3.3 min on the cloud with 100 nodes, a 47x speed-up. Simulation time scales inversely with the number of parallel nodes. The parallelization overhead is also negligible for large simulations. Cloud computing represents one of the most important recent advances in supercomputing technology and provides a promising platform for substantially improved MC simulation. In addition to the significant speed up, cloud computing builds a layer of abstraction for high performance parallel computing, which may change the way dose calculations are performed and radiation treatment plans are completed. (note)
Toward real-time Monte Carlo simulation using a commercial cloud computing infrastructure
Energy Technology Data Exchange (ETDEWEB)
Wang, Henry [Department of Electrical Engineering, Stanford University, Stanford, CA 94305 (United States); Ma Yunzhi; Pratx, Guillem; Xing Lei, E-mail: hwang41@stanford.edu [Department of Radiation Oncology, Stanford University School of Medicine, Stanford, CA 94305-5847 (United States)
2011-09-07
Monte Carlo (MC) methods are the gold standard for modeling photon and electron transport in a heterogeneous medium; however, their computational cost prohibits their routine use in the clinic. Cloud computing, wherein computing resources are allocated on-demand from a third party, is a new approach for high performance computing and is implemented to perform ultra-fast MC calculation in radiation therapy. We deployed the EGS5 MC package in a commercial cloud environment. Launched from a single local computer with Internet access, a Python script allocates a remote virtual cluster. A handshaking protocol designates master and worker nodes. The EGS5 binaries and the simulation data are initially loaded onto the master node. The simulation is then distributed among independent worker nodes via the message passing interface, and the results aggregated on the local computer for display and data analysis. The described approach is evaluated for pencil beams and broad beams of high-energy electrons and photons. The output of cloud-based MC simulation is identical to that produced by single-threaded implementation. For 1 million electrons, a simulation that takes 2.58 h on a local computer can be executed in 3.3 min on the cloud with 100 nodes, a 47x speed-up. Simulation time scales inversely with the number of parallel nodes. The parallelization overhead is also negligible for large simulations. Cloud computing represents one of the most important recent advances in supercomputing technology and provides a promising platform for substantially improved MC simulation. In addition to the significant speed up, cloud computing builds a layer of abstraction for high performance parallel computing, which may change the way dose calculations are performed and radiation treatment plans are completed. (note)
Toward real-time Monte Carlo simulation using a commercial cloud computing infrastructure.
Wang, Henry; Ma, Yunzhi; Pratx, Guillem; Xing, Lei
2011-09-07
Monte Carlo (MC) methods are the gold standard for modeling photon and electron transport in a heterogeneous medium; however, their computational cost prohibits their routine use in the clinic. Cloud computing, wherein computing resources are allocated on-demand from a third party, is a new approach for high performance computing and is implemented to perform ultra-fast MC calculation in radiation therapy. We deployed the EGS5 MC package in a commercial cloud environment. Launched from a single local computer with Internet access, a Python script allocates a remote virtual cluster. A handshaking protocol designates master and worker nodes. The EGS5 binaries and the simulation data are initially loaded onto the master node. The simulation is then distributed among independent worker nodes via the message passing interface, and the results aggregated on the local computer for display and data analysis. The described approach is evaluated for pencil beams and broad beams of high-energy electrons and photons. The output of cloud-based MC simulation is identical to that produced by single-threaded implementation. For 1 million electrons, a simulation that takes 2.58 h on a local computer can be executed in 3.3 min on the cloud with 100 nodes, a 47× speed-up. Simulation time scales inversely with the number of parallel nodes. The parallelization overhead is also negligible for large simulations. Cloud computing represents one of the most important recent advances in supercomputing technology and provides a promising platform for substantially improved MC simulation. In addition to the significant speed up, cloud computing builds a layer of abstraction for high performance parallel computing, which may change the way dose calculations are performed and radiation treatment plans are completed.
Energy Technology Data Exchange (ETDEWEB)
Nomura, Yasushi [Department of Fuel Cycle Safety Research, Nuclear Safety Research Center, Tokai Research Establishment, Japan Atomic Energy Research Institute, Tokai, Ibaraki (Japan); Tamaki, Hitoshi [Department of Safety Research Technical Support, Tokai Research Establishment, Japan Atomic Energy Research Institute, Tokai, Ibaraki (Japan); Kanai, Shigeru [Fuji Research Institute Corporation, Tokyo (Japan)
2000-04-01
In a plant system consisting of complex equipments and components for a reprocessing facility, there might be grace time between an initiating event and a resultant serious accident, allowing operating personnel to take remedial actions, thus, terminating the ongoing accident sequence. A component Monte Carlo simulation computer program TITAN has been developed to analyze such a complex reliability model including the grace time without any difficulty to obtain an accident occurrence frequency. Firstly, basic methods for the component Monte Carlo simulation is introduced to obtain an accident occurrence frequency, and then, the basic performance such as precision, convergence, and parallelization of calculation, is shown through calculation of a prototype accident sequence model. As an example to illustrate applicability to a real scale plant model, a red oil explosion in a German reprocessing plant model is simulated to show that TITAN can give an accident occurrence frequency with relatively good accuracy. Moreover, results of uncertainty analyses by TITAN are rendered to show another performance, and a proposal is made for introducing of a new input-data format to adapt the component Monte Carlo simulation. The present paper describes the calculational method, performance, applicability to a real scale, and new proposal for the TITAN code. In the Appendixes, a conventional analytical method is shown to avoid complex and laborious calculation to obtain a strict solution of accident occurrence frequency, compared with Monte Carlo method. The user's manual and the list/structure of program are also contained in the Appendixes to facilitate TITAN computer program usage. (author)
International Nuclear Information System (INIS)
Popescu, Lucretiu M.
2000-01-01
A computer code package (PTSIM) for particle transport Monte Carlo simulation was developed using object oriented techniques of design and programming. A flexible system for simulation of coupled photon, electron transport, facilitating development of efficient simulation applications, was obtained. For photons: Compton and photo-electric effects, pair production and Rayleigh interactions are simulated, while for electrons, a class II condensed history scheme was considered, in which catastrophic interactions (Moeller electron-electron interaction, bremsstrahlung, etc.) are treated in detail and all other interactions with reduced individual effect on electron history are grouped together using continuous slowing down approximation and energy straggling theories. Electron angular straggling is simulated using Moliere theory or a mixed model in which scatters at large angles are treated as distinct events. Comparisons with experimentally benchmarks for electron transmission and bremsstrahlung emissions energy and angular spectra, and for dose calculations are presented
Monte Carlo codes and Monte Carlo simulator program
International Nuclear Information System (INIS)
Higuchi, Kenji; Asai, Kiyoshi; Suganuma, Masayuki.
1990-03-01
Four typical Monte Carlo codes KENO-IV, MORSE, MCNP and VIM have been vectorized on VP-100 at Computing Center, JAERI. The problems in vector processing of Monte Carlo codes on vector processors have become clear through the work. As the result, it is recognized that these are difficulties to obtain good performance in vector processing of Monte Carlo codes. A Monte Carlo computing machine, which processes the Monte Carlo codes with high performances is being developed at our Computing Center since 1987. The concept of Monte Carlo computing machine and its performance have been investigated and estimated by using a software simulator. In this report the problems in vectorization of Monte Carlo codes, Monte Carlo pipelines proposed to mitigate these difficulties and the results of the performance estimation of the Monte Carlo computing machine by the simulator are described. (author)
Computational physics an introduction to Monte Carlo simulations of matrix field theory
Ydri, Badis
2017-01-01
This book is divided into two parts. In the first part we give an elementary introduction to computational physics consisting of 21 simulations which originated from a formal course of lectures and laboratory simulations delivered since 2010 to physics students at Annaba University. The second part is much more advanced and deals with the problem of how to set up working Monte Carlo simulations of matrix field theories which involve finite dimensional matrix regularizations of noncommutative and fuzzy field theories, fuzzy spaces and matrix geometry. The study of matrix field theory in its own right has also become very important to the proper understanding of all noncommutative, fuzzy and matrix phenomena. The second part, which consists of 9 simulations, was delivered informally to doctoral students who are working on various problems in matrix field theory. Sample codes as well as sample key solutions are also provided for convenience and completness. An appendix containing an executive arabic summary of t...
Dosimetry in radiotherapy and brachytherapy by Monte-Carlo GATE simulation on computing grid
International Nuclear Information System (INIS)
Thiam, Ch.O.
2007-10-01
Accurate radiotherapy treatment requires the delivery of a precise dose to the tumour volume and a good knowledge of the dose deposit to the neighbouring zones. Computation of the treatments is usually carried out by a Treatment Planning System (T.P.S.) which needs to be precise and fast. The G.A.T.E. platform for Monte-Carlo simulation based on G.E.A.N.T.4 is an emerging tool for nuclear medicine application that provides functionalities for fast and reliable dosimetric calculations. In this thesis, we studied in parallel a validation of the G.A.T.E. platform for the modelling of electrons and photons low energy sources and the optimized use of grid infrastructures to reduce simulations computing time. G.A.T.E. was validated for the dose calculation of point kernels for mono-energetic electrons and compared with the results of other Monte-Carlo studies. A detailed study was made on the energy deposit during electrons transport in G.E.A.N.T.4. In order to validate G.A.T.E. for very low energy photons (<35 keV), three models of radioactive sources used in brachytherapy and containing iodine 125 (2301 of Best Medical International; Symmetra of Uro- Med/Bebig and 6711 of Amersham) were simulated. Our results were analyzed according to the recommendations of task group No43 of American Association of Physicists in Medicine (A.A.P.M.). They show a good agreement between G.A.T.E., the reference studies and A.A.P.M. recommended values. The use of Monte-Carlo simulations for a better definition of the dose deposited in the tumour volumes requires long computing time. In order to reduce it, we exploited E.G.E.E. grid infrastructure where simulations are distributed using innovative technologies taking into account the grid status. Time necessary for the computing of a radiotherapy planning simulation using electrons was reduced by a factor 30. A Web platform based on G.E.N.I.U.S. portal was developed to make easily available all the methods to submit and manage G
Development of a space radiation Monte Carlo computer simulation based on the FLUKA and ROOT codes
Pinsky, L; Ferrari, A; Sala, P; Carminati, F; Brun, R
2001-01-01
This NASA funded project is proceeding to develop a Monte Carlo-based computer simulation of the radiation environment in space. With actual funding only initially in place at the end of May 2000, the study is still in the early stage of development. The general tasks have been identified and personnel have been selected. The code to be assembled will be based upon two major existing software packages. The radiation transport simulation will be accomplished by updating the FLUKA Monte Carlo program, and the user interface will employ the ROOT software being developed at CERN. The end-product will be a Monte Carlo-based code which will complement the existing analytic codes such as BRYNTRN/HZETRN presently used by NASA to evaluate the effects of radiation shielding in space. The planned code will possess the ability to evaluate the radiation environment for spacecraft and habitats in Earth orbit, in interplanetary space, on the lunar surface, or on a planetary surface such as Mars. Furthermore, it will be usef...
Radiation doses in volume-of-interest breast computed tomography—A Monte Carlo simulation study
Energy Technology Data Exchange (ETDEWEB)
Lai, Chao-Jen, E-mail: cjlai3711@gmail.com; Zhong, Yuncheng; Yi, Ying; Wang, Tianpeng; Shaw, Chris C. [Department of Imaging Physics, The University of Texas MD Anderson Cancer Center, Houston, Texas 77030-4009 (United States)
2015-06-15
Purpose: Cone beam breast computed tomography (breast CT) with true three-dimensional, nearly isotropic spatial resolution has been developed and investigated over the past decade to overcome the problem of lesions overlapping with breast anatomical structures on two-dimensional mammographic images. However, the ability of breast CT to detect small objects, such as tissue structure edges and small calcifications, is limited. To resolve this problem, the authors proposed and developed a volume-of-interest (VOI) breast CT technique to image a small VOI using a higher radiation dose to improve that region’s visibility. In this study, the authors performed Monte Carlo simulations to estimate average breast dose and average glandular dose (AGD) for the VOI breast CT technique. Methods: Electron–Gamma-Shower system code-based Monte Carlo codes were used to simulate breast CT. The Monte Carlo codes estimated were validated using physical measurements of air kerma ratios and point doses in phantoms with an ion chamber and optically stimulated luminescence dosimeters. The validated full cone x-ray source was then collimated to simulate half cone beam x-rays to image digital pendant-geometry, hemi-ellipsoidal, homogeneous breast phantoms and to estimate breast doses with full field scans. 13-cm in diameter, 10-cm long hemi-ellipsoidal homogeneous phantoms were used to simulate median breasts. Breast compositions of 25% and 50% volumetric glandular fractions (VGFs) were used to investigate the influence on breast dose. The simulated half cone beam x-rays were then collimated to a narrow x-ray beam with an area of 2.5 × 2.5 cm{sup 2} field of view at the isocenter plane and to perform VOI field scans. The Monte Carlo results for the full field scans and the VOI field scans were then used to estimate the AGD for the VOI breast CT technique. Results: The ratios of air kerma ratios and dose measurement results from the Monte Carlo simulation to those from the physical
Region-oriented CT image representation for reducing computing time of Monte Carlo simulations
International Nuclear Information System (INIS)
Sarrut, David; Guigues, Laurent
2008-01-01
Purpose. We propose a new method for efficient particle transportation in voxelized geometry for Monte Carlo simulations. We describe its use for calculating dose distribution in CT images for radiation therapy. Material and methods. The proposed approach, based on an implicit volume representation named segmented volume, coupled with an adapted segmentation procedure and a distance map, allows us to minimize the number of boundary crossings, which slows down simulation. The method was implemented with the GEANT4 toolkit and compared to four other methods: One box per voxel, parameterized volumes, octree-based volumes, and nested parameterized volumes. For each representation, we compared dose distribution, time, and memory consumption. Results. The proposed method allows us to decrease computational time by up to a factor of 15, while keeping memory consumption low, and without any modification of the transportation engine. Speeding up is related to the geometry complexity and the number of different materials used. We obtained an optimal number of steps with removal of all unnecessary steps between adjacent voxels sharing a similar material. However, the cost of each step is increased. When the number of steps cannot be decreased enough, due for example, to the large number of material boundaries, such a method is not considered suitable. Conclusion. This feasibility study shows that optimizing the representation of an image in memory potentially increases computing efficiency. We used the GEANT4 toolkit, but we could potentially use other Monte Carlo simulation codes. The method introduces a tradeoff between speed and geometry accuracy, allowing computational time gain. However, simulations with GEANT4 remain slow and further work is needed to speed up the procedure while preserving the desired accuracy
Energy Technology Data Exchange (ETDEWEB)
Thiam, Ch O
2007-10-15
Accurate radiotherapy treatment requires the delivery of a precise dose to the tumour volume and a good knowledge of the dose deposit to the neighbouring zones. Computation of the treatments is usually carried out by a Treatment Planning System (T.P.S.) which needs to be precise and fast. The G.A.T.E. platform for Monte-Carlo simulation based on G.E.A.N.T.4 is an emerging tool for nuclear medicine application that provides functionalities for fast and reliable dosimetric calculations. In this thesis, we studied in parallel a validation of the G.A.T.E. platform for the modelling of electrons and photons low energy sources and the optimized use of grid infrastructures to reduce simulations computing time. G.A.T.E. was validated for the dose calculation of point kernels for mono-energetic electrons and compared with the results of other Monte-Carlo studies. A detailed study was made on the energy deposit during electrons transport in G.E.A.N.T.4. In order to validate G.A.T.E. for very low energy photons (<35 keV), three models of radioactive sources used in brachytherapy and containing iodine 125 (2301 of Best Medical International; Symmetra of Uro- Med/Bebig and 6711 of Amersham) were simulated. Our results were analyzed according to the recommendations of task group No43 of American Association of Physicists in Medicine (A.A.P.M.). They show a good agreement between G.A.T.E., the reference studies and A.A.P.M. recommended values. The use of Monte-Carlo simulations for a better definition of the dose deposited in the tumour volumes requires long computing time. In order to reduce it, we exploited E.G.E.E. grid infrastructure where simulations are distributed using innovative technologies taking into account the grid status. Time necessary for the computing of a radiotherapy planning simulation using electrons was reduced by a factor 30. A Web platform based on G.E.N.I.U.S. portal was developed to make easily available all the methods to submit and manage G
Pediatric personalized CT-dosimetry Monte Carlo simulations, using computational phantoms
International Nuclear Information System (INIS)
Papadimitroulas, P; Kagadis, G C; Ploussi, A; Kordolaimi, S; Papamichail, D; Karavasilis, E; Syrgiamiotis, V; Loudos, G
2015-01-01
The last 40 years Monte Carlo (MC) simulations serve as a “gold standard” tool for a wide range of applications in the field of medical physics and tend to be essential in daily clinical practice. Regarding diagnostic imaging applications, such as computed tomography (CT), the assessment of deposited energy is of high interest, so as to better analyze the risks and the benefits of the procedure. The last few years a big effort is done towards personalized dosimetry, especially in pediatric applications. In the present study the GATE toolkit was used and computational pediatric phantoms have been modeled for the assessment of CT examinations dosimetry. The pediatric models used come from the XCAT and IT'IS series. The X-ray spectrum of a Brightspeed CT scanner was simulated and validated with experimental data. Specifically, a DCT-10 ionization chamber was irradiated twice using 120 kVp with 100 mAs and 200 mAs, for 1 sec in 1 central axial slice (thickness = 10mm). The absorbed dose was measured in air resulting in differences lower than 4% between the experimental and simulated data. The simulations were acquired using ∼10 10 number of primaries in order to achieve low statistical uncertainties. Dose maps were also saved for quantification of the absorbed dose in several children critical organs during CT acquisition. (paper)
Romano, Paul Kollath
Monte Carlo particle transport methods are being considered as a viable option for high-fidelity simulation of nuclear reactors. While Monte Carlo methods offer several potential advantages over deterministic methods, there are a number of algorithmic shortcomings that would prevent their immediate adoption for full-core analyses. In this thesis, algorithms are proposed both to ameliorate the degradation in parallel efficiency typically observed for large numbers of processors and to offer a means of decomposing large tally data that will be needed for reactor analysis. A nearest-neighbor fission bank algorithm was proposed and subsequently implemented in the OpenMC Monte Carlo code. A theoretical analysis of the communication pattern shows that the expected cost is O( N ) whereas traditional fission bank algorithms are O(N) at best. The algorithm was tested on two supercomputers, the Intrepid Blue Gene/P and the Titan Cray XK7, and demonstrated nearly linear parallel scaling up to 163,840 processor cores on a full-core benchmark problem. An algorithm for reducing network communication arising from tally reduction was analyzed and implemented in OpenMC. The proposed algorithm groups only particle histories on a single processor into batches for tally purposes---in doing so it prevents all network communication for tallies until the very end of the simulation. The algorithm was tested, again on a full-core benchmark, and shown to reduce network communication substantially. A model was developed to predict the impact of load imbalances on the performance of domain decomposed simulations. The analysis demonstrated that load imbalances in domain decomposed simulations arise from two distinct phenomena: non-uniform particle densities and non-uniform spatial leakage. The dominant performance penalty for domain decomposition was shown to come from these physical effects rather than insufficient network bandwidth or high latency. The model predictions were verified with
Spirou, Spiridon V; Papadimitroulas, Panagiotis; Liakou, Paraskevi; Georgoulias, Panagiotis; Loudos, George
2015-09-01
To present and evaluate a new methodology to investigate the effect of attenuation correction (AC) in single-photon emission computed tomography (SPECT) using textural features analysis, Monte Carlo techniques, and a computational anthropomorphic model. The GATE Monte Carlo toolkit was used to simulate SPECT experiments using the XCAT computational anthropomorphic model, filled with a realistic biodistribution of (99m)Tc-N-DBODC. The simulated gamma camera was the Siemens ECAM Dual-Head, equipped with a parallel hole lead collimator, with an image resolution of 3.54 × 3.54 mm(2). Thirty-six equispaced camera positions, spanning a full 360° arc, were simulated. Projections were calculated after applying a ± 20% energy window or after eliminating all scattered photons. The activity of the radioisotope was reconstructed using the MLEM algorithm. Photon attenuation was accounted for by calculating the radiological pathlength in a perpendicular line from the center of each voxel to the gamma camera. Twenty-two textural features were calculated on each slice, with and without AC, using 16 and 64 gray levels. A mask was used to identify only those pixels that belonged to each organ. Twelve of the 22 features showed almost no dependence on AC, irrespective of the organ involved. In both the heart and the liver, the mean and SD were the features most affected by AC. In the liver, six features were affected by AC only on some slices. Depending on the slice, skewness decreased by 22-34% with AC, kurtosis by 35-50%, long-run emphasis mean by 71-91%, and long-run emphasis range by 62-95%. In contrast, gray-level non-uniformity mean increased by 78-218% compared with the value without AC and run percentage mean by 51-159%. These results were not affected by the number of gray levels (16 vs. 64) or the data used for reconstruction: with the energy window or without scattered photons. The mean and SD were the main features affected by AC. In the heart, no other feature was
CTmod—A toolkit for Monte Carlo simulation of projections including scatter in computed tomography
Czech Academy of Sciences Publication Activity Database
Malušek, Alexandr; Sandborg, M.; Alm Carlsson, G.
2008-01-01
Roč. 90, č. 2 (2008), s. 167-178 ISSN 0169-2607 Institutional research plan: CEZ:AV0Z10480505 Keywords : Monte Carlo * computed tomography * cone beam * scatter Subject RIV: JC - Computer Hardware ; Software Impact factor: 1.220, year: 2008 http://dx.doi.org/10.1016/j.cmpb.2007.12.005
Propagation of uncertainty by Monte Carlo simulations in case of basic geodetic computations
Directory of Open Access Journals (Sweden)
Wyszkowska Patrycja
2017-12-01
Full Text Available The determination of the accuracy of functions of measured or adjusted values may be a problem in geodetic computations. The general law of covariance propagation or in case of the uncorrelated observations the propagation of variance (or the Gaussian formula are commonly used for that purpose. That approach is theoretically justified for the linear functions. In case of the non-linear functions, the first-order Taylor series expansion is usually used but that solution is affected by the expansion error. The aim of the study is to determine the applicability of the general variance propagation law in case of the non-linear functions used in basic geodetic computations. The paper presents errors which are a result of negligence of the higher-order expressions and it determines the range of such simplification. The basis of that analysis is the comparison of the results obtained by the law of propagation of variance and the probabilistic approach, namely Monte Carlo simulations. Both methods are used to determine the accuracy of the following geodetic computations: the Cartesian coordinates of unknown point in the three-point resection problem, azimuths and distances of the Cartesian coordinates, height differences in the trigonometric and the geometric levelling. These simulations and the analysis of the results confirm the possibility of applying the general law of variance propagation in basic geodetic computations even if the functions are non-linear. The only condition is the accuracy of observations, which cannot be too low. Generally, this is not a problem with using present geodetic instruments.
Propagation of uncertainty by Monte Carlo simulations in case of basic geodetic computations
Wyszkowska, Patrycja
2017-12-01
The determination of the accuracy of functions of measured or adjusted values may be a problem in geodetic computations. The general law of covariance propagation or in case of the uncorrelated observations the propagation of variance (or the Gaussian formula) are commonly used for that purpose. That approach is theoretically justified for the linear functions. In case of the non-linear functions, the first-order Taylor series expansion is usually used but that solution is affected by the expansion error. The aim of the study is to determine the applicability of the general variance propagation law in case of the non-linear functions used in basic geodetic computations. The paper presents errors which are a result of negligence of the higher-order expressions and it determines the range of such simplification. The basis of that analysis is the comparison of the results obtained by the law of propagation of variance and the probabilistic approach, namely Monte Carlo simulations. Both methods are used to determine the accuracy of the following geodetic computations: the Cartesian coordinates of unknown point in the three-point resection problem, azimuths and distances of the Cartesian coordinates, height differences in the trigonometric and the geometric levelling. These simulations and the analysis of the results confirm the possibility of applying the general law of variance propagation in basic geodetic computations even if the functions are non-linear. The only condition is the accuracy of observations, which cannot be too low. Generally, this is not a problem with using present geodetic instruments.
Prediction of beam hardening artefacts in computed tomography using Monte Carlo simulations
DEFF Research Database (Denmark)
Thomsen, M.; Bergbäck Knudsen, Erik; Willendrup, Peter Kjær
2015-01-01
We show how radiological images of both single and multi material samples can be simulated using the Monte Carlo simulation tool McXtrace and how these images can be used to make a three dimensional reconstruction. Good numerical agreement between the X-ray attenuation coefficient in experimental...
Energy Technology Data Exchange (ETDEWEB)
Chow, J [Princess Margaret Cancer Center, Toronto, ON (Canada)
2015-06-15
Purpose: This study evaluated the efficiency of 4D lung radiation treatment planning using Monte Carlo simulation on the cloud. The EGSnrc Monte Carlo code was used in dose calculation on the 4D-CT image set. Methods: 4D lung radiation treatment plan was created by the DOSCTP linked to the cloud, based on the Amazon elastic compute cloud platform. Dose calculation was carried out by Monte Carlo simulation on the 4D-CT image set on the cloud, and results were sent to the FFD4D image deformation program for dose reconstruction. The dependence of computing time for treatment plan on the number of compute node was optimized with variations of the number of CT image set in the breathing cycle and dose reconstruction time of the FFD4D. Results: It is found that the dependence of computing time on the number of compute node was affected by the diminishing return of the number of node used in Monte Carlo simulation. Moreover, the performance of the 4D treatment planning could be optimized by using smaller than 10 compute nodes on the cloud. The effects of the number of image set and dose reconstruction time on the dependence of computing time on the number of node were not significant, as more than 15 compute nodes were used in Monte Carlo simulations. Conclusion: The issue of long computing time in 4D treatment plan, requiring Monte Carlo dose calculations in all CT image sets in the breathing cycle, can be solved using the cloud computing technology. It is concluded that the optimized number of compute node selected in simulation should be between 5 and 15, as the dependence of computing time on the number of node is significant.
International Nuclear Information System (INIS)
Chow, J
2015-01-01
Purpose: This study evaluated the efficiency of 4D lung radiation treatment planning using Monte Carlo simulation on the cloud. The EGSnrc Monte Carlo code was used in dose calculation on the 4D-CT image set. Methods: 4D lung radiation treatment plan was created by the DOSCTP linked to the cloud, based on the Amazon elastic compute cloud platform. Dose calculation was carried out by Monte Carlo simulation on the 4D-CT image set on the cloud, and results were sent to the FFD4D image deformation program for dose reconstruction. The dependence of computing time for treatment plan on the number of compute node was optimized with variations of the number of CT image set in the breathing cycle and dose reconstruction time of the FFD4D. Results: It is found that the dependence of computing time on the number of compute node was affected by the diminishing return of the number of node used in Monte Carlo simulation. Moreover, the performance of the 4D treatment planning could be optimized by using smaller than 10 compute nodes on the cloud. The effects of the number of image set and dose reconstruction time on the dependence of computing time on the number of node were not significant, as more than 15 compute nodes were used in Monte Carlo simulations. Conclusion: The issue of long computing time in 4D treatment plan, requiring Monte Carlo dose calculations in all CT image sets in the breathing cycle, can be solved using the cloud computing technology. It is concluded that the optimized number of compute node selected in simulation should be between 5 and 15, as the dependence of computing time on the number of node is significant
PENELOPE, and algorithm and computer code for Monte Carlo simulation of electron-photon showers
Energy Technology Data Exchange (ETDEWEB)
Salvat, F.; Fernandez-Varea, J.M.; Baro, J.; Sempau, J.
1996-10-01
The FORTRAN 77 subroutine package PENELOPE performs Monte Carlo simulation of electron-photon showers in arbitrary for a wide energy range, from similar{sub t}o 1 KeV to several hundred MeV. Photon transport is simulated by means of the standard, detailed simulation scheme. Electron and positron histories are generated on the basis of a mixed procedure, which combines detailed simulation of hard events with condensed simulation of soft interactions. A simple geometry package permits the generation of random electron-photon showers in material systems consisting of homogeneous bodies limited by quadric surfaces, i.e. planes, spheres cylinders, etc. This report is intended not only to serve as a manual of the simulation package, but also to provide the user with the necessary information to understand the details of the Monte Carlo algorithm.
PENELOPE, an algorithm and computer code for Monte Carlo simulation of electron-photon showers
Energy Technology Data Exchange (ETDEWEB)
Salvat, F; Fernandez-Varea, J M; Baro, J; Sempau, J
1996-07-01
The FORTRAN 77 subroutine package PENELOPE performs Monte Carlo simulation of electron-photon showers in arbitrary for a wide energy range, from 1 keV to several hundred MeV. Photon transport is simulated by means of the standard, detailed simulation scheme. Electron and positron histories are generated on the basis of a mixed procedure, which combines detailed simulation of hard events with condensed simulation of soft interactions. A simple geometry package permits the generation of random electron-photon showers in material systems consisting of homogeneous bodies limited by quadric surfaces, i.e. planes, spheres, cylinders, etc. This report is intended not only to serve as a manual of the simulation package, but also to provide the user with the necessary information to understand the details of the Monte Carlo algorithm. (Author) 108 refs.
Energy Technology Data Exchange (ETDEWEB)
Gomes B, W. O., E-mail: wilsonottobatista@gmail.com [Instituto Federal da Bahia, Rua Emidio dos Santos s/n, Barbalho 40301-015, Salvador de Bahia (Brazil)
2016-10-15
This study aimed to develop a geometry of irradiation applicable to the software PCXMC and the consequent calculation of effective dose in applications of the Computed Tomography Cone Beam (CBCT). We evaluated two different CBCT equipment s for dental applications: Care stream Cs 9000 3-dimensional tomograph; i-CAT and GENDEX GXCB-500. Initially characterize each protocol measuring the surface kerma input and the product kerma air-area, P{sub KA}, with solid state detectors RADCAL and PTW transmission chamber. Then we introduce the technical parameters of each preset protocols and geometric conditions in the PCXMC software to obtain the values of effective dose. The calculated effective dose is within the range of 9.0 to 15.7 μSv for 3-dimensional computer 9000 Cs; within the range 44.5 to 89 μSv for GXCB-500 equipment and in the range of 62-111 μSv for equipment Classical i-CAT. These values were compared with results obtained dosimetry using TLD implanted in anthropomorphic phantom and are considered consistent. Os effective dose results are very sensitive to the geometry of radiation (beam position in mathematical phantom). This factor translates to a factor of fragility software usage. But it is very useful to get quick answers to regarding process optimization tool conclusions protocols. We conclude that use software PCXMC Monte Carlo simulation is useful assessment protocols for CBCT tests in dental applications. (Author)
Monte Carlo simulation of electrothermal atomization on a desktop personal computer
Histen, Timothy E.; Güell, Oscar A.; Chavez, Iris A.; Holcombea, James A.
1996-07-01
Monte Carlo simulations have been applied to electrothermal atomization (ETA) using a tubular atomizer (e.g. graphite furnace) because of the complexity in the geometry, heating, molecular interactions, etc. The intense computational time needed to accurately model ETA often limited its effective implementation to the use of supercomputers. However, with the advent of more powerful desktop processors, this is no longer the case. A C-based program has been developed and can be used under Windows TM or DOS. With this program, basic parameters such as furnace dimensions, sample placement, furnace heating and kinetic parameters such as activation energies for desorption and adsorption can be varied to show the absorbance profile dependence on these parameters. Even data such as time-dependent spatial distribution of analyte inside the furnace can be collected. The DOS version also permits input of external temperaturetime data to permit comparison of simulated profiles with experimentally obtained absorbance data. The run-time versions are provided along with the source code. This article is an electronic publication in Spectrochimica Acta Electronica (SAE), the electronic section of Spectrochimica Acta Part B (SAB). The hardcopy text is accompanied by a diskette with a program (PC format), data files and text files.
Zhang, Guannan; Del-Castillo-Negrete, Diego
2017-10-01
Kinetic descriptions of RE are usually based on the bounced-averaged Fokker-Planck model that determines the PDFs of RE. Despite of the simplification involved, the Fokker-Planck equation can rarely be solved analytically and direct numerical approaches (e.g., continuum and particle-based Monte Carlo (MC)) can be time consuming specially in the computation of asymptotic-type observable including the runaway probability, the slowing-down and runaway mean times, and the energy limit probability. Here we present a novel backward MC approach to these problems based on backward stochastic differential equations (BSDEs). The BSDE model can simultaneously describe the PDF of RE and the runaway probabilities by means of the well-known Feynman-Kac theory. The key ingredient of the backward MC algorithm is to place all the particles in a runaway state and simulate them backward from the terminal time to the initial time. As such, our approach can provide much faster convergence than the brute-force MC methods, which can significantly reduce the number of particles required to achieve a prescribed accuracy. Moreover, our algorithm can be parallelized as easy as the direct MC code, which paves the way for conducting large-scale RE simulation. This work is supported by DOE FES and ASCR under the Contract Numbers ERKJ320 and ERAT377.
Directory of Open Access Journals (Sweden)
Yun Hsing Cheung
2012-12-01
Full Text Available The three main Value at Risk (VaR methodologies are historical, parametric and Monte Carlo Simulation.Cheung & Powell (2012, using a step-by-step teaching study, showed how a nonparametric historical VaRmodel could be constructed using Excel, thus benefitting teachers and researchers by providing them with areadily useable teaching study and an inexpensive and flexible VaR modelling option. This article extends thatwork by demonstrating how parametric and Monte Carlo Simulation VaR models can also be constructed inExcel, thus providing a total Excel modelling package encompassing all three VaR methods.
International Nuclear Information System (INIS)
Li Di; Wang Geng; Chen Yang; Li Lin; Shrivastav, Gaurav; Oak, Stimit; Tasch, Al; Banerjee, Sanjay; Obradovic, Borna
2001-01-01
A physically-based three-dimensional Monte Carlo simulator has been developed within UT-MARLOWE, which is capable of simulating ion implantation into multi-material systems and arbitrary topography. Introducing the third dimension can result in a severe CPU time penalty. In order to minimize this penalty, a three-dimensional trajectory replication algorithm has been developed, implemented and verified. More than two orders of magnitude savings of CPU time have been observed. An unbalanced Octree structure was used to decompose three-dimensional structures. It effectively simplifies the structure, offers a good balance between modeling accuracy and computational efficiency, and allows arbitrary precision of mapping the Octree onto desired structure. Using the well-established and validated physical models in UT-MARLOWE 5.0, this simulator has been extensively verified by comparing the integrated one-dimensional simulation results with secondary ion mass spectroscopy (SIMS). Two options, the typical case and the worst scenario, have been selected to simulate ion implantation into poly-silicon under various scenarios using this simulator: implantation into a random, amorphous network, and implantation into the worst-case channeling condition, into (1 1 0) orientated wafers
Monte Carlo simulation for IRRMA
International Nuclear Information System (INIS)
Gardner, R.P.; Liu Lianyan
2000-01-01
Monte Carlo simulation is fast becoming a standard approach for many radiation applications that were previously treated almost entirely by experimental techniques. This is certainly true for Industrial Radiation and Radioisotope Measurement Applications - IRRMA. The reasons for this include: (1) the increased cost and inadequacy of experimentation for design and interpretation purposes; (2) the availability of low cost, large memory, and fast personal computers; and (3) the general availability of general purpose Monte Carlo codes that are increasingly user-friendly, efficient, and accurate. This paper discusses the history and present status of Monte Carlo simulation for IRRMA including the general purpose (GP) and specific purpose (SP) Monte Carlo codes and future needs - primarily from the experience of the authors
Icarus: A 2-D Direct Simulation Monte Carlo (DSMC) Code for Multi-Processor Computers
International Nuclear Information System (INIS)
BARTEL, TIMOTHY J.; PLIMPTON, STEVEN J.; GALLIS, MICHAIL A.
2001-01-01
Icarus is a 2D Direct Simulation Monte Carlo (DSMC) code which has been optimized for the parallel computing environment. The code is based on the DSMC method of Bird[11.1] and models from free-molecular to continuum flowfields in either cartesian (x, y) or axisymmetric (z, r) coordinates. Computational particles, representing a given number of molecules or atoms, are tracked as they have collisions with other particles or surfaces. Multiple species, internal energy modes (rotation and vibration), chemistry, and ion transport are modeled. A new trace species methodology for collisions and chemistry is used to obtain statistics for small species concentrations. Gas phase chemistry is modeled using steric factors derived from Arrhenius reaction rates or in a manner similar to continuum modeling. Surface chemistry is modeled with surface reaction probabilities; an optional site density, energy dependent, coverage model is included. Electrons are modeled by either a local charge neutrality assumption or as discrete simulational particles. Ion chemistry is modeled with electron impact chemistry rates and charge exchange reactions. Coulomb collision cross-sections are used instead of Variable Hard Sphere values for ion-ion interactions. The electro-static fields can either be: externally input, a Langmuir-Tonks model or from a Green's Function (Boundary Element) based Poison Solver. Icarus has been used for subsonic to hypersonic, chemically reacting, and plasma flows. The Icarus software package includes the grid generation, parallel processor decomposition, post-processing, and restart software. The commercial graphics package, Tecplot, is used for graphics display. All of the software packages are written in standard Fortran
Computer simulation of HTGR fuel microspheres using a Monte-Carlo statistical approach
International Nuclear Information System (INIS)
Hedrick, C.E.
1976-01-01
The concept and computational aspects of a Monte-Carlo statistical approach in relating structure of HTGR fuel microspheres to the uranium content of fuel samples have been verified. Results of the preliminary validation tests and the benefits to be derived from the program are summarized
Computer simulation of stochastic processes through model-sampling (Monte Carlo) techniques.
Sheppard, C W.
1969-03-01
A simple Monte Carlo simulation program is outlined which can be used for the investigation of random-walk problems, for example in diffusion, or the movement of tracers in the blood circulation. The results given by the simulation are compared with those predicted by well-established theory, and it is shown how the model can be expanded to deal with drift, and with reflexion from or adsorption at a boundary.
Availability of fusion plants employing a Monte Carlo simulation computer code
International Nuclear Information System (INIS)
Musicki, Z.
1984-01-01
The fusion facilities being built or designed will have availability problems due to their complexity and employment of not yet fully developed technologies. Low availability of test facilities will have an adverse impact on the learning time and will therefore push back the commercialization date of fusion. Low availability of commercial electric power plants will increase the cost of electricity and make fusion a less-attractive power source. Thus, the time to study the availability problems of fusion plants and suggest improvements is now, before costly mistakes are committed. This study is an initial effort in the area and is an attempt to develop methods for calculation of system's performance, specifically its availability, start collecting necessary data and identify the areas where data are lacking, as well as to point out the subsystems where resources need to be applied in order to bring about an acceptable system performance. The method used to study availability is a simulation computer code based on the Monte Carlo process and developed by the author. The fusion systems analyzed were TASKA (a tandem mirror test facility design) and MARS (a tandem mirror power plant design). The model and available data were employed to find that the most critical subsystems needing further work are the neutral beams, RF heating subsystems, direct convertor, and certain magnets
PhyloSim - Monte Carlo simulation of sequence evolution in the R statistical computing environment
Directory of Open Access Journals (Sweden)
Massingham Tim
2011-04-01
Full Text Available Abstract Background The Monte Carlo simulation of sequence evolution is routinely used to assess the performance of phylogenetic inference methods and sequence alignment algorithms. Progress in the field of molecular evolution fuels the need for more realistic and hence more complex simulations, adapted to particular situations, yet current software makes unreasonable assumptions such as homogeneous substitution dynamics or a uniform distribution of indels across the simulated sequences. This calls for an extensible simulation framework written in a high-level functional language, offering new functionality and making it easy to incorporate further complexity. Results PhyloSim is an extensible framework for the Monte Carlo simulation of sequence evolution, written in R, using the Gillespie algorithm to integrate the actions of many concurrent processes such as substitutions, insertions and deletions. Uniquely among sequence simulation tools, PhyloSim can simulate arbitrarily complex patterns of rate variation and multiple indel processes, and allows for the incorporation of selective constraints on indel events. User-defined complex patterns of mutation and selection can be easily integrated into simulations, allowing PhyloSim to be adapted to specific needs. Conclusions Close integration with R and the wide range of features implemented offer unmatched flexibility, making it possible to simulate sequence evolution under a wide range of realistic settings. We believe that PhyloSim will be useful to future studies involving simulated alignments.
Kim, Sangroh; Yoshizumi, Terry; Toncheva, Greta; Yoo, Sua; Yin, Fang-Fang; Frush, Donald
2010-05-01
To address the lack of accurate dose estimation method in cone beam computed tomography (CBCT), we performed point dose metal oxide semiconductor field-effect transistor (MOSFET) measurements and Monte Carlo (MC) simulations. A Varian On-Board Imager (OBI) was employed to measure point doses in the polymethyl methacrylate (PMMA) CT phantoms with MOSFETs for standard and low dose modes. A MC model of the OBI x-ray tube was developed using BEAMnrc/EGSnrc MC system and validated by the half value layer, x-ray spectrum and lateral and depth dose profiles. We compared the weighted computed tomography dose index (CTDIw) between MOSFET measurements and MC simulations. The CTDIw was found to be 8.39 cGy for the head scan and 4.58 cGy for the body scan from the MOSFET measurements in standard dose mode, and 1.89 cGy for the head and 1.11 cGy for the body in low dose mode, respectively. The CTDIw from MC compared well to the MOSFET measurements within 5% differences. In conclusion, a MC model for Varian CBCT has been established and this approach may be easily extended from the CBCT geometry to multi-detector CT geometry.
Reactor physics simulations with coupled Monte Carlo calculation and computational fluid dynamics
International Nuclear Information System (INIS)
Seker, V.; Thomas, J.W.; Downar, T.J.
2007-01-01
A computational code system based on coupling the Monte Carlo code MCNP5 and the Computational Fluid Dynamics (CFD) code STAR-CD was developed as an audit tool for lower order nuclear reactor calculations. This paper presents the methodology of the developed computer program 'McSTAR'. McSTAR is written in FORTRAN90 programming language and couples MCNP5 and the commercial CFD code STAR-CD. MCNP uses a continuous energy cross section library produced by the NJOY code system from the raw ENDF/B data. A major part of the work was to develop and implement methods to update the cross section library with the temperature distribution calculated by STARCD for every region. Three different methods were investigated and implemented in McSTAR. The user subroutines in STAR-CD are modified to read the power density data and assign them to the appropriate variables in the program and to write an output data file containing the temperature, density and indexing information to perform the mapping between MCNP and STAR-CD cells. Preliminary testing of the code was performed using a 3x3 PWR pin-cell problem. The preliminary results are compared with those obtained from a STAR-CD coupled calculation with the deterministic transport code DeCART. Good agreement in the k eff and the power profile was observed. Increased computational capabilities and improvements in computational methods have accelerated interest in high fidelity modeling of nuclear reactor cores during the last several years. High-fidelity has been achieved by utilizing full core neutron transport solutions for the neutronics calculation and computational fluid dynamics solutions for the thermal-hydraulics calculation. Previous researchers have reported the coupling of 3D deterministic neutron transport method to CFD and their application to practical reactor analysis problems. One of the principal motivations of the work here was to utilize Monte Carlo methods to validate the coupled deterministic neutron transport
Mean field simulation for Monte Carlo integration
Del Moral, Pierre
2013-01-01
In the last three decades, there has been a dramatic increase in the use of interacting particle methods as a powerful tool in real-world applications of Monte Carlo simulation in computational physics, population biology, computer sciences, and statistical machine learning. Ideally suited to parallel and distributed computation, these advanced particle algorithms include nonlinear interacting jump diffusions; quantum, diffusion, and resampled Monte Carlo methods; Feynman-Kac particle models; genetic and evolutionary algorithms; sequential Monte Carlo methods; adaptive and interacting Marko
International Nuclear Information System (INIS)
Lee, Choonsik; Kim, Kwang Pyo; Long, Daniel J.; Bolch, Wesley E.
2012-01-01
Purpose: To establish an organ dose database for pediatric and adolescent reference individuals undergoing computed tomography (CT) examinations by using Monte Carlo simulation. The data will permit rapid estimates of organ and effective doses for patients of different age, gender, examination type, and CT scanner model. Methods: The Monte Carlo simulation model of a Siemens Sensation 16 CT scanner previously published was employed as a base CT scanner model. A set of absorbed doses for 33 organs/tissues normalized to the product of 100 mAs and CTDI vol (mGy/100 mAs mGy) was established by coupling the CT scanner model with age-dependent reference pediatric hybrid phantoms. A series of single axial scans from the top of head to the feet of the phantoms was performed at a slice thickness of 10 mm, and at tube potentials of 80, 100, and 120 kVp. Using the established CTDI vol - and 100 mAs-normalized dose matrix, organ doses for different pediatric phantoms undergoing head, chest, abdomen-pelvis, and chest-abdomen-pelvis (CAP) scans with the Siemens Sensation 16 scanner were estimated and analyzed. The results were then compared with the values obtained from three independent published methods: CT-Expo software, organ dose for abdominal CT scan derived empirically from patient abdominal circumference, and effective dose per dose-length product (DLP). Results: Organ and effective doses were calculated and normalized to 100 mAs and CTDI vol for different CT examinations. At the same technical setting, dose to the organs, which were entirely included in the CT beam coverage, were higher by from 40 to 80% for newborn phantoms compared to those of 15-year phantoms. An increase of tube potential from 80 to 120 kVp resulted in 2.5-2.9-fold greater brain dose for head scans. The results from this study were compared with three different published studies and/or techniques. First, organ doses were compared to those given by CT-Expo which revealed dose differences up to
International Nuclear Information System (INIS)
Ford, R.L.; Nelson, W.R.
1978-06-01
A code to simulate almost any electron--photon transport problem conceivable is described. The report begins with a lengthy historical introduction and a description of the shower generation process. Then the detailed physics of the shower processes and the methods used to simulate them are presented. Ideas of sampling theory, transport techniques, particle interactions in general, and programing details are discussed. Next, EGS calculations and various experiments and other Monte Carlo results are compared. The remainder of the report consists of user manuals for EGS, PEGS, and TESTSR codes; options, input specifications, and typical output are included. 38 figures, 12 tables
International Nuclear Information System (INIS)
Yokohama, Noriya
2013-01-01
This report was aimed at structuring the design of architectures and studying performance measurement of a parallel computing environment using a Monte Carlo simulation for particle therapy using a high performance computing (HPC) instance within a public cloud-computing infrastructure. Performance measurements showed an approximately 28 times faster speed than seen with single-thread architecture, combined with improved stability. A study of methods of optimizing the system operations also indicated lower cost. (author)
International Nuclear Information System (INIS)
Hubert-Tremblay, Vincent; Archambault, Louis; Tubic, Dragan; Roy, Rene; Beaulieu, Luc
2006-01-01
The purpose of the present study is to introduce a compression algorithm for the CT (computed tomography) data used in Monte Carlo simulations. Performing simulations on the CT data implies large computational costs as well as large memory requirements since the number of voxels in such data reaches typically into hundreds of millions voxels. CT data, however, contain homogeneous regions which could be regrouped to form larger voxels without affecting the simulation's accuracy. Based on this property we propose a compression algorithm based on octrees: in homogeneous regions the algorithm replaces groups of voxels with a smaller number of larger voxels. This reduces the number of voxels while keeping the critical high-density gradient area. Results obtained using the present algorithm on both phantom and clinical data show that compression rates up to 75% are possible without losing the dosimetric accuracy of the simulation
International Nuclear Information System (INIS)
Long, Daniel J.; Lee, Choonsik; Tien, Christopher; Fisher, Ryan; Hoerner, Matthew R.; Hintenlang, David; Bolch, Wesley E.
2013-01-01
Purpose: To validate the accuracy of a Monte Carlo source model of the Siemens SOMATOM Sensation 16 CT scanner using organ doses measured in physical anthropomorphic phantoms. Methods: The x-ray output of the Siemens SOMATOM Sensation 16 multidetector CT scanner was simulated within the Monte Carlo radiation transport code, MCNPX version 2.6. The resulting source model was able to perform various simulated axial and helical computed tomographic (CT) scans of varying scan parameters, including beam energy, filtration, pitch, and beam collimation. Two custom-built anthropomorphic phantoms were used to take dose measurements on the CT scanner: an adult male and a 9-month-old. The adult male is a physical replica of University of Florida reference adult male hybrid computational phantom, while the 9-month-old is a replica of University of Florida Series B 9-month-old voxel computational phantom. Each phantom underwent a series of axial and helical CT scans, during which organ doses were measured using fiber-optic coupled plastic scintillator dosimeters developed at University of Florida. The physical setup was reproduced and simulated in MCNPX using the CT source model and the computational phantoms upon which the anthropomorphic phantoms were constructed. Average organ doses were then calculated based upon these MCNPX results. Results: For all CT scans, good agreement was seen between measured and simulated organ doses. For the adult male, the percent differences were within 16% for axial scans, and within 18% for helical scans. For the 9-month-old, the percent differences were all within 15% for both the axial and helical scans. These results are comparable to previously published validation studies using GE scanners and commercially available anthropomorphic phantoms. Conclusions: Overall results of this study show that the Monte Carlo source model can be used to accurately and reliably calculate organ doses for patients undergoing a variety of axial or helical CT
Monte Carlo simulations of neutron scattering instruments
International Nuclear Information System (INIS)
Aestrand, Per-Olof; Copenhagen Univ.; Lefmann, K.; Nielsen, K.
2001-01-01
A Monte Carlo simulation is an important computational tool used in many areas of science and engineering. The use of Monte Carlo techniques for simulating neutron scattering instruments is discussed. The basic ideas, techniques and approximations are presented. Since the construction of a neutron scattering instrument is very expensive, Monte Carlo software used for design of instruments have to be validated and tested extensively. The McStas software was designed with these aspects in mind and some of the basic principles of the McStas software will be discussed. Finally, some future prospects are discussed for using Monte Carlo simulations in optimizing neutron scattering experiments. (R.P.)
Reactor physics simulations with coupled Monte Carlo calculation and computational fluid dynamics
International Nuclear Information System (INIS)
Seker, V.; Thomas, J. W.; Downar, T. J.
2007-01-01
The interest in high fidelity modeling of nuclear reactor cores has increased over the last few years and has become computationally more feasible because of the dramatic improvements in processor speed and the availability of low cost parallel platforms. In the research here high fidelity, multi-physics analyses was performed by solving the neutron transport equation using Monte Carlo methods and by solving the thermal-hydraulics equations using computational fluid dynamics. A computation tool based on coupling the Monte Carlo code MCNP5 and the Computational Fluid Dynamics (CFD) code STAR-CD was developed as an audit tool for lower order nuclear reactor calculations. This paper presents the methodology of the developed computer program 'McSTAR' along with the verification and validation efforts. McSTAR is written in PERL programming language and couples MCNP5 and the commercial CFD code STAR-CD. MCNP uses a continuous energy cross section library produced by the NJOY code system from the raw ENDF/B data. A major part of the work was to develop and implement methods to update the cross section library with the temperature distribution calculated by STAR-CD for every region. Three different methods were investigated and two of them are implemented in McSTAR. The user subroutines in STAR-CD are modified to read the power density data and assign them to the appropriate variables in the program and to write an output data file containing the temperature, density and indexing information to perform the mapping between MCNP and STAR-CD cells. The necessary input file manipulation, data file generation, normalization and multi-processor calculation settings are all done through the program flow in McSTAR. Initial testing of the code was performed using a single pin cell and a 3X3 PWR pin-cell problem. The preliminary results of the single pin-cell problem are compared with those obtained from a STAR-CD coupled calculation with the deterministic transport code De
Atomic-level computer simulation
International Nuclear Information System (INIS)
Adams, J.B.; Rockett, Angus; Kieffer, John; Xu Wei; Nomura, Miki; Kilian, K.A.; Richards, D.F.; Ramprasad, R.
1994-01-01
This paper provides a broad overview of the methods of atomic-level computer simulation. It discusses methods of modelling atomic bonding, and computer simulation methods such as energy minimization, molecular dynamics, Monte Carlo, and lattice Monte Carlo. ((orig.))
International Nuclear Information System (INIS)
Courageot, Estelle
2010-01-01
After a description of the context of radiological accidents (definition, history, context, exposure types, associated clinic symptoms of irradiation and contamination, medical treatment, return on experience) and a presentation of dose assessment in the case of external exposure (clinic, biological and physical dosimetry), this research thesis describes the principles of numerical reconstruction of a radiological accident, presents some computation codes (Monte Carlo code, MCNPX code) and the SESAME tool, and reports an application to an actual case (an accident which occurred in Equator in April 2009). The next part reports the developments performed to modify the posture of voxelized phantoms and the experimental and numerical validations. The last part reports a feasibility study for the reconstruction of radiological accidents occurring in external radiotherapy. This work is based on a Monte Carlo simulation of a linear accelerator, with the aim of identifying the most relevant parameters to be implemented in SESAME in the case of external radiotherapy
International Nuclear Information System (INIS)
Pan, Yuxi; Qiu, Rui; Ge, Chaoyong; Xie, Wenzhang; Li, Junli; Gao, Linfeng; Zheng, Junzheng
2014-01-01
With the rapidly growing number of CT examinations, the consequential radiation risk has aroused more and more attention. The average dose in each organ during CT scans can only be obtained by using Monte Carlo simulation with computational phantoms. Since children tend to have higher radiation sensitivity than adults, the radiation dose of pediatric CT examinations requires special attention and needs to be assessed accurately. So far, studies on organ doses from CT exposures for pediatric patients are still limited. In this work, a 1-year-old computational phantom was constructed. The body contour was obtained from the CT images of a 1-year-old physical phantom and the internal organs were deformed from an existing Chinese reference adult phantom. To ensure the organ locations in the 1-year-old computational phantom were consistent with those of the physical phantom, the organ locations in 1-year-old computational phantom were manually adjusted one by one, and the organ masses were adjusted to the corresponding Chinese reference values. Moreover, a CT scanner model was developed using the Monte Carlo technique and the 1-year-old computational phantom was applied to estimate organ doses derived from simulated CT exposures. As a result, a database including doses to 36 organs and tissues from 47 single axial scans was built. It has been verified by calculation that doses of axial scans are close to those of helical scans; therefore, this database could be applied to helical scans as well. Organ doses were calculated using the database and compared with those obtained from the measurements made in the physical phantom for helical scans. The differences between simulation and measurement were less than 25% for all organs. The result shows that the 1-year-old phantom developed in this work can be used to calculate organ doses in CT exposures, and the dose database provides a method for the estimation of 1-year-old patient doses in a variety of CT examinations. (paper)
1983-09-01
duplicate a continuous function on a digital computer, and thus the machine representatic- of the GMA is only a close approximation of the continuous...error process. Thus, the manner in which the GMA process is digitally replicated has an effect on the results of the simulation. The parameterization of...Information Center 2 Cameron Station Alexandria, Virginia 22314 2. Libary , Code 0142 2 Naval Postgraduate School Monterey, California 93943 3. Professor
Energy Technology Data Exchange (ETDEWEB)
Wang, Z [Reading Hospital, West Reading, PA (United States); Gao, M [ProCure Treatment Centers, Warrenville, IL (United States)
2014-06-01
Purpose: Monte Carlo simulation plays an important role for proton Pencil Beam Scanning (PBS) technique. However, MC simulation demands high computing power and is limited to few large proton centers that can afford a computer cluster. We study the feasibility of utilizing cloud computing in the MC simulation of PBS beams. Methods: A GATE/GEANT4 based MC simulation software was installed on a commercial cloud computing virtual machine (Linux 64-bits, Amazon EC2). Single spot Integral Depth Dose (IDD) curves and in-air transverse profiles were used to tune the source parameters to simulate an IBA machine. With the use of StarCluster software developed at MIT, a Linux cluster with 2–100 nodes can be conveniently launched in the cloud. A proton PBS plan was then exported to the cloud where the MC simulation was run. Results: The simulated PBS plan has a field size of 10×10cm{sup 2}, 20cm range, 10cm modulation, and contains over 10,000 beam spots. EC2 instance type m1.medium was selected considering the CPU/memory requirement and 40 instances were used to form a Linux cluster. To minimize cost, master node was created with on-demand instance and worker nodes were created with spot-instance. The hourly cost for the 40-node cluster was $0.63 and the projected cost for a 100-node cluster was $1.41. Ten million events were simulated to plot PDD and profile, with each job containing 500k events. The simulation completed within 1 hour and an overall statistical uncertainty of < 2% was achieved. Good agreement between MC simulation and measurement was observed. Conclusion: Cloud computing is a cost-effective and easy to maintain platform to run proton PBS MC simulation. When proton MC packages such as GATE and TOPAS are combined with cloud computing, it will greatly facilitate the pursuing of PBS MC studies, especially for newly established proton centers or individual researchers.
International Nuclear Information System (INIS)
Wang, Z; Gao, M
2014-01-01
Purpose: Monte Carlo simulation plays an important role for proton Pencil Beam Scanning (PBS) technique. However, MC simulation demands high computing power and is limited to few large proton centers that can afford a computer cluster. We study the feasibility of utilizing cloud computing in the MC simulation of PBS beams. Methods: A GATE/GEANT4 based MC simulation software was installed on a commercial cloud computing virtual machine (Linux 64-bits, Amazon EC2). Single spot Integral Depth Dose (IDD) curves and in-air transverse profiles were used to tune the source parameters to simulate an IBA machine. With the use of StarCluster software developed at MIT, a Linux cluster with 2–100 nodes can be conveniently launched in the cloud. A proton PBS plan was then exported to the cloud where the MC simulation was run. Results: The simulated PBS plan has a field size of 10×10cm 2 , 20cm range, 10cm modulation, and contains over 10,000 beam spots. EC2 instance type m1.medium was selected considering the CPU/memory requirement and 40 instances were used to form a Linux cluster. To minimize cost, master node was created with on-demand instance and worker nodes were created with spot-instance. The hourly cost for the 40-node cluster was $0.63 and the projected cost for a 100-node cluster was $1.41. Ten million events were simulated to plot PDD and profile, with each job containing 500k events. The simulation completed within 1 hour and an overall statistical uncertainty of < 2% was achieved. Good agreement between MC simulation and measurement was observed. Conclusion: Cloud computing is a cost-effective and easy to maintain platform to run proton PBS MC simulation. When proton MC packages such as GATE and TOPAS are combined with cloud computing, it will greatly facilitate the pursuing of PBS MC studies, especially for newly established proton centers or individual researchers
Monte Carlo simulation of experiments
International Nuclear Information System (INIS)
Opat, G.I.
1977-07-01
An outline of the technique of computer simulation of particle physics experiments by the Monte Carlo method is presented. Useful special purpose subprograms are listed and described. At each stage the discussion is made concrete by direct reference to the programs SIMUL8 and its variant MONTE-PION, written to assist in the analysis of the radiative decay experiments μ + → e + ν sub(e) antiνγ and π + → e + ν sub(e)γ, respectively. These experiments were based on the use of two large sodium iodide crystals, TINA and MINA, as e and γ detectors. Instructions for the use of SIMUL8 and MONTE-PION are given. (author)
DEFF Research Database (Denmark)
Slot Thing, Rune; Bernchou, Uffe; Mainegra-Hing, Ernesto
2013-01-01
Abstract Purpose. Cone beam computed tomography (CBCT) image quality is limited by scattered photons. Monte Carlo (MC) simulations provide the ability of predicting the patient-specific scatter contamination in clinical CBCT imaging. Lengthy simulations prevent MC-based scatter correction from...
General purpose code for Monte Carlo simulations
International Nuclear Information System (INIS)
Wilcke, W.W.
1983-01-01
A general-purpose computer called MONTHY has been written to perform Monte Carlo simulations of physical systems. To achieve a high degree of flexibility the code is organized like a general purpose computer, operating on a vector describing the time dependent state of the system under simulation. The instruction set of the computer is defined by the user and is therefore adaptable to the particular problem studied. The organization of MONTHY allows iterative and conditional execution of operations
Atomistic Monte Carlo simulation of lipid membranes
DEFF Research Database (Denmark)
Wüstner, Daniel; Sklenar, Heinz
2014-01-01
Biological membranes are complex assemblies of many different molecules of which analysis demands a variety of experimental and computational approaches. In this article, we explain challenges and advantages of atomistic Monte Carlo (MC) simulation of lipid membranes. We provide an introduction...... of local-move MC methods in combination with molecular dynamics simulations, for example, for studying multi-component lipid membranes containing cholesterol....
International Nuclear Information System (INIS)
Constantin, Magdalena; Constantin, Dragos E; Keall, Paul J; Narula, Anisha; Svatos, Michelle; Perl, Joseph
2010-01-01
Most of the treatment head components of medical linear accelerators used in radiation therapy have complex geometrical shapes. They are typically designed using computer-aided design (CAD) applications. In Monte Carlo simulations of radiotherapy beam transport through the treatment head components, the relevant beam-generating and beam-modifying devices are inserted in the simulation toolkit using geometrical approximations of these components. Depending on their complexity, such approximations may introduce errors that can be propagated throughout the simulation. This drawback can be minimized by exporting a more precise geometry of the linac components from CAD and importing it into the Monte Carlo simulation environment. We present a technique that links three-dimensional CAD drawings of the treatment head components to Geant4 Monte Carlo simulations of dose deposition. (note)
Fisicaro, G; Pelaz, L; Lopez, P; La Magna, A
2012-09-01
Pulsed laser irradiation of damaged solids promotes ultrafast nonequilibrium kinetics, on the submicrosecond scale, leading to microscopic modifications of the material state. Reliable theoretical predictions of this evolution can be achieved only by simulating particle interactions in the presence of large and transient gradients of the thermal field. We propose a kinetic Monte Carlo (KMC) method for the simulation of damaged systems in the extremely far-from-equilibrium conditions caused by the laser irradiation. The reference systems are nonideal crystals containing point defect excesses, an order of magnitude larger than the equilibrium density, due to a preirradiation ion implantation process. The thermal and, eventual, melting problem is solved within the phase-field methodology, and the numerical solutions for the space- and time-dependent thermal field were then dynamically coupled to the KMC code. The formalism, implementation, and related tests of our computational code are discussed in detail. As an application example we analyze the evolution of the defect system caused by P ion implantation in Si under nanosecond pulsed irradiation. The simulation results suggest a significant annihilation of the implantation damage which can be well controlled by the laser fluence.
Monte Carlo simulation of Ising models by multispin coding on a vector computer
Wansleben, Stephan; Zabolitzky, John G.; Kalle, Claus
1984-11-01
Rebbi's efficient multispin coding algorithm for Ising models is combined with the use of the vector computer CDC Cyber 205. A speed of 21.2 million updates per second is reached. This is comparable to that obtained by special- purpose computers.
Broecker, Peter; Trebst, Simon
2016-12-01
In the absence of a fermion sign problem, auxiliary-field (or determinantal) quantum Monte Carlo (DQMC) approaches have long been the numerical method of choice for unbiased, large-scale simulations of interacting many-fermion systems. More recently, the conceptual scope of this approach has been expanded by introducing ingenious schemes to compute entanglement entropies within its framework. On a practical level, these approaches, however, suffer from a variety of numerical instabilities that have largely impeded their applicability. Here we report on a number of algorithmic advances to overcome many of these numerical instabilities and significantly improve the calculation of entanglement measures in the zero-temperature projective DQMC approach, ultimately allowing us to reach similar system sizes as for the computation of conventional observables. We demonstrate the applicability of this improved DQMC approach by providing an entanglement perspective on the quantum phase transition from a magnetically ordered Mott insulator to a band insulator in the bilayer square lattice Hubbard model at half filling.
Simulation and the Monte Carlo method
Rubinstein, Reuven Y
2016-01-01
Simulation and the Monte Carlo Method, Third Edition reflects the latest developments in the field and presents a fully updated and comprehensive account of the major topics that have emerged in Monte Carlo simulation since the publication of the classic First Edition over more than a quarter of a century ago. While maintaining its accessible and intuitive approach, this revised edition features a wealth of up-to-date information that facilitates a deeper understanding of problem solving across a wide array of subject areas, such as engineering, statistics, computer science, mathematics, and the physical and life sciences. The book begins with a modernized introduction that addresses the basic concepts of probability, Markov processes, and convex optimization. Subsequent chapters discuss the dramatic changes that have occurred in the field of the Monte Carlo method, with coverage of many modern topics including: Markov Chain Monte Carlo, variance reduction techniques such as the transform likelihood ratio...
Monte Carlo simulation of Markov unreliability models
International Nuclear Information System (INIS)
Lewis, E.E.; Boehm, F.
1984-01-01
A Monte Carlo method is formulated for the evaluation of the unrealibility of complex systems with known component failure and repair rates. The formulation is in terms of a Markov process allowing dependences between components to be modeled and computational efficiencies to be achieved in the Monte Carlo simulation. Two variance reduction techniques, forced transition and failure biasing, are employed to increase computational efficiency of the random walk procedure. For an example problem these result in improved computational efficiency by more than three orders of magnitudes over analog Monte Carlo. The method is generalized to treat problems with distributed failure and repair rate data, and a batching technique is introduced and shown to result in substantial increases in computational efficiency for an example problem. A method for separating the variance due to the data uncertainty from that due to the finite number of random walks is presented. (orig.)
Monte Carlo computation in the applied research of nuclear technology
International Nuclear Information System (INIS)
Xu Shuyan; Liu Baojie; Li Qin
2007-01-01
This article briefly introduces Monte Carlo Methods and their properties. It narrates the Monte Carlo methods with emphasis in their applications to several domains of nuclear technology. Monte Carlo simulation methods and several commonly used computer software to implement them are also introduced. The proposed methods are demonstrated by a real example. (authors)
Computational details of the Monte Carlo simulation of proton and electron tracks
International Nuclear Information System (INIS)
Zaider, M.; Brenner, D.J.
1983-01-01
The code PROTON simulates the elastic and nonelastic interactions of protons and electrons in water vapor. In this paper, the treatment of elastic angular scattering of electrons as utilized in PROTON is described and compared with alternate formalisms. The sensitivity of the calculation to different treatments of this process is examined in terms of proximity functions of energy deposition. 5 figures
International Nuclear Information System (INIS)
Lee, Hyun Cheol; Yoo, Do Hyeon; Testa, Mauro; Shin, Wook-Geun; Choi, Hyun Joon; Ha, Wi-Ho; Yoo, Jaeryong; Yoon, Seokwon; Min, Chul Hee
2016-01-01
The aim of this study is to evaluate the potential hazard of naturally occurring radioactive material (NORM) added consumer products. Using the Monte Carlo method, the radioactive products were simulated with ICRP reference phantom and the organ doses were calculated with the usage scenario. Finally, the annual effective doses were evaluated as lower than the public dose limit of 1 mSv y"−"1 for 44 products. It was demonstrated that NORM-added consumer products could be quantitatively assessed for the safety regulation. - Highlights: • Consumer products considered that NORM would be included should be regulated. • 44 products were collected and its gamma activities were measured with HPGe detector. • Through Monte Carlo simulation, organ equivalent doses and effective doses on human phantom were calculated. • All annual effective doses for the products were evaluated as lower than dose limit for the public.
Adaptive Multilevel Monte Carlo Simulation
Hoel, H
2011-08-23
This work generalizes a multilevel forward Euler Monte Carlo method introduced in Michael B. Giles. (Michael Giles. Oper. Res. 56(3):607–617, 2008.) for the approximation of expected values depending on the solution to an Itô stochastic differential equation. The work (Michael Giles. Oper. Res. 56(3):607– 617, 2008.) proposed and analyzed a forward Euler multilevelMonte Carlo method based on a hierarchy of uniform time discretizations and control variates to reduce the computational effort required by a standard, single level, Forward Euler Monte Carlo method. This work introduces an adaptive hierarchy of non uniform time discretizations, generated by an adaptive algorithmintroduced in (AnnaDzougoutov et al. Raùl Tempone. Adaptive Monte Carlo algorithms for stopped diffusion. In Multiscale methods in science and engineering, volume 44 of Lect. Notes Comput. Sci. Eng., pages 59–88. Springer, Berlin, 2005; Kyoung-Sook Moon et al. Stoch. Anal. Appl. 23(3):511–558, 2005; Kyoung-Sook Moon et al. An adaptive algorithm for ordinary, stochastic and partial differential equations. In Recent advances in adaptive computation, volume 383 of Contemp. Math., pages 325–343. Amer. Math. Soc., Providence, RI, 2005.). This form of the adaptive algorithm generates stochastic, path dependent, time steps and is based on a posteriori error expansions first developed in (Anders Szepessy et al. Comm. Pure Appl. Math. 54(10):1169– 1214, 2001). Our numerical results for a stopped diffusion problem, exhibit savings in the computational cost to achieve an accuracy of ϑ(TOL),from(TOL−3), from using a single level version of the adaptive algorithm to ϑ(((TOL−1)log(TOL))2).
Computation cluster for Monte Carlo calculations
International Nuclear Information System (INIS)
Petriska, M.; Vitazek, K.; Farkas, G.; Stacho, M.; Michalek, S.
2010-01-01
Two computation clusters based on Rocks Clusters 5.1 Linux distribution with Intel Core Duo and Intel Core Quad based computers were made at the Department of the Nuclear Physics and Technology. Clusters were used for Monte Carlo calculations, specifically for MCNP calculations applied in Nuclear reactor core simulations. Optimization for computation speed was made on hardware and software basis. Hardware cluster parameters, such as size of the memory, network speed, CPU speed, number of processors per computation, number of processors in one computer were tested for shortening the calculation time. For software optimization, different Fortran compilers, MPI implementations and CPU multi-core libraries were tested. Finally computer cluster was used in finding the weighting functions of neutron ex-core detectors of VVER-440. (authors)
Computation cluster for Monte Carlo calculations
Energy Technology Data Exchange (ETDEWEB)
Petriska, M.; Vitazek, K.; Farkas, G.; Stacho, M.; Michalek, S. [Dep. Of Nuclear Physics and Technology, Faculty of Electrical Engineering and Information, Technology, Slovak Technical University, Ilkovicova 3, 81219 Bratislava (Slovakia)
2010-07-01
Two computation clusters based on Rocks Clusters 5.1 Linux distribution with Intel Core Duo and Intel Core Quad based computers were made at the Department of the Nuclear Physics and Technology. Clusters were used for Monte Carlo calculations, specifically for MCNP calculations applied in Nuclear reactor core simulations. Optimization for computation speed was made on hardware and software basis. Hardware cluster parameters, such as size of the memory, network speed, CPU speed, number of processors per computation, number of processors in one computer were tested for shortening the calculation time. For software optimization, different Fortran compilers, MPI implementations and CPU multi-core libraries were tested. Finally computer cluster was used in finding the weighting functions of neutron ex-core detectors of VVER-440. (authors)
International Nuclear Information System (INIS)
Barbosa, Antonio Konrado de Santana; Vieira, Jose Wilson; Costa, Kleber Souza Silva; Lima, Fernando Roberto de Andrade
2011-01-01
Radiotherapy simulation procedures using Monte Carlo methods have shown to be increasingly important to the improvement of cancer fighting strategies. Within this context, brachytherapy is one of the most used methods to ensure better life quality when compared to other therapeutic modalities. These procedures are planned with the use of sectional exams with the patient in lying position. However, it is known that alteration of body posture after the procedure has an influence in the localization of many organs. This study had the aim to identify and to measure the influence of such alterations in MC brachytherapy simulations. In order to do so, prostate brachytherapy with the use of Iodine-125 radionuclide was chosen as model. Simulations were carried out with 108 events using EGSnrc code associated to MASH phantom in orthostatic and supine positions. Significant alterations were found, especially regarding bladder, small intestine and testicles. (author)
Lee, Hyun Cheol; Yoo, Do Hyeon; Testa, Mauro; Shin, Wook-Geun; Choi, Hyun Joon; Ha, Wi-Ho; Yoo, Jaeryong; Yoon, Seokwon; Min, Chul Hee
2016-04-01
The aim of this study is to evaluate the potential hazard of naturally occurring radioactive material (NORM) added consumer products. Using the Monte Carlo method, the radioactive products were simulated with ICRP reference phantom and the organ doses were calculated with the usage scenario. Finally, the annual effective doses were evaluated as lower than the public dose limit of 1mSv y(-1) for 44 products. It was demonstrated that NORM-added consumer products could be quantitatively assessed for the safety regulation. Copyright © 2016 Elsevier Ltd. All rights reserved.
International Nuclear Information System (INIS)
Barbosa, Antonio Konrado de Santana; Vieira, Jose Wilson; Costa, Kleber Souza Silva; Lima, Fernando Roberto de Andrade
2011-01-01
Radiotherapy computational simulation procedures using Monte Carlo (MC) methods have shown to be increasingly important to the improvement of cancer fighting strategies. One of the biases in this practice is the discretization of the radioactive source in brachytherapy simulations, which often do not match with a real situation. This study had the aim to identify and to measure the influence of radioactive sources discretization in brachytherapy MC simulations when compared to those that do not present discretization, using prostate brachytherapy with Iodine-125 radionuclide as model. Simulations were carried out with 108 events with both types of sources to compare them using EGSnrc code associated to MASH phantom in orthostatic and supine positions with some anatomic adaptations. Significant alterations were found, especially regarding bladder, rectum and the prostate itself. It can be concluded that there is a need to discretized sources in brachytherapy simulations to ensure its representativeness. (author)
Treur, M.; Postma, M.
2014-01-01
Objectives: Patient-level simulation models provide increased flexibility to overcome the limitations of cohort-based approaches in health-economic analysis. However, computational requirements of reaching convergence is a notorious barrier. The objective was to assess the impact of using
Atomistic Monte Carlo simulation of lipid membranes
DEFF Research Database (Denmark)
Wüstner, Daniel; Sklenar, Heinz
2014-01-01
Biological membranes are complex assemblies of many different molecules of which analysis demands a variety of experimental and computational approaches. In this article, we explain challenges and advantages of atomistic Monte Carlo (MC) simulation of lipid membranes. We provide an introduction...... into the various move sets that are implemented in current MC methods for efficient conformational sampling of lipids and other molecules. In the second part, we demonstrate for a concrete example, how an atomistic local-move set can be implemented for MC simulations of phospholipid monomers and bilayer patches...
Mosaic crystal algorithm for Monte Carlo simulations
Seeger, P A
2002-01-01
An algorithm is presented for calculating reflectivity, absorption, and scattering of mosaic crystals in Monte Carlo simulations of neutron instruments. The algorithm uses multi-step transport through the crystal with an exact solution of the Darwin equations at each step. It relies on the kinematical model for Bragg reflection (with parameters adjusted to reproduce experimental data). For computation of thermal effects (the Debye-Waller factor and coherent inelastic scattering), an expansion of the Debye integral as a rapidly converging series of exponential terms is also presented. Any crystal geometry and plane orientation may be treated. The algorithm has been incorporated into the neutron instrument simulation package NISP. (orig.)
Markov chains analytic and Monte Carlo computations
Graham, Carl
2014-01-01
Markov Chains: Analytic and Monte Carlo Computations introduces the main notions related to Markov chains and provides explanations on how to characterize, simulate, and recognize them. Starting with basic notions, this book leads progressively to advanced and recent topics in the field, allowing the reader to master the main aspects of the classical theory. This book also features: Numerous exercises with solutions as well as extended case studies.A detailed and rigorous presentation of Markov chains with discrete time and state space.An appendix presenting probabilistic notions that are nec
Monte Carlo simulation of grain growth
Directory of Open Access Journals (Sweden)
Paulo Blikstein
1999-07-01
Full Text Available Understanding and predicting grain growth in Metallurgy is meaningful. Monte Carlo methods have been used in computer simulations in many different fields of knowledge. Grain growth simulation using this method is especially attractive as the statistical behavior of the atoms is properly reproduced; microstructural evolution depends only on the real topology of the grains and not on any kind of geometric simplification. Computer simulation has the advantage of allowing the user to visualize graphically the procedures, even dynamically and in three dimensions. Single-phase alloy grain growth simulation was carried out by calculating the free energy of each atom in the lattice (with its present crystallographic orientation and comparing this value to another one calculated with a different random orientation. When the resulting free energy is lower or equal to the initial value, the new orientation replaces the former. The measure of time is the Monte Carlo Step (MCS, which involves a series of trials throughout the lattice. A very close relationship between experimental and theoretical values for the grain growth exponent (n was observed.
Energy Technology Data Exchange (ETDEWEB)
Bostani, Maryam, E-mail: mbostani@mednet.ucla.edu; McMillan, Kyle; Cagnon, Chris H.; McNitt-Gray, Michael F. [Departments of Biomedical Physics and Radiology, David Geffen School of Medicine, University of California, Los Angeles, Los Angeles, California 90024 (United States); DeMarco, John J. [Department of Radiation Oncology, University of California, Los Angeles, Los Angeles, California 90095 (United States)
2014-11-01
Purpose: Monte Carlo (MC) simulation methods have been widely used in patient dosimetry in computed tomography (CT), including estimating patient organ doses. However, most simulation methods have undergone a limited set of validations, often using homogeneous phantoms with simple geometries. As clinical scanning has become more complex and the use of tube current modulation (TCM) has become pervasive in the clinic, MC simulations should include these techniques in their methodologies and therefore should also be validated using a variety of phantoms with different shapes and material compositions to result in a variety of differently modulated tube current profiles. The purpose of this work is to perform the measurements and simulations to validate a Monte Carlo model under a variety of test conditions where fixed tube current (FTC) and TCM were used. Methods: A previously developed MC model for estimating dose from CT scans that models TCM, built using the platform of MCNPX, was used for CT dose quantification. In order to validate the suitability of this model to accurately simulate patient dose from FTC and TCM CT scan, measurements and simulations were compared over a wide range of conditions. Phantoms used for testing range from simple geometries with homogeneous composition (16 and 32 cm computed tomography dose index phantoms) to more complex phantoms including a rectangular homogeneous water equivalent phantom, an elliptical shaped phantom with three sections (where each section was a homogeneous, but different material), and a heterogeneous, complex geometry anthropomorphic phantom. Each phantom requires varying levels of x-, y- and z-modulation. Each phantom was scanned on a multidetector row CT (Sensation 64) scanner under the conditions of both FTC and TCM. Dose measurements were made at various surface and depth positions within each phantom. Simulations using each phantom were performed for FTC, detailed x–y–z TCM, and z-axis-only TCM to obtain
Monte Carlo simulation in nuclear medicine
International Nuclear Information System (INIS)
Morel, Ch.
2007-01-01
The Monte Carlo method allows for simulating random processes by using series of pseudo-random numbers. It became an important tool in nuclear medicine to assist in the design of new medical imaging devices, optimise their use and analyse their data. Presently, the sophistication of the simulation tools allows the introduction of Monte Carlo predictions in data correction and image reconstruction processes. The availability to simulate time dependent processes opens up new horizons for Monte Carlo simulation in nuclear medicine. In a near future, these developments will allow to tackle simultaneously imaging and dosimetry issues and soon, case system Monte Carlo simulations may become part of the nuclear medicine diagnostic process. This paper describes some Monte Carlo method basics and the sampling methods that were developed for it. It gives a referenced list of different simulation software used in nuclear medicine and enumerates some of their present and prospective applications. (author)
Monte Carlo simulation code modernization
CERN. Geneva
2015-01-01
The continual development of sophisticated transport simulation algorithms allows increasingly accurate description of the effect of the passage of particles through matter. This modelling capability finds applications in a large spectrum of fields from medicine to astrophysics, and of course HEP. These new capabilities however come at the cost of a greater computational intensity of the new models, which has the effect of increasing the demands of computing resources. This is particularly true for HEP, where the demand for more simulation are driven by the need of both more accuracy and more precision, i.e. better models and more events. Usually HEP has relied on the "Moore's law" evolution, but since almost ten years the increase in clock speed has withered and computing capacity comes in the form of hardware architectures of many-core or accelerated processors. To harness these opportunities we need to adapt our code to concurrent programming models taking advantages of both SIMD and SIMT architectures. Th...
Energy Technology Data Exchange (ETDEWEB)
May, Matthias S.; Kuettner, Axel; Lell, Michael M.; Wuest, Wolfgang; Scharf, Michael; Uder, Michael [University of Erlangen, Department of Radiology, Erlangen (Germany); Deak, Paul; Kalender, Willi A. [University of Erlangen, Department of Medical Physics, Erlangen (Germany); Keller, Andrea K.; Haeberle, Lothar [University of Erlangen, Department of Medical Informatics, Biometry and Epidemiology, Erlangen (Germany); Achenbach, Stephan; Seltmann, Martin [University of Erlangen, Department of Cardiology, Erlangen (Germany)
2012-03-15
To evaluate radiation dose levels in patients undergoing spiral coronary computed tomography angiography (CTA) on a dual-source system in clinical routine. Coronary CTA was performed for 56 patients with electrocardiogram-triggered tube current modulation (TCM) and heart-rate (HR) dependent pitch adaptation. Individual Monte Carlo (MC) simulations were performed for dose assessment. Retrospective simulations with constant tube current (CTC) served as reference. Lung tissue was segmented and used for organ and effective dose (ED) calculation. Estimates for mean relative ED was 7.1 {+-} 2.1 mSv/100 mAs for TCM and 12.5 {+-} 5.3 mSv/100 mAs for CTC (P < 0.001). Relative dose reduction at low HR ({<=}60 bpm) was highest (49 {+-} 5%) compared to intermediate (60-70 bpm, 33 {+-} 12%) and high HR (>70 bpm, 29 {+-} 12%). However lowest ED is achieved at high HR (5.2 {+-} 1.5 mSv/100 mAs), compared with intermediate (6.7 {+-} 1.6 mSv/100 mAs) and low (8.3 {+-} 2.1 mSv/100 mAs) HR when automated pitch adaptation is applied. Radiation dose savings up to 52% are achievable by TCM at low and regular HR. However lowest ED is attained at high HR by pitch adaptation despite inferior radiation dose reduction by TCM. circle Monte Carlo simulations allow for individual radiation dose calculations. (orig.)
International Nuclear Information System (INIS)
May, Matthias S.; Kuettner, Axel; Lell, Michael M.; Wuest, Wolfgang; Scharf, Michael; Uder, Michael; Deak, Paul; Kalender, Willi A.; Keller, Andrea K.; Haeberle, Lothar; Achenbach, Stephan; Seltmann, Martin
2012-01-01
To evaluate radiation dose levels in patients undergoing spiral coronary computed tomography angiography (CTA) on a dual-source system in clinical routine. Coronary CTA was performed for 56 patients with electrocardiogram-triggered tube current modulation (TCM) and heart-rate (HR) dependent pitch adaptation. Individual Monte Carlo (MC) simulations were performed for dose assessment. Retrospective simulations with constant tube current (CTC) served as reference. Lung tissue was segmented and used for organ and effective dose (ED) calculation. Estimates for mean relative ED was 7.1 ± 2.1 mSv/100 mAs for TCM and 12.5 ± 5.3 mSv/100 mAs for CTC (P 70 bpm, 29 ± 12%). However lowest ED is achieved at high HR (5.2 ± 1.5 mSv/100 mAs), compared with intermediate (6.7 ± 1.6 mSv/100 mAs) and low (8.3 ± 2.1 mSv/100 mAs) HR when automated pitch adaptation is applied. Radiation dose savings up to 52% are achievable by TCM at low and regular HR. However lowest ED is attained at high HR by pitch adaptation despite inferior radiation dose reduction by TCM. circle Monte Carlo simulations allow for individual radiation dose calculations. (orig.)
International Nuclear Information System (INIS)
Setiani, Tia Dwi; Suprijadi; Haryanto, Freddy
2016-01-01
Monte Carlo (MC) is one of the powerful techniques for simulation in x-ray imaging. MC method can simulate the radiation transport within matter with high accuracy and provides a natural way to simulate radiation transport in complex systems. One of the codes based on MC algorithm that are widely used for radiographic images simulation is MC-GPU, a codes developed by Andrea Basal. This study was aimed to investigate the time computation of x-ray imaging simulation in GPU (Graphics Processing Unit) compared to a standard CPU (Central Processing Unit). Furthermore, the effect of physical parameters to the quality of radiographic images and the comparison of image quality resulted from simulation in the GPU and CPU are evaluated in this paper. The simulations were run in CPU which was simulated in serial condition, and in two GPU with 384 cores and 2304 cores. In simulation using GPU, each cores calculates one photon, so, a large number of photon were calculated simultaneously. Results show that the time simulations on GPU were significantly accelerated compared to CPU. The simulations on the 2304 core of GPU were performed about 64 -114 times faster than on CPU, while the simulation on the 384 core of GPU were performed about 20 – 31 times faster than in a single core of CPU. Another result shows that optimum quality of images from the simulation was gained at the history start from 10"8 and the energy from 60 Kev to 90 Kev. Analyzed by statistical approach, the quality of GPU and CPU images are relatively the same.
Energy Technology Data Exchange (ETDEWEB)
Setiani, Tia Dwi, E-mail: tiadwisetiani@gmail.com [Computational Science, Faculty of Mathematics and Natural Sciences, Institut Teknologi Bandung Jalan Ganesha 10 Bandung, 40132 (Indonesia); Suprijadi [Computational Science, Faculty of Mathematics and Natural Sciences, Institut Teknologi Bandung Jalan Ganesha 10 Bandung, 40132 (Indonesia); Nuclear Physics and Biophysics Reaserch Division, Faculty of Mathematics and Natural Sciences, Institut Teknologi Bandung Jalan Ganesha 10 Bandung, 40132 (Indonesia); Haryanto, Freddy [Nuclear Physics and Biophysics Reaserch Division, Faculty of Mathematics and Natural Sciences, Institut Teknologi Bandung Jalan Ganesha 10 Bandung, 40132 (Indonesia)
2016-03-11
Monte Carlo (MC) is one of the powerful techniques for simulation in x-ray imaging. MC method can simulate the radiation transport within matter with high accuracy and provides a natural way to simulate radiation transport in complex systems. One of the codes based on MC algorithm that are widely used for radiographic images simulation is MC-GPU, a codes developed by Andrea Basal. This study was aimed to investigate the time computation of x-ray imaging simulation in GPU (Graphics Processing Unit) compared to a standard CPU (Central Processing Unit). Furthermore, the effect of physical parameters to the quality of radiographic images and the comparison of image quality resulted from simulation in the GPU and CPU are evaluated in this paper. The simulations were run in CPU which was simulated in serial condition, and in two GPU with 384 cores and 2304 cores. In simulation using GPU, each cores calculates one photon, so, a large number of photon were calculated simultaneously. Results show that the time simulations on GPU were significantly accelerated compared to CPU. The simulations on the 2304 core of GPU were performed about 64 -114 times faster than on CPU, while the simulation on the 384 core of GPU were performed about 20 – 31 times faster than in a single core of CPU. Another result shows that optimum quality of images from the simulation was gained at the history start from 10{sup 8} and the energy from 60 Kev to 90 Kev. Analyzed by statistical approach, the quality of GPU and CPU images are relatively the same.
International Nuclear Information System (INIS)
Thing, Rune S.; Bernchou, Uffe; Brink, Carsten; Mainegra-Hing, Ernesto
2013-01-01
Purpose: Cone beam computed tomography (CBCT) image quality is limited by scattered photons. Monte Carlo (MC) simulations provide the ability of predicting the patient-specific scatter contamination in clinical CBCT imaging. Lengthy simulations prevent MC-based scatter correction from being fully implemented in a clinical setting. This study investigates the combination of using fast MC simulations to predict scatter distributions with a ray tracing algorithm to allow calibration between simulated and clinical CBCT images. Material and methods: An EGSnrc-based user code (egs c bct), was used to perform MC simulations of an Elekta XVI CBCT imaging system. A 60keV x-ray source was used, and air kerma scored at the detector plane. Several variance reduction techniques (VRTs) were used to increase the scatter calculation efficiency. Three patient phantoms based on CT scans were simulated, namely a brain, a thorax and a pelvis scan. A ray tracing algorithm was used to calculate the detector signal due to primary photons. A total of 288 projections were simulated, one for each thread on the computer cluster used for the investigation. Results: Scatter distributions for the brain, thorax and pelvis scan were simulated within 2 % statistical uncertainty in two hours per scan. Within the same time, the ray tracing algorithm provided the primary signal for each of the projections. Thus, all the data needed for MC-based scatter correction in clinical CBCT imaging was obtained within two hours per patient, using a full simulation of the clinical CBCT geometry. Conclusions: This study shows that use of MC-based scatter corrections in CBCT imaging has a great potential to improve CBCT image quality. By use of powerful VRTs to predict scatter distributions and a ray tracing algorithm to calculate the primary signal, it is possible to obtain the necessary data for patient specific MC scatter correction within two hours per patient
International Nuclear Information System (INIS)
Xu, Zuwei; Zhao, Haibo; Zheng, Chuguang
2015-01-01
This paper proposes a comprehensive framework for accelerating population balance-Monte Carlo (PBMC) simulation of particle coagulation dynamics. By combining Markov jump model, weighted majorant kernel and GPU (graphics processing unit) parallel computing, a significant gain in computational efficiency is achieved. The Markov jump model constructs a coagulation-rule matrix of differentially-weighted simulation particles, so as to capture the time evolution of particle size distribution with low statistical noise over the full size range and as far as possible to reduce the number of time loopings. Here three coagulation rules are highlighted and it is found that constructing appropriate coagulation rule provides a route to attain the compromise between accuracy and cost of PBMC methods. Further, in order to avoid double looping over all simulation particles when considering the two-particle events (typically, particle coagulation), the weighted majorant kernel is introduced to estimate the maximum coagulation rates being used for acceptance–rejection processes by single-looping over all particles, and meanwhile the mean time-step of coagulation event is estimated by summing the coagulation kernels of rejected and accepted particle pairs. The computational load of these fast differentially-weighted PBMC simulations (based on the Markov jump model) is reduced greatly to be proportional to the number of simulation particles in a zero-dimensional system (single cell). Finally, for a spatially inhomogeneous multi-dimensional (multi-cell) simulation, the proposed fast PBMC is performed in each cell, and multiple cells are parallel processed by multi-cores on a GPU that can implement the massively threaded data-parallel tasks to obtain remarkable speedup ratio (comparing with CPU computation, the speedup ratio of GPU parallel computing is as high as 200 in a case of 100 cells with 10 000 simulation particles per cell). These accelerating approaches of PBMC are
Hybrid Monte Carlo methods in computational finance
Leitao Rodriguez, A.
2017-01-01
Monte Carlo methods are highly appreciated and intensively employed in computational finance in the context of financial derivatives valuation or risk management. The method offers valuable advantages like flexibility, easy interpretation and straightforward implementation. Furthermore, the
Energy Technology Data Exchange (ETDEWEB)
Papadimitroulas, P; Kagadis, GC [University of Patras, Rion, Ahaia (Greece); Loudos, G [Technical Educational Institute of Athens, Aigaleo, Attiki (Greece)
2014-06-15
Purpose: Our purpose is to evaluate the administered absorbed dose in pediatric, nuclear imaging studies. Monte Carlo simulations with the incorporation of pediatric computational models can serve as reference for the accurate determination of absorbed dose. The procedure of the calculated dosimetric factors is described, while a dataset of reference doses is created. Methods: Realistic simulations were executed using the GATE toolkit and a series of pediatric computational models, developed by the “IT'IS Foundation”. The series of the phantoms used in our work includes 6 models in the range of 5–14 years old (3 boys and 3 girls). Pre-processing techniques were applied to the images, to incorporate the phantoms in GATE simulations. The resolution of the phantoms was set to 2 mm3. The most important organ densities were simulated according to the GATE “Materials Database”. Several used radiopharmaceuticals in SPECT and PET applications are being tested, following the EANM pediatric dosage protocol. The biodistributions of the several isotopes used as activity maps in the simulations, were derived by the literature. Results: Initial results of absorbed dose per organ (mGy) are presented in a 5 years old girl from the whole body exposure to 99mTc - SestaMIBI, 30 minutes after administration. Heart, kidney, liver, ovary, pancreas and brain are the most critical organs, in which the S-factors are calculated. The statistical uncertainty in the simulation procedure was kept lower than 5%. The Sfactors for each target organ are calculated in Gy/(MBq*sec) with highest dose being absorbed in kidneys and pancreas (9.29*10{sup 10} and 0.15*10{sup 10} respectively). Conclusion: An approach for the accurate dosimetry on pediatric models is presented, creating a reference dosage dataset for several radionuclides in children computational models with the advantages of MC techniques. Our study is ongoing, extending our investigation to other reference models and
International Nuclear Information System (INIS)
Papadimitroulas, P; Kagadis, GC; Loudos, G
2014-01-01
Purpose: Our purpose is to evaluate the administered absorbed dose in pediatric, nuclear imaging studies. Monte Carlo simulations with the incorporation of pediatric computational models can serve as reference for the accurate determination of absorbed dose. The procedure of the calculated dosimetric factors is described, while a dataset of reference doses is created. Methods: Realistic simulations were executed using the GATE toolkit and a series of pediatric computational models, developed by the “IT'IS Foundation”. The series of the phantoms used in our work includes 6 models in the range of 5–14 years old (3 boys and 3 girls). Pre-processing techniques were applied to the images, to incorporate the phantoms in GATE simulations. The resolution of the phantoms was set to 2 mm3. The most important organ densities were simulated according to the GATE “Materials Database”. Several used radiopharmaceuticals in SPECT and PET applications are being tested, following the EANM pediatric dosage protocol. The biodistributions of the several isotopes used as activity maps in the simulations, were derived by the literature. Results: Initial results of absorbed dose per organ (mGy) are presented in a 5 years old girl from the whole body exposure to 99mTc - SestaMIBI, 30 minutes after administration. Heart, kidney, liver, ovary, pancreas and brain are the most critical organs, in which the S-factors are calculated. The statistical uncertainty in the simulation procedure was kept lower than 5%. The Sfactors for each target organ are calculated in Gy/(MBq*sec) with highest dose being absorbed in kidneys and pancreas (9.29*10 10 and 0.15*10 10 respectively). Conclusion: An approach for the accurate dosimetry on pediatric models is presented, creating a reference dosage dataset for several radionuclides in children computational models with the advantages of MC techniques. Our study is ongoing, extending our investigation to other reference models and evaluating the
Chow, James C L; Leung, Michael K K; Islam, Mohammad K; Norrlinger, Bernhard D; Jaffray, David A
2008-01-01
The aim of this study is to evaluate the impact of the patient dose due to the kilovoltage cone beam computed tomography (kV-CBCT) in a prostate intensity-modulated radiation therapy (IMRT). The dose distributions for the five prostate IMRTs were calculated using the Pinnacle treatment planning system. To calculate the patient dose from CBCT, phase-space beams of a CBCT head based on the ELEKTA x-ray volume imaging system were generated using the Monte Carlo BEAMnr code for 100, 120, 130, and 140 kVp energies. An in-house graphical user interface called DOSCTP (DOSXYZnrc-based) developed using MATLAB was used to calculate the dose distributions due to a 360 degrees photon arc from the CBCT beam with the same patient CT image sets as used in Pinnacle. The two calculated dose distributions were added together by setting the CBCT doses equal to 1%, 1.5%, 2%, and 2.5% of the prescription dose of the prostate IMRT. The prostate plan and the summed dose distributions were then processed in the CERR platform to determine the dose-volume histograms (DVHs) of the regions of interest. Moreover, dose profiles along the x- and y-axes crossing the isocenter with and without addition of the CBCT dose were determined. It was found that the added doses due to CBCT are most significant at the femur heads. Higher doses were found at the bones for a relatively low energy CBCT beam such as 100 kVp. Apart from the bones, the CBCT dose was observed to be most concentrated on the anterior and posterior side of the patient anatomy. Analysis of the DVHs for the prostate and other critical tissues showed that they vary only slightly with the added CBCT dose at different beam energies. On the other hand, the changes of the DVHs for the femur heads due to the CBCT dose and beam energy were more significant than those of rectal and bladder wall. By analyzing the vertical and horizontal dose profiles crossing the femur heads and isocenter, with and without the CBCT dose equal to 2% of the
International Nuclear Information System (INIS)
Chow, James C. L.; Leung, Michael K. K.; Islam, Mohammad K.; Norrlinger, Bernhard D.; Jaffray, David A.
2008-01-01
The aim of this study is to evaluate the impact of the patient dose due to the kilovoltage cone beam computed tomography (kV-CBCT) in a prostate intensity-modulated radiation therapy (IMRT). The dose distributions for the five prostate IMRTs were calculated using the Pinnacle3 treatment planning system. To calculate the patient dose from CBCT, phase-space beams of a CBCT head based on the ELEKTA x-ray volume imaging system were generated using the Monte Carlo BEAMnrc code for 100, 120, 130, and 140 kVp energies. An in-house graphical user interface called DOSCTP (DOSXYZnrc-based) developed using MATLAB was used to calculate the dose distributions due to a 360 deg. photon arc from the CBCT beam with the same patient CT image sets as used in Pinnacle3. The two calculated dose distributions were added together by setting the CBCT doses equal to 1%, 1.5%, 2%, and 2.5% of the prescription dose of the prostate IMRT. The prostate plan and the summed dose distributions were then processed in the CERR platform to determine the dose-volume histograms (DVHs) of the regions of interest. Moreover, dose profiles along the x- and y-axes crossing the isocenter with and without addition of the CBCT dose were determined. It was found that the added doses due to CBCT are most significant at the femur heads. Higher doses were found at the bones for a relatively low energy CBCT beam such as 100 kVp. Apart from the bones, the CBCT dose was observed to be most concentrated on the anterior and posterior side of the patient anatomy. Analysis of the DVHs for the prostate and other critical tissues showed that they vary only slightly with the added CBCT dose at different beam energies. On the other hand, the changes of the DVHs for the femur heads due to the CBCT dose and beam energy were more significant than those of rectal and bladder wall. By analyzing the vertical and horizontal dose profiles crossing the femur heads and isocenter, with and without the CBCT dose equal to 2% of the
Energy Technology Data Exchange (ETDEWEB)
Sharma, D; Badano, A [Division of Imaging, Diagnostics and Software Reliability, OSEL/CDRH, Food & Drug Administration, MD (United States); Sempau, J [Technical University of Catalonia, Barcelona (Spain)
2016-06-15
Purpose: Variance reduction techniques (VRTs) are employed in Monte Carlo simulations to obtain estimates with reduced statistical uncertainty for a given simulation time. In this work, we study the bias and efficiency of a VRT for estimating the response of imaging detectors. Methods: We implemented Directed Sampling (DS), preferentially directing a fraction of emitted optical photons directly towards the detector by altering the isotropic model. The weight of each optical photon is appropriately modified to maintain simulation estimates unbiased. We use a Monte Carlo tool called fastDETECT2 (part of the hybridMANTIS open-source package) for optical transport, modified for VRT. The weight of each photon is calculated as the ratio of original probability (no VRT) and the new probability for a particular direction. For our analysis of bias and efficiency, we use pulse height spectra, point response functions, and Swank factors. We obtain results for a variety of cases including analog (no VRT, isotropic distribution), and DS with 0.2 and 0.8 optical photons directed towards the sensor plane. We used 10,000, 25-keV primaries. Results: The Swank factor for all cases in our simplified model converged fast (within the first 100 primaries) to a stable value of 0.9. The root mean square error per pixel for DS VRT for the point response function between analog and VRT cases was approximately 5e-4. Conclusion: Our preliminary results suggest that DS VRT does not affect the estimate of the mean for the Swank factor. Our findings indicate that it may be possible to design VRTs for imaging detector simulations to increase computational efficiency without introducing bias.
Monte carlo simulation for soot dynamics
Zhou, Kun
2012-01-01
A new Monte Carlo method termed Comb-like frame Monte Carlo is developed to simulate the soot dynamics. Detailed stochastic error analysis is provided. Comb-like frame Monte Carlo is coupled with the gas phase solver Chemkin II to simulate soot formation in a 1-D premixed burner stabilized flame. The simulated soot number density, volume fraction, and particle size distribution all agree well with the measurement available in literature. The origin of the bimodal distribution of particle size distribution is revealed with quantitative proof.
A general purpose code for Monte Carlo simulations
International Nuclear Information System (INIS)
Wilcke, W.W.; Rochester Univ., NY
1984-01-01
A general-purpose computer code MONTHY has been written to perform Monte Carlo simulations of physical systems. To achieve a high degree of flexibility the code is organized like a general purpose computer, operating on a vector describing the time dependent state of the system under simulation. The instruction set of the 'computer' is defined by the user and is therefore adaptable to the particular problem studied. The organization of MONTHY allows iterative and conditional execution of operations. (orig.)
Cros, Maria; Joemai, Raoul M. S.; Geleijns, Jacob; Molina, Diego; Salvadó, Marçal
2017-08-01
This study aims to develop and test software for assessing and reporting doses for standard patients undergoing computed tomography (CT) examinations in a 320 detector-row cone-beam scanner. The software, called SimDoseCT, is based on the Monte Carlo (MC) simulation code, which was developed to calculate organ doses and effective doses in ICRP anthropomorphic adult reference computational phantoms for acquisitions with the Aquilion ONE CT scanner (Toshiba). MC simulation was validated by comparing CTDI measurements within standard CT dose phantoms with results from simulation under the same conditions. SimDoseCT consists of a graphical user interface connected to a MySQL database, which contains the look-up-tables that were generated with MC simulations for volumetric acquisitions at different scan positions along the phantom using any tube voltage, bow tie filter, focal spot and nine different beam widths. Two different methods were developed to estimate organ doses and effective doses from acquisitions using other available beam widths in the scanner. A correction factor was used to estimate doses in helical acquisitions. Hence, the user can select any available protocol in the Aquilion ONE scanner for a standard adult male or female and obtain the dose results through the software interface. Agreement within 9% between CTDI measurements and simulations allowed the validation of the MC program. Additionally, the algorithm for dose reporting in SimDoseCT was validated by comparing dose results from this tool with those obtained from MC simulations for three volumetric acquisitions (head, thorax and abdomen). The comparison was repeated using eight different collimations and also for another collimation in a helical abdomen examination. The results showed differences of 0.1 mSv or less for absolute dose in most organs and also in the effective dose calculation. The software provides a suitable tool for dose assessment in standard adult patients undergoing CT
Parallel Monte Carlo simulation of aerosol dynamics
Zhou, K.
2014-01-01
A highly efficient Monte Carlo (MC) algorithm is developed for the numerical simulation of aerosol dynamics, that is, nucleation, surface growth, and coagulation. Nucleation and surface growth are handled with deterministic means, while coagulation is simulated with a stochastic method (Marcus-Lushnikov stochastic process). Operator splitting techniques are used to synthesize the deterministic and stochastic parts in the algorithm. The algorithm is parallelized using the Message Passing Interface (MPI). The parallel computing efficiency is investigated through numerical examples. Near 60% parallel efficiency is achieved for the maximum testing case with 3.7 million MC particles running on 93 parallel computing nodes. The algorithm is verified through simulating various testing cases and comparing the simulation results with available analytical and/or other numerical solutions. Generally, it is found that only small number (hundreds or thousands) of MC particles is necessary to accurately predict the aerosol particle number density, volume fraction, and so forth, that is, low order moments of the Particle Size Distribution (PSD) function. Accurately predicting the high order moments of the PSD needs to dramatically increase the number of MC particles. 2014 Kun Zhou et al.
Energy Technology Data Exchange (ETDEWEB)
Guerra, J.G., E-mail: jglezg2002@gmail.es [Departamento de Física, Universidad de Las Palmas de Gran Canaria, 35001 Las Palmas de Gran Canaria (Spain); Rubiano, J.G. [Departamento de Física, Universidad de Las Palmas de Gran Canaria, 35001 Las Palmas de Gran Canaria (Spain); Instituto Universitario de Estudios Ambientales y Recursos Naturales, Universidad de Las Palmas de Gran Canaria, 35001 Las Palmas de Gran Canaria (Spain); Winter, G. [Instituto Universitario de Sistemas Inteligentes y Aplicaciones Numéricas en la Ingeniería, Universidad de Las Palmas de Gran Canaria, 35001 Las Palmas de Gran Canaria (Spain); Guerra, A.G.; Alonso, H.; Arnedo, M.A.; Tejera, A.; Martel, P. [Departamento de Física, Universidad de Las Palmas de Gran Canaria, 35001 Las Palmas de Gran Canaria (Spain); Instituto Universitario de Estudios Ambientales y Recursos Naturales, Universidad de Las Palmas de Gran Canaria, 35001 Las Palmas de Gran Canaria (Spain); Bolivar, J.P. [Departamento de Física Aplicada, Universidad de Huelva, 21071 Huelva (Spain)
2017-06-21
In this work, we have developed a computational methodology for characterizing HPGe detectors by implementing in parallel a multi-objective evolutionary algorithm, together with a Monte Carlo simulation code. The evolutionary algorithm is used for searching the geometrical parameters of a model of detector by minimizing the differences between the efficiencies calculated by Monte Carlo simulation and two reference sets of Full Energy Peak Efficiencies (FEPEs) corresponding to two given sample geometries, a beaker of small diameter laid over the detector window and a beaker of large capacity which wrap the detector. This methodology is a generalization of a previously published work, which was limited to beakers placed over the window of the detector with a diameter equal or smaller than the crystal diameter, so that the crystal mount cap (which surround the lateral surface of the crystal), was not considered in the detector model. The generalization has been accomplished not only by including such a mount cap in the model, but also using multi-objective optimization instead of mono-objective, with the aim of building a model sufficiently accurate for a wider variety of beakers commonly used for the measurement of environmental samples by gamma spectrometry, like for instance, Marinellis, Petris, or any other beaker with a diameter larger than the crystal diameter, for which part of the detected radiation have to pass through the mount cap. The proposed methodology has been applied to an HPGe XtRa detector, providing a model of detector which has been successfully verificated for different source-detector geometries and materials and experimentally validated using CRMs. - Highlights: • A computational method for characterizing HPGe detectors has been generalized. • The new version is usable for a wider range of sample geometries. • It starts from reference FEPEs obtained through a standard calibration procedure. • A model of an HPGe XtRa detector has been
International Nuclear Information System (INIS)
Guerra, J.G.; Rubiano, J.G.; Winter, G.; Guerra, A.G.; Alonso, H.; Arnedo, M.A.; Tejera, A.; Martel, P.; Bolivar, J.P.
2017-01-01
In this work, we have developed a computational methodology for characterizing HPGe detectors by implementing in parallel a multi-objective evolutionary algorithm, together with a Monte Carlo simulation code. The evolutionary algorithm is used for searching the geometrical parameters of a model of detector by minimizing the differences between the efficiencies calculated by Monte Carlo simulation and two reference sets of Full Energy Peak Efficiencies (FEPEs) corresponding to two given sample geometries, a beaker of small diameter laid over the detector window and a beaker of large capacity which wrap the detector. This methodology is a generalization of a previously published work, which was limited to beakers placed over the window of the detector with a diameter equal or smaller than the crystal diameter, so that the crystal mount cap (which surround the lateral surface of the crystal), was not considered in the detector model. The generalization has been accomplished not only by including such a mount cap in the model, but also using multi-objective optimization instead of mono-objective, with the aim of building a model sufficiently accurate for a wider variety of beakers commonly used for the measurement of environmental samples by gamma spectrometry, like for instance, Marinellis, Petris, or any other beaker with a diameter larger than the crystal diameter, for which part of the detected radiation have to pass through the mount cap. The proposed methodology has been applied to an HPGe XtRa detector, providing a model of detector which has been successfully verificated for different source-detector geometries and materials and experimentally validated using CRMs. - Highlights: • A computational method for characterizing HPGe detectors has been generalized. • The new version is usable for a wider range of sample geometries. • It starts from reference FEPEs obtained through a standard calibration procedure. • A model of an HPGe XtRa detector has been
Baptista, M.; Di Maria, S.; Vieira, S.; Vaz, P.
2017-11-01
Cone-Beam Computed Tomography (CBCT) enables high-resolution volumetric scanning of the bone and soft tissue anatomy under investigation at the treatment accelerator. This technique is extensively used in Image Guided Radiation Therapy (IGRT) for pre-treatment verification of patient position and target volume localization. When employed daily and several times per patient, CBCT imaging may lead to high cumulative imaging doses to the healthy tissues surrounding the exposed organs. This work aims at (1) evaluating the dose distribution during a CBCT scan and (2) calculating the organ doses involved in this image guiding procedure for clinically available scanning protocols. Both Monte Carlo (MC) simulations and measurements were performed. To model and simulate the kV imaging system mounted on a linear accelerator (Edge™, Varian Medical Systems) the state-of-the-art MC radiation transport program MCNPX 2.7.0 was used. In order to validate the simulation results, measurements of the Computed Tomography Dose Index (CTDI) were performed, using standard PMMA head and body phantoms, with 150 mm length and a standard pencil ionizing chamber (IC) 100 mm long. Measurements for head and pelvis scanning protocols, usually adopted in clinical environment were acquired, using two acquisition modes (full-fan and half fan). To calculate the organ doses, the implemented MC model of the CBCT scanner together with a male voxel phantom ("Golem") was used. The good agreement between the MCNPX simulations and the CTDIw measurements (differences up to 17%) presented in this work reveals that the CBCT MC model was successfully validated, taking into account the several uncertainties. The adequacy of the computational model to map dose distributions during a CBCT scan is discussed in order to identify ways to reduce the total CBCT imaging dose. The organ dose assessment highlights the need to evaluate the therapeutic and the CBCT imaging doses, in a more balanced approach, and the
Monte Carlo Simulation in Statistical Physics An Introduction
Binder, Kurt
2010-01-01
Monte Carlo Simulation in Statistical Physics deals with the computer simulation of many-body systems in condensed-matter physics and related fields of physics, chemistry and beyond, to traffic flows, stock market fluctuations, etc.). Using random numbers generated by a computer, probability distributions are calculated, allowing the estimation of the thermodynamic properties of various systems. This book describes the theoretical background to several variants of these Monte Carlo methods and gives a systematic presentation from which newcomers can learn to perform such simulations and to analyze their results. The fifth edition covers Classical as well as Quantum Monte Carlo methods. Furthermore a new chapter on the sampling of free-energy landscapes has been added. To help students in their work a special web server has been installed to host programs and discussion groups (http://wwwcp.tphys.uni-heidelberg.de). Prof. Binder was awarded the Berni J. Alder CECAM Award for Computational Physics 2001 as well ...
International Nuclear Information System (INIS)
Kongsoe, H.E.; Lauridsen, K.
1993-09-01
SIMON is a program for calculation of reliability and statistical analysis. The program is of the Monte Carlo type, and it is designed with high flexibility, and has a large potential for application to complex problems like reliability analyses of very large systems and of systems, where complex modelling or knowledge of special details are required. Examples of application of the program, including input and output, for reliability and statistical analysis are presented. (au) (3 tabs., 3 ills., 5 refs.)
Multilevel Monte Carlo in Approximate Bayesian Computation
Jasra, Ajay
2017-02-13
In the following article we consider approximate Bayesian computation (ABC) inference. We introduce a method for numerically approximating ABC posteriors using the multilevel Monte Carlo (MLMC). A sequential Monte Carlo version of the approach is developed and it is shown under some assumptions that for a given level of mean square error, this method for ABC has a lower cost than i.i.d. sampling from the most accurate ABC approximation. Several numerical examples are given.
Monte Carlo simulation of neutron scattering instruments
International Nuclear Information System (INIS)
Seeger, P.A.; Daemen, L.L.; Hjelm, R.P. Jr.
1998-01-01
A code package consisting of the Monte Carlo Library MCLIB, the executing code MC RUN, the web application MC Web, and various ancillary codes is proposed as an open standard for simulation of neutron scattering instruments. The architecture of the package includes structures to define surfaces, regions, and optical elements contained in regions. A particle is defined by its vector position and velocity, its time of flight, its mass and charge, and a polarization vector. The MC RUN code handles neutron transport and bookkeeping, while the action on the neutron within any region is computed using algorithms that may be deterministic, probabilistic, or a combination. Complete versatility is possible because the existing library may be supplemented by any procedures a user is able to code. Some examples are shown
Guerra, J. G.; Rubiano, J. G.; Winter, G.; Guerra, A. G.; Alonso, H.; Arnedo, M. A.; Tejera, A.; Martel, P.; Bolivar, J. P.
2017-06-01
In this work, we have developed a computational methodology for characterizing HPGe detectors by implementing in parallel a multi-objective evolutionary algorithm, together with a Monte Carlo simulation code. The evolutionary algorithm is used for searching the geometrical parameters of a model of detector by minimizing the differences between the efficiencies calculated by Monte Carlo simulation and two reference sets of Full Energy Peak Efficiencies (FEPEs) corresponding to two given sample geometries, a beaker of small diameter laid over the detector window and a beaker of large capacity which wrap the detector. This methodology is a generalization of a previously published work, which was limited to beakers placed over the window of the detector with a diameter equal or smaller than the crystal diameter, so that the crystal mount cap (which surround the lateral surface of the crystal), was not considered in the detector model. The generalization has been accomplished not only by including such a mount cap in the model, but also using multi-objective optimization instead of mono-objective, with the aim of building a model sufficiently accurate for a wider variety of beakers commonly used for the measurement of environmental samples by gamma spectrometry, like for instance, Marinellis, Petris, or any other beaker with a diameter larger than the crystal diameter, for which part of the detected radiation have to pass through the mount cap. The proposed methodology has been applied to an HPGe XtRa detector, providing a model of detector which has been successfully verificated for different source-detector geometries and materials and experimentally validated using CRMs.
Rapid Monte Carlo Simulation of Gravitational Wave Galaxies
Breivik, Katelyn; Larson, Shane L.
2015-01-01
With the detection of gravitational waves on the horizon, astrophysical catalogs produced by gravitational wave observatories can be used to characterize the populations of sources and validate different galactic population models. Efforts to simulate gravitational wave catalogs and source populations generally focus on population synthesis models that require extensive time and computational power to produce a single simulated galaxy. Monte Carlo simulations of gravitational wave source populations can also be used to generate observation catalogs from the gravitational wave source population. Monte Carlo simulations have the advantes of flexibility and speed, enabling rapid galactic realizations as a function of galactic binary parameters with less time and compuational resources required. We present a Monte Carlo method for rapid galactic simulations of gravitational wave binary populations.
Monte Carlo simulation in statistical physics an introduction
Binder, Kurt
1992-01-01
The Monte Carlo method is a computer simulation method which uses random numbers to simulate statistical fluctuations The method is used to model complex systems with many degrees of freedom Probability distributions for these systems are generated numerically and the method then yields numerically exact information on the models Such simulations may be used tosee how well a model system approximates a real one or to see how valid the assumptions are in an analyical theory A short and systematic theoretical introduction to the method forms the first part of this book The second part is a practical guide with plenty of examples and exercises for the student Problems treated by simple sampling (random and self-avoiding walks, percolation clusters, etc) are included, along with such topics as finite-size effects and guidelines for the analysis of Monte Carlo simulations The two parts together provide an excellent introduction to the theory and practice of Monte Carlo simulations
Monte Carlo strategies in scientific computing
Liu, Jun S
2008-01-01
This paperback edition is a reprint of the 2001 Springer edition This book provides a self-contained and up-to-date treatment of the Monte Carlo method and develops a common framework under which various Monte Carlo techniques can be "standardized" and compared Given the interdisciplinary nature of the topics and a moderate prerequisite for the reader, this book should be of interest to a broad audience of quantitative researchers such as computational biologists, computer scientists, econometricians, engineers, probabilists, and statisticians It can also be used as the textbook for a graduate-level course on Monte Carlo methods Many problems discussed in the alter chapters can be potential thesis topics for masters’ or PhD students in statistics or computer science departments Jun Liu is Professor of Statistics at Harvard University, with a courtesy Professor appointment at Harvard Biostatistics Department Professor Liu was the recipient of the 2002 COPSS Presidents' Award, the most prestigious one for sta...
Monte Carlo methods of PageRank computation
Litvak, Nelli
2004-01-01
We describe and analyze an on-line Monte Carlo method of PageRank computation. The PageRank is being estimated basing on results of a large number of short independent simulation runs initiated from each page that contains outgoing hyperlinks. The method does not require any storage of the hyperlink
Advanced computers and Monte Carlo
International Nuclear Information System (INIS)
Jordan, T.L.
1979-01-01
High-performance parallelism that is currently available is synchronous in nature. It is manifested in such architectures as Burroughs ILLIAC-IV, CDC STAR-100, TI ASC, CRI CRAY-1, ICL DAP, and many special-purpose array processors designed for signal processing. This form of parallelism has apparently not been of significant value to many important Monte Carlo calculations. Nevertheless, there is much asynchronous parallelism in many of these calculations. A model of a production code that requires up to 20 hours per problem on a CDC 7600 is studied for suitability on some asynchronous architectures that are on the drawing board. The code is described and some of its properties and resource requirements ae identified to compare with corresponding properties and resource requirements are identified to compare with corresponding properties and resource requirements are identified to compare with corresponding properties and resources of some asynchronous multiprocessor architectures. Arguments are made for programer aids and special syntax to identify and support important asynchronous parallelism. 2 figures, 5 tables
Atomistic Monte Carlo Simulation of Lipid Membranes
Directory of Open Access Journals (Sweden)
Daniel Wüstner
2014-01-01
Full Text Available Biological membranes are complex assemblies of many different molecules of which analysis demands a variety of experimental and computational approaches. In this article, we explain challenges and advantages of atomistic Monte Carlo (MC simulation of lipid membranes. We provide an introduction into the various move sets that are implemented in current MC methods for efficient conformational sampling of lipids and other molecules. In the second part, we demonstrate for a concrete example, how an atomistic local-move set can be implemented for MC simulations of phospholipid monomers and bilayer patches. We use our recently devised chain breakage/closure (CBC local move set in the bond-/torsion angle space with the constant-bond-length approximation (CBLA for the phospholipid dipalmitoylphosphatidylcholine (DPPC. We demonstrate rapid conformational equilibration for a single DPPC molecule, as assessed by calculation of molecular energies and entropies. We also show transition from a crystalline-like to a fluid DPPC bilayer by the CBC local-move MC method, as indicated by the electron density profile, head group orientation, area per lipid, and whole-lipid displacements. We discuss the potential of local-move MC methods in combination with molecular dynamics simulations, for example, for studying multi-component lipid membranes containing cholesterol.
Uncertainty analysis in Monte Carlo criticality computations
International Nuclear Information System (INIS)
Qi Ao
2011-01-01
Highlights: ► Two types of uncertainty methods for k eff Monte Carlo computations are examined. ► Sampling method has the least restrictions on perturbation but computing resources. ► Analytical method is limited to small perturbation on material properties. ► Practicality relies on efficiency, multiparameter applicability and data availability. - Abstract: Uncertainty analysis is imperative for nuclear criticality risk assessments when using Monte Carlo neutron transport methods to predict the effective neutron multiplication factor (k eff ) for fissionable material systems. For the validation of Monte Carlo codes for criticality computations against benchmark experiments, code accuracy and precision are measured by both the computational bias and uncertainty in the bias. The uncertainty in the bias accounts for known or quantified experimental, computational and model uncertainties. For the application of Monte Carlo codes for criticality analysis of fissionable material systems, an administrative margin of subcriticality must be imposed to provide additional assurance of subcriticality for any unknown or unquantified uncertainties. Because of a substantial impact of the administrative margin of subcriticality on economics and safety of nuclear fuel cycle operations, recently increasing interests in reducing the administrative margin of subcriticality make the uncertainty analysis in criticality safety computations more risk-significant. This paper provides an overview of two most popular k eff uncertainty analysis methods for Monte Carlo criticality computations: (1) sampling-based methods, and (2) analytical methods. Examples are given to demonstrate their usage in the k eff uncertainty analysis due to uncertainties in both neutronic and non-neutronic parameters of fissionable material systems.
Monte Carlo simulation applied to alpha spectrometry
International Nuclear Information System (INIS)
Baccouche, S.; Gharbi, F.; Trabelsi, A.
2007-01-01
Alpha particle spectrometry is a widely-used analytical method, in particular when we deal with pure alpha emitting radionuclides. Monte Carlo simulation is an adequate tool to investigate the influence of various phenomena on this analytical method. We performed an investigation of those phenomena using the simulation code GEANT of CERN. The results concerning the geometrical detection efficiency in different measurement geometries agree with analytical calculations. This work confirms that Monte Carlo simulation of solid angle of detection is a very useful tool to determine with very good accuracy the detection efficiency.
Dynamic bounds coupled with Monte Carlo simulations
Rajabali Nejad, Mohammadreza; Meester, L.E.; van Gelder, P.H.A.J.M.; Vrijling, J.K.
2011-01-01
For the reliability analysis of engineering structures a variety of methods is known, of which Monte Carlo (MC) simulation is widely considered to be among the most robust and most generally applicable. To reduce simulation cost of the MC method, variance reduction methods are applied. This paper
Monte Carlo simulation of the microcanonical ensemble
International Nuclear Information System (INIS)
Creutz, M.
1984-01-01
We consider simulating statistical systems with a random walk on a constant energy surface. This combines features of deterministic molecular dynamics techniques and conventional Monte Carlo simulations. For discrete systems the method can be programmed to run an order of magnitude faster than other approaches. It does not require high quality random numbers and may also be useful for nonequilibrium studies. 10 references
Monte Carlo simulation of AB-copolymers with saturating bonds
DEFF Research Database (Denmark)
Chertovich, A.C.; Ivanov, V.A.; Khokhlov, A.R.
2003-01-01
Structural transitions in a single AB-copolymer chain where saturating bonds can be formed between A- and B-units are studied by means of Monte Carlo computer simulations using the bond fluctuation model. Three transitions are found, coil-globule, coil-hairpin and globule-hairpin, depending...
Rare event simulation using Monte Carlo methods
Rubino, Gerardo
2009-01-01
In a probabilistic model, a rare event is an event with a very small probability of occurrence. The forecasting of rare events is a formidable task but is important in many areas. For instance a catastrophic failure in a transport system or in a nuclear power plant, the failure of an information processing system in a bank, or in the communication network of a group of banks, leading to financial losses. Being able to evaluate the probability of rare events is therefore a critical issue. Monte Carlo Methods, the simulation of corresponding models, are used to analyze rare events. This book sets out to present the mathematical tools available for the efficient simulation of rare events. Importance sampling and splitting are presented along with an exposition of how to apply these tools to a variety of fields ranging from performance and dependability evaluation of complex systems, typically in computer science or in telecommunications, to chemical reaction analysis in biology or particle transport in physics. ...
Monte-Carlo simulation of electromagnetic showers
International Nuclear Information System (INIS)
Amatuni, Ts.A.
1984-01-01
The universal ELSS-1 program for Monte Carlo simulation of high energy electromagnetic showers in homogeneous absorbers of arbitrary geometry is written. The major processes and effects of electron and photon interaction with matter, particularly the Landau-Pomeranchuk-Migdal effect, are taken into account in the simulation procedures. The simulation results are compared with experimental data. Some characteristics of shower detectors and electromagnetic showers for energies up 1 TeV are calculated
Monte Carlo simulation of neutron counters for safeguards applications
International Nuclear Information System (INIS)
Looman, Marc; Peerani, Paolo; Tagziria, Hamid
2009-01-01
MCNP-PTA is a new Monte Carlo code for the simulation of neutron counters for nuclear safeguards applications developed at the Joint Research Centre (JRC) in Ispra (Italy). After some preliminary considerations outlining the general aspects involved in the computational modelling of neutron counters, this paper describes the specific details and approximations which make up the basis of the model implemented in the code. One of the major improvements allowed by the use of Monte Carlo simulation is a considerable reduction in both the experimental work and in the reference materials required for the calibration of the instruments. This new approach to the calibration of counters using Monte Carlo simulation techniques is also discussed.
Brolin, Gustav; Sjögreen Gleisner, Katarina; Ljungberg, Michael
2013-05-01
In dynamic renal scintigraphy, the main interest is the radiopharmaceutical redistribution as a function of time. Quality control (QC) of renal procedures often relies on phantom experiments to compare image-based results with the measurement setup. A phantom with a realistic anatomy and time-varying activity distribution is therefore desirable. This work describes a pharmacokinetic (PK) compartment model for 99mTc-MAG3, used for defining a dynamic whole-body activity distribution within a digital phantom (XCAT) for accurate Monte Carlo (MC)-based images for QC. Each phantom structure is assigned a time-activity curve provided by the PK model, employing parameter values consistent with MAG3 pharmacokinetics. This approach ensures that the total amount of tracer in the phantom is preserved between time points, and it allows for modifications of the pharmacokinetics in a controlled fashion. By adjusting parameter values in the PK model, different clinically realistic scenarios can be mimicked, regarding, e.g., the relative renal uptake and renal transit time. Using the MC code SIMIND, a complete set of renography images including effects of photon attenuation, scattering, limited spatial resolution and noise, are simulated. The obtained image data can be used to evaluate quantitative techniques and computer software in clinical renography.
Exploring Various Monte Carlo Simulations for Geoscience Applications
Blais, R.
2010-12-01
Computer simulations are increasingly important in geoscience research and development. At the core of stochastic or Monte Carlo simulations are the random number sequences that are assumed to be distributed with specific characteristics. Computer generated random numbers, uniformly distributed on (0, 1), can be very different depending on the selection of pseudo-random number (PRN), or chaotic random number (CRN) generators. Equidistributed quasi-random numbers (QRNs) can also be used in Monte Carlo simulations. In the evaluation of some definite integrals, the resulting error variances can even be of different orders of magnitude. Furthermore, practical techniques for variance reduction such as Importance Sampling and Stratified Sampling can be implemented to significantly improve the results. A comparative analysis of these strategies has been carried out for computational applications in planar and spatial contexts. Based on these experiments, and on examples of geodetic applications of gravimetric terrain corrections and gravity inversion, conclusions and recommendations concerning their performance and general applicability are included.
Exploring pseudo- and chaotic random Monte Carlo simulations
Blais, J. A. Rod; Zhang, Zhan
2011-07-01
Computer simulations are an increasingly important area of geoscience research and development. At the core of stochastic or Monte Carlo simulations are the random number sequences that are assumed to be distributed with specific characteristics. Computer-generated random numbers, uniformly distributed on (0, 1), can be very different depending on the selection of pseudo-random number (PRN) or chaotic random number (CRN) generators. In the evaluation of some definite integrals, the resulting error variances can even be of different orders of magnitude. Furthermore, practical techniques for variance reduction such as importance sampling and stratified sampling can be applied in most Monte Carlo simulations and significantly improve the results. A comparative analysis of these strategies has been carried out for computational applications in planar and spatial contexts. Based on these experiments, and on some practical examples of geodetic direct and inverse problems, conclusions and recommendations concerning their performance and general applicability are included.
International Nuclear Information System (INIS)
Ismail, M.; Liljequist, D.
1986-10-01
In the present model, the treatment of elastic scattering is based on the similarity of multiple scattering processes with equal transport mean free path /LAMBDA/sub(tr). Elastic scattering events are separated by an artificially enlarged mean free path. In such events, scattering is optionally performed either by means of a single, energy-dependent scattering angle, or by means of a scattering angle distribution of the same form as the screened Rutherford cross section, but with an artificial screening factor. The physically correct /LAMBDA/sub(tr) value is obtained by appropriate choice of scattering angle or screening factor, respectively. We find good agreement with experimental transmission and with energy loss distributions. The Rutherford-like model gives good agreement with experimental angular distribution even for the penetration of very thin layers. Treatment of electron energy loss is based on the partial CSDA method: energy losses W WMINSE are treated as discrete electron-electron or positron-electron scattering events. Similarly, for bremsstrahlung photon energies W WMINR are treated at discrete events. The sensitivity of the model to the parameters WMINSE and WMINR is studied. WMINR can, in practise, be made negligibly small, and WMINSE can without any excessive computer time be made as small as to give results in good agreement with experiment and with computations based on Landau theory of straggling. Using this model, we study some of the characteristic features of relativistic electron transmission, energy loss distributions, straggling, angular distributions and trajectories. (authors)
Adaptive Multilevel Monte Carlo Simulation
Hoel, H; von Schwerin, E; Szepessy, A; Tempone, Raul
2011-01-01
. An adaptive algorithm for ordinary, stochastic and partial differential equations. In Recent advances in adaptive computation, volume 383 of Contemp. Math., pages 325–343. Amer. Math. Soc., Providence, RI, 2005.). This form of the adaptive algorithm generates
Monte Carlo Simulation for Particle Detectors
Pia, Maria Grazia
2012-01-01
Monte Carlo simulation is an essential component of experimental particle physics in all the phases of its life-cycle: the investigation of the physics reach of detector concepts, the design of facilities and detectors, the development and optimization of data reconstruction software, the data analysis for the production of physics results. This note briefly outlines some research topics related to Monte Carlo simulation, that are relevant to future experimental perspectives in particle physics. The focus is on physics aspects: conceptual progress beyond current particle transport schemes, the incorporation of materials science knowledge relevant to novel detection technologies, functionality to model radiation damage, the capability for multi-scale simulation, quantitative validation and uncertainty quantification to determine the predictive power of simulation. The R&D on simulation for future detectors would profit from cooperation within various components of the particle physics community, and synerg...
Dynamic bounds coupled with Monte Carlo simulations
Energy Technology Data Exchange (ETDEWEB)
Rajabalinejad, M., E-mail: M.Rajabalinejad@tudelft.n [Faculty of Civil Engineering, Delft University of Technology, Delft (Netherlands); Meester, L.E. [Delft Institute of Applied Mathematics, Delft University of Technology, Delft (Netherlands); Gelder, P.H.A.J.M. van; Vrijling, J.K. [Faculty of Civil Engineering, Delft University of Technology, Delft (Netherlands)
2011-02-15
For the reliability analysis of engineering structures a variety of methods is known, of which Monte Carlo (MC) simulation is widely considered to be among the most robust and most generally applicable. To reduce simulation cost of the MC method, variance reduction methods are applied. This paper describes a method to reduce the simulation cost even further, while retaining the accuracy of Monte Carlo, by taking into account widely present monotonicity. For models exhibiting monotonic (decreasing or increasing) behavior, dynamic bounds (DB) are defined, which in a coupled Monte Carlo simulation are updated dynamically, resulting in a failure probability estimate, as well as a strict (non-probabilistic) upper and lower bounds. Accurate results are obtained at a much lower cost than an equivalent ordinary Monte Carlo simulation. In a two-dimensional and a four-dimensional numerical example, the cost reduction factors are 130 and 9, respectively, where the relative error is smaller than 5%. At higher accuracy levels, this factor increases, though this effect is expected to be smaller with increasing dimension. To show the application of DB method to real world problems, it is applied to a complex finite element model of a flood wall in New Orleans.
Suppression of the initial transient in Monte Carlo criticality simulations
International Nuclear Information System (INIS)
Richet, Y.
2006-12-01
Criticality Monte Carlo calculations aim at estimating the effective multiplication factor (k-effective) for a fissile system through iterations simulating neutrons propagation (making a Markov chain). Arbitrary initialization of the neutron population can deeply bias the k-effective estimation, defined as the mean of the k-effective computed at each iteration. A simplified model of this cycle k-effective sequence is built, based on characteristics of industrial criticality Monte Carlo calculations. Statistical tests, inspired by Brownian bridge properties, are designed to discriminate stationarity of the cycle k-effective sequence. The initial detected transient is, then, suppressed in order to improve the estimation of the system k-effective. The different versions of this methodology are detailed and compared, firstly on a plan of numerical tests fitted on criticality Monte Carlo calculations, and, secondly on real criticality calculations. Eventually, the best methodologies observed in these tests are selected and allow to improve industrial Monte Carlo criticality calculations. (author)
Systematic uncertainties on Monte Carlo simulation of lead based ADS
International Nuclear Information System (INIS)
Embid, M.; Fernandez, R.; Garcia-Sanz, J.M.; Gonzalez, E.
1999-01-01
Computer simulations of the neutronic behaviour of ADS systems foreseen for actinide and fission product transmutation are affected by many sources of systematic uncertainties, both from the nuclear data and by the methodology selected when applying the codes. Several actual ADS Monte Carlo simulations are presented, comparing different options both for the data and for the methodology, evaluating the relevance of the different uncertainties. (author)
Monte Carlo simulation of hybrid systems: An example
International Nuclear Information System (INIS)
Bacha, F.; D'Alencon, H.; Grivelet, J.; Jullien, E.; Jejcic, A.; Maillard, J.; Silva, J.; Zukanovich, R.; Vergnes, J.
1997-01-01
Simulation of hybrid systems needs tracking of particles from the GeV (incident proton beam) range down to a fraction of eV (thermic neutrons). We show how a GEANT based Monte-Carlo program can achieve this, with a realistic computer time and accompanying tools. An example of a dedicated original actinide burner is simulated with this chain. 8 refs., 5 figs
Monte Carlo simulations in theoretical physic
International Nuclear Information System (INIS)
Billoire, A.
1991-01-01
After a presentation of the MONTE CARLO method principle, the method is applied, first to the critical exponents calculations in the three dimensions ISING model, and secondly to the discrete quantum chromodynamic with calculation times in function of computer power. 28 refs., 4 tabs
Coded aperture optimization using Monte Carlo simulations
International Nuclear Information System (INIS)
Martineau, A.; Rocchisani, J.M.; Moretti, J.L.
2010-01-01
Coded apertures using Uniformly Redundant Arrays (URA) have been unsuccessfully evaluated for two-dimensional and three-dimensional imaging in Nuclear Medicine. The images reconstructed from coded projections contain artifacts and suffer from poor spatial resolution in the longitudinal direction. We introduce a Maximum-Likelihood Expectation-Maximization (MLEM) algorithm for three-dimensional coded aperture imaging which uses a projection matrix calculated by Monte Carlo simulations. The aim of the algorithm is to reduce artifacts and improve the three-dimensional spatial resolution in the reconstructed images. Firstly, we present the validation of GATE (Geant4 Application for Emission Tomography) for Monte Carlo simulations of a coded mask installed on a clinical gamma camera. The coded mask modelling was validated by comparison between experimental and simulated data in terms of energy spectra, sensitivity and spatial resolution. In the second part of the study, we use the validated model to calculate the projection matrix with Monte Carlo simulations. A three-dimensional thyroid phantom study was performed to compare the performance of the three-dimensional MLEM reconstruction with conventional correlation method. The results indicate that the artifacts are reduced and three-dimensional spatial resolution is improved with the Monte Carlo-based MLEM reconstruction.
Hybrid Multilevel Monte Carlo Simulation of Stochastic Reaction Networks
Moraes, Alvaro
2015-01-01
even more, we want to achieve this objective with near optimal computational work. We first introduce a hybrid path-simulation scheme based on the well-known stochastic simulation algorithm (SSA)[3] and the tau-leap method [2]. Then, we introduce a Multilevel Monte Carlo strategy that allows us to achieve a computational complexity of order O(T OL−2), this is the same computational complexity as in an exact method but with a smaller constant. We provide numerical examples to show our results.
Topological zero modes in Monte Carlo simulations
International Nuclear Information System (INIS)
Dilger, H.
1994-08-01
We present an improvement of global Metropolis updating steps, the instanton hits, used in a hybrid Monte Carlo simulation of the two-flavor Schwinger model with staggered fermions. These hits are designed to change the topological sector of the gauge field. In order to match these hits to an unquenched simulation with pseudofermions, the approximate zero mode structure of the lattice Dirac operator has to be considered explicitly. (orig.)
Autocorrelations in hybrid Monte Carlo simulations
International Nuclear Information System (INIS)
Schaefer, Stefan; Virotta, Francesco
2010-11-01
Simulations of QCD suffer from severe critical slowing down towards the continuum limit. This problem is known to be prominent in the topological charge, however, all observables are affected to various degree by these slow modes in the Monte Carlo evolution. We investigate the slowing down in high statistics simulations and propose a new error analysis method, which gives a realistic estimate of the contribution of the slow modes to the errors. (orig.)
Monte Carlo simulations for heavy ion dosimetry
Energy Technology Data Exchange (ETDEWEB)
Geithner, O.
2006-07-26
Water-to-air stopping power ratio (s{sub w,air}) calculations for the ionization chamber dosimetry of clinically relevant ion beams with initial energies from 50 to 450 MeV/u have been performed using the Monte Carlo technique. To simulate the transport of a particle in water the computer code SHIELD-HIT v2 was used which is a substantially modified version of its predecessor SHIELD-HIT v1. The code was partially rewritten, replacing formerly used single precision variables with double precision variables. The lowest particle transport specific energy was decreased from 1 MeV/u down to 10 keV/u by modifying the Bethe- Bloch formula, thus widening its range for medical dosimetry applications. Optional MSTAR and ICRU-73 stopping power data were included. The fragmentation model was verified using all available experimental data and some parameters were adjusted. The present code version shows excellent agreement with experimental data. Additional to the calculations of stopping power ratios, s{sub w,air}, the influence of fragments and I-values on s{sub w,air} for carbon ion beams was investigated. The value of s{sub w,air} deviates as much as 2.3% at the Bragg peak from the recommended by TRS-398 constant value of 1.130 for an energy of 50 MeV/u. (orig.)
Simplified monte carlo simulation for Beijing spectrometer
International Nuclear Information System (INIS)
Wang Taijie; Wang Shuqin; Yan Wuguang; Huang Yinzhi; Huang Deqiang; Lang Pengfei
1986-01-01
The Monte Carlo method based on the functionization of the performance of detectors and the transformation of values of kinematical variables into ''measured'' ones by means of smearing has been used to program the Monte Carlo simulation of the performance of the Beijing Spectrometer (BES) in FORTRAN language named BESMC. It can be used to investigate the multiplicity, the particle type, and the distribution of four-momentum of the final states of electron-positron collision, and also the response of the BES to these final states. Thus, it provides a measure to examine whether the overall design of the BES is reasonable and to decide the physical topics of the BES
Computer system for Monte Carlo experimentation
International Nuclear Information System (INIS)
Grier, D.A.
1986-01-01
A new computer system for Monte Carlo Experimentation is presented. The new system speeds and simplifies the process of coding and preparing a Monte Carlo Experiment; it also encourages the proper design of Monte Carlo Experiments, and the careful analysis of the experimental results. A new functional language is the core of this system. Monte Carlo Experiments, and their experimental designs, are programmed in this new language; those programs are compiled into Fortran output. The Fortran output is then compiled and executed. The experimental results are analyzed with a standard statistics package such as Si, Isp, or Minitab or with a user-supplied program. Both the experimental results and the experimental design may be directly loaded into the workspace of those packages. The new functional language frees programmers from many of the details of programming an experiment. Experimental designs such as factorial, fractional factorial, or latin square are easily described by the control structures and expressions of the language. Specific mathematical modes are generated by the routines of the language
Energy Technology Data Exchange (ETDEWEB)
Ford, R.L.; Nelson, W.R.
1978-06-01
A code to simulate almost any electron--photon transport problem conceivable is described. The report begins with a lengthy historical introduction and a description of the shower generation process. Then the detailed physics of the shower processes and the methods used to simulate them are presented. Ideas of sampling theory, transport techniques, particle interactions in general, and programing details are discussed. Next, EGS calculations and various experiments and other Monte Carlo results are compared. The remainder of the report consists of user manuals for EGS, PEGS, and TESTSR codes; options, input specifications, and typical output are included. 38 figures, 12 tables. (RWR)
Methods for Monte Carlo simulations of biomacromolecules.
Vitalis, Andreas; Pappu, Rohit V
2009-01-01
The state-of-the-art for Monte Carlo (MC) simulations of biomacromolecules is reviewed. Available methodologies for sampling conformational equilibria and associations of biomacromolecules in the canonical ensemble, given a continuum description of the solvent environment, are reviewed. Detailed sections are provided dealing with the choice of degrees of freedom, the efficiencies of MC algorithms and algorithmic peculiarities, as well as the optimization of simple movesets. The issue of introducing correlations into elementary MC moves, and the applicability of such methods to simulations of biomacromolecules is discussed. A brief discussion of multicanonical methods and an overview of recent simulation work highlighting the potential of MC methods are also provided. It is argued that MC simulations, while underutilized biomacromolecular simulation community, hold promise for simulations of complex systems and phenomena that span multiple length scales, especially when used in conjunction with implicit solvation models or other coarse graining strategies.
Reliability analysis of neutron transport simulation using Monte Carlo method
International Nuclear Information System (INIS)
Souza, Bismarck A. de; Borges, Jose C.
1995-01-01
This work presents a statistical and reliability analysis covering data obtained by computer simulation of neutron transport process, using the Monte Carlo method. A general description of the method and its applications is presented. Several simulations, corresponding to slowing down and shielding problems have been accomplished. The influence of the physical dimensions of the materials and of the sample size on the reliability level of results was investigated. The objective was to optimize the sample size, in order to obtain reliable results, optimizing computation time. (author). 5 refs, 8 figs
Monte Carlo simulations in skin radiotherapy
International Nuclear Information System (INIS)
Sarvari, A.; Jeraj, R.; Kron, T.
2000-01-01
The primary goal of this work was to develop a procedure for calculation the appropriate filter shape for a brachytherapy applicator used for skin radiotherapy. In the applicator a radioactive source is positioned close to the skin. Without a filter, the resultant dose distribution would be highly nonuniform.High uniformity is usually required however. This can be achieved using an appropriately shaped filter, which flattens the dose profile. Because of the complexity of the transport and geometry, Monte Carlo simulations had to be used. An 192 Ir high dose rate photon source was used. All necessary transport parameters were simulated with the MCNP4B Monte Carlo code. A highly efficient iterative procedure was developed, which enabled calculation of the optimal filter shape in only few iterations. The initially non-uniform dose distributions became uniform within a percent when applying the filter calculated by this procedure. (author)
Monte Carlo simulation of Touschek effect
Directory of Open Access Journals (Sweden)
Aimin Xiao
2010-07-01
Full Text Available We present a Monte Carlo method implementation in the code elegant for simulating Touschek scattering effects in a linac beam. The local scattering rate and the distribution of scattered electrons can be obtained from the code either for a Gaussian-distributed beam or for a general beam whose distribution function is given. In addition, scattered electrons can be tracked through the beam line and the local beam-loss rate and beam halo information recorded.
Monte Carlo simulations on a 9-node PC cluster
International Nuclear Information System (INIS)
Gouriou, J.
2001-01-01
Monte Carlo simulation methods are frequently used in the fields of medical physics, dosimetry and metrology of ionising radiation. Nevertheless, the main drawback of this technique is to be computationally slow, because the statistical uncertainty of the result improves only as the square root of the computational time. We present a method, which allows to reduce by a factor 10 to 20 the used effective running time. In practice, the aim was to reduce the calculation time in the LNHB metrological applications from several weeks to a few days. This approach includes the use of a PC-cluster, under Linux operating system and PVM parallel library (version 3.4). The Monte Carlo codes EGS4, MCNP and PENELOPE have been implemented on this platform and for the two last ones adapted for running under the PVM environment. The maximum observed speedup is ranging from a factor 13 to 18 according to the codes and the problems to be simulated. (orig.)
On the inclusion of macroscopic theory in Monte Carlo simulation using game theory
International Nuclear Information System (INIS)
Tatarkiewicz, J.
1980-01-01
This paper presents the inclusion of macroscopic damage theory into Monte Carlo particle-range simulation using game theory. A new computer code called RADDI was developed on the basis of this inclusion. Results of Monte Carlo damage simulation after 6.3 MeV proton bombardment of silicon are compared with experimental data of Bulgakov et al. (orig.)
Applications of Monte Carlo simulations of gamma-ray spectra
International Nuclear Information System (INIS)
Clark, D.D.
1995-01-01
A short, convenient computer program based on the Monte Carlo method that was developed to generate simulated gamma-ray spectra has been found to have useful applications in research and teaching. In research, we use it to predict spectra in neutron activation analysis (NAA), particularly in prompt gamma-ray NAA (PGNAA). In teaching, it is used to illustrate the dependence of detector response functions on the nature of gamma-ray interactions, the incident gamma-ray energy, and detector geometry
Energy Technology Data Exchange (ETDEWEB)
Barbosa, Antonio Konrado de Santana; Vieira, Jose Wilson, E-mail: konrado.radiologia@gmail.co, E-mail: jose.wilson59@uol.com.b [Instituto Federal de Educacao, Ciencia e Tecnologia (IFPE), Recife, PE (Brazil); Costa, Kleber Souza Silva [Faculdade Integrada de Pernambuco (FACIPE), Recife, PE (Brazil); Lima, Fernando Roberto de Andrade, E-mail: falima@cnen.gov.b [Centro Regional de Ciencias Nucleares do Nordeste (CRCN-NE/CNEN-PE), Recife, PE (Brazil)
2011-07-01
Radiotherapy simulation procedures using Monte Carlo methods have shown to be increasingly important to the improvement of cancer fighting strategies. Within this context, brachytherapy is one of the most used methods to ensure better life quality when compared to other therapeutic modalities. These procedures are planned with the use of sectional exams with the patient in lying position. However, it is known that alteration of body posture after the procedure has an influence in the localization of many organs. This study had the aim to identify and to measure the influence of such alterations in MC brachytherapy simulations. In order to do so, prostate brachytherapy with the use of Iodine-125 radionuclide was chosen as model. Simulations were carried out with 108 events using EGSnrc code associated to MASH phantom in orthostatic and supine positions. Significant alterations were found, especially regarding bladder, small intestine and testicles. (author)
MBR Monte Carlo Simulation in PYTHIA8
Ciesielski, R.
We present the MBR (Minimum Bias Rockefeller) Monte Carlo simulation of (anti)proton-proton interactions and its implementation in the PYTHIA8 event generator. We discuss the total, elastic, and total-inelastic cross sections, and three contributions from diffraction dissociation processes that contribute to the latter: single diffraction, double diffraction, and central diffraction or double-Pomeron exchange. The event generation follows a renormalized-Regge-theory model, successfully tested using CDF data. Based on the MBR-enhanced PYTHIA8 simulation, we present cross-section predictions for the LHC and beyond, up to collision energies of 50 TeV.
Monte Carlo simulation for the transport beamline
Energy Technology Data Exchange (ETDEWEB)
Romano, F.; Cuttone, G.; Jia, S. B.; Varisano, A. [INFN, Laboratori Nazionali del Sud, Via Santa Sofia 62, Catania (Italy); Attili, A.; Marchetto, F.; Russo, G. [INFN, Sezione di Torino, Via P.Giuria, 1 10125 Torino (Italy); Cirrone, G. A. P.; Schillaci, F.; Scuderi, V. [INFN, Laboratori Nazionali del Sud, Via Santa Sofia 62, Catania, Italy and Institute of Physics Czech Academy of Science, ELI-Beamlines project, Na Slovance 2, Prague (Czech Republic); Carpinelli, M. [INFN Sezione di Cagliari, c/o Dipartimento di Fisica, Università di Cagliari, Cagliari (Italy); Tramontana, A. [INFN, Laboratori Nazionali del Sud, Via Santa Sofia 62, Catania, Italy and Università di Catania, Dipartimento di Fisica e Astronomia, Via S. Sofia 64, Catania (Italy)
2013-07-26
In the framework of the ELIMED project, Monte Carlo (MC) simulations are widely used to study the physical transport of charged particles generated by laser-target interactions and to preliminarily evaluate fluence and dose distributions. An energy selection system and the experimental setup for the TARANIS laser facility in Belfast (UK) have been already simulated with the GEANT4 (GEometry ANd Tracking) MC toolkit. Preliminary results are reported here. Future developments are planned to implement a MC based 3D treatment planning in order to optimize shots number and dose delivery.
Monte Carlo simulation for the transport beamline
International Nuclear Information System (INIS)
Romano, F.; Cuttone, G.; Jia, S. B.; Varisano, A.; Attili, A.; Marchetto, F.; Russo, G.; Cirrone, G. A. P.; Schillaci, F.; Scuderi, V.; Carpinelli, M.; Tramontana, A.
2013-01-01
In the framework of the ELIMED project, Monte Carlo (MC) simulations are widely used to study the physical transport of charged particles generated by laser-target interactions and to preliminarily evaluate fluence and dose distributions. An energy selection system and the experimental setup for the TARANIS laser facility in Belfast (UK) have been already simulated with the GEANT4 (GEometry ANd Tracking) MC toolkit. Preliminary results are reported here. Future developments are planned to implement a MC based 3D treatment planning in order to optimize shots number and dose delivery
Lattice gauge theories and Monte Carlo simulations
International Nuclear Information System (INIS)
Rebbi, C.
1981-11-01
After some preliminary considerations, the discussion of quantum gauge theories on a Euclidean lattice takes up the definition of Euclidean quantum theory and treatment of the continuum limit; analogy is made with statistical mechanics. Perturbative methods can produce useful results for strong or weak coupling. In the attempts to investigate the properties of the systems for intermediate coupling, numerical methods known as Monte Carlo simulations have proved valuable. The bulk of this paper illustrates the basic ideas underlying the Monte Carlo numerical techniques and the major results achieved with them according to the following program: Monte Carlo simulations (general theory, practical considerations), phase structure of Abelian and non-Abelian models, the observables (coefficient of the linear term in the potential between two static sources at large separation, mass of the lowest excited state with the quantum numbers of the vacuum (the so-called glueball), the potential between two static sources at very small distance, the critical temperature at which sources become deconfined), gauge fields coupled to basonic matter (Higgs) fields, and systems with fermions
Genetic algorithms and Monte Carlo simulation for optimal plant design
International Nuclear Information System (INIS)
Cantoni, M.; Marseguerra, M.; Zio, E.
2000-01-01
We present an approach to the optimal plant design (choice of system layout and components) under conflicting safety and economic constraints, based upon the coupling of a Monte Carlo evaluation of plant operation with a Genetic Algorithms-maximization procedure. The Monte Carlo simulation model provides a flexible tool, which enables one to describe relevant aspects of plant design and operation, such as standby modes and deteriorating repairs, not easily captured by analytical models. The effects of deteriorating repairs are described by means of a modified Brown-Proschan model of imperfect repair which accounts for the possibility of an increased proneness to failure of a component after a repair. The transitions of a component from standby to active, and vice versa, are simulated using a multiplicative correlation model. The genetic algorithms procedure is demanded to optimize a profit function which accounts for the plant safety and economic performance and which is evaluated, for each possible design, by the above Monte Carlo simulation. In order to avoid an overwhelming use of computer time, for each potential solution proposed by the genetic algorithm, we perform only few hundreds Monte Carlo histories and, then, exploit the fact that during the genetic algorithm population evolution, the fit chromosomes appear repeatedly many times, so that the results for the solutions of interest (i.e. the best ones) attain statistical significance
A general transform for variance reduction in Monte Carlo simulations
International Nuclear Information System (INIS)
Becker, T.L.; Larsen, E.W.
2011-01-01
This paper describes a general transform to reduce the variance of the Monte Carlo estimate of some desired solution, such as flux or biological dose. This transform implicitly includes many standard variance reduction techniques, including source biasing, collision biasing, the exponential transform for path-length stretching, and weight windows. Rather than optimizing each of these techniques separately or choosing semi-empirical biasing parameters based on the experience of a seasoned Monte Carlo practitioner, this General Transform unites all these variance techniques to achieve one objective: a distribution of Monte Carlo particles that attempts to optimize the desired solution. Specifically, this transform allows Monte Carlo particles to be distributed according to the user's specification by using information obtained from a computationally inexpensive deterministic simulation of the problem. For this reason, we consider the General Transform to be a hybrid Monte Carlo/Deterministic method. The numerical results con rm that the General Transform distributes particles according to the user-specified distribution and generally provide reasonable results for shielding applications. (author)
Monte Carlo simulation of a CZT detector
International Nuclear Information System (INIS)
Chun, Sung Dae; Park, Se Hwan; Ha, Jang Ho; Kim, Han Soo; Cho, Yoon Ho; Kang, Sang Mook; Kim, Yong Kyun; Hong, Duk Geun
2008-01-01
CZT detector is one of the most promising radiation detectors for hard X-ray and γ-ray measurement. The energy spectrum of CZT detector has to be simulated to optimize the detector design. A CZT detector was fabricated with dimensions of 5x5x2 mm 3 . A Peltier cooler with a size of 40x40 mm 2 was installed below the fabricated CZT detector to reduce the operation temperature of the detector. Energy spectra of were measured with 59.5 keV γ-ray from 241 Am. A Monte Carlo code was developed to simulate the CZT energy spectrum, which was measured with a planar-type CZT detector, and the result was compared with the measured one. The simulation was extended to the CZT detector with strip electrodes. (author)
Investigating the impossible: Monte Carlo simulations
International Nuclear Information System (INIS)
Kramer, Gary H.; Crowley, Paul; Burns, Linda C.
2000-01-01
Designing and testing new equipment can be an expensive and time consuming process or the desired performance characteristics may preclude its construction due to technological shortcomings. Cost may also prevent equipment being purchased for other scenarios to be tested. An alternative is to use Monte Carlo simulations to make the investigations. This presentation exemplifies how Monte Carlo code calculations can be used to fill the gap. An example is given for the investigation of two sizes of germanium detector (70 mm and 80 mm diameter) at four different crystal thicknesses (15, 20, 25, and 30 mm) and makes predictions on how the size affects the counting efficiency and the Minimum Detectable Activity (MDA). The Monte Carlo simulations have shown that detector efficiencies can be adequately modelled using photon transport if the data is used to investigate trends. The investigation of the effect of detector thickness on the counting efficiency has shown that thickness for a fixed diameter detector of either 70 mm or 80 mm is unimportant up to 60 keV. At higher photon energies, the counting efficiency begins to decrease as the thickness decreases as expected. The simulations predict that the MDA of either the 70 mm or 80 mm diameter detectors does not differ by more than a factor of 1.15 at 17 keV or 1.2 at 60 keV when comparing detectors of equivalent thicknesses. The MDA is slightly increased at 17 keV, and rises by about 52% at 660 keV, when the thickness is decreased from 30 mm to 15 mm. One could conclude from this information that the extra cost associated with the larger area Ge detectors may not be justified for the slight improvement predicted in the MDA. (author)
Monte Carlo Simulation of an American Option
Directory of Open Access Journals (Sweden)
Gikiri Thuo
2007-04-01
Full Text Available We implement gradient estimation techniques for sensitivity analysis of option pricing which can be efficiently employed in Monte Carlo simulation. Using these techniques we can simultaneously obtain an estimate of the option value together with the estimates of sensitivities of the option value to various parameters of the model. After deriving the gradient estimates we incorporate them in an iterative stochastic approximation algorithm for pricing an option with early exercise features. We illustrate the procedure using an example of an American call option with a single dividend that is analytically tractable. In particular we incorporate estimates for the gradient with respect to the early exercise threshold level.
Monte Carlo in radiotherapy: experience in a distributed computational environment
Caccia, B.; Mattia, M.; Amati, G.; Andenna, C.; Benassi, M.; D'Angelo, A.; Frustagli, G.; Iaccarino, G.; Occhigrossi, A.; Valentini, S.
2007-06-01
New technologies in cancer radiotherapy need a more accurate computation of the dose delivered in the radiotherapeutical treatment plan, and it is important to integrate sophisticated mathematical models and advanced computing knowledge into the treatment planning (TP) process. We present some results about using Monte Carlo (MC) codes in dose calculation for treatment planning. A distributed computing resource located in the Technologies and Health Department of the Italian National Institute of Health (ISS) along with other computer facilities (CASPUR - Inter-University Consortium for the Application of Super-Computing for Universities and Research) has been used to perform a fully complete MC simulation to compute dose distribution on phantoms irradiated with a radiotherapy accelerator. Using BEAMnrc and GEANT4 MC based codes we calculated dose distributions on a plain water phantom and air/water phantom. Experimental and calculated dose values below ±2% (for depth between 5 mm and 130 mm) were in agreement both in PDD (Percentage Depth Dose) and transversal sections of the phantom. We consider these results a first step towards a system suitable for medical physics departments to simulate a complete treatment plan using remote computing facilities for MC simulations.
Abuhaimed, Abdullah; J Martin, Colin; Sankaralingam, Marimuthu; J Gentle, David; McJury, Mark
2014-11-07
The IEC has introduced a practical approach to overcome shortcomings of the CTDI100 for measurements on wide beams employed for cone beam (CBCT) scans. This study evaluated the efficiency of this approach (CTDIIEC) for different arrangements using Monte Carlo simulation techniques, and compared CTDIIEC to the efficiency of CTDI100 for CBCT. Monte Carlo EGSnrc/BEAMnrc and EGSnrc/DOSXYZnrc codes were used to simulate the kV imaging system mounted on a Varian TrueBeam linear accelerator. The Monte Carlo model was benchmarked against experimental measurements and good agreement shown. Standard PMMA head and body phantoms with lengths 150, 600, and 900 mm were simulated. Beam widths studied ranged from 20-300 mm, and four scanning protocols using two acquisition modes were utilized. The efficiency values were calculated at the centre (εc) and periphery (εp) of the phantoms and for the weighted CTDI (εw). The efficiency values for CTDI100 were approximately constant for beam widths 20-40 mm, where εc(CTDI100), εp(CTDI100), and εw(CTDI100) were 74.7 ± 0.6%, 84.6 ± 0.3%, and 80.9 ± 0.4%, for the head phantom and 59.7 ± 0.3%, 82.1 ± 0.3%, and 74.9 ± 0.3%, for the body phantom, respectively. When beam width increased beyond 40 mm, ε(CTDI100) values fell steadily reaching ~30% at a beam width of 300 mm. In contrast, the efficiency of the CTDIIEC was approximately constant over all beam widths, demonstrating its suitability for assessment of CBCT. εc(CTDIIEC), εp(CTDIIEC), and εw(CTDIIEC) were 76.1 ± 0.9%, 85.9 ± 1.0%, and 82.2 ± 0.9% for the head phantom and 60.6 ± 0.7%, 82.8 ± 0.8%, and 75.8 ± 0.7%, for the body phantom, respectively, within 2% of ε(CTDI100) values for narrower beam widths. CTDI100,w and CTDIIEC,w underestimate CTDI∞,w by ~55% and ~18% for the head phantom and by ~56% and ~24% for the body phantom, respectively, using a clinical beam width 198 mm. The
Monte Carlo simulations of lattice gauge theories
International Nuclear Information System (INIS)
Forcrand, P. de; Minnesota Univ., Minneapolis, MN
1989-01-01
Lattice gauge simulations are presented in layman's terms. The need for large computer resources is justified. The main aspects of implementations on vector and parallel machines are explained. An overview of state of the art simulations and dedicated hardware projects is presented. 8 refs.; 1 figure; 1 table
11th International Conference on Monte Carlo and Quasi-Monte Carlo Methods in Scientific Computing
Nuyens, Dirk
2016-01-01
This book presents the refereed proceedings of the Eleventh International Conference on Monte Carlo and Quasi-Monte Carlo Methods in Scientific Computing that was held at the University of Leuven (Belgium) in April 2014. These biennial conferences are major events for Monte Carlo and quasi-Monte Carlo researchers. The proceedings include articles based on invited lectures as well as carefully selected contributed papers on all theoretical aspects and applications of Monte Carlo and quasi-Monte Carlo methods. Offering information on the latest developments in these very active areas, this book is an excellent reference resource for theoreticians and practitioners interested in solving high-dimensional computational problems, arising, in particular, in finance, statistics and computer graphics.
Monte Carlo simulations of low background detectors
International Nuclear Information System (INIS)
Miley, H.S.; Brodzinski, R.L.; Hensley, W.K.; Reeves, J.H.
1995-01-01
An implementation of the Electron Gamma Shower 4 code (EGS4) has been developed to allow convenient simulation of typical gamma ray measurement systems. Coincidence gamma rays, beta spectra, and angular correlations have been added to adequately simulate a complete nuclear decay and provide corrections to experimentally determined detector efficiencies. This code has been used to strip certain low-background spectra for the purpose of extremely low-level assay. Monte Carlo calculations of this sort can be extremely successful since low background detectors are usually free of significant contributions from poorly localized radiation sources, such as cosmic muons, secondary cosmic neutrons, and radioactive construction or shielding materials. Previously, validation of this code has been obtained from a series of comparisons between measurements and blind calculations. An example of the application of this code to an exceedingly low background spectrum stripping will be presented. (author) 5 refs.; 3 figs.; 1 tab
Monte Carlo simulation of the ARGO
International Nuclear Information System (INIS)
Depaola, G.O.
1997-01-01
We use GEANT Monte Carlo code to design an outline of the geometry and simulate the performance of the Argentine gamma-ray observer (ARGO), a telescope based on silicon strip detector technlogy. The γ-ray direction is determined by geometrical means and the angular resolution is calculated for small variations of the basic design. The results show that the angular resolutions vary from a few degrees at low energies (∝50 MeV) to 0.2 , approximately, at high energies (>500 MeV). We also made simulations using as incoming γ-ray the energy spectrum of PKS0208-512 and PKS0528+134 quasars. Moreover, a method based on multiple scattering theory is also used to determine the incoming energy. We show that this method is applicable to energy spectrum. (orig.)
Exploring Monte Carlo Simulation Strategies for Geoscience Applications
Blais, J.; Grebenitcharsky, R.; Zhang, Z.
2008-12-01
Computer simulations are an increasingly important area of geoscience research and development. At the core of stochastic or Monte Carlo simulations are the random number sequences that are assumed to be distributed with specific characteristics. Computer generated random numbers, uniformly distributed on [0, 1], can be very different depending on the selection of pseudo-random number (PRN), quasi-random number (QRN) or chaotic random number (CRN) generators. In the evaluation of some definite integrals, the expected error variances are generally of different orders for the same number of random numbers. A comparative analysis of these three strategies has been carried out for geodetic and related applications in planar and spherical contexts. Based on these computational experiments, conclusions and recommendations concerning their performance and error variances are included.
Understanding quantum tunneling using diffusion Monte Carlo simulations
Inack, E. M.; Giudici, G.; Parolini, T.; Santoro, G.; Pilati, S.
2018-03-01
In simple ferromagnetic quantum Ising models characterized by an effective double-well energy landscape the characteristic tunneling time of path-integral Monte Carlo (PIMC) simulations has been shown to scale as the incoherent quantum-tunneling time, i.e., as 1 /Δ2 , where Δ is the tunneling gap. Since incoherent quantum tunneling is employed by quantum annealers (QAs) to solve optimization problems, this result suggests that there is no quantum advantage in using QAs with respect to quantum Monte Carlo (QMC) simulations. A counterexample is the recently introduced shamrock model (Andriyash and Amin, arXiv:1703.09277), where topological obstructions cause an exponential slowdown of the PIMC tunneling dynamics with respect to incoherent quantum tunneling, leaving open the possibility for potential quantum speedup, even for stoquastic models. In this work we investigate the tunneling time of projective QMC simulations based on the diffusion Monte Carlo (DMC) algorithm without guiding functions, showing that it scales as 1 /Δ , i.e., even more favorably than the incoherent quantum-tunneling time, both in a simple ferromagnetic system and in the more challenging shamrock model. However, a careful comparison between the DMC ground-state energies and the exact solution available for the transverse-field Ising chain indicates an exponential scaling of the computational cost required to keep a fixed relative error as the system size increases.
Efficient Monte Carlo Simulations of Gas Molecules Inside Porous Materials.
Kim, Jihan; Smit, Berend
2012-07-10
Monte Carlo (MC) simulations are commonly used to obtain adsorption properties of gas molecules inside porous materials. In this work, we discuss various optimization strategies that lead to faster MC simulations with CO2 gas molecules inside host zeolite structures used as a test system. The reciprocal space contribution of the gas-gas Ewald summation and both the direct and the reciprocal gas-host potential energy interactions are stored inside energy grids to reduce the wall time in the MC simulations. Additional speedup can be obtained by selectively calling the routine that computes the gas-gas Ewald summation, which does not impact the accuracy of the zeolite's adsorption characteristics. We utilize two-level density-biased sampling technique in the grand canonical Monte Carlo (GCMC) algorithm to restrict CO2 insertion moves into low-energy regions within the zeolite materials to accelerate convergence. Finally, we make use of the graphics processing units (GPUs) hardware to conduct multiple MC simulations in parallel via judiciously mapping the GPU threads to available workload. As a result, we can obtain a CO2 adsorption isotherm curve with 14 pressure values (up to 10 atm) for a zeolite structure within a minute of total compute wall time.
Odd-flavor Simulations by the Hybrid Monte Carlo
Takaishi, Tetsuya; Takaishi, Tetsuya; De Forcrand, Philippe
2001-01-01
The standard hybrid Monte Carlo algorithm is known to simulate even flavors QCD only. Simulations of odd flavors QCD, however, can be also performed in the framework of the hybrid Monte Carlo algorithm where the inverse of the fermion matrix is approximated by a polynomial. In this exploratory study we perform three flavors QCD simulations. We make a comparison of the hybrid Monte Carlo algorithm and the R-algorithm which also simulates odd flavors systems but has step-size errors. We find that results from our hybrid Monte Carlo algorithm are in agreement with those from the R-algorithm obtained at very small step-size.
Temporal acceleration of spatially distributed kinetic Monte Carlo simulations
International Nuclear Information System (INIS)
Chatterjee, Abhijit; Vlachos, Dionisios G.
2006-01-01
The computational intensity of kinetic Monte Carlo (KMC) simulation is a major impediment in simulating large length and time scales. In recent work, an approximate method for KMC simulation of spatially uniform systems, termed the binomial τ-leap method, was introduced [A. Chatterjee, D.G. Vlachos, M.A. Katsoulakis, Binomial distribution based τ-leap accelerated stochastic simulation, J. Chem. Phys. 122 (2005) 024112], where molecular bundles instead of individual processes are executed over coarse-grained time increments. This temporal coarse-graining can lead to significant computational savings but its generalization to spatially lattice KMC simulation has not been realized yet. Here we extend the binomial τ-leap method to lattice KMC simulations by combining it with spatially adaptive coarse-graining. Absolute stability and computational speed-up analyses for spatial systems along with simulations provide insights into the conditions where accuracy and substantial acceleration of the new spatio-temporal coarse-graining method are ensured. Model systems demonstrate that the r-time increment criterion of Chatterjee et al. obeys the absolute stability limit for values of r up to near 1
Monte Carlo simulation for radiographic applications
International Nuclear Information System (INIS)
Tillack, G.R.; Bellon, C.
2003-01-01
Standard radiography simulators are based on the attenuation law complemented by built-up-factors (BUF) to describe the interaction of radiation with material. The assumption of BUF implies that scattered radiation reduces only the contrast in radiographic images. This simplification holds for a wide range of applications like weld inspection as known from practical experience. But only a detailed description of the different underlying interaction mechanisms is capable to explain effects like mottling or others that every radiographer has experienced in practice. The application of Monte Carlo models is capable to handle primary and secondary interaction mechanisms contributing to the image formation process like photon interactions (absorption, incoherent and coherent scattering including electron-binding effects, pair production) and electron interactions (electron tracing including X-Ray fluorescence and Bremsstrahlung production). It opens up possibilities like the separation of influencing factors and the understanding of the functioning of intensifying screen used in film radiography. The paper discusses the opportunities in applying the Monte Carlo method to investigate special features in radiography in terms of selected examples. (orig.) [de
Fast Monte Carlo-assisted simulation of cloudy Earth backgrounds
Adler-Golden, Steven; Richtsmeier, Steven C.; Berk, Alexander; Duff, James W.
2012-11-01
A calculation method has been developed for rapidly synthesizing radiometrically accurate ultraviolet through longwavelengthinfrared spectral imagery of the Earth for arbitrary locations and cloud fields. The method combines cloudfree surface reflectance imagery with cloud radiance images calculated from a first-principles 3-D radiation transport model. The MCScene Monte Carlo code [1-4] is used to build a cloud image library; a data fusion method is incorporated to speed convergence. The surface and cloud images are combined with an upper atmospheric description with the aid of solar and thermal radiation transport equations that account for atmospheric inhomogeneity. The method enables a wide variety of sensor and sun locations, cloud fields, and surfaces to be combined on-the-fly, and provides hyperspectral wavelength resolution with minimal computational effort. The simulations agree very well with much more time-consuming direct Monte Carlo calculations of the same scene.
Radiation Modeling with Direct Simulation Monte Carlo
Carlson, Ann B.; Hassan, H. A.
1991-01-01
Improvements in the modeling of radiation in low density shock waves with direct simulation Monte Carlo (DSMC) are the subject of this study. A new scheme to determine the relaxation collision numbers for excitation of electronic states is proposed. This scheme attempts to move the DSMC programs toward a more detailed modeling of the physics and more reliance on available rate data. The new method is compared with the current modeling technique and both techniques are compared with available experimental data. The differences in the results are evaluated. The test case is based on experimental measurements from the AVCO-Everett Research Laboratory electric arc-driven shock tube of a normal shock wave in air at 10 km/s and .1 Torr. The new method agrees with the available data as well as the results from the earlier scheme and is more easily extrapolated to di erent ow conditions.
'Odontologic dosimetric card' experiments and simulations using Monte Carlo methods
International Nuclear Information System (INIS)
Menezes, C.J.M.; Lima, R. de A.; Peixoto, J.E.; Vieira, J.W.
2008-01-01
The techniques for data processing, combined with the development of fast and more powerful computers, makes the Monte Carlo methods one of the most widely used tools in the radiation transport simulation. For applications in diagnostic radiology, this method generally uses anthropomorphic phantoms to evaluate the absorbed dose to patients during exposure. In this paper, some Monte Carlo techniques were used to simulation of a testing device designed for intra-oral X-ray equipment performance evaluation called Odontologic Dosimetric Card (CDO of 'Cartao Dosimetrico Odontologico' in Portuguese) for different thermoluminescent detectors. This paper used two computational models of exposition RXD/EGS4 and CDO/EGS4. In the first model, the simulation results are compared with experimental data obtained in the similar conditions. The second model, it presents the same characteristics of the testing device studied (CDO). For the irradiations, the X-ray spectra were generated by the IPEM report number 78, spectrum processor. The attenuated spectrum was obtained for IEC 61267 qualities and various additional filters for a Pantak 320 X-ray industrial equipment. The results obtained for the study of the copper filters used in the determination of the kVp were compared with experimental data, validating the model proposed for the characterization of the CDO. The results shower of the CDO will be utilized in quality assurance programs in order to guarantee that the equipment fulfill the requirements of the Norm SVS No. 453/98 MS (Brazil) 'Directives of Radiation Protection in Medical and Dental Radiodiagnostic'. We conclude that the EGS4 is a suitable code Monte Carlo to simulate thermoluminescent dosimeters and experimental procedures employed in the routine of the quality control laboratory in diagnostic radiology. (author)
Energy Technology Data Exchange (ETDEWEB)
Barbosa, Antonio Konrado de Santana; Vieira, Jose Wilson [Instituto Federal de Educacao, Ciencia e Tecnologia (IFPE), Recife, PE (Brazil); Costa, Kleber Souza Silva [Faculdade Integrada de Pernambuco (FACIPE), Recife, PE (Brazil); Lima, Fernando Roberto de Andrade, E-mail: falima@cnen.gov.b [Centro Regional de Ciencias Nucleares do Nordeste (CRCN-NE/CNEN-PE), Recife, PE (Brazil)
2011-07-01
Radiotherapy computational simulation procedures using Monte Carlo (MC) methods have shown to be increasingly important to the improvement of cancer fighting strategies. One of the biases in this practice is the discretization of the radioactive source in brachytherapy simulations, which often do not match with a real situation. This study had the aim to identify and to measure the influence of radioactive sources discretization in brachytherapy MC simulations when compared to those that do not present discretization, using prostate brachytherapy with Iodine-125 radionuclide as model. Simulations were carried out with 108 events with both types of sources to compare them using EGSnrc code associated to MASH phantom in orthostatic and supine positions with some anatomic adaptations. Significant alterations were found, especially regarding bladder, rectum and the prostate itself. It can be concluded that there is a need to discretized sources in brachytherapy simulations to ensure its representativeness. (author)
Simulation of quantum systems by the tomography Monte Carlo method
International Nuclear Information System (INIS)
Bogdanov, Yu I
2007-01-01
A new method of statistical simulation of quantum systems is presented which is based on the generation of data by the Monte Carlo method and their purposeful tomography with the energy minimisation. The numerical solution of the problem is based on the optimisation of the target functional providing a compromise between the maximisation of the statistical likelihood function and the energy minimisation. The method does not involve complicated and ill-posed multidimensional computational procedures and can be used to calculate the wave functions and energies of the ground and excited stationary sates of complex quantum systems. The applications of the method are illustrated. (fifth seminar in memory of d.n. klyshko)
Advanced Computational Methods for Monte Carlo Calculations
Energy Technology Data Exchange (ETDEWEB)
Brown, Forrest B. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2018-01-12
This course is intended for graduate students who already have a basic understanding of Monte Carlo methods. It focuses on advanced topics that may be needed for thesis research, for developing new state-of-the-art methods, or for working with modern production Monte Carlo codes.
Monte Carlo simulations and dosimetric studies of an irradiation facility
Energy Technology Data Exchange (ETDEWEB)
Belchior, A. [Instituto Tecnologico e Nuclear, Estrada nacional no. 10, Apartado 21, 2686-953 Sacavem (Portugal)], E-mail: anabelchior@itn.pt; Botelho, M.L; Vaz, P. [Instituto Tecnologico e Nuclear, Estrada nacional no. 10, Apartado 21, 2686-953 Sacavem (Portugal)
2007-09-21
There is an increasing utilization of ionizing radiation for industrial applications. Additionally, the radiation technology offers a variety of advantages in areas, such as sterilization and food preservation. For these applications, dosimetric tests are of crucial importance in order to assess the dose distribution throughout the sample being irradiated. The use of Monte Carlo methods and computational tools in support of the assessment of the dose distributions in irradiation facilities can prove to be economically effective, representing savings in the utilization of dosemeters, among other benefits. One of the purposes of this study is the development of a Monte Carlo simulation, using a state-of-the-art computational tool-MCNPX-in order to determine the dose distribution inside an irradiation facility of Cobalt 60. This irradiation facility is currently in operation at the ITN campus and will feature an automation and robotics component, which will allow its remote utilization by an external user, under REEQ/996/BIO/2005 project. The detailed geometrical description of the irradiation facility has been implemented in MCNPX, which features an accurate and full simulation of the electron-photon processes involved. The validation of the simulation results obtained was performed by chemical dosimetry methods, namely a Fricke solution. The Fricke dosimeter is a standard dosimeter and is widely used in radiation processing for calibration purposes.
Treatment planning in radiosurgery: parallel Monte Carlo simulation software
Energy Technology Data Exchange (ETDEWEB)
Scielzo, G [Galliera Hospitals, Genova (Italy). Dept. of Hospital Physics; Grillo Ruggieri, F [Galliera Hospitals, Genova (Italy) Dept. for Radiation Therapy; Modesti, M; Felici, R [Electronic Data System, Rome (Italy); Surridge, M [University of South Hampton (United Kingdom). Parallel Apllication Centre
1995-12-01
The main objective of this research was to evaluate the possibility of direct Monte Carlo simulation for accurate dosimetry with short computation time. We made us of: graphics workstation, linear accelerator, water, PMMA and anthropomorphic phantoms, for validation purposes; ionometric, film and thermo-luminescent techniques, for dosimetry; treatment planning system for comparison. Benchmarking results suggest that short computing times can be obtained with use of the parallel version of EGS4 that was developed. Parallelism was obtained assigning simulation incident photons to separate processors, and the development of a parallel random number generator was necessary. Validation consisted in: phantom irradiation, comparison of predicted and measured values good agreement in PDD and dose profiles. Experiments on anthropomorphic phantoms (with inhomogeneities) were carried out, and these values are being compared with results obtained with the conventional treatment planning system.
The Development and Comparison of Molecular Dynamics Simulation and Monte Carlo Simulation
Chen, Jundong
2018-03-01
Molecular dynamics is an integrated technology that combines physics, mathematics and chemistry. Molecular dynamics method is a computer simulation experimental method, which is a powerful tool for studying condensed matter system. This technique not only can get the trajectory of the atom, but can also observe the microscopic details of the atomic motion. By studying the numerical integration algorithm in molecular dynamics simulation, we can not only analyze the microstructure, the motion of particles and the image of macroscopic relationship between them and the material, but can also study the relationship between the interaction and the macroscopic properties more conveniently. The Monte Carlo Simulation, similar to the molecular dynamics, is a tool for studying the micro-molecular and particle nature. In this paper, the theoretical background of computer numerical simulation is introduced, and the specific methods of numerical integration are summarized, including Verlet method, Leap-frog method and Velocity Verlet method. At the same time, the method and principle of Monte Carlo Simulation are introduced. Finally, similarities and differences of Monte Carlo Simulation and the molecular dynamics simulation are discussed.
International Nuclear Information System (INIS)
Tabary, J.; Gliere, A.
2001-01-01
A Monte Carlo radiation transport simulation program, EGS Nova, and a computer aided design software, BRL-CAD, have been coupled within the framework of Sindbad, a nondestructive evaluation (NDE) simulation system. In its current status, the program is very valuable in a NDE laboratory context, as it helps simulate the images due to the uncollided and scattered photon fluxes in a single NDE software environment, without having to switch to a Monte Carlo code parameters set. Numerical validations show a good agreement with EGS4 computed and published data. As the program's major drawback is the execution time, computational efficiency improvements are foreseen. (orig.)
Diffusion Monte Carlo approach versus adiabatic computation for local Hamiltonians
Bringewatt, Jacob; Dorland, William; Jordan, Stephen P.; Mink, Alan
2018-02-01
Most research regarding quantum adiabatic optimization has focused on stoquastic Hamiltonians, whose ground states can be expressed with only real non-negative amplitudes and thus for whom destructive interference is not manifest. This raises the question of whether classical Monte Carlo algorithms can efficiently simulate quantum adiabatic optimization with stoquastic Hamiltonians. Recent results have given counterexamples in which path-integral and diffusion Monte Carlo fail to do so. However, most adiabatic optimization algorithms, such as for solving MAX-k -SAT problems, use k -local Hamiltonians, whereas our previous counterexample for diffusion Monte Carlo involved n -body interactions. Here we present a 6-local counterexample which demonstrates that even for these local Hamiltonians there are cases where diffusion Monte Carlo cannot efficiently simulate quantum adiabatic optimization. Furthermore, we perform empirical testing of diffusion Monte Carlo on a standard well-studied class of permutation-symmetric tunneling problems and similarly find large advantages for quantum optimization over diffusion Monte Carlo.
International Nuclear Information System (INIS)
Ahmed Ghoneim, Adel Aly; Ghoneim, Adel A.; Al-Zanki, Jasem M.; El-Essawy, Ashraf H.
2009-01-01
Atomic reorganization starts by filling the initially inner-shell vacancy by a radiative transition (x-ray) or by a non-radiative transition (Auger and Coster-Kronig processes). New vacancies created during this atomic reorganization may in turn be filled by further radiative and non-radiative transitions until all vacancies reach the outermost occupied shells. The production of inner-shell vacancy in an atom and the de-excitation decays through radiative and non-radiative transitions may result in a change of the atomic potential; this change leads to the emission of an additional electron in the continuum (electron shake-off processes). In the present work, the ion charge state distributions (CSD) and mean atomic charge ions produced from inner shell vacancy de-excitation decay are calculated for neutral Ne , Ar and Kr atoms. The calculations are carried out using Monte Carlo (MC) technique to simulate the cascade development after primary vacancy production. The radiative and non-radiative transitions for each vacancy are calculated in the simulation. In addition, the change of transition energies and transition rates due to multi vacancies produced in the atomic configurations through the cascade development are considered in the present work. It is found that considering the electron shake off process and closing of non-allowed non-radiative channels improves the results of both charge state distributions (CSD) and average charge state. To check the validity of the present calculations, the results obtained are compared with available theoretical and experimental data. The present results are found to agree well with the available theoretical and experimental values. (author)
Monte Carlo Simulation Tool Installation and Operation Guide
Energy Technology Data Exchange (ETDEWEB)
Aguayo Navarrete, Estanislao; Ankney, Austin S.; Berguson, Timothy J.; Kouzes, Richard T.; Orrell, John L.; Troy, Meredith D.; Wiseman, Clinton G.
2013-09-02
This document provides information on software and procedures for Monte Carlo simulations based on the Geant4 toolkit, the ROOT data analysis software and the CRY cosmic ray library. These tools have been chosen for its application to shield design and activation studies as part of the simulation task for the Majorana Collaboration. This document includes instructions for installation, operation and modification of the simulation code in a high cyber-security computing environment, such as the Pacific Northwest National Laboratory network. It is intended as a living document, and will be periodically updated. It is a starting point for information collection by an experimenter, and is not the definitive source. Users should consult with one of the authors for guidance on how to find the most current information for their needs.
A Monte Carlo Simulation Framework for Testing Cosmological Models
Directory of Open Access Journals (Sweden)
Heymann Y.
2014-10-01
Full Text Available We tested alternative cosmologies using Monte Carlo simulations based on the sam- pling method of the zCosmos galactic survey. The survey encompasses a collection of observable galaxies with respective redshifts that have been obtained for a given spec- troscopic area of the sky. Using a cosmological model, we can convert the redshifts into light-travel times and, by slicing the survey into small redshift buckets, compute a curve of galactic density over time. Because foreground galaxies obstruct the images of more distant galaxies, we simulated the theoretical galactic density curve using an average galactic radius. By comparing the galactic density curves of the simulations with that of the survey, we could assess the cosmologies. We applied the test to the expanding-universe cosmology of de Sitter and to a dichotomous cosmology.
Personal Supercomputing for Monte Carlo Simulation Using a GPU
Energy Technology Data Exchange (ETDEWEB)
Oh, Jae-Yong; Koo, Yang-Hyun; Lee, Byung-Ho [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)
2008-05-15
Since the usability, accessibility, and maintenance of a personal computer (PC) are very good, a PC is a useful computer simulation tool for researchers. It has enough calculation power to simulate a small scale system with the improved performance of a PC's CPU. However, if a system is large or long time scale, we need a cluster computer or supercomputer. Recently great changes have occurred in the PC calculation environment. A graphic process unit (GPU) on a graphic card, only used to calculate display data, has a superior calculation capability to a PC's CPU. This GPU calculation performance is a match for the supercomputer in 2000. Although it has such a great calculation potential, it is not easy to program a simulation code for GPU due to difficult programming techniques for converting a calculation matrix to a 3D rendering image using graphic APIs. In 2006, NVIDIA provided the Software Development Kit (SDK) for the programming environment for NVIDIA's graphic cards, which is called the Compute Unified Device Architecture (CUDA). It makes the programming on the GPU easy without knowledge of the graphic APIs. This paper describes the basic architectures of NVIDIA's GPU and CUDA, and carries out a performance benchmark for the Monte Carlo simulation.
Personal Supercomputing for Monte Carlo Simulation Using a GPU
International Nuclear Information System (INIS)
Oh, Jae-Yong; Koo, Yang-Hyun; Lee, Byung-Ho
2008-01-01
Since the usability, accessibility, and maintenance of a personal computer (PC) are very good, a PC is a useful computer simulation tool for researchers. It has enough calculation power to simulate a small scale system with the improved performance of a PC's CPU. However, if a system is large or long time scale, we need a cluster computer or supercomputer. Recently great changes have occurred in the PC calculation environment. A graphic process unit (GPU) on a graphic card, only used to calculate display data, has a superior calculation capability to a PC's CPU. This GPU calculation performance is a match for the supercomputer in 2000. Although it has such a great calculation potential, it is not easy to program a simulation code for GPU due to difficult programming techniques for converting a calculation matrix to a 3D rendering image using graphic APIs. In 2006, NVIDIA provided the Software Development Kit (SDK) for the programming environment for NVIDIA's graphic cards, which is called the Compute Unified Device Architecture (CUDA). It makes the programming on the GPU easy without knowledge of the graphic APIs. This paper describes the basic architectures of NVIDIA's GPU and CUDA, and carries out a performance benchmark for the Monte Carlo simulation
Atmosphere Re-Entry Simulation Using Direct Simulation Monte Carlo (DSMC Method
Directory of Open Access Journals (Sweden)
Francesco Pellicani
2016-05-01
Full Text Available Hypersonic re-entry vehicles aerothermodynamic investigations provide fundamental information to other important disciplines like materials and structures, assisting the development of thermal protection systems (TPS efficient and with a low weight. In the transitional flow regime, where thermal and chemical equilibrium is almost absent, a new numerical method for such studies has been introduced, the direct simulation Monte Carlo (DSMC numerical technique. The acceptance and applicability of the DSMC method have increased significantly in the 50 years since its invention thanks to the increase in computer speed and to the parallel computing. Anyway, further verification and validation efforts are needed to lead to its greater acceptance. In this study, the Monte Carlo simulator OpenFOAM and Sparta have been studied and benchmarked against numerical and theoretical data for inert and chemically reactive flows and the same will be done against experimental data in the near future. The results show the validity of the data found with the DSMC. The best setting of the fundamental parameters used by a DSMC simulator are presented for each software and they are compared with the guidelines deriving from the theory behind the Monte Carlo method. In particular, the number of particles per cell was found to be the most relevant parameter to achieve valid and optimized results. It is shown how a simulation with a mean value of one particle per cell gives sufficiently good results with very low computational resources. This achievement aims to reconsider the correct investigation method in the transitional regime where both the direct simulation Monte Carlo (DSMC and the computational fluid-dynamics (CFD can work, but with a different computational effort.
Fast Monte Carlo for ion beam analysis simulations
International Nuclear Information System (INIS)
Schiettekatte, Francois
2008-01-01
A Monte Carlo program for the simulation of ion beam analysis data is presented. It combines mainly four features: (i) ion slowdown is computed separately from the main scattering/recoil event, which is directed towards the detector. (ii) A virtual detector, that is, a detector larger than the actual one can be used, followed by trajectory correction. (iii) For each collision during ion slowdown, scattering angle components are extracted form tables. (iv) Tables of scattering angle components, stopping power and energy straggling are indexed using the binary representation of floating point numbers, which allows logarithmic distribution of these tables without the computation of logarithms to access them. Tables are sufficiently fine-grained that interpolation is not necessary. Ion slowdown computation thus avoids trigonometric, inverse and transcendental function calls and, as much as possible, divisions. All these improvements make possible the computation of 10 7 collisions/s on current PCs. Results for transmitted ions of several masses in various substrates are well comparable to those obtained using SRIM-2006 in terms of both angular and energy distributions, as long as a sufficiently large number of collisions is considered for each ion. Examples of simulated spectrum show good agreement with experimental data, although a large detector rather than the virtual detector has to be used to properly simulate background signals that are due to plural collisions. The program, written in standard C, is open-source and distributed under the terms of the GNU General Public License
International Nuclear Information System (INIS)
Silva, Hugo R.; Silva, Ademir X.; Rebello, Wilson F.; Silva, Maria G.
2011-01-01
This paper aims to present the results obtained by Monte Carlo simulation of the effect of shielding against neutrons, called External Shielding, to be placed on the heads of linear accelerators used in radiotherapy. For this, it was used the radiation transport code Monte Carlo N-Particle - MCNPX, in which were developed computational model of the head of the linear accelerator Varian 2300 C/D. The equipment was simulated within a bunker, operating at energies of 10, 15 and 18 MV, considering the rotation of the gantry at eight different angles ( 0 deg, 45 deg, 90 deg, 135 deg, 180 deg, 225 deg, 270 deg and 315 deg), in all cases, the equipment was modeled without and with the shielding positioned attached to the head of the accelerator on its bottom. In each of these settings, it was calculated the Ambient Dose Equivalent due to neutron H * (10)n on points situated in the region of the patient (region of interest for evaluation of undesirable neutron doses on the patient) and in the maze of radiotherapy room (region of interest for shielding the access door to the bunker). It was observed for all energies of equipment operation as well as for all angles of inclination of the gantry, a significant reduction in the values of H * (10) n when the equipment operated with the external shielding, both in the region of the patient as in the region of the maze. (author)
Monte Carlo and analytic simulations in nanoparticle-enhanced radiation therapy
Directory of Open Access Journals (Sweden)
Paro AD
2016-09-01
Full Text Available Autumn D Paro,1 Mainul Hossain,2 Thomas J Webster,1,3,4 Ming Su1,4 1Department of Chemical Engineering, Northeastern University, Boston, MA, USA; 2NanoScience Technology Center and School of Electrical Engineering and Computer Science, University of Central Florida, Orlando, Florida, USA; 3Excellence for Advanced Materials Research, King Abdulaziz University, Jeddah, Saudi Arabia; 4Wenzhou Institute of Biomaterials and Engineering, Chinese Academy of Science, Wenzhou Medical University, Zhejiang, People’s Republic of China Abstract: Analytical and Monte Carlo simulations have been used to predict dose enhancement factors in nanoparticle-enhanced X-ray radiation therapy. Both simulations predict an increase in dose enhancement in the presence of nanoparticles, but the two methods predict different levels of enhancement over the studied energy, nanoparticle materials, and concentration regime for several reasons. The Monte Carlo simulation calculates energy deposited by electrons and photons, while the analytical one only calculates energy deposited by source photons and photoelectrons; the Monte Carlo simulation accounts for electron–hole recombination, while the analytical one does not; and the Monte Carlo simulation randomly samples photon or electron path and accounts for particle interactions, while the analytical simulation assumes a linear trajectory. This study demonstrates that the Monte Carlo simulation will be a better choice to evaluate dose enhancement with nanoparticles in radiation therapy. Keywords: nanoparticle, dose enhancement, Monte Carlo simulation, analytical simulation, radiation therapy, tumor cell, X-ray
Direct Simulation Monte Carlo (DSMC) on the Connection Machine
International Nuclear Information System (INIS)
Wong, B.C.; Long, L.N.
1992-01-01
The massively parallel computer Connection Machine is utilized to map an improved version of the direct simulation Monte Carlo (DSMC) method for solving flows with the Boltzmann equation. The kinetic theory is required for analyzing hypersonic aerospace applications, and the features and capabilities of the DSMC particle-simulation technique are discussed. The DSMC is shown to be inherently massively parallel and data parallel, and the algorithm is based on molecule movements, cross-referencing their locations, locating collisions within cells, and sampling macroscopic quantities in each cell. The serial DSMC code is compared to the present parallel DSMC code, and timing results show that the speedup of the parallel version is approximately linear. The correct physics can be resolved from the results of the complete DSMC method implemented on the connection machine using the data-parallel approach. 41 refs
Modern analysis of ion channeling data by Monte Carlo simulations
Energy Technology Data Exchange (ETDEWEB)
Nowicki, Lech [Andrzej SoItan Institute for Nuclear Studies, ul. Hoza 69, 00-681 Warsaw (Poland)]. E-mail: lech.nowicki@fuw.edu.pl; Turos, Andrzej [Institute of Electronic Materials Technology, Wolczynska 133, 01-919 Warsaw (Poland); Ratajczak, Renata [Andrzej SoItan Institute for Nuclear Studies, ul. Hoza 69, 00-681 Warsaw (Poland); Stonert, Anna [Andrzej SoItan Institute for Nuclear Studies, ul. Hoza 69, 00-681 Warsaw (Poland); Garrido, Frederico [Centre de Spectrometrie Nucleaire et Spectrometrie de Masse, CNRS-IN2P3-Universite Paris-Sud, 91405 Orsay (France)
2005-10-15
Basic scheme of ion channeling spectra Monte Carlo simulation is reformulated in terms of statistical sampling. The McChasy simulation code is described and two examples of the code applications are presented. These are: calculation of projectile flux in uranium dioxide crystal and defect analysis for ion implanted InGaAsP/InP superlattice. Virtues and pitfalls of defect analysis using Monte Carlo simulations are discussed.
Scientific computer simulation review
International Nuclear Information System (INIS)
Kaizer, Joshua S.; Heller, A. Kevin; Oberkampf, William L.
2015-01-01
Before the results of a scientific computer simulation are used for any purpose, it should be determined if those results can be trusted. Answering that question of trust is the domain of scientific computer simulation review. There is limited literature that focuses on simulation review, and most is specific to the review of a particular type of simulation. This work is intended to provide a foundation for a common understanding of simulation review. This is accomplished through three contributions. First, scientific computer simulation review is formally defined. This definition identifies the scope of simulation review and provides the boundaries of the review process. Second, maturity assessment theory is developed. This development clarifies the concepts of maturity criteria, maturity assessment sets, and maturity assessment frameworks, which are essential for performing simulation review. Finally, simulation review is described as the application of a maturity assessment framework. This is illustrated through evaluating a simulation review performed by the U.S. Nuclear Regulatory Commission. In making these contributions, this work provides a means for a more objective assessment of a simulation’s trustworthiness and takes the next step in establishing scientific computer simulation review as its own field. - Highlights: • We define scientific computer simulation review. • We develop maturity assessment theory. • We formally define a maturity assessment framework. • We describe simulation review as the application of a maturity framework. • We provide an example of a simulation review using a maturity framework
Closed-shell variational quantum Monte Carlo simulation for the ...
African Journals Online (AJOL)
Closed-shell variational quantum Monte Carlo simulation for the electric dipole moment calculation of hydrazine molecule using casino-code. ... Nigeria Journal of Pure and Applied Physics ... The variational quantum Monte Carlo (VQMC) technique used in this work employed the restricted Hartree-Fock (RHF) scheme.
Forest canopy BRDF simulation using Monte Carlo method
Huang, J.; Wu, B.; Zeng, Y.; Tian, Y.
2006-01-01
Monte Carlo method is a random statistic method, which has been widely used to simulate the Bidirectional Reflectance Distribution Function (BRDF) of vegetation canopy in the field of visible remote sensing. The random process between photons and forest canopy was designed using Monte Carlo method.
Crop canopy BRDF simulation and analysis using Monte Carlo method
Huang, J.; Wu, B.; Tian, Y.; Zeng, Y.
2006-01-01
This author designs the random process between photons and crop canopy. A Monte Carlo model has been developed to simulate the Bi-directional Reflectance Distribution Function (BRDF) of crop canopy. Comparing Monte Carlo model to MCRM model, this paper analyzes the variations of different LAD and
Monte Carlo simulations for plasma physics
International Nuclear Information System (INIS)
Okamoto, M.; Murakami, S.; Nakajima, N.; Wang, W.X.
2000-07-01
Plasma behaviours are very complicated and the analyses are generally difficult. However, when the collisional processes play an important role in the plasma behaviour, the Monte Carlo method is often employed as a useful tool. For examples, in neutral particle injection heating (NBI heating), electron or ion cyclotron heating, and alpha heating, Coulomb collisions slow down high energetic particles and pitch angle scatter them. These processes are often studied by the Monte Carlo technique and good agreements can be obtained with the experimental results. Recently, Monte Carlo Method has been developed to study fast particle transports associated with heating and generating the radial electric field. Further it is applied to investigating the neoclassical transport in the plasma with steep gradients of density and temperatures which is beyong the conventional neoclassical theory. In this report, we briefly summarize the researches done by the present authors utilizing the Monte Carlo method. (author)
Monte Carlo simulation of the turbulent transport of airborne contaminants
International Nuclear Information System (INIS)
Watson, C.W.; Barr, S.
1975-09-01
A generalized, three-dimensional Monte Carlo model and computer code (SPOOR) are described for simulating atmospheric transport and dispersal of small pollutant clouds. A cloud is represented by a large number of particles that we track by statistically sampling simulated wind and turbulence fields. These fields are based on generalized wind data for large-scale flow and turbulent energy spectra for the micro- and mesoscales. The large-scale field can be input from a climatological data base, or by means of real-time analyses, or from a separate, subjectively defined data base. We introduce the micro- and mesoscale wind fluctuations through a power spectral density, to include effects from a broad spectrum of turbulent-energy scales. The role of turbulence is simulated in both meander and dispersal. Complex flow fields and time-dependent diffusion rates are accounted for naturally, and shear effects are simulated automatically in the ensemble of particle trajectories. An important adjunct has been the development of computer-graphics displays. These include two- and three-dimensional (perspective) snapshots and color motion pictures of particle ensembles, plus running displays of differential and integral cloud characteristics. The model's versatility makes it a valuable atmospheric research tool that we can adapt easily into broader, multicomponent systems-analysis codes. Removal, transformation, dry or wet deposition, and resuspension of contaminant particles can be readily included
Monte Carlo simulation of zinc protoporphyrin fluorescence in the retina
Chen, Xiaoyan; Lane, Stephen
2010-02-01
We have used Monte Carlo simulation of autofluorescence in the retina to determine that noninvasive detection of nutritional iron deficiency is possible. Nutritional iron deficiency (which leads to iron deficiency anemia) affects more than 2 billion people worldwide, and there is an urgent need for a simple, noninvasive diagnostic test. Zinc protoporphyrin (ZPP) is a fluorescent compound that accumulates in red blood cells and is used as a biomarker for nutritional iron deficiency. We developed a computational model of the eye, using parameters that were identified either by literature search, or by direct experimental measurement to test the possibility of detecting ZPP non-invasively in retina. By incorporating fluorescence into Steven Jacques' original code for multi-layered tissue, we performed Monte Carlo simulation of fluorescence in the retina and determined that if the beam is not focused on a blood vessel in a neural retina layer or if part of light is hitting the vessel, ZPP fluorescence will be 10-200 times higher than background lipofuscin fluorescence coming from the retinal pigment epithelium (RPE) layer directly below. In addition we found that if the light can be focused entirely onto a blood vessel in the neural retina layer, the fluorescence signal comes only from ZPP. The fluorescence from layers below in this second situation does not contribute to the signal. Therefore, the possibility that a device could potentially be built and detect ZPP fluorescence in retina looks very promising.
Study of Gamma spectra by Monte Carlo simulation
International Nuclear Information System (INIS)
Cantaragiu, A.; Gheorghies, A.; Borcia, C.
2008-01-01
The purpose of this paper is obtaining gamma ray spectra by means of a scintillation detector applying the Monte Carlo statistic simulation method using the EGS4 program. The Monte Carlo algorithm implies that the physical system is described by the probability density function which allows generating random figures and the result is taken as an average of numbers which were observed. The EGS4 program allows the simulation of the following physical processes: the photo-electrical effect, the Compton effect, the electron positron pairs generation and the Rayleigh diffusion. The gamma rays recorded by the detector are converted into electrical pulses and the gamma ray spectra are acquired and processed by means of the Nomad Plus portable spectrometer connected to a computer. As a gamma ray sources 137Cs and 60Co are used whose spectra drawn and used for study the interaction of the gamma radiations with the scintillation detector. The parameters which varied during the acquisition of the gamma ray spectra are the distance between source and detector and the measuring time. Due to the statistical processes in the detector, the peak looks like a Gauss distribution. The identification of the gamma quantum energy value is achieved by the experimental spectra peaks, thus gathering information about the position of the peak, the width and the area of the peak respectively. By means of the EGS4 program a simulation is run using these parameters and an 'ideal' spectrum is obtained, a spectrum which is not influenced by the statistical processes which take place inside the detector. Then, the convolution of the spectra is achieved by means of a normalised Gauss function. There is a close match between the experimental results and those simulated in the EGS4 program because the interactions which occurred during the simulation have a statistical behaviour close to the real one. (authors)
Direct Measurement of Power Dissipated by Monte Carlo Simulations on CPU and FPGA Platforms
DEFF Research Database (Denmark)
Albicocco, Pietro; Papini, Davide; Nannarelli, Alberto
In this technical report, we describe how power dissipation measurements on different computing platforms (a desktop computer and an FPGA board) are performed by using a Hall effectbased current sensor. The chosen application is a Monte Carlo simulation for European option pricing which is a popu...
Cost effective distributed computing for Monte Carlo radiation dosimetry
International Nuclear Information System (INIS)
Wise, K.N.; Webb, D.V.
2000-01-01
Full text: An inexpensive computing facility has been established for performing repetitive Monte Carlo simulations with the BEAM and EGS4/EGSnrc codes of linear accelerator beams, for calculating effective dose from diagnostic imaging procedures and of ion chambers and phantoms used for the Australian high energy absorbed dose standards. The facility currently consists of 3 dual-processor 450 MHz processor PCs linked by a high speed LAN. The 3 PCs can be accessed either locally from a single keyboard/monitor/mouse combination using a SwitchView controller or remotely via a computer network from PCs with suitable communications software (e.g. Telnet, Kermit etc). All 3 PCs are identically configured to have the Red Hat Linux 6.0 operating system. A Fortran compiler and the BEAM and EGS4/EGSnrc codes are available on the 3 PCs. The preparation of sequences of jobs utilising the Monte Carlo codes is simplified using load-distributing software (enFuzion 6.0 marketed by TurboLinux Inc, formerly Cluster from Active Tools) which efficiently distributes the computing load amongst all 6 processors. We describe 3 applications of the system - (a) energy spectra from radiotherapy sources, (b) mean mass-energy absorption coefficients and stopping powers for absolute absorbed dose standards and (c) dosimetry for diagnostic procedures; (a) and (b) are based on the transport codes BEAM and FLURZnrc while (c) is a Fortran/EGS code developed at ARPANSA. Efficiency gains ranged from 3 for (c) to close to the theoretical maximum of 6 for (a) and (b), with the gain depending on the amount of 'bookkeeping' to begin each task and the time taken to complete a single task. We have found the use of a load-balancing batch processing system with many PCs to be an economical way of achieving greater productivity for Monte Carlo calculations or of any computer intensive task requiring many runs with different parameters. Copyright (2000) Australasian College of Physical Scientists and
Hybrid Multilevel Monte Carlo Simulation of Stochastic Reaction Networks
Moraes, Alvaro
2015-01-07
Stochastic reaction networks (SRNs) is a class of continuous-time Markov chains intended to describe, from the kinetic point of view, the time-evolution of chemical systems in which molecules of different chemical species undergo a finite set of reaction channels. This talk is based on articles [4, 5, 6], where we are interested in the following problem: given a SRN, X, defined though its set of reaction channels, and its initial state, x0, estimate E (g(X(T))); that is, the expected value of a scalar observable, g, of the process, X, at a fixed time, T. This problem lead us to define a series of Monte Carlo estimators, M, such that, with high probability can produce values close to the quantity of interest, E (g(X(T))). More specifically, given a user-selected tolerance, TOL, and a small confidence level, η, find an estimator, M, based on approximate sampled paths of X, such that, P (|E (g(X(T))) − M| ≤ TOL) ≥ 1 − η; even more, we want to achieve this objective with near optimal computational work. We first introduce a hybrid path-simulation scheme based on the well-known stochastic simulation algorithm (SSA)[3] and the tau-leap method [2]. Then, we introduce a Multilevel Monte Carlo strategy that allows us to achieve a computational complexity of order O(T OL−2), this is the same computational complexity as in an exact method but with a smaller constant. We provide numerical examples to show our results.
Simulation of quantum computers
De Raedt, H; Michielsen, K; Hams, AH; Miyashita, S; Saito, K; Landau, DP; Lewis, SP; Schuttler, HB
2001-01-01
We describe a simulation approach to study the functioning of Quantum Computer hardware. The latter is modeled by a collection of interacting spin-1/2 objects. The time evolution of this spin system maps one-to-one to a quantum program carried out by the Quantum Computer. Our simulation software
Simulation of quantum computers
Raedt, H. De; Michielsen, K.; Hams, A.H.; Miyashita, S.; Saito, K.
2000-01-01
We describe a simulation approach to study the functioning of Quantum Computer hardware. The latter is modeled by a collection of interacting spin-1/2 objects. The time evolution of this spin system maps one-to-one to a quantum program carried out by the Quantum Computer. Our simulation software
The Monte Carlo simulation of the Ladon photon beam facility
International Nuclear Information System (INIS)
Strangio, C.
1976-01-01
The backward compton scattering of laser light against high energy electrons has been simulated with a Monte Carlo method. The main features of the produced photon beam are reported as well as a careful description of the numerical calculation
Energy Technology Data Exchange (ETDEWEB)
Fong, G; Kapadia, A [Carl E Ravin Advanced Imaging Laboratories, Durham, North Carolina (United States)
2016-06-15
Purpose: To optimize collimation and shielding for a deuterium-deuterium (DD) neutron generator for an inexpensive and compact clinical neutron imaging system. The envisioned application is cancer diagnosis through Neutron Stimulated Emission Computed Tomography (NSECT). Methods: Collimator designs were tested with an isotropic 2.5 MeV neutron source through GEANT4 simulations. The collimator is a 52×52×52 cm{sup 3} polyethylene block coupled with a 1 cm lead sheet in sequence. Composite opening was modeled into the collimator to permit passage of neutrons. The opening varied in shape (cylindrical vs. tapered), size (1–5 cm source-side and target-side openings) and aperture placements (13–39 cm from source-side). Spatial and energy distribution of neutrons and gammas were tracked from each collimator design. Parameters analyzed were primary beam width (FWHM), divergence, and efficiency (percent transmission) for different configurations of the collimator. Select resultant outputs were then used for simulated NSECT imaging of a virtual breast phantom containing a 2.5 cm diameter tumor to assess the effect of the collimator on spatial resolution, noise, and scan time. Finally, composite shielding enclosure made of polyethylene and lead was designed and evaluated to block 99.99% of neutron and gamma radiation generated in the system. Results: Analysis of primary beam indicated the beam-width is linear to the aperture size. Increasing source-side opening allowed at least 20% more neutron throughput for all designs relative to the cylindrical openings. Maximum throughput for all designs was 364% relative to cylindrical openings. Conclusion: The work indicates potential for collimating and shielding a DD neutron generator for use in a clinical NSECT system. The proposed collimator designs produced a well-defined collimated neutron beam that can be used to image samples of interest with millimeter resolution. Balance in output efficiency, noise reduction, and scan
International Nuclear Information System (INIS)
Fong, G; Kapadia, A
2016-01-01
Purpose: To optimize collimation and shielding for a deuterium-deuterium (DD) neutron generator for an inexpensive and compact clinical neutron imaging system. The envisioned application is cancer diagnosis through Neutron Stimulated Emission Computed Tomography (NSECT). Methods: Collimator designs were tested with an isotropic 2.5 MeV neutron source through GEANT4 simulations. The collimator is a 52×52×52 cm"3 polyethylene block coupled with a 1 cm lead sheet in sequence. Composite opening was modeled into the collimator to permit passage of neutrons. The opening varied in shape (cylindrical vs. tapered), size (1–5 cm source-side and target-side openings) and aperture placements (13–39 cm from source-side). Spatial and energy distribution of neutrons and gammas were tracked from each collimator design. Parameters analyzed were primary beam width (FWHM), divergence, and efficiency (percent transmission) for different configurations of the collimator. Select resultant outputs were then used for simulated NSECT imaging of a virtual breast phantom containing a 2.5 cm diameter tumor to assess the effect of the collimator on spatial resolution, noise, and scan time. Finally, composite shielding enclosure made of polyethylene and lead was designed and evaluated to block 99.99% of neutron and gamma radiation generated in the system. Results: Analysis of primary beam indicated the beam-width is linear to the aperture size. Increasing source-side opening allowed at least 20% more neutron throughput for all designs relative to the cylindrical openings. Maximum throughput for all designs was 364% relative to cylindrical openings. Conclusion: The work indicates potential for collimating and shielding a DD neutron generator for use in a clinical NSECT system. The proposed collimator designs produced a well-defined collimated neutron beam that can be used to image samples of interest with millimeter resolution. Balance in output efficiency, noise reduction, and scan time
Energy Technology Data Exchange (ETDEWEB)
Richet, Y
2006-12-15
Criticality Monte Carlo calculations aim at estimating the effective multiplication factor (k-effective) for a fissile system through iterations simulating neutrons propagation (making a Markov chain). Arbitrary initialization of the neutron population can deeply bias the k-effective estimation, defined as the mean of the k-effective computed at each iteration. A simplified model of this cycle k-effective sequence is built, based on characteristics of industrial criticality Monte Carlo calculations. Statistical tests, inspired by Brownian bridge properties, are designed to discriminate stationarity of the cycle k-effective sequence. The initial detected transient is, then, suppressed in order to improve the estimation of the system k-effective. The different versions of this methodology are detailed and compared, firstly on a plan of numerical tests fitted on criticality Monte Carlo calculations, and, secondly on real criticality calculations. Eventually, the best methodologies observed in these tests are selected and allow to improve industrial Monte Carlo criticality calculations. (author)
Parallel reservoir simulator computations
International Nuclear Information System (INIS)
Hemanth-Kumar, K.; Young, L.C.
1995-01-01
The adaptation of a reservoir simulator for parallel computations is described. The simulator was originally designed for vector processors. It performs approximately 99% of its calculations in vector/parallel mode and relative to scalar calculations it achieves speedups of 65 and 81 for black oil and EOS simulations, respectively on the CRAY C-90
Topics in computer simulations of statistical systems
International Nuclear Information System (INIS)
Salvador, R.S.
1987-01-01
Several computer simulations studying a variety of topics in statistical mechanics and lattice gauge theories are performed. The first study describes a Monte Carlo simulation performed on Ising systems defined on Sierpinsky carpets of dimensions between one and four. The critical coupling and the exponent γ are measured as a function of dimension. The Ising gauge theory in d = 4 - epsilon, for epsilon → 0 + , is then studied by performing a Monte Carlo simulation for the theory defined on fractals. A high statistics Monte Carlo simulation for the three-dimensional Ising model is presented for lattices of sizes 8 3 to 44 3 . All the data obtained agrees completely, within statistical errors, with the forms predicted by finite-sizing scaling. Finally, a method to estimate numerically the partition function of statistical systems is developed
Monte Carlo simulation of neutron scattering instruments
International Nuclear Information System (INIS)
Seeger, P.A.
1995-01-01
A library of Monte Carlo subroutines has been developed for the purpose of design of neutron scattering instruments. Using small-angle scattering as an example, the philosophy and structure of the library are described and the programs are used to compare instruments at continuous wave (CW) and long-pulse spallation source (LPSS) neutron facilities. The Monte Carlo results give a count-rate gain of a factor between 2 and 4 using time-of-flight analysis. This is comparable to scaling arguments based on the ratio of wavelength bandwidth to resolution width
Simulation of transport equations with Monte Carlo
International Nuclear Information System (INIS)
Matthes, W.
1975-09-01
The main purpose of the report is to explain the relation between the transport equation and the Monte Carlo game used for its solution. The introduction of artificial particles carrying a weight provides one with high flexibility in constructing many different games for the solution of the same equation. This flexibility opens a way to construct a Monte Carlo game for the solution of the adjoint transport equation. Emphasis is laid mostly on giving a clear understanding of what to do and not on the details of how to do a specific game
International Nuclear Information System (INIS)
Oliveira, Monica G. Nunes; Braz, Delson; Silva, Regina Cely B. da S.
2005-01-01
The computer simulation has been widely used in physical researches by both the viability of the codes and the growth of the power of computers in the last decades. The Monte Carlo simulation program, EGS4 code is a simulation program used in the area of radiation transport. The simulators, surrogate tissues, phantoms are objects used to perform studies on dosimetric quantities and quality testing of images. The simulators have characteristics of scattering and absorption of radiation similar to tissues that make up the body. The aim of this work is to translate the effects of radiation interactions in a real healthy breast tissues, sick and on simulators using the EGS4 Monte Carlo simulation code
Treatment planning for a small animal using Monte Carlo simulation
International Nuclear Information System (INIS)
Chow, James C. L.; Leung, Michael K. K.
2007-01-01
The development of a small animal model for radiotherapy research requires a complete setup of customized imaging equipment, irradiators, and planning software that matches the sizes of the subjects. The purpose of this study is to develop and demonstrate the use of a flexible in-house research environment for treatment planning on small animals. The software package, called DOSCTP, provides a user-friendly platform for DICOM computed tomography-based Monte Carlo dose calculation using the EGSnrcMP-based DOSXYZnrc code. Validation of the treatment planning was performed by comparing the dose distributions for simple photon beam geometries calculated through the Pinnacle3 treatment planning system and measurements. A treatment plan for a mouse based on a CT image set by a 360-deg photon arc is demonstrated. It is shown that it is possible to create 3D conformal treatment plans for small animals with consideration of inhomogeneities using small photon beam field sizes in the diameter range of 0.5-5 cm, with conformal dose covering the target volume while sparing the surrounding critical tissue. It is also found that Monte Carlo simulation is suitable to carry out treatment planning dose calculation for small animal anatomy with voxel size about one order of magnitude smaller than that of the human
Energy Technology Data Exchange (ETDEWEB)
Moskvin, Vadim [Department of Radiation Oncology, Indiana University School of Medicine, Indianapolis, IN (United States)]. E-mail: vmoskvin@iupui.edu; DesRosiers, Colleen; Papiez, Lech; Timmerman, Robert; Randall, Marcus; DesRosiers, Paul [Department of Radiation Oncology, Indiana University School of Medicine, Indianapolis, IN (United States)
2002-06-21
The Monte Carlo code PENELOPE has been used to simulate photon flux from the Leksell Gamma Knife, a precision method for treating intracranial lesions. Radiation from a single {sup 60}Co assembly traversing the collimator system was simulated, and phase space distributions at the output surface of the helmet for photons and electrons were calculated. The characteristics describing the emitted final beam were used to build a two-stage Monte Carlo simulation of irradiation of a target. A dose field inside a standard spherical polystyrene phantom, usually used for Gamma Knife dosimetry, has been computed and compared with experimental results, with calculations performed by other authors with the use of the EGS4 Monte Carlo code, and data provided by the treatment planning system Gamma Plan. Good agreement was found between these data and results of simulations in homogeneous media. Owing to this established accuracy, PENELOPE is suitable for simulating problems relevant to stereotactic radiosurgery. (author)
A study on the shielding element using Monte Carlo simulation
Energy Technology Data Exchange (ETDEWEB)
Kim, Ki Jeong [Dept. of Radiology, Konkuk University Medical Center, Seoul (Korea, Republic of); Shim, Jae Goo [Dept. of Radiologic Technology, Daegu Health College, Daegu (Korea, Republic of)
2017-06-15
In this research, we simulated the elementary star shielding ability using Monte Carlo simulation to apply medical radiation shielding sheet which can replace existing lead. In the selection of elements, mainly elements and metal elements having a large atomic number, which are known to have high shielding performance, recently, various composite materials have improved shielding performance, so that weight reduction, processability, In consideration of activity etc., 21 elements were selected. The simulation tools were utilized Monte Carlo method. As a result of simulating the shielding performance by each element, it was estimated that the shielding ratio is the highest at 98.82% and 98.44% for tungsten and gold.
Monte Carlo simulated dynamical magnetization of single-chain magnets
Energy Technology Data Exchange (ETDEWEB)
Li, Jun; Liu, Bang-Gui, E-mail: bgliu@iphy.ac.cn
2015-03-15
Here, a dynamical Monte-Carlo (DMC) method is used to study temperature-dependent dynamical magnetization of famous Mn{sub 2}Ni system as typical example of single-chain magnets with strong magnetic anisotropy. Simulated magnetization curves are in good agreement with experimental results under typical temperatures and sweeping rates, and simulated coercive fields as functions of temperature are also consistent with experimental curves. Further analysis indicates that the magnetization reversal is determined by both thermal-activated effects and quantum spin tunnelings. These can help explore basic properties and applications of such important magnetic systems. - Highlights: • Monte Carlo simulated magnetization curves are in good agreement with experimental results. • Simulated coercive fields as functions of temperature are consistent with experimental results. • The magnetization reversal is understood in terms of the Monte Carlo simulations.
From Monte Carlo to Quantum Computation
Heinrich, Stefan
2001-01-01
Quantum computing was so far mainly concerned with discrete problems. Recently, E. Novak and the author studied quantum algorithms for high dimensional integration and dealt with the question, which advantages quantum computing can bring over classical deterministic or randomized methods for this type of problem. In this paper we give a short introduction to the basic ideas of quantum computing and survey recent results on high dimensional integration. We discuss connections to the Monte Carl...
Monte Carlo simulation of gas Cerenkov detectors
International Nuclear Information System (INIS)
Mack, J.M.; Jain, M.; Jordan, T.M.
1984-01-01
Theoretical study of selected gamma-ray and electron diagnostic necessitates coupling Cerenkov radiation to electron/photon cascades. A Cerenkov production model and its incorporation into a general geometry Monte Carlo coupled electron/photon transport code is discussed. A special optical photon ray-trace is implemented using bulk optical properties assigned to each Monte Carlo zone. Good agreement exists between experimental and calculated Cerenkov data in the case of a carbon-dioxide gas Cerenkov detector experiment. Cerenkov production and threshold data are presented for a typical carbon-dioxide gas detector that converts a 16.7 MeV photon source to Cerenkov light, which is collected by optics and detected by a photomultiplier
Optimizing the HLT Buffer Strategy with Monte Carlo Simulations
AUTHOR|(CDS)2266763
2017-01-01
This project aims to optimize the strategy of utilizing the disk buffer for the High Level Trigger (HLT) of the LHCb experiment with the help of Monte-Carlo simulations. A method is developed, which simulates the Event Filter Farm (EFF) -- a computing cluster for the High Level Trigger -- as a compound of nodes with different performance properties. In this way, the behavior of the computing farm can be analyzed at a deeper level than before. It is demonstrated that the current operating strategy might be improved when data taking is reaching a mid-year scheduled stop or the year-end technical stop. The processing time of the buffered data can be lowered by distributing the detector data according to the processing power of the nodes instead of the relative disk size as long as the occupancy level of the buffer is low enough. Moreover, this ensures that data taken and stored on the buffer at the same time is processed by different nodes nearly simultaneously, which reduces load on the infrastructure.
Image based Monte Carlo modeling for computational phantom
International Nuclear Information System (INIS)
Cheng, M.; Wang, W.; Zhao, K.; Fan, Y.; Long, P.; Wu, Y.
2013-01-01
Full text of the publication follows. The evaluation on the effects of ionizing radiation and the risk of radiation exposure on human body has been becoming one of the most important issues for radiation protection and radiotherapy fields, which is helpful to avoid unnecessary radiation and decrease harm to human body. In order to accurately evaluate the dose on human body, it is necessary to construct more realistic computational phantom. However, manual description and verification of the models for Monte Carlo (MC) simulation are very tedious, error-prone and time-consuming. In addition, it is difficult to locate and fix the geometry error, and difficult to describe material information and assign it to cells. MCAM (CAD/Image-based Automatic Modeling Program for Neutronics and Radiation Transport Simulation) was developed as an interface program to achieve both CAD- and image-based automatic modeling. The advanced version (Version 6) of MCAM can achieve automatic conversion from CT/segmented sectioned images to computational phantoms such as MCNP models. Imaged-based automatic modeling program(MCAM6.0) has been tested by several medical images and sectioned images. And it has been applied in the construction of Rad-HUMAN. Following manual segmentation and 3D reconstruction, a whole-body computational phantom of Chinese adult female called Rad-HUMAN was created by using MCAM6.0 from sectioned images of a Chinese visible human dataset. Rad-HUMAN contains 46 organs/tissues, which faithfully represented the average anatomical characteristics of the Chinese female. The dose conversion coefficients (Dt/Ka) from kerma free-in-air to absorbed dose of Rad-HUMAN were calculated. Rad-HUMAN can be applied to predict and evaluate dose distributions in the Treatment Plan System (TPS), as well as radiation exposure for human body in radiation protection. (authors)
Zaidi, H
1999-01-01
the many applications of Monte Carlo modelling in nuclear medicine imaging make it desirable to increase the accuracy and computational speed of Monte Carlo codes. The accuracy of Monte Carlo simulations strongly depends on the accuracy in the probability functions and thus on the cross section libraries used for photon transport calculations. A comparison between different photon cross section libraries and parametrizations implemented in Monte Carlo simulation packages developed for positron emission tomography and the most recent Evaluated Photon Data Library (EPDL97) developed by the Lawrence Livermore National Laboratory was performed for several human tissues and common detector materials for energies from 1 keV to 1 MeV. Different photon cross section libraries and parametrizations show quite large variations as compared to the EPDL97 coefficients. This latter library is more accurate and was carefully designed in the form of look-up tables providing efficient data storage, access, and management. Toge...
Direct Measurement of Power Dissipated by Monte Carlo Simulations on CPU and FPGA Platforms
Albicocco, Pietro; Papini, Davide; Nannarelli, Alberto
2012-01-01
In this technical report, we describe how power dissipation measurements on different computing platforms (a desktop computer and an FPGA board) are performed by using a Hall effectbased current sensor. The chosen application is a Monte Carlo simulation for European option pricing which is a popular algorithm used in financial computations. The Hall effect probe measurements complement the measurements performed on the core of the FPGA by a built-in Xilinxpower monitoring system.
LCG MCDB - a Knowledgebase of Monte Carlo Simulated Events
Belov, S; Galkin, E; Gusev, A; Pokorski, Witold; Sherstnev, A V
2008-01-01
In this paper we report on LCG Monte Carlo Data Base (MCDB) and software which has been developed to operate MCDB. The main purpose of the LCG MCDB project is to provide a storage and documentation system for sophisticated event samples simulated for the LHC collaborations by experts. In many cases, the modern Monte Carlo simulation of physical processes requires expert knowledge in Monte Carlo generators or significant amount of CPU time to produce the events. MCDB is a knowledgebase mainly to accumulate simulated events of this type. The main motivation behind LCG MCDB is to make the sophisticated MC event samples available for various physical groups. All the data from MCDB is accessible in several convenient ways. LCG MCDB is being developed within the CERN LCG Application Area Simulation project.
Stabilization effect of fission source in coupled Monte Carlo simulations
Energy Technology Data Exchange (ETDEWEB)
Olsen, Borge; Dufek, Jan [Div. of Nuclear Reactor Technology, KTH Royal Institute of Technology, AlbaNova University Center, Stockholm (Sweden)
2017-08-15
A fission source can act as a stabilization element in coupled Monte Carlo simulations. We have observed this while studying numerical instabilities in nonlinear steady-state simulations performed by a Monte Carlo criticality solver that is coupled to a xenon feedback solver via fixed-point iteration. While fixed-point iteration is known to be numerically unstable for some problems, resulting in large spatial oscillations of the neutron flux distribution, we show that it is possible to stabilize it by reducing the number of Monte Carlo criticality cycles simulated within each iteration step. While global convergence is ensured, development of any possible numerical instability is prevented by not allowing the fission source to converge fully within a single iteration step, which is achieved by setting a small number of criticality cycles per iteration step. Moreover, under these conditions, the fission source may converge even faster than in criticality calculations with no feedback, as we demonstrate in our numerical test simulations.
Scouting the feasibility of Monte Carlo reactor dynamics simulations
International Nuclear Information System (INIS)
Legrady, David; Hoogenboom, J. Eduard
2008-01-01
In this paper we present an overview of the methodological questions related to Monte Carlo simulation of time dependent power transients in nuclear reactors. Investigations using a small fictional 3D reactor with isotropic scattering and a single energy group we have performed direct Monte Carlo transient calculations with simulation of delayed neutrons and with and without thermal feedback. Using biased delayed neutron sampling and population control at time step boundaries calculation times were kept reasonably low. We have identified the initial source determination and the prompt chain simulations as key issues that require most attention. (authors)
Scouting the feasibility of Monte Carlo reactor dynamics simulations
Energy Technology Data Exchange (ETDEWEB)
Legrady, David [Forschungszentrum Dresden-Rossendorf, Dresden (Germany); Hoogenboom, J. Eduard [Delft University of Technology, Delft (Netherlands)
2008-07-01
In this paper we present an overview of the methodological questions related to Monte Carlo simulation of time dependent power transients in nuclear reactors. Investigations using a small fictional 3D reactor with isotropic scattering and a single energy group we have performed direct Monte Carlo transient calculations with simulation of delayed neutrons and with and without thermal feedback. Using biased delayed neutron sampling and population control at time step boundaries calculation times were kept reasonably low. We have identified the initial source determination and the prompt chain simulations as key issues that require most attention. (authors)
Monte Carlo Simulations of Phosphate Polyhedron Connectivity in Glasses
Energy Technology Data Exchange (ETDEWEB)
ALAM,TODD M.
1999-12-21
Monte Carlo simulations of phosphate tetrahedron connectivity distributions in alkali and alkaline earth phosphate glasses are reported. By utilizing a discrete bond model, the distribution of next-nearest neighbor connectivities between phosphate polyhedron for random, alternating and clustering bonding scenarios was evaluated as a function of the relative bond energy difference. The simulated distributions are compared to experimentally observed connectivities reported for solid-state two-dimensional exchange and double-quantum NMR experiments of phosphate glasses. These Monte Carlo simulations demonstrate that the polyhedron connectivity is best described by a random distribution in lithium phosphate and calcium phosphate glasses.
Monte Carlo simulation of virtual compton scattering at MAMI
International Nuclear Information System (INIS)
D'Hose, N.; Ducret, J.E.; Gousset, TH.; Guichon, P.A.M.; Kerhoas, S.; Lhuillier, D.; Marchand, C.; Marchand, D.; Martino, J.; Mougey, J.; Roche, J.; Vanderhaeghen, M.; Vernin, P.; Bohm, H.; Distler, M.; Edelhoff, R.; Friedrich, J.M.; Geiges, R.; Jennewein, P.; Kahrau, M.; Korn, M.; Kramer, H.; Krygier, K.W.; Kunde, V.; Liesenfeld, A.; Merkel, H.; Merle, K.; Neuhausen, R.; Pospischil, TH.; Rosner, G.; Sauer, P.; Schmieden, H.; Schardt, S.; Tamas, G.; Wagner, A.; Walcher, TH.; Wolf, S.; Hyde-Wright, CH.; Boeglin, W.U.; Van de Wiele, J.
1996-01-01
The Monte Carlo simulation developed specially for the VCS experiments taking place at MAMI in fully described. This simulation can generate events according to the Bethe-Heitler + Born cross section behaviour and takes into account resolution deteriorating effects. It is used to determine solid angles for the various experimental settings. (authors)
Particle-transport simulation with the Monte Carlo method
International Nuclear Information System (INIS)
Carter, L.L.; Cashwell, E.D.
1975-01-01
Attention is focused on the application of the Monte Carlo method to particle transport problems, with emphasis on neutron and photon transport. Topics covered include sampling methods, mathematical prescriptions for simulating particle transport, mechanics of simulating particle transport, neutron transport, and photon transport. A literature survey of 204 references is included. (GMT)
Massive Parallelism of Monte-Carlo Simulation on Low-End Hardware using Graphic Processing Units
Energy Technology Data Exchange (ETDEWEB)
Mburu, Joe Mwangi; Hah, Chang Joo Hah [KEPCO International Nuclear Graduate School, Ulsan (Korea, Republic of)
2014-05-15
Within the past decade, research has been done on utilizing GPU massive parallelization in core simulation with impressive results but unfortunately, not much commercial application has been done in the nuclear field especially in reactor core simulation. The purpose of this paper is to give an introductory concept on the topic and illustrate the potential of exploiting the massive parallel nature of GPU computing on a simple monte-carlo simulation with very minimal hardware specifications. To do a comparative analysis, a simple two dimension monte-carlo simulation is implemented for both the CPU and GPU in order to evaluate performance gain based on the computing devices. The heterogeneous platform utilized in this analysis is done on a slow notebook with only 1GHz processor. The end results are quite surprising whereby high speedups obtained are almost a factor of 10. In this work, we have utilized heterogeneous computing in a GPU-based approach in applying potential high arithmetic intensive calculation. By applying a complex monte-carlo simulation on GPU platform, we have speed up the computational process by almost a factor of 10 based on one million neutrons. This shows how easy, cheap and efficient it is in using GPU in accelerating scientific computing and the results should encourage in exploring further this avenue especially in nuclear reactor physics simulation where deterministic and stochastic calculations are quite favourable in parallelization.
Massive Parallelism of Monte-Carlo Simulation on Low-End Hardware using Graphic Processing Units
International Nuclear Information System (INIS)
Mburu, Joe Mwangi; Hah, Chang Joo Hah
2014-01-01
Within the past decade, research has been done on utilizing GPU massive parallelization in core simulation with impressive results but unfortunately, not much commercial application has been done in the nuclear field especially in reactor core simulation. The purpose of this paper is to give an introductory concept on the topic and illustrate the potential of exploiting the massive parallel nature of GPU computing on a simple monte-carlo simulation with very minimal hardware specifications. To do a comparative analysis, a simple two dimension monte-carlo simulation is implemented for both the CPU and GPU in order to evaluate performance gain based on the computing devices. The heterogeneous platform utilized in this analysis is done on a slow notebook with only 1GHz processor. The end results are quite surprising whereby high speedups obtained are almost a factor of 10. In this work, we have utilized heterogeneous computing in a GPU-based approach in applying potential high arithmetic intensive calculation. By applying a complex monte-carlo simulation on GPU platform, we have speed up the computational process by almost a factor of 10 based on one million neutrons. This shows how easy, cheap and efficient it is in using GPU in accelerating scientific computing and the results should encourage in exploring further this avenue especially in nuclear reactor physics simulation where deterministic and stochastic calculations are quite favourable in parallelization
Direct Monte Carlo simulation of nanoscale mixed gas bearings
Directory of Open Access Journals (Sweden)
Kyaw Sett Myo
2015-06-01
Full Text Available The conception of sealed hard drives with helium gas mixture has been recently suggested over the current hard drives for achieving higher reliability and less position error. Therefore, it is important to understand the effects of different helium gas mixtures on the slider bearing characteristics in the head–disk interface. In this article, the helium/air and helium/argon gas mixtures are applied as the working fluids and their effects on the bearing characteristics are studied using the direct simulation Monte Carlo method. Based on direct simulation Monte Carlo simulations, the physical properties of these gas mixtures such as mean free path and dynamic viscosity are achieved and compared with those obtained from theoretical models. It is observed that both results are comparable. Using these gas mixture properties, the bearing pressure distributions are calculated under different fractions of helium with conventional molecular gas lubrication models. The outcomes reveal that the molecular gas lubrication results could have relatively good agreement with those of direct simulation Monte Carlo simulations, especially for pure air, helium, or argon gas cases. For gas mixtures, the bearing pressures predicted by molecular gas lubrication model are slightly larger than those from direct simulation Monte Carlo simulation.
Monte Carlo simulation of VHTR particle fuel with chord length sampling
International Nuclear Information System (INIS)
Ji, W.; Martin, W. R.
2007-01-01
The Very High Temperature Gas-Cooled Reactor (VHTR) poses a problem for neutronic analysis due to the double heterogeneity posed by the particle fuel and either the fuel compacts in the case of the prismatic block reactor or the fuel pebbles in the case of the pebble bed reactor. Direct Monte Carlo simulation has been used in recent years to analyze these VHTR configurations but is computationally challenged when space dependent phenomena are considered such as depletion or temperature feedback. As an alternative approach, we have considered chord length sampling to reduce the computational burden of the Monte Carlo simulation. We have improved on an existing method called 'limited chord length sampling' and have used it to analyze stochastic media representative of either pebble bed or prismatic VHTR fuel geometries. Based on the assumption that the PDF had an exponential form, a theoretical chord length distribution is derived and shown to be an excellent model for a wide range of packing fractions. This chord length PDF was then used to analyze a stochastic medium that was constructed using the RSA (Random Sequential Addition) algorithm and the results were compared to a benchmark Monte Carlo simulation of the actual stochastic geometry. The results are promising and suggest that the theoretical chord length PDF can be used instead of a full Monte Carlo random walk simulation in the stochastic medium, saving orders of magnitude in computational time (and memory demand) to perform the simulation. (authors)
Energy Technology Data Exchange (ETDEWEB)
Souris, Kevin, E-mail: kevin.souris@uclouvain.be; Lee, John Aldo [Center for Molecular Imaging and Experimental Radiotherapy, Institut de Recherche Expérimentale et Clinique, Université catholique de Louvain, Avenue Hippocrate 54, 1200 Brussels, Belgium and ICTEAM Institute, Université catholique de Louvain, Louvain-la-Neuve 1348 (Belgium); Sterpin, Edmond [Center for Molecular Imaging and Experimental Radiotherapy, Institut de Recherche Expérimentale et Clinique, Université catholique de Louvain, Avenue Hippocrate 54, 1200 Brussels, Belgium and Department of Oncology, Katholieke Universiteit Leuven, O& N I Herestraat 49, 3000 Leuven (Belgium)
2016-04-15
Purpose: Accuracy in proton therapy treatment planning can be improved using Monte Carlo (MC) simulations. However the long computation time of such methods hinders their use in clinical routine. This work aims to develop a fast multipurpose Monte Carlo simulation tool for proton therapy using massively parallel central processing unit (CPU) architectures. Methods: A new Monte Carlo, called MCsquare (many-core Monte Carlo), has been designed and optimized for the last generation of Intel Xeon processors and Intel Xeon Phi coprocessors. These massively parallel architectures offer the flexibility and the computational power suitable to MC methods. The class-II condensed history algorithm of MCsquare provides a fast and yet accurate method of simulating heavy charged particles such as protons, deuterons, and alphas inside voxelized geometries. Hard ionizations, with energy losses above a user-specified threshold, are simulated individually while soft events are regrouped in a multiple scattering theory. Elastic and inelastic nuclear interactions are sampled from ICRU 63 differential cross sections, thereby allowing for the computation of prompt gamma emission profiles. MCsquare has been benchmarked with the GATE/GEANT4 Monte Carlo application for homogeneous and heterogeneous geometries. Results: Comparisons with GATE/GEANT4 for various geometries show deviations within 2%–1 mm. In spite of the limited memory bandwidth of the coprocessor simulation time is below 25 s for 10{sup 7} primary 200 MeV protons in average soft tissues using all Xeon Phi and CPU resources embedded in a single desktop unit. Conclusions: MCsquare exploits the flexibility of CPU architectures to provide a multipurpose MC simulation tool. Optimized code enables the use of accurate MC calculation within a reasonable computation time, adequate for clinical practice. MCsquare also simulates prompt gamma emission and can thus be used also for in vivo range verification.
International Nuclear Information System (INIS)
Souris, Kevin; Lee, John Aldo; Sterpin, Edmond
2016-01-01
Purpose: Accuracy in proton therapy treatment planning can be improved using Monte Carlo (MC) simulations. However the long computation time of such methods hinders their use in clinical routine. This work aims to develop a fast multipurpose Monte Carlo simulation tool for proton therapy using massively parallel central processing unit (CPU) architectures. Methods: A new Monte Carlo, called MCsquare (many-core Monte Carlo), has been designed and optimized for the last generation of Intel Xeon processors and Intel Xeon Phi coprocessors. These massively parallel architectures offer the flexibility and the computational power suitable to MC methods. The class-II condensed history algorithm of MCsquare provides a fast and yet accurate method of simulating heavy charged particles such as protons, deuterons, and alphas inside voxelized geometries. Hard ionizations, with energy losses above a user-specified threshold, are simulated individually while soft events are regrouped in a multiple scattering theory. Elastic and inelastic nuclear interactions are sampled from ICRU 63 differential cross sections, thereby allowing for the computation of prompt gamma emission profiles. MCsquare has been benchmarked with the GATE/GEANT4 Monte Carlo application for homogeneous and heterogeneous geometries. Results: Comparisons with GATE/GEANT4 for various geometries show deviations within 2%–1 mm. In spite of the limited memory bandwidth of the coprocessor simulation time is below 25 s for 10"7 primary 200 MeV protons in average soft tissues using all Xeon Phi and CPU resources embedded in a single desktop unit. Conclusions: MCsquare exploits the flexibility of CPU architectures to provide a multipurpose MC simulation tool. Optimized code enables the use of accurate MC calculation within a reasonable computation time, adequate for clinical practice. MCsquare also simulates prompt gamma emission and can thus be used also for in vivo range verification.
Souris, Kevin; Lee, John Aldo; Sterpin, Edmond
2016-04-01
Accuracy in proton therapy treatment planning can be improved using Monte Carlo (MC) simulations. However the long computation time of such methods hinders their use in clinical routine. This work aims to develop a fast multipurpose Monte Carlo simulation tool for proton therapy using massively parallel central processing unit (CPU) architectures. A new Monte Carlo, called MCsquare (many-core Monte Carlo), has been designed and optimized for the last generation of Intel Xeon processors and Intel Xeon Phi coprocessors. These massively parallel architectures offer the flexibility and the computational power suitable to MC methods. The class-II condensed history algorithm of MCsquare provides a fast and yet accurate method of simulating heavy charged particles such as protons, deuterons, and alphas inside voxelized geometries. Hard ionizations, with energy losses above a user-specified threshold, are simulated individually while soft events are regrouped in a multiple scattering theory. Elastic and inelastic nuclear interactions are sampled from ICRU 63 differential cross sections, thereby allowing for the computation of prompt gamma emission profiles. MCsquare has been benchmarked with the gate/geant4 Monte Carlo application for homogeneous and heterogeneous geometries. Comparisons with gate/geant4 for various geometries show deviations within 2%-1 mm. In spite of the limited memory bandwidth of the coprocessor simulation time is below 25 s for 10(7) primary 200 MeV protons in average soft tissues using all Xeon Phi and CPU resources embedded in a single desktop unit. MCsquare exploits the flexibility of CPU architectures to provide a multipurpose MC simulation tool. Optimized code enables the use of accurate MC calculation within a reasonable computation time, adequate for clinical practice. MCsquare also simulates prompt gamma emission and can thus be used also for in vivo range verification.
Monte Carlo simulation of continuous-space crystal growth
International Nuclear Information System (INIS)
Dodson, B.W.; Taylor, P.A.
1986-01-01
We describe a method, based on Monte Carlo techniques, of simulating the atomic growth of crystals without the discrete lattice space assumed by conventional Monte Carlo growth simulations. Since no lattice space is assumed, problems involving epitaxial growth, heteroepitaxy, phonon-driven mechanisms, surface reconstruction, and many other phenomena incompatible with the lattice-space approximation can be studied. Also, use of the Monte Carlo method circumvents to some extent the extreme limitations on simulated timescale inherent in crystal-growth techniques which might be proposed using molecular dynamics. The implementation of the new method is illustrated by studying the growth of strained-layer superlattice (SLS) interfaces in two-dimensional Lennard-Jones atomic systems. Despite the extreme simplicity of such systems, the qualitative features of SLS growth seen here are similar to those observed experimentally in real semiconductor systems
Fast Monte Carlo-simulator with full collimator and detector response modelling for SPECT
International Nuclear Information System (INIS)
Sohlberg, A.O.; Kajaste, M.T.
2012-01-01
Monte Carlo (MC)-simulations have proved to be a valuable tool in studying single photon emission computed tomography (SPECT)-reconstruction algorithms. Despite their popularity, the use of Monte Carlo-simulations is still often limited by their large computation demand. This is especially true in situations where full collimator and detector modelling with septal penetration, scatter and X-ray fluorescence needs to be included. This paper presents a rapid and simple MC-simulator, which can effectively reduce the computation times. The simulator was built on the convolution-based forced detection principle, which can markedly lower the number of simulated photons. Full collimator and detector response look-up tables are pre-simulated and then later used in the actual MC-simulations to model the system response. The developed simulator was validated by comparing it against 123 I point source measurements made with a clinical gamma camera system and against 99m Tc software phantom simulations made with the SIMIND MC-package. The results showed good agreement between the new simulator, measurements and the SIMIND-package. The new simulator provided near noise-free projection data in approximately 1.5 min per projection with 99m Tc, which was less than one-tenth of SIMIND's time. The developed MC-simulator can markedly decrease the simulation time without sacrificing image quality. (author)
Monte Carlo Simulation of Callisto's Exosphere
Vorburger, Audrey; Wurz, Peter; Galli, André; Mousis, Olivier; Barabash, Stas; Lammer, Helmut
2014-05-01
Whereas Callisto's surface has been mapped as early as in 1980 by the two Voyager missions, Callisto's tenuous atmosphere, also called an exosphere, was not directly observed until the Galileo mission in 1999. The Galileo Near-Infrared Mapping Spectrometer detected a CO2 signal up to 100 km above the surface [Carlson, Science, 1999]. Radio occultation measurements, also conducted by Galileo, led to the detection of an ionosphere with inferred densities much higher than can be explained by the measured CO2 exosphere, though [Kliore et al., J. Geophys. Res, 2002]. Insight about Callisto's exosphere is expected to be boosted by the Neutral Ion Mass Spectrometer (NIM) of the Particle Environment Package (PEP) on board the planned JUpiter ICy moons Explorer (JUICE) mission, which will conduct the first-ever direct sampling of the exospheres of Europa, Ganymede, and Callisto. To ensure that NIM's mass resolution and mass range will be sufficient for NIM to detect most expected species in Callisto's exosphere, we model said exosphere ab initio. Since Callisto is thought to consist to about equal parts of both icy and rocky components [Showman and Malhotra, Science, 1999], we model particle release from an icy as well as from a mineral surface separately. For the ice component, we investigate two different compositions, for reducing and oxidising conditions, which find analogy in the initial gas phase conditions in the solar nebula [Mousis et al., Planet. Space Sci., submitted]. For the non-ice material, the mineral surface, we investigate surfaces with compositions similar to CI chondrites and L/LL type chondrites, both of which have been suggested to represent Callisto's non-ice material best [Kuskov and Kronrod, Icarus, 2005 and Moore et al., Cambridge University Press, 2004]. For all mentioned materials, we compute density profiles for particles released by either surface sublimation or ion induced sputtering up to an altitude of 100'000 km. Our results show that close
Combinatorial geometry domain decomposition strategies for Monte Carlo simulations
Energy Technology Data Exchange (ETDEWEB)
Li, G.; Zhang, B.; Deng, L.; Mo, Z.; Liu, Z.; Shangguan, D.; Ma, Y.; Li, S.; Hu, Z. [Institute of Applied Physics and Computational Mathematics, Beijing, 100094 (China)
2013-07-01
Analysis and modeling of nuclear reactors can lead to memory overload for a single core processor when it comes to refined modeling. A method to solve this problem is called 'domain decomposition'. In the current work, domain decomposition algorithms for a combinatorial geometry Monte Carlo transport code are developed on the JCOGIN (J Combinatorial Geometry Monte Carlo transport INfrastructure). Tree-based decomposition and asynchronous communication of particle information between domains are described in the paper. Combination of domain decomposition and domain replication (particle parallelism) is demonstrated and compared with that of MERCURY code. A full-core reactor model is simulated to verify the domain decomposition algorithms using the Monte Carlo particle transport code JMCT (J Monte Carlo Transport Code), which has being developed on the JCOGIN infrastructure. Besides, influences of the domain decomposition algorithms to tally variances are discussed. (authors)
Combinatorial geometry domain decomposition strategies for Monte Carlo simulations
International Nuclear Information System (INIS)
Li, G.; Zhang, B.; Deng, L.; Mo, Z.; Liu, Z.; Shangguan, D.; Ma, Y.; Li, S.; Hu, Z.
2013-01-01
Analysis and modeling of nuclear reactors can lead to memory overload for a single core processor when it comes to refined modeling. A method to solve this problem is called 'domain decomposition'. In the current work, domain decomposition algorithms for a combinatorial geometry Monte Carlo transport code are developed on the JCOGIN (J Combinatorial Geometry Monte Carlo transport INfrastructure). Tree-based decomposition and asynchronous communication of particle information between domains are described in the paper. Combination of domain decomposition and domain replication (particle parallelism) is demonstrated and compared with that of MERCURY code. A full-core reactor model is simulated to verify the domain decomposition algorithms using the Monte Carlo particle transport code JMCT (J Monte Carlo Transport Code), which has being developed on the JCOGIN infrastructure. Besides, influences of the domain decomposition algorithms to tally variances are discussed. (authors)
Parallelization of a Monte Carlo particle transport simulation code
Hadjidoukas, P.; Bousis, C.; Emfietzoglou, D.
2010-05-01
We have developed a high performance version of the Monte Carlo particle transport simulation code MC4. The original application code, developed in Visual Basic for Applications (VBA) for Microsoft Excel, was first rewritten in the C programming language for improving code portability. Several pseudo-random number generators have been also integrated and studied. The new MC4 version was then parallelized for shared and distributed-memory multiprocessor systems using the Message Passing Interface. Two parallel pseudo-random number generator libraries (SPRNG and DCMT) have been seamlessly integrated. The performance speedup of parallel MC4 has been studied on a variety of parallel computing architectures including an Intel Xeon server with 4 dual-core processors, a Sun cluster consisting of 16 nodes of 2 dual-core AMD Opteron processors and a 200 dual-processor HP cluster. For large problem size, which is limited only by the physical memory of the multiprocessor server, the speedup results are almost linear on all systems. We have validated the parallel implementation against the serial VBA and C implementations using the same random number generator. Our experimental results on the transport and energy loss of electrons in a water medium show that the serial and parallel codes are equivalent in accuracy. The present improvements allow for studying of higher particle energies with the use of more accurate physical models, and improve statistics as more particles tracks can be simulated in low response time.
Diagrammatic Monte Carlo simulations of staggered fermions at finite coupling
Vairinhos, Helvio
2016-01-01
Diagrammatic Monte Carlo has been a very fruitful tool for taming, and in some cases even solving, the sign problem in several lattice models. We have recently proposed a diagrammatic model for simulating lattice gauge theories with staggered fermions at arbitrary coupling, which extends earlier successful efforts to simulate lattice QCD at finite baryon density in the strong-coupling regime. Here we present the first numerical simulations of our model, using worm algorithms.
Computer Modeling and Simulation
Energy Technology Data Exchange (ETDEWEB)
Pronskikh, V. S. [Fermilab
2014-05-09
Verification and validation of computer codes and models used in simulation are two aspects of the scientific practice of high importance and have recently been discussed by philosophers of science. While verification is predominantly associated with the correctness of the way a model is represented by a computer code or algorithm, validation more often refers to model’s relation to the real world and its intended use. It has been argued that because complex simulations are generally not transparent to a practitioner, the Duhem problem can arise for verification and validation due to their entanglement; such an entanglement makes it impossible to distinguish whether a coding error or model’s general inadequacy to its target should be blamed in the case of the model failure. I argue that in order to disentangle verification and validation, a clear distinction between computer modeling (construction of mathematical computer models of elementary processes) and simulation (construction of models of composite objects and processes by means of numerical experimenting with them) needs to be made. Holding on to that distinction, I propose to relate verification (based on theoretical strategies such as inferences) to modeling and validation, which shares the common epistemology with experimentation, to simulation. To explain reasons of their intermittent entanglement I propose a weberian ideal-typical model of modeling and simulation as roles in practice. I suggest an approach to alleviate the Duhem problem for verification and validation generally applicable in practice and based on differences in epistemic strategies and scopes
International Nuclear Information System (INIS)
Rasmussen, H.
1992-01-01
Computer Simulation Western is a unit within the Department of Applied Mathematics at the University of Western Ontario. Its purpose is the development of computational and mathematical methods for practical problems in industry and engineering and the application and marketing of such methods. We describe the unit and our efforts at obtaining research and development grants. Some representative projects will be presented and future plans discussed. (author)
Data decomposition of Monte Carlo particle transport simulations via tally servers
International Nuclear Information System (INIS)
Romano, Paul K.; Siegel, Andrew R.; Forget, Benoit; Smith, Kord
2013-01-01
An algorithm for decomposing large tally data in Monte Carlo particle transport simulations is developed, analyzed, and implemented in a continuous-energy Monte Carlo code, OpenMC. The algorithm is based on a non-overlapping decomposition of compute nodes into tracking processors and tally servers. The former are used to simulate the movement of particles through the domain while the latter continuously receive and update tally data. A performance model for this approach is developed, suggesting that, for a range of parameters relevant to LWR analysis, the tally server algorithm should perform with minimal overhead on contemporary supercomputers. An implementation of the algorithm in OpenMC is then tested on the Intrepid and Titan supercomputers, supporting the key predictions of the model over a wide range of parameters. We thus conclude that the tally server algorithm is a successful approach to circumventing classical on-node memory constraints en route to unprecedentedly detailed Monte Carlo reactor simulations
Accelerator simulation using computers
International Nuclear Information System (INIS)
Lee, M.; Zambre, Y.; Corbett, W.
1992-01-01
Every accelerator or storage ring system consists of a charged particle beam propagating through a beam line. Although a number of computer programs exits that simulate the propagation of a beam in a given beam line, only a few provide the capabilities for designing, commissioning and operating the beam line. This paper shows how a ''multi-track'' simulation and analysis code can be used for these applications
Advanced computers and simulation
International Nuclear Information System (INIS)
Ryne, R.D.
1993-01-01
Accelerator physicists today have access to computers that are far more powerful than those available just 10 years ago. In the early 1980's, desktop workstations performed less one million floating point operations per second (Mflops), and the realized performance of vector supercomputers was at best a few hundred Mflops. Today vector processing is available on the desktop, providing researchers with performance approaching 100 Mflops at a price that is measured in thousands of dollars. Furthermore, advances in Massively Parallel Processors (MPP) have made performance of over 10 gigaflops a reality, and around mid-decade MPPs are expected to be capable of teraflops performance. Along with advances in MPP hardware, researchers have also made significant progress in developing algorithms and software for MPPS. These changes have had, and will continue to have, a significant impact on the work of computational accelerator physicists. Now, instead of running particle simulations with just a few thousand particles, we can perform desktop simulations with tens of thousands of simulation particles, and calculations with well over 1 million particles are being performed on MPPs. In the area of computational electromagnetics, simulations that used to be performed only on vector supercomputers now run in several hours on desktop workstations, and researchers are hoping to perform simulations with over one billion mesh points on future MPPs. In this paper we will discuss the latest advances, and what can be expected in the near future, in hardware, software and applications codes for advanced simulation of particle accelerators
Yoo, Do Hyeon; Shin, Wook-Geun; Lee, Jaekook; Yeom, Yeon Soo; Kim, Chan Hyeong; Chang, Byung-Uck; Min, Chul Hee
2017-11-01
After the Fukushima accident in Japan, the Korean Government implemented the "Act on Protective Action Guidelines Against Radiation in the Natural Environment" to regulate unnecessary radiation exposure to the public. However, despite the law which came into effect in July 2012, an appropriate method to evaluate the equivalent and effective doses from naturally occurring radioactive material (NORM) in consumer products is not available. The aim of the present study is to develop and validate an effective dose coefficient database enabling the simple and correct evaluation of the effective dose due to the usage of NORM-added consumer products. To construct the database, we used a skin source method with a computational human phantom and Monte Carlo (MC) simulation. For the validation, the effective dose was compared between the database using interpolation method and the original MC method. Our result showed a similar equivalent dose across the 26 organs and a corresponding average dose between the database and the MC calculations of database with sufficient accuracy. Copyright © 2017 Elsevier Ltd. All rights reserved.
Guideline of Monte Carlo calculation. Neutron/gamma ray transport simulation by Monte Carlo method
2002-01-01
This report condenses basic theories and advanced applications of neutron/gamma ray transport calculations in many fields of nuclear energy research. Chapters 1 through 5 treat historical progress of Monte Carlo methods, general issues of variance reduction technique, cross section libraries used in continuous energy Monte Carlo codes. In chapter 6, the following issues are discussed: fusion benchmark experiments, design of ITER, experiment analyses of fast critical assembly, core analyses of JMTR, simulation of pulsed neutron experiment, core analyses of HTTR, duct streaming calculations, bulk shielding calculations, neutron/gamma ray transport calculations of the Hiroshima atomic bomb. Chapters 8 and 9 treat function enhancements of MCNP and MVP codes, and a parallel processing of Monte Carlo calculation, respectively. An important references are attached at the end of this report.
Fast code for Monte Carlo simulations
International Nuclear Information System (INIS)
Oliveira, P.M.C. de; Penna, T.J.P.
1988-01-01
A computer code to generate the dynamic evolution of the Ising model on a square lattice, following the Metropolis algorithm is presented. The computer time consumption is reduced by a factor of 8 when one compares our code with traditional multiple spin codes. The memory allocation size is also reduced by a factor of 4. The code is easily generalizable for other lattices and models. (author) [pt
Parallel Monte Carlo simulation of aerosol dynamics
Zhou, K.; He, Z.; Xiao, M.; Zhang, Z.
2014-01-01
is simulated with a stochastic method (Marcus-Lushnikov stochastic process). Operator splitting techniques are used to synthesize the deterministic and stochastic parts in the algorithm. The algorithm is parallelized using the Message Passing Interface (MPI
Juste, B; Miro, R; Gallardo, S; Santos, A; Verdu, G
2006-01-01
The present work has simulated the photon and electron transport in a Theratron 780 (MDS Nordion) (60)Co radiotherapy unit, using the Monte Carlo transport code, MCNP (Monte Carlo N-Particle), version 5. In order to become computationally more efficient in view of taking part in the practical field of radiotherapy treatment planning, this work is focused mainly on the analysis of dose results and on the required computing time of different tallies applied in the model to speed up calculations.
Profit Forecast Model Using Monte Carlo Simulation in Excel
Directory of Open Access Journals (Sweden)
Petru BALOGH
2014-01-01
Full Text Available Profit forecast is very important for any company. The purpose of this study is to provide a method to estimate the profit and the probability of obtaining the expected profit. Monte Carlo methods are stochastic techniques–meaning they are based on the use of random numbers and probability statistics to investigate problems. Monte Carlo simulation furnishes the decision-maker with a range of possible outcomes and the probabilities they will occur for any choice of action. Our example of Monte Carlo simulation in Excel will be a simplified profit forecast model. Each step of the analysis will be described in detail. The input data for the case presented: the number of leads per month, the percentage of leads that result in sales, , the cost of a single lead, the profit per sale and fixed cost, allow obtaining profit and associated probabilities of achieving.
Computer Simulation of Reading.
Leton, Donald A.
In recent years, coding and decoding have been claimed to be the processes for converting one language form to another. But there has been little effort to locate these processes in the human learner or to identify the nature of the internal codes. Computer simulation of reading is useful because the similarities in the human reception and…
Monte Carlo-based simulation of dynamic jaws tomotherapy
Energy Technology Data Exchange (ETDEWEB)
Sterpin, E.; Chen, Y.; Chen, Q.; Lu, W.; Mackie, T. R.; Vynckier, S. [Department of Molecular Imaging, Radiotherapy and Oncology, Universite Catholique de Louvain, 54 Avenue Hippocrate, 1200 Brussels, Belgium and Department of Medical Physics, University of Wisconsin-Madison, Madison, Wisconsin 53705 (United States); TomoTherapy Inc., 1240 Deming Way, Madison, Wisconsin 53717 (United States); 21 Century Oncology., 1240 D' onofrio, Madison, Wisconsin 53719 (United States); TomoTherapy Inc., 1240 Deming Way, Madison, Wisconsin 53717 and Department of Medical Physics, University of Wisconsin-Madison, Madison, Wisconsin 53705 (United States); Department of Radiotherapy and Oncology, Universite Catholique de Louvain, St-Luc University Hospital, 10 Avenue Hippocrate, 1200 Brussels (Belgium)
2011-09-15
Purpose: Original TomoTherapy systems may involve a trade-off between conformity and treatment speed, the user being limited to three slice widths (1.0, 2.5, and 5.0 cm). This could be overcome by allowing the jaws to define arbitrary fields, including very small slice widths (<1 cm), which are challenging for a beam model. The aim of this work was to incorporate the dynamic jaws feature into a Monte Carlo (MC) model called TomoPen, based on the MC code PENELOPE, previously validated for the original TomoTherapy system. Methods: To keep the general structure of TomoPen and its efficiency, the simulation strategy introduces several techniques: (1) weight modifiers to account for any jaw settings using only the 5 cm phase-space file; (2) a simplified MC based model called FastStatic to compute the modifiers faster than pure MC; (3) actual simulation of dynamic jaws. Weight modifiers computed with both FastStatic and pure MC were compared. Dynamic jaws simulations were compared with the convolution/superposition (C/S) of TomoTherapy in the ''cheese'' phantom for a plan with two targets longitudinally separated by a gap of 3 cm. Optimization was performed in two modes: asymmetric jaws-constant couch speed (''running start stop,'' RSS) and symmetric jaws-variable couch speed (''symmetric running start stop,'' SRSS). Measurements with EDR2 films were also performed for RSS for the formal validation of TomoPen with dynamic jaws. Results: Weight modifiers computed with FastStatic were equivalent to pure MC within statistical uncertainties (0.5% for three standard deviations). Excellent agreement was achieved between TomoPen and C/S for both asymmetric jaw opening/constant couch speed and symmetric jaw opening/variable couch speed, with deviations well within 2%/2 mm. For RSS procedure, agreement between C/S and measurements was within 2%/2 mm for 95% of the points and 3%/3 mm for 98% of the points, where dose is
Monte Carlo-based simulation of dynamic jaws tomotherapy
International Nuclear Information System (INIS)
Sterpin, E.; Chen, Y.; Chen, Q.; Lu, W.; Mackie, T. R.; Vynckier, S.
2011-01-01
Purpose: Original TomoTherapy systems may involve a trade-off between conformity and treatment speed, the user being limited to three slice widths (1.0, 2.5, and 5.0 cm). This could be overcome by allowing the jaws to define arbitrary fields, including very small slice widths (<1 cm), which are challenging for a beam model. The aim of this work was to incorporate the dynamic jaws feature into a Monte Carlo (MC) model called TomoPen, based on the MC code PENELOPE, previously validated for the original TomoTherapy system. Methods: To keep the general structure of TomoPen and its efficiency, the simulation strategy introduces several techniques: (1) weight modifiers to account for any jaw settings using only the 5 cm phase-space file; (2) a simplified MC based model called FastStatic to compute the modifiers faster than pure MC; (3) actual simulation of dynamic jaws. Weight modifiers computed with both FastStatic and pure MC were compared. Dynamic jaws simulations were compared with the convolution/superposition (C/S) of TomoTherapy in the ''cheese'' phantom for a plan with two targets longitudinally separated by a gap of 3 cm. Optimization was performed in two modes: asymmetric jaws-constant couch speed (''running start stop,'' RSS) and symmetric jaws-variable couch speed (''symmetric running start stop,'' SRSS). Measurements with EDR2 films were also performed for RSS for the formal validation of TomoPen with dynamic jaws. Results: Weight modifiers computed with FastStatic were equivalent to pure MC within statistical uncertainties (0.5% for three standard deviations). Excellent agreement was achieved between TomoPen and C/S for both asymmetric jaw opening/constant couch speed and symmetric jaw opening/variable couch speed, with deviations well within 2%/2 mm. For RSS procedure, agreement between C/S and measurements was within 2%/2 mm for 95% of the points and 3%/3 mm for 98% of the points, where dose is greater than 30% of the prescription dose (gamma analysis
Monte Carlo simulation for theoretical calculations of damage and sputtering processes
International Nuclear Information System (INIS)
Yamamura, Yasunori
1984-01-01
The radiation damage accompanying ion irradiation and the various problems caused with it should be determined in principle by resolving Boltzmann's equations. However, in reality, those for a semi-infinite system cannot be generally resolved. Moreover, the effect of crystals, oblique incidence and so on make the situation more difficult. The analysis of the complicated phenomena of the collision in solids and the problems of radiation damage and sputtering accompanying them is possible in most cases only by computer simulation. At present, the methods of simulating the atomic collision phenomena in solids are roughly classified into molecular dynamics method and Monte Carlo method. In the molecular dynamics, Newton's equations are numerically calculated time-dependently as they are, and it has large merits that many body effect and nonlinear effect can be taken in consideration, but much computing time is required. The features and problems of the Monte Carlo simulation and nonlinear Monte Carlo simulation are described. The comparison of the Monte Carlo simulation codes calculating on the basis of two-body collision approximation, MARLOWE, TRIM and ACAT, was carried out through the calculation of the backscattering spectra of light ions. (Kako, I.)
BRAND program complex for neutron-physical experiment simulation by the Monte-Carlo method
International Nuclear Information System (INIS)
Androsenko, A.A.; Androsenko, P.A.
1984-01-01
Possibilities of the BRAND program complex for neutron and γ-radiation transport simulation by the Monte-Carlo method are described in short. The complex includes the following modules: geometric module, source module, detector module, modules of simulation of a vector of particle motion direction after interaction and a free path. The complex is written in the FORTRAN langauage and realized by the BESM-6 computer
Monte Carlo simulations of ionization potential depression in dense plasmas
Czech Academy of Sciences Publication Activity Database
Stránský, Michal
2016-01-01
Roč. 23, č. 1 (2016), 1-5, č. článku 012708. ISSN 1070-664X R&D Projects: GA MŠk LG15013 Institutional support: RVO:68378271 Keywords : Monte Carlo methods * aluminium * plasma temperature * computer modeling * ionization Subject RIV: BL - Plasma and Gas Discharge Physics Impact factor: 2.115, year: 2016
Energy Technology Data Exchange (ETDEWEB)
Kim, Ho Chul; Lee, Young Jin [Dept. of Radiological Science, Eulji University, Seongnam (Korea, Republic of); Kim, Hee Joung; Kim, Kyuseok; Lee, Min Hee [Yonsei University, Wonju (Korea, Republic of)
2017-06-15
To avoid imaging artifacts and interpretation mistakes, an improvement of the uniformity in gamma camera systems is a very important point. We can expect excellent uniformity using cadmium zinc telluride (CZT) photon counting detector (PCD) because of the direct conversion of the gamma rays energy into electrons. In addition, the uniformity performance such as integral uniformity (IU), differential uniformity (DU), scatter fraction (SF), and contrast-to-noise ratio (CNR) varies according to the energy window setting. In this study, we compared a PCD and conventional scintillation detector with respect to the energy windows (5%, 10%, 15%, and 20%) using a {sup 99m}Tc gamma source with a Geant4 Application for Tomography Emission simulation tool. The gamma camera systems used in this work are a CZT PCD and NaI(Tl) conventional scintillation detector with a 1-mm thickness. According to the results, although the IU and DU results were improved with the energy window, the SF and CNR results deteriorated with the energy window. In particular, the uniformity for the PCD was higher than that of the conventional scintillation detector in all cases. In conclusion, our results demonstrated that the uniformity of the CZT PCD was higher than that of the conventional scintillation detector.
Sensitivity analysis for oblique incidence reflectometry using Monte Carlo simulations
DEFF Research Database (Denmark)
Kamran, Faisal; Andersen, Peter E.
2015-01-01
profiles. This article presents a sensitivity analysis of the technique in turbid media. Monte Carlo simulations are used to investigate the technique and its potential to distinguish the small changes between different levels of scattering. We present various regions of the dynamic range of optical...
GEANT Monte Carlo simulations for the GREAT spectrometer
International Nuclear Information System (INIS)
Andreyev, A.N.; Butler, P.A.; Page, R.D.; Appelbe, D.E.; Jones, G.D.; Joss, D.T.; Herzberg, R.-D.; Regan, P.H.; Simpson, J.; Wadsworth, R.
2004-01-01
GEANT Monte Carlo simulations for the recently developed GREAT spectrometer are presented. Some novel applications of the spectrometer for γ-ray, conversion-electron and β-decay spectroscopy are discussed. The conversion-electron spectroscopy of heavy nuclei with strongly converted transitions and the extension of the recoil decay tagging method to β-decaying nuclei are considered in detail
Flexible polymers in a nematic medium : a Monte Carlo simulation
Vliet, J.H. van; Luyten, M.C.; Brinke, G. ten
Monte Carlo simulations of self-avoiding random walks surrounded by aligned rods on a square lattice and a simple cubic lattice were performed to address the topological constraints involved for dilute solutions of flexible polymers in a highly oriented nematic solvent. The nematic constraint
Monte Carlo simulations of adsorption-induced segregation
DEFF Research Database (Denmark)
Christoffersen, Ebbe; Stoltze, Per; Nørskov, Jens Kehlet
2002-01-01
Through the use of Monte Carlo simulations we study the effect of adsorption-induced segregation. From the bulk composition, degree of dispersion and the partial pressure of the gas phase species we calculate the surface composition of bimetallic alloys. We show that both segregation and adsorption...
Monte Carlo simulation models of breeding-population advancement.
J.N. King; G.R. Johnson
1993-01-01
Five generations of population improvement were modeled using Monte Carlo simulations. The model was designed to address questions that are important to the development of an advanced generation breeding population. Specifically we addressed the effects on both gain and effective population size of different mating schemes when creating a recombinant population for...
Nakamura, Mitsuhiro; Ishihara, Yoshitomo; Matsuo, Yukinori; Iizuka, Yusuke; Ueki, Nami; Iramina, Hiraku; Hirashima, Hideaki; Mizowaki, Takashi
2018-03-01
Knowledge of the imaging doses delivered to patients and accurate dosimetry of the radiation to organs from various imaging procedures is becoming increasingly important for clinicians. The purposes of this study were to calculate imaging doses delivered to the organs of lung cancer patients during real-time tumor tracking (RTTT) with three-dimensional (3D), and four-dimensional (4D) cone-beam computed tomography (CBCT), using Monte Carlo techniques to simulate kV X-ray dose distributions delivered using the Vero4DRT. Imaging doses from RTTT, 3D-CBCT and 4D-CBCT were calculated with the planning CT images for nine lung cancer patients who underwent stereotactic body radiotherapy (SBRT) with RTTT. With RTTT, imaging doses from correlation modeling and from monitoring of imaging during beam delivery were calculated. With CBCT, doses from 3D-CBCT and 4D-CBCT were also simulated. The doses covering 2-cc volumes (D2cc) in correlation modeling were up to 9.3 cGy for soft tissues and 48.4 cGy for bone. The values from correlation modeling and monitoring were up to 11.0 cGy for soft tissues and 59.8 cGy for bone. Imaging doses in correlation modeling were larger with RTTT. On a single 4D-CBCT, the skin and bone D2cc values were in the ranges of 7.4-10.5 cGy and 33.5-58.1 cGy, respectively. The D2cc from 4D-CBCT was approximately double that from 3D-CBCT. Clinicians should Figure that the imaging dose increases the cumulative doses to organs.
Directory of Open Access Journals (Sweden)
Vincenza Di Stefano
2009-11-01
Full Text Available The Multicomb variance reduction technique has been introduced in the Direct Monte Carlo Simulation for submicrometric semiconductor devices. The method has been implemented in bulk silicon. The simulations show that the statistical variance of hot electrons is reduced with some computational cost. The method is efficient and easy to implement in existing device simulators.
International Nuclear Information System (INIS)
Lee, Seung-Wan; Choi, Yu-Na; Cho, Hyo-Min; Lee, Young-Jin; Ryu, Hyun-Ju; Kim, Hee-Joung
2011-01-01
Conventional X-ray systems and X-ray computed tomography (CT) systems, which use detectors operated in the integrating mode, are not able to reflect spectral information because the detector output is proportional to the energy fluence integrated over the whole spectrum. Photon-counting detectors have been considered as alternative devices. These detectors can measure the photon energy deposited by each event and improve the image quality. In this study, we investigated the feasibility of K-edge imaging using a photon-counting detector and evaluated the capability of material decomposition in X-ray images. The geometries of X-ray imaging systems equipped with cadmium telluride (CdTe) detectors and phantoms consisting of different materials were designed using Geant4 Application for Tomographic Emission (GATE) version 6.0. To observe the effect of a discontinuity in the attenuation due to the K-edge of a high atomic number material, we chose the energy windows to be one below and one above the K-edge absorption energy of the target material. The contrast-to-noise ratios (CNRs) of the target materials were increased at selective energy levels above the K-edge absorption energy because the attenuation is more dramatically increased at energies above the K-edge absorption energy of the material than at energies below that. The CNRs for the target materials in the K-edge image were proportional to the material concentration. The results of this study show that K-edge imaging can be carried out in conventional X-ray systems and X-ray CT systems using CdTe photon-counting detectors and that the target materials can be separated from background materials by using K-edge imaging. The photon-counting detector has potential to provide improved image quality, and this study will be used as a basis for future studies on photon-counting X-ray imaging.
Monte Carlo simulations to replace film dosimetry in IMRT verification
International Nuclear Information System (INIS)
Goetzfried, Thomas; Trautwein, Marius; Koelbi, Oliver; Bogner, Ludwig; Rickhey, Mark
2011-01-01
Patient-specific verification of intensity-modulated radiation therapy (IMRT) plans can be done by dosimetric measurements or by independent dose or monitor unit calculations. The aim of this study was the clinical evaluation of IMRT verification based on a fast Monte Carlo (MC) program with regard to possible benefits compared to commonly used film dosimetry. 25 head-and-neck IMRT plans were recalculated by a pencil beam based treatment planning system (TPS) using an appropriate quality assurance (QA) phantom. All plans were verified both by film and diode dosimetry and compared to MC simulations. The irradiated films, the results of diode measurements and the computed dose distributions were evaluated, and the data were compared on the basis of gamma maps and dose-difference histograms. Average deviations in the high-dose region between diode measurements and point dose calculations performed with the TPS and MC program were 0.7 ± 2.7% and 1.2 ± 3.1%, respectively. For film measurements, the mean gamma values with 3% dose difference and 3 mm distance-to-agreement were 0.74 ± 0.28 (TPS as reference) with dose deviations up to 10%. Corresponding values were significantly reduced to 0.34 ± 0.09 for MC dose calculation. The total time needed for both verification procedures is comparable, however, by far less labor intensive in the case of MC simulations. The presented study showed that independent dose calculation verification of IMRT plans with a fast MC program has the potential to eclipse film dosimetry more and more in the near future. Thus, the linac-specific QA part will necessarily become more important. In combination with MC simulations and due to the simple set-up, point-dose measurements for dosimetric plausibility checks are recommended at least in the IMRT introduction phase. (orig.)
Monte Carlo simulation of tomography techniques using the platform Gate
International Nuclear Information System (INIS)
Barbouchi, Asma
2007-01-01
Simulations play a key role in functional imaging, with applications ranging from scanner design, scatter correction, protocol optimisation. GATE (Geant4 for Application Tomography Emission) is a platform for Monte Carlo Simulation. It is based on Geant4 to generate and track particles, to model geometry and physics process. Explicit modelling of time includes detector motion, time of flight, tracer kinetics. Interfaces to voxellised models and image reconstruction packages improve the integration of GATE in the global modelling cycle. In this work Monte Carlo simulations are used to understand and optimise the gamma camera's performances. We study the effect of the distance between source and collimator, the diameter of the holes and the thick of the collimator on the spatial resolution, energy resolution and efficiency of the gamma camera. We also study the reduction of simulation's time and implement a model of left ventricle in GATE. (Author). 7 refs
Monte Carlo simulations of plutonium gamma-ray spectra
International Nuclear Information System (INIS)
Koenig, Z.M.; Carlson, J.B.; Wang, Tzu-Fang; Ruhter, W.D.
1993-01-01
Monte Carlo calculations were investigated as a means of simulating the gamma-ray spectra of Pu. These simulated spectra will be used to develop and evaluate gamma-ray analysis techniques for various nondestructive measurements. Simulated spectra of calculational standards can be used for code intercomparisons, to understand systematic biases and to estimate minimum detection levels of existing and proposed nondestructive analysis instruments. The capability to simulate gamma-ray spectra from HPGe detectors could significantly reduce the costs of preparing large numbers of real reference materials. MCNP was used for the Monte Carlo transport of the photons. Results from the MCNP calculations were folded in with a detector response function for a realistic spectrum. Plutonium spectrum peaks were produced with Lorentzian shapes, for the x-rays, and Gaussian distributions. The MGA code determined the Pu isotopes and specific power of this calculated spectrum and compared it to a similar analysis on a measured spectrum
Monte Carlo simulation for dual head gamma camera
International Nuclear Information System (INIS)
Osman, Yousif Bashir Soliman
2015-12-01
Monte Carlo (MC) simulation technique was used widely in medical physics applications. In nuclear medicine MC was used to design new medical imaging devices such as positron emission tomography (PET), gamma camera and single photon emission computed tomography (SPECT). Also it can be used to study the factors affecting image quality and internal dosimetry, Gate is on of monte Carlo code that has a number of advantages for simulation of SPECT and PET. There is a limit accessibilities in machines which are used in clinics because of the work load of machines. This makes it hard to evaluate some factors effecting machine performance which must be evaluated routinely. Also because of difficulties of carrying out scientific research and training of students, MC model can be optimum solution for the problem. The aim of this study was to use gate monte Carlo code to model Nucline spirit, medico dual head gamma camera hosted in radiation and isotopes center of Khartoum which is equipped with low energy general purpose LEGP collimators. This was used model to evaluate spatial resolution and sensitivity which is important factor affecting image quality and to demonstrate the validity of gate by comparing experimental results with simulation results on spatial resolution. The gate model of Nuclide spirit, medico dual head gamma camera was developed by applying manufacturer specifications. Then simulation was run. In evaluation of spatial resolution the FWHM was calculated from image profile of line source of Tc 99m gammas emitter of energy 140 KeV at different distances from modeled camera head at 5,10,15,20,22,27,32,37 cm and for these distances the spatial resolution was founded to be 5.76, 7.73, 10.7, 13.8, 14.01,16.91, 19.75 and 21.9 mm, respectively. These results showed a decrement of spatial resolution with increase of the distance between object (line source) and collimator in linear manner. FWHM calculated at 10 cm was compared with experimental results. The
SELF-ABSORPTION CORRECTIONS BASED ON MONTE CARLO SIMULATIONS
Directory of Open Access Journals (Sweden)
Kamila Johnová
2016-12-01
Full Text Available The main aim of this article is to demonstrate how Monte Carlo simulations are implemented in our gamma spectrometry laboratory at the Department of Dosimetry and Application of Ionizing Radiation in order to calculate the self-absorption within the samples. A model of real HPGe detector created for MCNP simulations is presented in this paper. All of the possible parameters, which may influence the self-absorption, are at first discussed theoretically and lately described using the calculated results.
Monte-Carlo simulation of heavy-ion collisions
International Nuclear Information System (INIS)
Schenke, Bjoern; Jeon, Sangyong; Gale, Charles
2011-01-01
We present Monte-Carlo simulations for heavy-ion collisions combining PYTHIA and the McGill-AMY formalism to describe the evolution of hard partons in a soft background, modelled using hydrodynamic simulations. MARTINI generates full event configurations in the high p T region that take into account thermal QCD and QED effects as well as effects of the evolving medium. This way it is possible to perform detailed quantitative comparisons with experimental observables.
Euclidean Monte Carlo simulation of nuclear interactions
International Nuclear Information System (INIS)
Montvay, Istvan; Bonn Univ.; Urbach, Carsten
2011-05-01
We present an exploratory study of chiral effective theories of nuclei with methods adopted from lattice quantum chromodynamics (QCD). We show that the simulations in the Euclidean path integral approach are feasible and that we can determine the energy of the two nucleon state. By varying the parameters and the simulated volumes phase shifts can be determined in principle and hopefully tuned to their physical values in the future. The physical cut-off of the theory is realised by blocking of the lattice fields. By keeping this physical cut-off fixed in physical units the lattice cut-off can be changed and in this way the lattice artefacts can be eliminated. (orig.)
Performance of three-photon PET imaging: Monte Carlo simulations
International Nuclear Information System (INIS)
Kacperski, Krzysztof; Spyrou, Nicholas M
2005-01-01
We have recently introduced the idea of making use of three-photon positron annihilations in positron emission tomography. In this paper, the basic characteristics of the three-gamma imaging in PET are studied by means of Monte Carlo simulations and analytical computations. Two typical configurations of human and small animal scanners are considered. Three-photon imaging requires high-energy resolution detectors. Parameters currently attainable by CdZnTe semiconductor detectors, the technology of choice for the future development of radiation imaging, are assumed. Spatial resolution is calculated as a function of detector energy resolution and size, position in the field of view, scanner size and the energies of the three-gamma annihilation photons. Possible ways to improve the spatial resolution obtained for nominal parameters, 1.5 cm and 3.2 mm FWHM for human and small animal scanners, respectively, are indicated. Counting rates of true and random three-photon events for typical human and small animal scanning configurations are assessed. A simple formula for minimum size of lesions detectable in the three-gamma based images is derived. Depending on the contrast and total number of registered counts, lesions of a few mm size for human and sub mm for small animal scanners can be detected
Monte Carlo simulation of AB-copolymers with saturating bonds
Chertovich, A V; Khokhlov, A R; Bohr, J
2003-01-01
Structural transitions in a single AB-copolymer chain where saturating bonds can be formed between A-and B-units are studied by means of Monte Carlo computer simulations using the bond fluctuation model. Three transitions are found, coil-globule, coil-hairpin and globule-hairpin, depending on the nature of a particular AB-sequence: statistical random sequence, diblock sequence and 'random-complementary' sequence (one-half of such an AB-sequence is random with Bernoulli statistics while the other half is complementary to the first one). The properties of random-complementary sequences are closer to those of diblock sequences than to the properties of random sequences. The model (although quite rough) is expected to represent some basic features of real RNA molecules, i.e. the formation of secondary structure of RNA due to hydrogen bonding of corresponding bases and stacking interactions of the base pairs in helixes. We introduce the notation of RNA-like copolymers and discuss in what sense the sequences studie...
Monte Carlo simulation of a gas-sampled hadron calorimeter
Energy Technology Data Exchange (ETDEWEB)
Chang, C Y; Kunori, S; Rapp, P; Talaga, R; Steinberg, P; Tylka, A J; Wang, Z M
1988-02-15
A prototype of the OPAL barrel hadron calorimeter, which is a gas-sampled calorimeter using plastic streamer tubes, was exposed to pions at energies between 1 and 7 GeV. The response of the detector was simulated using the CERN GEANT3 Monte Carlo program. By using the observed high energy muon signals to deduce details of the streamer formation, the Monte Carlo program was able to reproduce the observed calorimeter response. The behavior of the hadron calorimeter when placed behind a lead glass electromagnetic calorimeter was also investigated.
A Monte Carlo simulation study of associated liquid crystals
Berardi, R.; Fehervari, M.; Zannoni, C.
We have performed a Monte Carlo simulation study of a system of ellipsoidal particles with donor-acceptor sites modelling complementary hydrogen-bonding groups in real molecules. We have considered elongated Gay-Berne particles with terminal interaction sites allowing particles to associate and form dimers. The changes in the phase transitions and in the molecular organization and the interplay between orientational ordering and dimer formation are discussed. Particle flip and dimer moves have been used to increase the convergency rate of the Monte Carlo (MC) Markov chain.
Monte Carlo simulation and experimental verification of radiotherapy electron beams
International Nuclear Information System (INIS)
Griffin, J.; Deloar, H. M.
2007-01-01
Full text: Based on fundamental physics and statistics, the Monte Carlo technique is generally accepted as the accurate method for modelling radiation therapy treatments. A Monte Carlo simulation system has been installed, and models of linear accelerators in the more commonly used electron beam modes have been built and commissioned. A novel technique for radiation dosimetry is also being investigated. Combining the advantages of both water tank and solid phantom dosimetry, a hollow, thin walled shell or mask is filled with water and then raised above the natural water surface to produce a volume of water with the desired irregular shape.
Rapid Monte Carlo simulation of detector DQE(f)
Energy Technology Data Exchange (ETDEWEB)
Star-Lack, Josh, E-mail: josh.starlack@varian.com; Sun, Mingshan; Abel, Eric [Varian Medical Systems, Palo Alto, California 94304-1030 (United States); Meyer, Andre; Morf, Daniel [Varian Medical Systems, CH-5405, Baden-Dattwil (Switzerland); Constantin, Dragos; Fahrig, Rebecca [Department of Radiology, Stanford University, Stanford, California 94305 (United States)
2014-03-15
Purpose: Performance optimization of indirect x-ray detectors requires proper characterization of both ionizing (gamma) and optical photon transport in a heterogeneous medium. As the tool of choice for modeling detector physics, Monte Carlo methods have failed to gain traction as a design utility, due mostly to excessive simulation times and a lack of convenient simulation packages. The most important figure-of-merit in assessing detector performance is the detective quantum efficiency (DQE), for which most of the computational burden has traditionally been associated with the determination of the noise power spectrum (NPS) from an ensemble of flood images, each conventionally having 10{sup 7} − 10{sup 9} detected gamma photons. In this work, the authors show that the idealized conditions inherent in a numerical simulation allow for a dramatic reduction in the number of gamma and optical photons required to accurately predict the NPS. Methods: The authors derived an expression for the mean squared error (MSE) of a simulated NPS when computed using the International Electrotechnical Commission-recommended technique based on taking the 2D Fourier transform of flood images. It is shown that the MSE is inversely proportional to the number of flood images, and is independent of the input fluence provided that the input fluence is above a minimal value that avoids biasing the estimate. The authors then propose to further lower the input fluence so that each event creates a point-spread function rather than a flood field. The authors use this finding as the foundation for a novel algorithm in which the characteristic MTF(f), NPS(f), and DQE(f) curves are simultaneously generated from the results of a single run. The authors also investigate lowering the number of optical photons used in a scintillator simulation to further increase efficiency. Simulation results are compared with measurements performed on a Varian AS1000 portal imager, and with a previously published
Rapid Monte Carlo simulation of detector DQE(f)
International Nuclear Information System (INIS)
Star-Lack, Josh; Sun, Mingshan; Abel, Eric; Meyer, Andre; Morf, Daniel; Constantin, Dragos; Fahrig, Rebecca
2014-01-01
Purpose: Performance optimization of indirect x-ray detectors requires proper characterization of both ionizing (gamma) and optical photon transport in a heterogeneous medium. As the tool of choice for modeling detector physics, Monte Carlo methods have failed to gain traction as a design utility, due mostly to excessive simulation times and a lack of convenient simulation packages. The most important figure-of-merit in assessing detector performance is the detective quantum efficiency (DQE), for which most of the computational burden has traditionally been associated with the determination of the noise power spectrum (NPS) from an ensemble of flood images, each conventionally having 10 7 − 10 9 detected gamma photons. In this work, the authors show that the idealized conditions inherent in a numerical simulation allow for a dramatic reduction in the number of gamma and optical photons required to accurately predict the NPS. Methods: The authors derived an expression for the mean squared error (MSE) of a simulated NPS when computed using the International Electrotechnical Commission-recommended technique based on taking the 2D Fourier transform of flood images. It is shown that the MSE is inversely proportional to the number of flood images, and is independent of the input fluence provided that the input fluence is above a minimal value that avoids biasing the estimate. The authors then propose to further lower the input fluence so that each event creates a point-spread function rather than a flood field. The authors use this finding as the foundation for a novel algorithm in which the characteristic MTF(f), NPS(f), and DQE(f) curves are simultaneously generated from the results of a single run. The authors also investigate lowering the number of optical photons used in a scintillator simulation to further increase efficiency. Simulation results are compared with measurements performed on a Varian AS1000 portal imager, and with a previously published simulation
Baräo, Fernando; Nakagawa, Masayuki; Távora, Luis; Vaz, Pedro
2001-01-01
This book focusses on the state of the art of Monte Carlo methods in radiation physics and particle transport simulation and applications, the latter involving in particular, the use and development of electron--gamma, neutron--gamma and hadronic codes. Besides the basic theory and the methods employed, special attention is paid to algorithm development for modeling, and the analysis of experiments and measurements in a variety of fields ranging from particle to medical physics.
Cluster computing software for GATE simulations
International Nuclear Information System (INIS)
Beenhouwer, Jan de; Staelens, Steven; Kruecker, Dirk; Ferrer, Ludovic; D'Asseler, Yves; Lemahieu, Ignace; Rannou, Fernando R.
2007-01-01
Geometry and tracking (GEANT4) is a Monte Carlo package designed for high energy physics experiments. It is used as the basis layer for Monte Carlo simulations of nuclear medicine acquisition systems in GEANT4 Application for Tomographic Emission (GATE). GATE allows the user to realistically model experiments using accurate physics models and time synchronization for detector movement through a script language contained in a macro file. The downside of this high accuracy is long computation time. This paper describes a platform independent computing approach for running GATE simulations on a cluster of computers in order to reduce the overall simulation time. Our software automatically creates fully resolved, nonparametrized macros accompanied with an on-the-fly generated cluster specific submit file used to launch the simulations. The scalability of GATE simulations on a cluster is investigated for two imaging modalities, positron emission tomography (PET) and single photon emission computed tomography (SPECT). Due to a higher sensitivity, PET simulations are characterized by relatively high data output rates that create rather large output files. SPECT simulations, on the other hand, have lower data output rates but require a long collimator setup time. Both of these characteristics hamper scalability as a function of the number of CPUs. The scalability of PET simulations is improved here by the development of a fast output merger. The scalability of SPECT simulations is improved by greatly reducing the collimator setup time. Accordingly, these two new developments result in higher scalability for both PET and SPECT simulations and reduce the computation time to more practical values
Energy Technology Data Exchange (ETDEWEB)
Wu, Y., E-mail: yican.wu@fds.org.cn [Inst. of Nuclear Energy Safety Technology, Hefei, Anhui (China)
2015-07-01
'Full text:' Super Monte Carlo Simulation Program for Advanced Nuclear Energy Systems (SuperMC) is a CAD-based Monte Carlo (MC) program for integrated simulation of nuclear system by making use of hybrid MC-deterministic method and advanced computer technologies. The main usability features are automatic modeling of geometry and physics, visualization and virtual simulation and cloud computing service. SuperMC 2.3, the latest version, can perform coupled neutron and photon transport calculation. SuperMC has been verified by more than 2000 benchmark models and experiments, and has been applied in tens of major nuclear projects, such as the nuclear design and analysis of International Thermonuclear Experimental Reactor (ITER) and China Lead-based reactor (CLEAR). Development and applications of SuperMC are introduced in this presentation. (author)
International Nuclear Information System (INIS)
Wu, Y.
2015-01-01
'Full text:' Super Monte Carlo Simulation Program for Advanced Nuclear Energy Systems (SuperMC) is a CAD-based Monte Carlo (MC) program for integrated simulation of nuclear system by making use of hybrid MC-deterministic method and advanced computer technologies. The main usability features are automatic modeling of geometry and physics, visualization and virtual simulation and cloud computing service. SuperMC 2.3, the latest version, can perform coupled neutron and photon transport calculation. SuperMC has been verified by more than 2000 benchmark models and experiments, and has been applied in tens of major nuclear projects, such as the nuclear design and analysis of International Thermonuclear Experimental Reactor (ITER) and China Lead-based reactor (CLEAR). Development and applications of SuperMC are introduced in this presentation. (author)
Energy Technology Data Exchange (ETDEWEB)
Matthew Ellis; Derek Gaston; Benoit Forget; Kord Smith
2011-07-01
In recent years the use of Monte Carlo methods for modeling reactors has become feasible due to the increasing availability of massively parallel computer systems. One of the primary challenges yet to be fully resolved, however, is the efficient and accurate inclusion of multiphysics feedback in Monte Carlo simulations. The research in this paper presents a preliminary coupling of the open source Monte Carlo code OpenMC with the open source Multiphysics Object-Oriented Simulation Environment (MOOSE). The coupling of OpenMC and MOOSE will be used to investigate efficient and accurate numerical methods needed to include multiphysics feedback in Monte Carlo codes. An investigation into the sensitivity of Doppler feedback to fuel temperature approximations using a two dimensional 17x17 PWR fuel assembly is presented in this paper. The results show a functioning multiphysics coupling between OpenMC and MOOSE. The coupling utilizes Functional Expansion Tallies to accurately and efficiently transfer pin power distributions tallied in OpenMC to unstructured finite element meshes used in MOOSE. The two dimensional PWR fuel assembly case also demonstrates that for a simplified model the pin-by-pin doppler feedback can be adequately replicated by scaling a representative pin based on pin relative powers.
International Nuclear Information System (INIS)
Schelonka, E.P.
1979-01-01
Development and application of a series of simulation codes used for computer security analysis and design are described. Boolean relationships for arrays of barriers within functional modules are used to generate composite effectiveness indices. The general case of multiple layers of protection with any specified barrier survival criteria is given. Generalized reduction algorithms provide numerical security indices in selected subcategories and for the system as a whole. 9 figures, 11 tables
Monte Carlo Simulations of Neutron Oil well Logging Tools
International Nuclear Information System (INIS)
Azcurra, Mario
2002-01-01
Monte Carlo simulations of simple neutron oil well logging tools into typical geological formations are presented.The simulated tools consist of both 14 MeV pulsed and continuous Am-Be neutron sources with time gated and continuous gamma ray detectors respectively.The geological formation consists of pure limestone with 15% absolute porosity in a wide range of oil saturation.The particle transport was performed with the Monte Carlo N-Particle Transport Code System, MCNP-4B.Several gamma ray spectra were obtained at the detector position that allow to perform composition analysis of the formation.In particular, the ratio C/O was analyzed as an indicator of oil saturation.Further calculations are proposed to simulate actual detector responses in order to contribute to understand the relation between the detector response with the formation composition
Monte Carlo simulations of neutron oil well logging tools
International Nuclear Information System (INIS)
Azcurra, Mario O.; Zamonsky, Oscar M.
2003-01-01
Monte Carlo simulations of simple neutron oil well logging tools into typical geological formations are presented. The simulated tools consist of both 14 MeV pulsed and continuous Am-Be neutron sources with time gated and continuous gamma ray detectors respectively. The geological formation consists of pure limestone with 15% absolute porosity in a wide range of oil saturation. The particle transport was performed with the Monte Carlo N-Particle Transport Code System, MCNP-4B. Several gamma ray spectra were obtained at the detector position that allow to perform composition analysis of the formation. In particular, the ratio C/O was analyzed as an indicator of oil saturation. Further calculations are proposed to simulate actual detector responses in order to contribute to understand the relation between the detector response with the formation composition. (author)
Application of Macro Response Monte Carlo method for electron spectrum simulation
International Nuclear Information System (INIS)
Perles, L.A.; Almeida, A. de
2007-01-01
During the past years several variance reduction techniques for Monte Carlo electron transport have been developed in order to reduce the electron computation time transport for absorbed dose distribution. We have implemented the Macro Response Monte Carlo (MRMC) method to evaluate the electron spectrum which can be used as a phase space input for others simulation programs. Such technique uses probability distributions for electron histories previously simulated in spheres (called kugels). These probabilities are used to sample the primary electron final state, as well as the creation secondary electrons and photons. We have compared the MRMC electron spectra simulated in homogeneous phantom against the Geant4 spectra. The results showed an agreement better than 6% in the spectra peak energies and that MRMC code is up to 12 time faster than Geant4 simulations
Parallel computing by Monte Carlo codes MVP/GMVP
International Nuclear Information System (INIS)
Nagaya, Yasunobu; Nakagawa, Masayuki; Mori, Takamasa
2001-01-01
General-purpose Monte Carlo codes MVP/GMVP are well-vectorized and thus enable us to perform high-speed Monte Carlo calculations. In order to achieve more speedups, we parallelized the codes on the different types of parallel computing platforms or by using a standard parallelization library MPI. The platforms used for benchmark calculations are a distributed-memory vector-parallel computer Fujitsu VPP500, a distributed-memory massively parallel computer Intel paragon and a distributed-memory scalar-parallel computer Hitachi SR2201, IBM SP2. As mentioned generally, linear speedup could be obtained for large-scale problems but parallelization efficiency decreased as the batch size per a processing element(PE) was smaller. It was also found that the statistical uncertainty for assembly powers was less than 0.1% by the PWR full-core calculation with more than 10 million histories and it took about 1.5 hours by massively parallel computing. (author)
Monte Carlo simulations of multiple scattering effects in ERD measurements
International Nuclear Information System (INIS)
Doyle, Barney Lee; Arstila, Kai.; Nordlumd, K.; Knapp, James Arthur
2003-01-01
Multiple scattering effects in ERD measurements are studied by comparing two Monte Carlo simulation codes, representing different approaches to obtain acceptable statistics, to experimental spectra measured from a HfO 2 sample with a time-of-flight-ERD setup. The results show that both codes can reproduce the absolute detection yields and the energy distributions in an adequate way. The effect of the choice of the interatomic potential in multiple scattering effects is also studied. Finally the capabilities of the MC simulations in the design of new measurement setups are demonstrated by simulating the recoil energy spectra from a WC x N y sample with a low energy heavy ion beam.
Monte Carlo simulation of a prototype photodetector used in radiotherapy
Kausch, C; Albers, D; Schmidt, R; Schreiber, B
2000-01-01
The imaging performance of prototype electronic portal imaging devices (EPID) has been investigated. Monte Carlo simulations have been applied to calculate the modulation transfer function (MTF( f )), the noise power spectrum (NPS( f )) and the detective quantum efficiency (DQE( f )) for different new type of EPIDs, which consist of a detector combination of metal or polyethylene (PE), a phosphor layer of Gd sub 2 O sub 2 S and a flat array of photodiodes. The simulated results agree well with measurements. Based on simulated results, possible optimization of these devices is discussed.
Stock Price Simulation Using Bootstrap and Monte Carlo
Directory of Open Access Journals (Sweden)
Pažický Martin
2017-06-01
Full Text Available In this paper, an attempt is made to assessment and comparison of bootstrap experiment and Monte Carlo experiment for stock price simulation. Since the stock price evolution in the future is extremely important for the investors, there is the attempt to find the best method how to determine the future stock price of BNP Paribas′ bank. The aim of the paper is define the value of the European and Asian option on BNP Paribas′ stock at the maturity date. There are employed four different methods for the simulation. First method is bootstrap experiment with homoscedastic error term, second method is blocked bootstrap experiment with heteroscedastic error term, third method is Monte Carlo simulation with heteroscedastic error term and the last method is Monte Carlo simulation with homoscedastic error term. In the last method there is necessary to model the volatility using econometric GARCH model. The main purpose of the paper is to compare the mentioned methods and select the most reliable. The difference between classical European option and exotic Asian option based on the experiment results is the next aim of tis paper.
Stabilization effect of fission source in coupled Monte Carlo simulations
Directory of Open Access Journals (Sweden)
Börge Olsen
2017-08-01
Full Text Available A fission source can act as a stabilization element in coupled Monte Carlo simulations. We have observed this while studying numerical instabilities in nonlinear steady-state simulations performed by a Monte Carlo criticality solver that is coupled to a xenon feedback solver via fixed-point iteration. While fixed-point iteration is known to be numerically unstable for some problems, resulting in large spatial oscillations of the neutron flux distribution, we show that it is possible to stabilize it by reducing the number of Monte Carlo criticality cycles simulated within each iteration step. While global convergence is ensured, development of any possible numerical instability is prevented by not allowing the fission source to converge fully within a single iteration step, which is achieved by setting a small number of criticality cycles per iteration step. Moreover, under these conditions, the fission source may converge even faster than in criticality calculations with no feedback, as we demonstrate in our numerical test simulations.
Improved local lattice Monte Carlo simulation for charged systems
Jiang, Jian; Wang, Zhen-Gang
2018-03-01
Maggs and Rossetto [Phys. Rev. Lett. 88, 196402 (2002)] proposed a local lattice Monte Carlo algorithm for simulating charged systems based on Gauss's law, which scales with the particle number N as O(N). This method includes two degrees of freedom: the configuration of the mobile charged particles and the electric field. In this work, we consider two important issues in the implementation of the method, the acceptance rate of configurational change (particle move) and the ergodicity in the phase space sampled by the electric field. We propose a simple method to improve the acceptance rate of particle moves based on the superposition principle for electric field. Furthermore, we introduce an additional updating step for the field, named "open-circuit update," to ensure that the system is fully ergodic under periodic boundary conditions. We apply this improved local Monte Carlo simulation to an electrolyte solution confined between two low dielectric plates. The results show excellent agreement with previous theoretical work.
Study of TXRF experimental system by Monte Carlo simulation
International Nuclear Information System (INIS)
Costa, Ana Cristina M.; Leitao, Roberta G.; Lopes, Ricardo T.; Anjos, Marcelino J.; Conti, Claudio C.
2011-01-01
The Total-Reflection X-ray Fluorescence (TXRF) technique offers unique possibilities to study the concentrations of a wide range of trace elements in various types of samples. Besides that, the TXRF technique is widely used to study the trace elements in biological, medical and environmental samples due to its multielemental character as well as simplicity of sample preparation and quantification methods used. In general the TXRF experimental setup is not simple and might require substantial experimental efforts. On the other hand, in recent years, experimental TXRF portable systems have been developed. It has motivated us to develop our own TXRF portable system. In this work we presented a first step in order to optimize a TXRF experimental setup using Monte Carlo simulation by MCNP code. The results found show that the Monte Carlo simulation method can be used to investigate the development of a TXRF experimental system before its assembly. (author)
Polymers undergoing inhomogeneous adsorption: exact results and Monte Carlo simulations
Energy Technology Data Exchange (ETDEWEB)
Iliev, G K [Department of Mathematics, University of Melbourne, Parkville, Victoria (Australia); Orlandini, E [Dipartimento di Fisica, CNISM, Universita di Padova, Via Marzolo 8, 35131 Padova (Italy); Whittington, S G, E-mail: giliev@yorku.ca [Department of Chemistry, University of Toronto, Toronto (Canada)
2011-10-07
We consider several types of inhomogeneous polymer adsorption. In each case, the inhomogeneity is regular and resides in the surface, in the polymer or in both. We consider two different polymer models: a directed walk model that can be solved exactly and a self-avoiding walk model which we investigate using Monte Carlo methods. In each case, we compute the phase diagram. We compare and contrast the phase diagrams and give qualitative arguments about their forms. (paper)
Monte Carlo simulation experiments on box-type radon dosimeter
International Nuclear Information System (INIS)
Jamil, Khalid; Kamran, Muhammad; Illahi, Ahsan; Manzoor, Shahid
2014-01-01
Epidemiological studies show that inhalation of radon gas ( 222 Rn) may be carcinogenic especially to mine workers, people living in closed indoor energy conserved environments and underground dwellers. It is, therefore, of paramount importance to measure the 222 Rn concentrations (Bq/m 3 ) in indoors environments. For this purpose, box-type passive radon dosimeters employing ion track detector like CR-39 are widely used. Fraction of the number of radon alphas emitted in the volume of the box type dosimeter resulting in latent track formation on CR-39 is the latent track registration efficiency. Latent track registration efficiency is ultimately required to evaluate the radon concentration which consequently determines the effective dose and the radiological hazards. In this research, Monte Carlo simulation experiments were carried out to study the alpha latent track registration efficiency for box type radon dosimeter as a function of dosimeter’s dimensions and range of alpha particles in air. Two different self developed Monte Carlo simulation techniques were employed namely: (a) Surface ratio (SURA) method and (b) Ray hitting (RAHI) method. Monte Carlo simulation experiments revealed that there are two types of efficiencies i.e. intrinsic efficiency (η int ) and alpha hit efficiency (η hit ). The η int depends upon only on the dimensions of the dosimeter and η hit depends both upon dimensions of the dosimeter and range of the alpha particles. The total latent track registration efficiency is the product of both intrinsic and hit efficiencies. It has been concluded that if diagonal length of box type dosimeter is kept smaller than the range of alpha particle then hit efficiency is achieved as 100%. Nevertheless the intrinsic efficiency keeps playing its role. The Monte Carlo simulation experimental results have been found helpful to understand the intricate track registration mechanisms in the box type dosimeter. This paper explains that how radon
Image reconstruction using Monte Carlo simulation and artificial neural networks
International Nuclear Information System (INIS)
Emert, F.; Missimner, J.; Blass, W.; Rodriguez, A.
1997-01-01
PET data sets are subject to two types of distortions during acquisition: the imperfect response of the scanner and attenuation and scattering in the active distribution. In addition, the reconstruction of voxel images from the line projections composing a data set can introduce artifacts. Monte Carlo simulation provides a means for modeling the distortions and artificial neural networks a method for correcting for them as well as minimizing artifacts. (author) figs., tab., refs
Monte Carlo simulations of interacting particle mixtures in ratchet potentials
International Nuclear Information System (INIS)
Fendrik, A J; Romanelli, L
2012-01-01
There are different models of devices for achieving a separation of mixtures of particles by using the ratchet effect. On the other hand, it has been proposed that one could also control the separation by means of appropriate interactions. Through Monte Carlo simulations, we show that inclusion of simple interactions leads to a decrease of the ratchet effect and therefore also a separation of the mixtures.
Quantum Monte Carlo simulations for high-Tc superconductors
International Nuclear Information System (INIS)
Muramatsu, A.; Dopf, G.; Wagner, J.; Dieterich, P.; Hanke, W.
1992-01-01
Quantum Monte Carlo simulations for a multi-band model of high-Tc superconductors are reviewed with special emphasis on the comparison of different observabels with experiments. It is shown that a give parameter set of the three-band Hubbard model leads to a consistent description of normal-state propteries as well as pairing correlation function for the copper-oxide superconductors as a function of doping and temperature. (orig.)
Monte Carlo simulation of PET images for injection doseoptimization
Czech Academy of Sciences Publication Activity Database
Boldyš, Jiří; Dvořák, Jiří; Skopalová, M.; Bělohlávek, O.
2013-01-01
Roč. 29, č. 9 (2013), s. 988-999 ISSN 2040-7939 R&D Projects: GA MŠk 1M0572 Institutional support: RVO:67985556 Keywords : positron emission tomography * Monte Carlo simulation * biological system modeling * image quality Subject RIV: FD - Oncology ; Hematology Impact factor: 1.542, year: 2013 http://library.utia.cas.cz/separaty/2013/ZOI/boldys-0397175.pdf
Domain-growth kinetics and aspects of pinning: A Monte Carlo simulation study
DEFF Research Database (Denmark)
Castán, T.; Lindgård, Per-Anker
1991-01-01
By means of Monte Carlo computer simulations we study the domain-growth kinetics after a quench across a first-order line to very low and moderate temperatures in a multidegenerate system with nonconserved order parameter. The model is a continuous spin model relevant for martensitic transformati......By means of Monte Carlo computer simulations we study the domain-growth kinetics after a quench across a first-order line to very low and moderate temperatures in a multidegenerate system with nonconserved order parameter. The model is a continuous spin model relevant for martensitic...... to cross over from n = 1/4 at T approximately 0 to n = 1/2 with temperature for models with pinnings of types (a) and (b). For topological pinnings at T approximately 0, n is consistent with n = 1/8, a value conceivable for several levels of hierarchically interrelated domain-wall movement. When...
International Nuclear Information System (INIS)
Marseguerra, M.; Zio, E.
2000-01-01
In this paper we present an optimization approach based on the combination of a Genetic Algorithms maximization procedure with a Monte Carlo simulation. The approach is applied within the context of plant logistic management for what concerns the choice of maintenance and repair strategies. A stochastic model of plant operation is developed from the standpoint of its reliability/availability behavior, i.e. of the failure/repair/maintenance processes of its components. The model is evaluated by Monte Carlo simulation in terms of economic costs and revenues of operation. The flexibility of the Monte Carlo method allows us to include several practical aspects such as stand-by operation modes, deteriorating repairs, aging, sequences of periodic maintenances, number of repair teams available for different kinds of repair interventions (mechanical, electronic, hydraulic, etc.), components priority rankings. A genetic algorithm is then utilized to optimize the components maintenance periods and number of repair teams. The fitness function object of the optimization is a profit function which inherently accounts for the safety and economic performance of the plant and whose value is computed by the above Monte Carlo simulation model. For an efficient combination of Genetic Algorithms and Monte Carlo simulation, only few hundreds Monte Carlo histories are performed for each potential solution proposed by the genetic algorithm. Statistical significance of the results of the solutions of interest (i.e. the best ones) is then attained exploiting the fact that during the population evolution the fit chromosomes appear repeatedly many times. The proposed optimization approach is applied on two case studies of increasing complexity
International Nuclear Information System (INIS)
Androseno, P.; Zholudov, D.; Kompaniyets, A.; Smirnova, O.
2000-01-01
In order to improve both the economics of Nuclear Power Plants (NPPs) as well as their safety, data and computer codes that perform benchmark calculations while simulating NPP parameters must be utilized. This work is mainly concerned with application of computer codes using the Monte Carlo method, which provides advanced accuracy of equations to be calculated. (authors)
Analysis of Monte Carlo methods for the simulation of photon transport
International Nuclear Information System (INIS)
Carlsson, G.A.; Kusoffsky, L.
1975-01-01
In connection with the transport of low-energy photons (30 - 140 keV) through layers of water of different thicknesses, various aspects of Monte Carlo methods are examined in order to improve their effectivity (to produce statistically more reliable results with shorter computer times) and to bridge the gap between more physical methods and more mathematical ones. The calculations are compared with results of experiments involving the simulation of photon transport, using direct methods and collision density ones (J.S.)
Monte Carlo simulations in small animal PET imaging
Energy Technology Data Exchange (ETDEWEB)
Branco, Susana [Universidade de Lisboa, Faculdade de Ciencias, Instituto de Biofisica e Engenharia Biomedica, Lisbon (Portugal)], E-mail: susana.silva@fc.ul.pt; Jan, Sebastien [Service Hospitalier Frederic Joliot, CEA/DSV/DRM, Orsay (France); Almeida, Pedro [Universidade de Lisboa, Faculdade de Ciencias, Instituto de Biofisica e Engenharia Biomedica, Lisbon (Portugal)
2007-10-01
This work is based on the use of an implemented Positron Emission Tomography (PET) simulation system dedicated for small animal PET imaging. Geant4 Application for Tomographic Emission (GATE), a Monte Carlo simulation platform based on the Geant4 libraries, is well suited for modeling the microPET FOCUS system and to implement realistic phantoms, such as the MOBY phantom, and data maps from real examinations. The use of a microPET FOCUS simulation model with GATE has been validated for spatial resolution, counting rates performances, imaging contrast recovery and quantitative analysis. Results from realistic studies of the mouse body using {sup -}F and [{sup 18}F]FDG imaging protocols are presented. These simulations include the injection of realistic doses into the animal and realistic time framing. The results have shown that it is possible to simulate small animal PET acquisitions under realistic conditions, and are expected to be useful to improve the quantitative analysis in PET mouse body studies.
Spatial distribution sampling and Monte Carlo simulation of radioactive isotopes
Krainer, Alexander Michael
2015-01-01
This work focuses on the implementation of a program for random sampling of uniformly spatially distributed isotopes for Monte Carlo particle simulations and in specific FLUKA. With FLUKA it is possible to calculate the radio nuclide production in high energy fields. The decay of these nuclide, and therefore the resulting radiation field, however can only be simulated in the same geometry. This works gives the tool to simulate the decay of the produced nuclide in other geometries. With that the radiation field from an irradiated object can be simulated in arbitrary environments. The sampling of isotope mixtures was tested by simulating a 50/50 mixture of $Cs^{137}$ and $Co^{60}$. These isotopes are both well known and provide therefore a first reliable benchmark in that respect. The sampling of uniformly distributed coordinates was tested using the histogram test for various spatial distributions. The advantages and disadvantages of the program compared to standard methods are demonstrated in the real life ca...
A brief history of the introduction of generalized ensembles to Markov chain Monte Carlo simulations
Berg, Bernd A.
2017-03-01
The most efficient weights for Markov chain Monte Carlo calculations of physical observables are not necessarily those of the canonical ensemble. Generalized ensembles, which do not exist in nature but can be simulated on computers, lead often to a much faster convergence. In particular, they have been used for simulations of first order phase transitions and for simulations of complex systems in which conflicting constraints lead to a rugged free energy landscape. Starting off with the Metropolis algorithm and Hastings' extension, I present a minireview which focuses on the explosive use of generalized ensembles in the early 1990s. Illustrations are given, which range from spin models to peptides.
International Nuclear Information System (INIS)
Huang Yong; Liang Xingang; Xia Xinlin
2005-01-01
The Monte Carlo method is used to simulate the thermal emission of absorbing-emitting-scattering slab with gradient index. Three Monte Carlo ray-tracing strategies are considered. The first strategy is keeping the real distribution of the refractive index and to trace bundles in a curve route. The second strategy is discretizing the slab into sub-layers, each having constant refractive index. The bundle is traced in a straight route in each sub-layer and the reflection at the inner interface is taken into account. The third strategy is similar to the second one but only the total reflection at the inner interface is computed. Little difference is observed among the results of apparent thermal emission by these three different Monte Carlo ray tracing strategies. The results also show that the apparent hemispherical emissivity non-monotonously varies with increasing optical thickness of the slab with strong scattering gradient index. Many parameters can influence the apparent thermal emission greatly
Power distribution system reliability evaluation using dagger-sampling Monte Carlo simulation
Energy Technology Data Exchange (ETDEWEB)
Hu, Y.; Zhao, S.; Ma, Y. [North China Electric Power Univ., Hebei (China). Dept. of Electrical Engineering
2009-03-11
A dagger-sampling Monte Carlo simulation method was used to evaluate power distribution system reliability. The dagger-sampling technique was used to record the failure of a component as an incident and to determine its occurrence probability by generating incident samples using random numbers. The dagger sampling technique was combined with the direct sequential Monte Carlo method to calculate average values of load point indices and system indices. Results of the 2 methods with simulation times of up to 100,000 years were then compared. The comparative evaluation showed that less computing time was required using the dagger-sampling technique due to its higher convergence speed. When simulation times were 1000 years, the dagger-sampling method required 0.05 seconds to accomplish an evaluation, while the direct method required 0.27 seconds. 12 refs., 3 tabs., 4 figs.
Hatch, Harold W.; Jiao, Sally; Mahynski, Nathan A.; Blanco, Marco A.; Shen, Vincent K.
2017-12-01
Virial coefficients are predicted over a large range of both temperatures and model parameter values (i.e., alchemical transformation) from an individual Mayer-sampling Monte Carlo simulation by statistical mechanical extrapolation with minimal increase in computational cost. With this extrapolation method, a Mayer-sampling Monte Carlo simulation of the SPC/E (extended simple point charge) water model quantitatively predicted the second virial coefficient as a continuous function spanning over four orders of magnitude in value and over three orders of magnitude in temperature with less than a 2% deviation. In addition, the same simulation predicted the second virial coefficient if the site charges were scaled by a constant factor, from an increase of 40% down to zero charge. This method is also shown to perform well for the third virial coefficient and the exponential parameter for a Lennard-Jones fluid.
A multi-transputer system for parallel Monte Carlo simulations of extensive air showers
International Nuclear Information System (INIS)
Gils, H.J.; Heck, D.; Oehlschlaeger, J.; Schatz, G.; Thouw, T.
1989-01-01
A multiprocessor computer system has been brought into operation at the Kernforschungszentrum Karlsruhe. It is dedicated to Monte Carlo simulations of extensive air showers induced by ultra-high energy cosmic rays. The architecture consists of two independently working VMEbus systems each with a 68020 microprocessor as host computer and twelve T800 transputers for parallel processing. The two systems are linked via Ethernet for data exchange. The T800 transputers are equipped with 4 Mbyte RAM each, sufficient to run rather large codes. The host computers are operated under UNIX 5.3. On the transputers compilers for PARALLEL FORTRAN, C, and PASCAL are available. The simple modular architecture of this parallel computer reflects the single purpose for which it is intended. The hardware of the multiprocessor computer is described as well as the way how the user software is handled and distributed to the 24 working processors. The performance of the parallel computer is demonstrated by well-known benchmarks and by realistic Monte Carlo simulations of air showers. Comparisons with other types of microprocessors and with large universal computers are made. It is demonstrated that a cost reduction by more than a factor of 20 is achieved by this system as compared to universal computer. (orig.)
Numerical integration of the Langevin equation: Monte Carlo simulation
International Nuclear Information System (INIS)
Ermak, D.L.; Buckholz, H.
1980-01-01
Monte Carlo simulation techniques are derived for solving the ordinary Langevin equation of motion for a Brownian particle in the presence of an external force. These methods allow considerable freedom in selecting the size of the time step, which is restricted only by the rate of change in the external force. This approach is extended to the generalized Langevin equation which uses a memory function in the friction force term. General simulation techniques are derived which are independent of the form of the memory function. A special method requiring less storage space is presented for the case of the exponential memory function
CMS Monte Carlo production in the WLCG computing grid
International Nuclear Information System (INIS)
Hernandez, J M; Kreuzer, P; Hof, C; Khomitch, A; Mohapatra, A; Filippis, N D; Pompili, A; My, S; Abbrescia, M; Maggi, G; Donvito, G; Weirdt, S D; Maes, J; Mulders, P v; Villella, I; Wakefield, S; Guan, W; Fanfani, A; Evans, D; Flossdorf, A
2008-01-01
Monte Carlo production in CMS has received a major boost in performance and scale since the past CHEP06 conference. The production system has been re-engineered in order to incorporate the experience gained in running the previous system and to integrate production with the new CMS event data model, data management system and data processing framework. The system is interfaced to the two major computing Grids used by CMS, the LHC Computing Grid (LCG) and the Open Science Grid (OSG). Operational experience and integration aspects of the new CMS Monte Carlo production system is presented together with an analysis of production statistics. The new system automatically handles job submission, resource monitoring, job queuing, job distribution according to the available resources, data merging, registration of data into the data bookkeeping, data location, data transfer and placement systems. Compared to the previous production system automation, reliability and performance have been considerably improved. A more efficient use of computing resources and a better handling of the inherent Grid unreliability have resulted in an increase of production scale by about an order of magnitude, capable of running in parallel at the order of ten thousand jobs and yielding more than two million events per day
Applications of the Monte Carlo simulation in dosimetry and medical physics problems
International Nuclear Information System (INIS)
Rojas C, E. L.
2010-01-01
At the present time the computers use to solve important problems extends to all the areas. These areas can be of social, economic, of engineering, of basic and applied science, etc. With and appropriate handling of computation programs and information can be carried out calculations and simulations of real models, to study them and to solve theoretical or application problems. The processes that contain random variables are susceptible of being approached with the Monte Carlo method. This is a numeric method that, thanks to the improvements in the processors of the computers, it can apply in many tasks more than what was made in the principles of their practical application (at the beginning of the decade of 1950). In this work the application of the Monte Carlo method will be approached in the simulation of the radiation interaction with the matter, to investigate dosimetric aspects of some problems that exist in the medical physics area. Also, contain an introduction about some historical data and some general concepts related with the Monte Carlo simulation are revised. (Author)
Global Monte Carlo Simulation with High Order Polynomial Expansions
International Nuclear Information System (INIS)
William R. Martin; James Paul Holloway; Kaushik Banerjee; Jesse Cheatham; Jeremy Conlin
2007-01-01
The functional expansion technique (FET) was recently developed for Monte Carlo simulation. The basic idea of the FET is to expand a Monte Carlo tally in terms of a high order expansion, the coefficients of which can be estimated via the usual random walk process in a conventional Monte Carlo code. If the expansion basis is chosen carefully, the lowest order coefficient is simply the conventional histogram tally, corresponding to a flat mode. This research project studied the applicability of using the FET to estimate the fission source, from which fission sites can be sampled for the next generation. The idea is that individual fission sites contribute to expansion modes that may span the geometry being considered, possibly increasing the communication across a loosely coupled system and thereby improving convergence over the conventional fission bank approach used in most production Monte Carlo codes. The project examined a number of basis functions, including global Legendre polynomials as well as 'local' piecewise polynomials such as finite element hat functions and higher order versions. The global FET showed an improvement in convergence over the conventional fission bank approach. The local FET methods showed some advantages versus global polynomials in handling geometries with discontinuous material properties. The conventional finite element hat functions had the disadvantage that the expansion coefficients could not be estimated directly but had to be obtained by solving a linear system whose matrix elements were estimated. An alternative fission matrix-based response matrix algorithm was formulated. Studies were made of two alternative applications of the FET, one based on the kernel density estimator and one based on Arnoldi's method of minimized iterations. Preliminary results for both methods indicate improvements in fission source convergence. These developments indicate that the FET has promise for speeding up Monte Carlo fission source convergence
A New Approach to Monte Carlo Simulations in Statistical Physics
Landau, David P.
2002-08-01
Monte Carlo simulations [1] have become a powerful tool for the study of diverse problems in statistical/condensed matter physics. Standard methods sample the probability distribution for the states of the system, most often in the canonical ensemble, and over the past several decades enormous improvements have been made in performance. Nonetheless, difficulties arise near phase transitions-due to critical slowing down near 2nd order transitions and to metastability near 1st order transitions, and these complications limit the applicability of the method. We shall describe a new Monte Carlo approach [2] that uses a random walk in energy space to determine the density of states directly. Once the density of states is known, all thermodynamic properties can be calculated. This approach can be extended to multi-dimensional parameter spaces and should be effective for systems with complex energy landscapes, e.g., spin glasses, protein folding models, etc. Generalizations should produce a broadly applicable optimization tool. 1. A Guide to Monte Carlo Simulations in Statistical Physics, D. P. Landau and K. Binder (Cambridge U. Press, Cambridge, 2000). 2. Fugao Wang and D. P. Landau, Phys. Rev. Lett. 86, 2050 (2001); Phys. Rev. E64, 056101-1 (2001).
EGS4, Electron Photon Shower Simulation by Monte-Carlo
International Nuclear Information System (INIS)
1998-01-01
1 - Description of program or function: The EGS code system is one of a chain of three codes designed to solve the electromagnetic shower problem by Monte Carlo simulation. This chain makes possible simulation of almost any electron-photon transport problem conceivable. The structure of the system, with its global features, modular form, and structured programming, is readily adaptable to virtually any interfacing scheme that is desired on the part of the user. EGS4 is a package of subroutines plus block data with a flexible user interface. This allows for greater flexibility without requiring the user to be overly familiar with the internal details of the code. Combining this with the macro facility capabilities of the Mortran3 language, this reduces the likelihood that user edits will introduce bugs into the code. EGS4 uses material cross section and branching ratio data created and fit by the companion code, PEGS4. EGS4 allows for the implementation of importance sampling and other variance reduction techniques such as leading particle biasing, splitting, path length biasing, Russian roulette, etc. 2 - Method of solution: EGS employs the Monte Carlo method of solution. It allows all of the fundamental processes to be included and arbitrary geometries can be treated, also. Other minor processes, such as photoneutron production, can be added as a further generalization. Since showers develop randomly according to the quantum laws of probability, each shower is different. We again are led to the Monte Carlo method. 3 - Restrictions on the complexity of the problem: None noted
A virtual source method for Monte Carlo simulation of Gamma Knife Model C
Energy Technology Data Exchange (ETDEWEB)
Kim, Tae Hoon; Kim, Yong Kyun [Hanyang University, Seoul (Korea, Republic of); Chung, Hyun Tai [Seoul National University College of Medicine, Seoul (Korea, Republic of)
2016-05-15
The Monte Carlo simulation method has been used for dosimetry of radiation treatment. Monte Carlo simulation is the method that determines paths and dosimetry of particles using random number. Recently, owing to the ability of fast processing of the computers, it is possible to treat a patient more precisely. However, it is necessary to increase the simulation time to improve the efficiency of accuracy uncertainty. When generating the particles from the cobalt source in a simulation, there are many particles cut off. So it takes time to simulate more accurately. For the efficiency, we generated the virtual source that has the phase space distribution which acquired a single gamma knife channel. We performed the simulation using the virtual sources on the 201 channel and compared the measurement with the simulation using virtual sources and real sources. A virtual source file was generated to reduce the simulation time of a Gamma Knife Model C. Simulations with a virtual source executed about 50 times faster than the original source code and there was no statistically significant difference in simulated results.
A virtual source method for Monte Carlo simulation of Gamma Knife Model C
International Nuclear Information System (INIS)
Kim, Tae Hoon; Kim, Yong Kyun; Chung, Hyun Tai
2016-01-01
The Monte Carlo simulation method has been used for dosimetry of radiation treatment. Monte Carlo simulation is the method that determines paths and dosimetry of particles using random number. Recently, owing to the ability of fast processing of the computers, it is possible to treat a patient more precisely. However, it is necessary to increase the simulation time to improve the efficiency of accuracy uncertainty. When generating the particles from the cobalt source in a simulation, there are many particles cut off. So it takes time to simulate more accurately. For the efficiency, we generated the virtual source that has the phase space distribution which acquired a single gamma knife channel. We performed the simulation using the virtual sources on the 201 channel and compared the measurement with the simulation using virtual sources and real sources. A virtual source file was generated to reduce the simulation time of a Gamma Knife Model C. Simulations with a virtual source executed about 50 times faster than the original source code and there was no statistically significant difference in simulated results
Energy Technology Data Exchange (ETDEWEB)
Radhakrishnan, B., E-mail: radhakrishnb@ornl.gov; Eisenbach, M.; Burress, T.A.
2017-06-15
Highlights: • Developed new scaling technique for dipole–dipole interaction energy. • Developed new scaling technique for exchange interaction energy. • Used scaling laws to extend atomistic simulations to micrometer length scale. • Demonstrated transition from mono-domain to vortex magnetic structure. • Simulated domain wall width and transition length scale agree with experiments. - Abstract: A new scaling approach has been proposed for the spin exchange and the dipole–dipole interaction energy as a function of the system size. The computed scaling laws are used in atomistic Monte Carlo simulations of magnetic moment evolution to predict the transition from single domain to a vortex structure as the system size increases. The width of a 180° – domain wall extracted from the simulated structures is in close agreement with experimentally values for an F–Si alloy. The transition size from a single domain to a vortex structure is also in close agreement with theoretically predicted and experimentally measured values for Fe.
Energy Technology Data Exchange (ETDEWEB)
Rehman, Fazal-ur- E-mail: fazalr@kfupm.edu.sa; Jamil, K.; Zakaullah, M.; Abu-Jarad, F.; Mujahid, S.A
2003-07-01
There are several methods of measuring radon concentrations but nuclear track detector cylindrical dosimeters are widely employed. In this investigation, the consequence of effective volumes of the dosimeters on the registration of alpha tracks in a CR-39 detector was studied. In a series of experiments an optimum radius for a CR-39-based open cylindrical radon dosimeter was found to be about 3 cm. Monte Carlo simulation techniques hav been employed to verify the experimental results. In this context, a computer code Monte Carlo simulation dosimetry (MOCSID) was developed. Monte Carlo simulation experiments gave the optimum radius of the dosimeters as 3.0 cm. The experimental results are in good agreement with those obtained by Monte Carlo design calculations. In addition to this, plate-out effects of radon progeny were also studied. It was observed that the contribution of radon progeny ({sup 218}Po and {sup 214}Po) plated-out on the wall of the dosimeters increases with an increase of dosimeter radii and then decrease to 0 at a radius of about 3 cm if a point detector has been installed at the center of the dosimeter base. In the code MOCSID different types of random number generators were employed. The results of this research are very useful for designing an optimum size of radon dosimeters.
International Nuclear Information System (INIS)
Rehman, Fazal-ur-; Jamil, K.; Zakaullah, M.; Abu-Jarad, F.; Mujahid, S.A.
2003-01-01
There are several methods of measuring radon concentrations but nuclear track detector cylindrical dosimeters are widely employed. In this investigation, the consequence of effective volumes of the dosimeters on the registration of alpha tracks in a CR-39 detector was studied. In a series of experiments an optimum radius for a CR-39-based open cylindrical radon dosimeter was found to be about 3 cm. Monte Carlo simulation techniques hav been employed to verify the experimental results. In this context, a computer code Monte Carlo simulation dosimetry (MOCSID) was developed. Monte Carlo simulation experiments gave the optimum radius of the dosimeters as 3.0 cm. The experimental results are in good agreement with those obtained by Monte Carlo design calculations. In addition to this, plate-out effects of radon progeny were also studied. It was observed that the contribution of radon progeny ( 218 Po and 214 Po) plated-out on the wall of the dosimeters increases with an increase of dosimeter radii and then decrease to 0 at a radius of about 3 cm if a point detector has been installed at the center of the dosimeter base. In the code MOCSID different types of random number generators were employed. The results of this research are very useful for designing an optimum size of radon dosimeters
ARCHER, a new Monte Carlo software tool for emerging heterogeneous computing environments
International Nuclear Information System (INIS)
Xu, X. George; Liu, Tianyu; Su, Lin; Du, Xining; Riblett, Matthew; Ji, Wei; Gu, Deyang; Carothers, Christopher D.; Shephard, Mark S.; Brown, Forrest B.; Kalra, Mannudeep K.; Liu, Bob
2015-01-01
Highlights: • A fast Monte Carlo based radiation transport code ARCHER was developed. • ARCHER supports different hardware including CPU, GPU and Intel Xeon Phi coprocessor. • Code is benchmarked again MCNP for medical applications. • A typical CT scan dose simulation only takes 6.8 s on an NVIDIA M2090 GPU. • GPU and coprocessor-based codes are 5–8 times faster than the CPU-based codes. - Abstract: The Monte Carlo radiation transport community faces a number of challenges associated with peta- and exa-scale computing systems that rely increasingly on heterogeneous architectures involving hardware accelerators such as GPUs and Xeon Phi coprocessors. Existing Monte Carlo codes and methods must be strategically upgraded to meet emerging hardware and software needs. In this paper, we describe the development of a software, called ARCHER (Accelerated Radiation-transport Computations in Heterogeneous EnviRonments), which is designed as a versatile testbed for future Monte Carlo codes. Preliminary results from five projects in nuclear engineering and medical physics are presented
Research on Monte Carlo simulation method of industry CT system
International Nuclear Information System (INIS)
Li Junli; Zeng Zhi; Qui Rui; Wu Zhen; Li Chunyan
2010-01-01
There are a series of radiation physical problems in the design and production of industry CT system (ICTS), including limit quality index analysis; the effect of scattering, efficiency of detectors and crosstalk to the system. Usually the Monte Carlo (MC) Method is applied to resolve these problems. Most of them are of little probability, so direct simulation is very difficult, and existing MC methods and programs can't meet the needs. To resolve these difficulties, particle flux point auto-important sampling (PFPAIS) is given on the basis of auto-important sampling. Then, on the basis of PFPAIS, a particular ICTS simulation method: MCCT is realized. Compared with existing MC methods, MCCT is proved to be able to simulate the ICTS more exactly and effectively. Furthermore, the effects of all kinds of disturbances of ICTS are simulated and analyzed by MCCT. To some extent, MCCT can guide the research of the radiation physical problems in ICTS. (author)
Monte Carlo simulation of discrete γ-ray detectors
International Nuclear Information System (INIS)
Bakkali, A.; Tamda, N.; Parmentier, M.; Chavanelle, J.; Pousse, A.; Kastler, B.
2005-01-01
Needs in medical diagnosis, especially for early and reliable breast cancer detection, lead us to consider developments in scintillation crystals and position sensitive photomultiplier tubes (PSPMT) in order to develop a high-resolution medium field γ-ray imaging device. However the ideal detector for γ-rays represents a compromise between many conflicting requirements. In order to optimize different parameters involved in the detection process, we have developed a Monte Carlo simulation software. Its aim was to study the light distribution produced by a gamma photon interacting with a pixellated scintillation crystal coupled to a PSPMT array. Several crystal properties were taken into account as well as the intrinsic response of PSPMTs. Images obtained by simulations are compared with experimental results. Agreement between simulation and experimental results validate our simulation model
Quantum Monte Carlo Simulation of Frustrated Kondo Lattice Models
Sato, Toshihiro; Assaad, Fakher F.; Grover, Tarun
2018-03-01
The absence of the negative sign problem in quantum Monte Carlo simulations of spin and fermion systems has different origins. World-line based algorithms for spins require positivity of matrix elements whereas auxiliary field approaches for fermions depend on symmetries such as particle-hole symmetry. For negative-sign-free spin and fermionic systems, we show that one can formulate a negative-sign-free auxiliary field quantum Monte Carlo algorithm that allows Kondo coupling of fermions with the spins. Using this general approach, we study a half-filled Kondo lattice model on the honeycomb lattice with geometric frustration. In addition to the conventional Kondo insulator and antiferromagnetically ordered phases, we find a partial Kondo screened state where spins are selectively screened so as to alleviate frustration, and the lattice rotation symmetry is broken nematically.
The MCLIB library: Monte Carlo simulation of neutron scattering instruments
Energy Technology Data Exchange (ETDEWEB)
Seeger, P.A.
1995-09-01
Monte Carlo is a method to integrate over a large number of variables. Random numbers are used to select a value for each variable, and the integrand is evaluated. The process is repeated a large number of times and the resulting values are averaged. For a neutron transport problem, first select a neutron from the source distribution, and project it through the instrument using either deterministic or probabilistic algorithms to describe its interaction whenever it hits something, and then (if it hits the detector) tally it in a histogram representing where and when it was detected. This is intended to simulate the process of running an actual experiment (but it is much slower). This report describes the philosophy and structure of MCLIB, a Fortran library of Monte Carlo subroutines which has been developed for design of neutron scattering instruments. A pair of programs (LQDGEOM and MC{_}RUN) which use the library are shown as an example.
The MCLIB library: Monte Carlo simulation of neutron scattering instruments
International Nuclear Information System (INIS)
Seeger, P.A.
1995-01-01
Monte Carlo is a method to integrate over a large number of variables. Random numbers are used to select a value for each variable, and the integrand is evaluated. The process is repeated a large number of times and the resulting values are averaged. For a neutron transport problem, first select a neutron from the source distribution, and project it through the instrument using either deterministic or probabilistic algorithms to describe its interaction whenever it hits something, and then (if it hits the detector) tally it in a histogram representing where and when it was detected. This is intended to simulate the process of running an actual experiment (but it is much slower). This report describes the philosophy and structure of MCLIB, a Fortran library of Monte Carlo subroutines which has been developed for design of neutron scattering instruments. A pair of programs (LQDGEOM and MC RUN) which use the library are shown as an example
Monte Carlo simulation of a mammographic test phantom
International Nuclear Information System (INIS)
Hunt, R. A.; Dance, D. R.; Pachoud, M.; Carlsson, G. A.; Sandborg, M.; Ullman, G.
2005-01-01
A test phantom, including a wide range of mammographic tissue equivalent materials and test details, was imaged on a digital mammographic system. In order to quantify the effect of scatter on the contrast obtained for the test details, calculations of the scatter-to-primary ratio (S/P) have been made using a Monte Carlo simulation of the digital mammographic imaging chain, grid and test phantom. The results show that the S/P values corresponding to the imaging conditions used were in the range 0.084-0.126. Calculated and measured pixel values in different regions of the image were compared as a validation of the model and showed excellent agreement. The results indicate the potential of Monte Carlo methods in the image quality-patient dose process optimisation, especially in the assessment of imaging conditions not available on standard mammographic units. (authors)
On Monte Carlo Simulation and Analysis of Electricity Markets
International Nuclear Information System (INIS)
Amelin, Mikael
2004-07-01
This dissertation is about how Monte Carlo simulation can be used to analyse electricity markets. There are a wide range of applications for simulation; for example, players in the electricity market can use simulation to decide whether or not an investment can be expected to be profitable, and authorities can by means of simulation find out which consequences a certain market design can be expected to have on electricity prices, environmental impact, etc. In the first part of the dissertation, the focus is which electricity market models are suitable for Monte Carlo simulation. The starting point is a definition of an ideal electricity market. Such an electricity market is partly practical from a mathematical point of view (it is simple to formulate and does not require too complex calculations) and partly it is a representation of the best possible resource utilisation. The definition of the ideal electricity market is followed by analysis how the reality differs from the ideal model, what consequences the differences have on the rules of the electricity market and the strategies of the players, as well as how non-ideal properties can be included in a mathematical model. Particularly, questions about environmental impact, forecast uncertainty and grid costs are studied. The second part of the dissertation treats the Monte Carlo technique itself. To reduce the number of samples necessary to obtain accurate results, variance reduction techniques can be used. Here, six different variance reduction techniques are studied and possible applications are pointed out. The conclusions of these studies are turned into a method for efficient simulation of basic electricity markets. The method is applied to some test systems and the results show that the chosen variance reduction techniques can produce equal or better results using 99% fewer samples compared to when the same system is simulated without any variance reduction technique. More complex electricity market models
Partial multicanonical algorithm for molecular dynamics and Monte Carlo simulations.
Okumura, Hisashi
2008-09-28
Partial multicanonical algorithm is proposed for molecular dynamics and Monte Carlo simulations. The partial multicanonical simulation samples a wide range of a part of the potential-energy terms, which is necessary to sample the conformational space widely, whereas a wide range of total potential energy is sampled in the multicanonical algorithm. Thus, one can concentrate the effort to determine the weight factor only on the important energy terms in the partial multicanonical simulation. The partial multicanonical, multicanonical, and canonical molecular dynamics algorithms were applied to an alanine dipeptide in explicit water solvent. The canonical simulation sampled the states of P(II), C(5), alpha(R), and alpha(P). The multicanonical simulation covered the alpha(L) state as well as these states. The partial multicanonical simulation also sampled the C(7) (ax) state in addition to the states that were sampled by the multicanonical simulation. In the partial multicanonical simulation, furthermore, backbone dihedral angles phi and psi rotated more frequently than those in the multicanonical and canonical simulations. These results mean that the partial multicanonical algorithm has a higher sampling efficiency than the multicanonical and canonical algorithms.
Lattice Boltzmann accelerated direct simulation Monte Carlo for dilute gas flow simulations.
Di Staso, G; Clercx, H J H; Succi, S; Toschi, F
2016-11-13
Hybrid particle-continuum computational frameworks permit the simulation of gas flows by locally adjusting the resolution to the degree of non-equilibrium displayed by the flow in different regions of space and time. In this work, we present a new scheme that couples the direct simulation Monte Carlo (DSMC) with the lattice Boltzmann (LB) method in the limit of isothermal flows. The former handles strong non-equilibrium effects, as they typically occur in the vicinity of solid boundaries, whereas the latter is in charge of the bulk flow, where non-equilibrium can be dealt with perturbatively, i.e. according to Navier-Stokes hydrodynamics. The proposed concurrent multiscale method is applied to the dilute gas Couette flow, showing major computational gains when compared with the full DSMC scenarios. In addition, it is shown that the coupling with LB in the bulk flow can speed up the DSMC treatment of the Knudsen layer with respect to the full DSMC case. In other words, LB acts as a DSMC accelerator.This article is part of the themed issue 'Multiscale modelling at the physics-chemistry-biology interface'. © 2016 The Author(s).
Massively parallel quantum computer simulator
De Raedt, K.; Michielsen, K.; De Raedt, H.; Trieu, B.; Arnold, G.; Richter, M.; Lippert, Th.; Watanabe, H.; Ito, N.
2007-01-01
We describe portable software to simulate universal quantum computers on massive parallel Computers. We illustrate the use of the simulation software by running various quantum algorithms on different computer architectures, such as a IBM BlueGene/L, a IBM Regatta p690+, a Hitachi SR11000/J1, a Cray
A hybrid transport-diffusion method for Monte Carlo radiative-transfer simulations
International Nuclear Information System (INIS)
Densmore, Jeffery D.; Urbatsch, Todd J.; Evans, Thomas M.; Buksas, Michael W.
2007-01-01
Discrete Diffusion Monte Carlo (DDMC) is a technique for increasing the efficiency of Monte Carlo particle-transport simulations in diffusive media. If standard Monte Carlo is used in such media, particle histories will consist of many small steps, resulting in a computationally expensive calculation. In DDMC, particles take discrete steps between spatial cells according to a discretized diffusion equation. Each discrete step replaces many small Monte Carlo steps, thus increasing the efficiency of the simulation. In addition, given that DDMC is based on a diffusion equation, it should produce accurate solutions if used judiciously. In practice, DDMC is combined with standard Monte Carlo to form a hybrid transport-diffusion method that can accurately simulate problems with both diffusive and non-diffusive regions. In this paper, we extend previously developed DDMC techniques in several ways that improve the accuracy and utility of DDMC for nonlinear, time-dependent, radiative-transfer calculations. The use of DDMC in these types of problems is advantageous since, due to the underlying linearizations, optically thick regions appear to be diffusive. First, we employ a diffusion equation that is discretized in space but is continuous in time. Not only is this methodology theoretically more accurate than temporally discretized DDMC techniques, but it also has the benefit that a particle's time is always known. Thus, there is no ambiguity regarding what time to assign a particle that leaves an optically thick region (where DDMC is used) and begins transporting by standard Monte Carlo in an optically thin region. Also, we treat the interface between optically thick and optically thin regions with an improved method, based on the asymptotic diffusion-limit boundary condition, that can produce accurate results regardless of the angular distribution of the incident Monte Carlo particles. Finally, we develop a technique for estimating radiation momentum deposition during the
Understanding Quantum Tunneling through Quantum Monte Carlo Simulations.
Isakov, Sergei V; Mazzola, Guglielmo; Smelyanskiy, Vadim N; Jiang, Zhang; Boixo, Sergio; Neven, Hartmut; Troyer, Matthias
2016-10-28
The tunneling between the two ground states of an Ising ferromagnet is a typical example of many-body tunneling processes between two local minima, as they occur during quantum annealing. Performing quantum Monte Carlo (QMC) simulations we find that the QMC tunneling rate displays the same scaling with system size, as the rate of incoherent tunneling. The scaling in both cases is O(Δ^{2}), where Δ is the tunneling splitting (or equivalently the minimum spectral gap). An important consequence is that QMC simulations can be used to predict the performance of a quantum annealer for tunneling through a barrier. Furthermore, by using open instead of periodic boundary conditions in imaginary time, equivalent to a projector QMC algorithm, we obtain a quadratic speedup for QMC simulations, and achieve linear scaling in Δ. We provide a physical understanding of these results and their range of applicability based on an instanton picture.
Monte Carlo simulations for design of the KFUPM PGNAA facility
Naqvi, A A; Maslehuddin, M; Kidwai, S
2003-01-01
Monte Carlo simulations were carried out to design a 2.8 MeV neutron-based prompt gamma ray neutron activation analysis (PGNAA) setup for elemental analysis of cement samples. The elemental analysis was carried out using prompt gamma rays produced through capture of thermal neutrons in sample nuclei. The basic design of the PGNAA setup consists of a cylindrical cement sample enclosed in a cylindrical high-density polyethylene moderator placed between a neutron source and a gamma ray detector. In these simulations the predominant geometrical parameters of the PGNAA setup were optimized, including moderator size, sample size and shielding of the detector. Using the results of the simulations, an experimental PGNAA setup was then fabricated at the 350 kV Accelerator Laboratory of this University. The design calculations were checked experimentally through thermal neutron flux measurements inside the PGNAA moderator. A test prompt gamma ray spectrum of the PGNAA setup was also acquired from a Portland cement samp...
Monte Carlo Simulation for Statistical Decay of Compound Nucleus
Directory of Open Access Journals (Sweden)
Chadwick M.B.
2012-02-01
Full Text Available We perform Monte Carlo simulations for neutron and γ-ray emissions from a compound nucleus based on the Hauser-Feshbach statistical theory. This Monte Carlo Hauser-Feshbach (MCHF method calculation, which gives us correlated information between emitted particles and γ-rays. It will be a powerful tool in many applications, as nuclear reactions can be probed in a more microscopic way. We have been developing the MCHF code, CGM, which solves the Hauser-Feshbach theory with the Monte Carlo method. The code includes all the standard models that used in a standard Hauser-Feshbach code, namely the particle transmission generator, the level density module, interface to the discrete level database, and so on. CGM can emit multiple neutrons, as long as the excitation energy of the compound nucleus is larger than the neutron separation energy. The γ-ray competition is always included at each compound decay stage, and the angular momentum and parity are conserved. Some calculations for a fission fragment 140Xe are shown as examples of the MCHF method, and the correlation between the neutron and γ-ray is discussed.
Directory of Open Access Journals (Sweden)
Eric Dumonteil
2017-09-01
Full Text Available The Monte Carlo criticality simulation of decoupled systems, as for instance in large reactor cores, has been a challenging issue for a long time. In particular, due to limited computer time resources, the number of neutrons simulated per generation is still many order of magnitudes below realistic statistics, even during the start-up phases of reactors. This limited number of neutrons triggers a strong clustering effect of the neutron population that affects Monte Carlo tallies. Below a certain threshold, not only is the variance affected but also the estimation of the eigenvectors. In this paper we will build a time-dependent diffusion equation that takes into account both spatial correlations and population control (fixed number of neutrons along generations. We will show that its solution obeys a traveling wave dynamic, and we will discuss the mechanism that explains this biasing of local tallies whenever leakage boundary conditions are applied to the system.
A Pipelined and Parallel Architecture for Quantum Monte Carlo Simulations on FPGAs
Directory of Open Access Journals (Sweden)
Akila Gothandaraman
2010-01-01
Full Text Available Recent advances in Field-Programmable Gate Array (FPGA technology make reconfigurable computing using FPGAs an attractive platform for accelerating scientific applications. We develop a deeply pipelined and parallel architecture for Quantum Monte Carlo simulations using FPGAs. Quantum Monte Carlo simulations enable us to obtain the structural and energetic properties of atomic clusters. We experiment with different pipeline structures for each component of the design and develop a deeply pipelined architecture that provides the best performance in terms of achievable clock rate, while at the same time has a modest use of the FPGA resources. We discuss the details of the pipelined and generic architecture that is used to obtain the potential energy and wave function of a cluster of atoms.
Monte Carlo Simulations Validation Study: Vascular Brachytherapy Beta Sources
International Nuclear Information System (INIS)
Orion, I.; Koren, K.
2004-01-01
During the last decade many versions of angioplasty irradiation treatments have been proposed. The purpose of this unique brachytherapy is to administer a sufficient radiation dose into the vein walls in order to prevent restonosis, a clinical sequel to balloon angioplasty. The most suitable sources for this vascular brachytherapy are the β - emitters such as Re-188, P-32, and Sr-90/Y-90, with a maximum energy range of up to 2.1 MeV [1,2,3]. The radioactive catheters configurations offered for these treatments can be a simple wire [4], a fluid filled balloon or a coated stent. Each source is differently positioned inside the blood vessel, and the emitted electrons ranges therefore vary. Many types of sources and configurations were studied either experimentally or with the use of the Monte Carlo calculation technique, while most of the Monte Carlo simulations were carried out using EGS4 [5] or MCNP [6]. In this study we compared the beta-source absorbed-dose versus radial-distance of two treatment configurations using MCNP and EGS4 simulations. This comparison was aimed to discover the differences between the MCNP and the EGS4 simulation code systems in intermediate energies electron transport
Monte Carlo simulation of lattice bosons in three dimensions
International Nuclear Information System (INIS)
Blaer, A.; Han, J.
1992-01-01
We present an algorithm for calculating the thermodynamic properties of a system of nonrelativistic bosons on a three-dimensional spatial lattice. The method, which maps the three-dimensional quantum system onto a four-dimensional classical system, uses Monte Carlo sampling of configurations in either the canonical or the grand canonical ensemble. Our procedure is applicable to any system of lattice bosons with arbitrary short-range interactions. We test the algorithm by computing the temperature dependence of the energy, the heat capacity, and the condensate fraction of the free Bose gas
Proceedings of the first symposium on Monte Carlo simulation
International Nuclear Information System (INIS)
2001-01-01
The first symposium on Monte Carlo simulation was held at Mitsubishi Research Institute, Otemachi, Tokyo, on 10th and 11st of September, 1998. This symposium was organized by Nuclear Code Research Committee at Japan Atomic Energy Research Institute. In the sessions, were presented orally 21 papers on code development, parallel calculation, reactor physics, burn-up, criticality, shielding safety, dose evaluation, nuclear fusion reactor, thermonuclear fusion plasma, nuclear transmutation, electromagnetic cascade, fuel cycle facility. Those presented papers are compiled in this proceedings. The 21 of the presented papers are indexed individually. (J.P.N.)
Monte Carlo simulation of radiation treatment machine heads
International Nuclear Information System (INIS)
Mohan, R.
1988-01-01
Monte Carlo simulations of radiation treatment machine heads provide practical means for obtaining energy spectra and angular distributions of photons and electrons. So far, most of the work published in the literature has been limited to photons and the contaminant electrons knocked out by photons. This chapter will be confined to megavoltage photon beams produced by medical linear accelerators and 60 Co teletherapy units. The knowledge of energy spectra and angular distributions of photons and contaminant electrons emerging from such machines is important for a variety of applications in radiation dosimetry
Monte Carlo simulation experiments on box-type radon dosimeter
Energy Technology Data Exchange (ETDEWEB)
Jamil, Khalid, E-mail: kjamil@comsats.edu.pk; Kamran, Muhammad; Illahi, Ahsan; Manzoor, Shahid
2014-11-11
Epidemiological studies show that inhalation of radon gas ({sup 222}Rn) may be carcinogenic especially to mine workers, people living in closed indoor energy conserved environments and underground dwellers. It is, therefore, of paramount importance to measure the {sup 222}Rn concentrations (Bq/m{sup 3}) in indoors environments. For this purpose, box-type passive radon dosimeters employing ion track detector like CR-39 are widely used. Fraction of the number of radon alphas emitted in the volume of the box type dosimeter resulting in latent track formation on CR-39 is the latent track registration efficiency. Latent track registration efficiency is ultimately required to evaluate the radon concentration which consequently determines the effective dose and the radiological hazards. In this research, Monte Carlo simulation experiments were carried out to study the alpha latent track registration efficiency for box type radon dosimeter as a function of dosimeter’s dimensions and range of alpha particles in air. Two different self developed Monte Carlo simulation techniques were employed namely: (a) Surface ratio (SURA) method and (b) Ray hitting (RAHI) method. Monte Carlo simulation experiments revealed that there are two types of efficiencies i.e. intrinsic efficiency (η{sub int}) and alpha hit efficiency (η{sub hit}). The η{sub int} depends upon only on the dimensions of the dosimeter and η{sub hit} depends both upon dimensions of the dosimeter and range of the alpha particles. The total latent track registration efficiency is the product of both intrinsic and hit efficiencies. It has been concluded that if diagonal length of box type dosimeter is kept smaller than the range of alpha particle then hit efficiency is achieved as 100%. Nevertheless the intrinsic efficiency keeps playing its role. The Monte Carlo simulation experimental results have been found helpful to understand the intricate track registration mechanisms in the box type dosimeter. This paper
Phase transition in nonuniform Josephson arrays: Monte Carlo simulations
Lozovik, Yu. E.; Pomirchy, L. M.
1994-01-01
Disordered 2D system with Josephson interactions is considered. Disordered XY-model describes the granular films, Josephson arrays etc. Two types of disorder are analyzed: (1) randomly diluted system: Josephson coupling constants J ij are equal to J with probability p or zero (bond percolation problem); (2) coupling constants J ij are positive and distributed randomly and uniformly in some interval either including the vicinity of zero or apart from it. These systems are simulated by Monte Carlo method. Behaviour of potential energy, specific heat, phase correlation function and helicity modulus are analyzed. The phase diagram of the diluted system in T c-p plane is obtained.
Monte Carlo simulation of fully Markovian stochastic geometries
International Nuclear Information System (INIS)
Lepage, Thibaut; Delaby, Lucie; Malvagi, Fausto; Mazzolo, Alain
2010-01-01
The interest in resolving the equation of transport in stochastic media has continued to increase these last years. For binary stochastic media it is often assumed that the geometry is Markovian, which is never the case in usual environments. In the present paper, based on rigorous mathematical theorems, we construct fully two-dimensional Markovian stochastic geometries and we study their main properties. In particular, we determine a percolation threshold p c , equal to 0.586 ± 0.0015 for such geometries. Finally, Monte Carlo simulations are performed through these geometries and the results compared to homogeneous geometries. (author)
Monte Carlo simulation of particle-induced bit upsets
Wrobel, Frédéric; Touboul, Antoine; Vaillé, Jean-Roch; Boch, Jérôme; Saigné, Frédéric
2017-09-01
We investigate the issue of radiation-induced failures in electronic devices by developing a Monte Carlo tool called MC-Oracle. It is able to transport the particles in device, to calculate the energy deposited in the sensitive region of the device and to calculate the transient current induced by the primary particle and the secondary particles produced during nuclear reactions. We compare our simulation results with SRAM experiments irradiated with neutrons, protons and ions. The agreement is very good and shows that it is possible to predict the soft error rate (SER) for a given device in a given environment.
Monte Carlo simulation of particle-induced bit upsets
Directory of Open Access Journals (Sweden)
Wrobel Frédéric
2017-01-01
Full Text Available We investigate the issue of radiation-induced failures in electronic devices by developing a Monte Carlo tool called MC-Oracle. It is able to transport the particles in device, to calculate the energy deposited in the sensitive region of the device and to calculate the transient current induced by the primary particle and the secondary particles produced during nuclear reactions. We compare our simulation results with SRAM experiments irradiated with neutrons, protons and ions. The agreement is very good and shows that it is possible to predict the soft error rate (SER for a given device in a given environment.
Application of direct simulation Monte Carlo method for analysis of AVLIS evaporation process
International Nuclear Information System (INIS)
Nishimura, Akihiko
1995-01-01
The computation code of the direct simulation Monte Carlo (DSMC) method was developed in order to analyze the atomic vapor evaporation in atomic vapor laser isotope separation (AVLIS). The atomic excitation temperatures of gadolinium atom were calculated for the model with five low lying states. Calculation results were compared with the experiments obtained by laser absorption spectroscopy. Two types of DSMC simulations which were different in inelastic collision procedure were carried out. It was concluded that the energy transfer was forbidden unless the total energy of the colliding atoms exceeds a threshold value. (author)
Analysis of skin tissues spatial fluorescence distribution by the Monte Carlo simulation
International Nuclear Information System (INIS)
Churmakov, D Y; Meglinski, I V; Piletsky, S A; Greenhalgh, D A
2003-01-01
A novel Monte Carlo technique of simulation of spatial fluorescence distribution within the human skin is presented. The computational model of skin takes into account the spatial distribution of fluorophores, which would arise due to the structure of collagen fibres, compared to the epidermis and stratum corneum where the distribution of fluorophores is assumed to be homogeneous. The results of simulation suggest that distribution of auto-fluorescence is significantly suppressed in the near-infrared spectral region, whereas the spatial distribution of fluorescence sources within a sensor layer embedded in the epidermis is localized at an 'effective' depth
Analysis of skin tissues spatial fluorescence distribution by the Monte Carlo simulation
Energy Technology Data Exchange (ETDEWEB)
Churmakov, D Y [School of Engineering, Cranfield University, Cranfield, MK43 0AL (United Kingdom); Meglinski, I V [School of Engineering, Cranfield University, Cranfield, MK43 0AL (United Kingdom); Piletsky, S A [Institute of BioScience and Technology, Cranfield University, Silsoe, MK45 4DT (United Kingdom); Greenhalgh, D A [School of Engineering, Cranfield University, Cranfield, MK43 0AL (United Kingdom)
2003-07-21
A novel Monte Carlo technique of simulation of spatial fluorescence distribution within the human skin is presented. The computational model of skin takes into account the spatial distribution of fluorophores, which would arise due to the structure of collagen fibres, compared to the epidermis and stratum corneum where the distribution of fluorophores is assumed to be homogeneous. The results of simulation suggest that distribution of auto-fluorescence is significantly suppressed in the near-infrared spectral region, whereas the spatial distribution of fluorescence sources within a sensor layer embedded in the epidermis is localized at an 'effective' depth.
Analysis of skin tissues spatial fluorescence distribution by the Monte Carlo simulation
Y Churmakov, D.; Meglinski, I. V.; Piletsky, S. A.; Greenhalgh, D. A.
2003-07-01
A novel Monte Carlo technique of simulation of spatial fluorescence distribution within the human skin is presented. The computational model of skin takes into account the spatial distribution of fluorophores, which would arise due to the structure of collagen fibres, compared to the epidermis and stratum corneum where the distribution of fluorophores is assumed to be homogeneous. The results of simulation suggest that distribution of auto-fluorescence is significantly suppressed in the near-infrared spectral region, whereas the spatial distribution of fluorescence sources within a sensor layer embedded in the epidermis is localized at an `effective' depth.
Clinical treatment planning for stereotactic radiotherapy, evaluation by Monte Carlo simulation
International Nuclear Information System (INIS)
Kairn, T.; Aland, T.; Kenny, J.; Knight, R.T.; Crowe, S.B.; Langton, C.M.; Franich, R.D.; Johnston, P.N.
2010-01-01
Full text: This study uses re-evaluates the doses delivered by a series of clinical stereotactic radiotherapy treatments, to test the accuracy of treatment planning predictions for very small radiation fields. Stereotactic radiotherapy treatment plans for meningiomas near the petrous temporal bone and the foramen magnum (incorp rating fields smaller than I c m2) were examined using Monte Carlo simulations. Important differences between treatment planning predictions and Monte Carlo calculations of doses delivered to stereotactic radiotherapy patients are apparent. For example, in one case the Monte Carlo calculation shows that the delivery a planned meningioma treatment would spare the patient's critical structures (eyes, brainstem) more effectively than the treatment plan predicted, and therefore suggests that this patient could safely receive an increased dose to their tumour. Monte Carlo simulations can be used to test the dose predictions made by a conventional treatment planning system, for dosimetrically challenging small fields, and can thereby suggest valuable modifications to clinical treatment plans. This research was funded by the Wesley Research Institute, Australia. The authors wish to thank Andrew Fielding and David Schlect for valuable discussions of aspects of this work. The authors are also grateful to Muhammad Kakakhel, for assisting with the design and calibration of our linear accelerator model, and to the stereotactic radiation therapy team at Premion, who designed the treatment plans. Computational resources and services used in this work were provided by the HPC and Research Support Unit, QUT, Brisbane, Australia. (author)
Molecular Monte Carlo Simulations Using Graphics Processing Units: To Waste Recycle or Not?
Kim, Jihan; Rodgers, Jocelyn M; Athènes, Manuel; Smit, Berend
2011-10-11
In the waste recycling Monte Carlo (WRMC) algorithm, (1) multiple trial states may be simultaneously generated and utilized during Monte Carlo moves to improve the statistical accuracy of the simulations, suggesting that such an algorithm may be well posed for implementation in parallel on graphics processing units (GPUs). In this paper, we implement two waste recycling Monte Carlo algorithms in CUDA (Compute Unified Device Architecture) using uniformly distributed random trial states and trial states based on displacement random-walk steps, and we test the methods on a methane-zeolite MFI framework system to evaluate their utility. We discuss the specific implementation details of the waste recycling GPU algorithm and compare the methods to other parallel algorithms optimized for the framework system. We analyze the relationship between the statistical accuracy of our simulations and the CUDA block size to determine the efficient allocation of the GPU hardware resources. We make comparisons between the GPU and the serial CPU Monte Carlo implementations to assess speedup over conventional microprocessors. Finally, we apply our optimized GPU algorithms to the important problem of determining free energy landscapes, in this case for molecular motion through the zeolite LTA.
Kadoura, Ahmad Salim; Sun, Shuyu; Salama, Amgad
2014-01-01
thermodynamically consistent technique to regenerate rapidly Monte Carlo Markov Chains (MCMCs) at different thermodynamic conditions from the existing data points that have been pre-computed with expensive classical simulation. This technique can speed up
Monte Carlo simulation of a clinical linear accelerator
International Nuclear Information System (INIS)
Lin, S.-Y.; Chu, T.-C.; Lin, J.-P.
2001-01-01
The effects of the physical parameters of an electron beam from a Siemens PRIMUS clinical linear accelerator (linac) on the dose distribution in water were investigated by Monte Carlo simulation. The EGS4 user code, OMEGA/BEAM, was used in this study. Various incident electron beams, for example, with different energies, spot sizes and distances from the point source, were simulated using the detailed linac head structure in the 6 MV photon mode. Approximately 10 million particles were collected in the scored plane, which was set under the reticle to form the so-called phase space file. The phase space file served as a source for simulating the dose distribution in water using DOSXYZ. Dose profiles at D max (1.5 cm) and PDD curves were calculated following simulating about 1 billion histories for dose profiles and 500 million histories for percent depth dose (PDD) curves in a 30x30x30 cm 3 water phantom. The simulation results were compared with the data measured by a CEA film and an ion chamber. The results show that the dose profiles are influenced by the energy and the spot size, while PDD curves are primarily influenced by the energy of the incident beam. The effect of the distance from the point source on the dose profile is not significant and is recommended to be set at infinity. We also recommend adjusting the beam energy by using PDD curves and, then, adjusting the spot size by using the dose profile to maintain the consistency of the Monte Carlo results and measured data
International Nuclear Information System (INIS)
Vautrin, M.
2011-01-01
Contrast-enhanced stereotactic synchrotron radiation therapy (SSRT) is an innovative technique based on localized dose-enhancement effects obtained by reinforced photoelectric absorption in the tumor. Medium energy monochromatic X-rays (50 - 100 keV) are used for irradiating tumors previously loaded with a high-Z element. Clinical trials of SSRT are being prepared at the European Synchrotron Radiation Facility (ESRF), an iodinated contrast agent will be used. In order to compute the energy deposited in the patient (dose), a dedicated treatment planning system (TPS) has been developed for the clinical trials, based on the ISOgray TPS. This work focuses on the SSRT specific modifications of the TPS, especially to the PENELOPE-based Monte Carlo dose engine. The TPS uses a dedicated Monte Carlo simulation of medium energy polarized photons to compute the deposited energy in the patient. Simulations are performed considering the synchrotron source, the modeled beamline geometry and finally the patient. Specific materials were also implemented in the voxelized geometry of the patient, to consider iodine concentrations in the tumor. The computation process has been optimized and parallelized. Finally a specific computation of absolute doses and associated irradiation times (instead of monitor units) was implemented. The dedicated TPS was validated with depth dose curves, dose profiles and absolute dose measurements performed at the ESRF in a water tank and solid water phantoms with or without bone slabs. (author) [fr
CAD-based Monte Carlo program for integrated simulation of nuclear system SuperMC
International Nuclear Information System (INIS)
Wu, Y.; Song, J.; Zheng, H.; Sun, G.; Hao, L.; Long, P.; Hu, L.
2013-01-01
SuperMC is a (Computer-Aided-Design) CAD-based Monte Carlo (MC) program for integrated simulation of nuclear systems developed by FDS Team (China), making use of hybrid MC-deterministic method and advanced computer technologies. The design aim, architecture and main methodology of SuperMC are presented in this paper. The taking into account of multi-physics processes and the use of advanced computer technologies such as automatic geometry modeling, intelligent data analysis and visualization, high performance parallel computing and cloud computing, contribute to the efficiency of the code. SuperMC2.1, the latest version of the code for neutron, photon and coupled neutron and photon transport calculation, has been developed and validated by using a series of benchmarking cases such as the fusion reactor ITER model and the fast reactor BN-600 model
Kadoura, Ahmad Salim
2014-03-17
Molecular simulation could provide detailed description of fluid systems when compared to experimental techniques. They can also replace equations of state; however, molecular simulation usually costs considerable computational efforts. Several techniques have been developed to overcome such high computational costs. In this paper, two early rejection schemes, a conservative and a hybrid one, are introduced. In these two methods, undesired configurations generated by the Monte Carlo trials are rejected earlier than it would when using conventional algorithms. The methods are tested for structureless single-component Lennard-Jones particles in both canonical and NVT-Gibbs ensembles. The computational time reduction for both ensembles is observed at a wide range of thermodynamic conditions. Results show that computational time savings are directly proportional to the rejection rate of Monte Carlo trials. The proposed conservative scheme has shown to be successful in saving up to 40% of the computational time in the canonical ensemble and up to 30% in the NVT-Gibbs ensemble when compared to standard algorithms. In addition, it preserves the exact Markov chains produced by the Metropolis scheme. Further enhancement for NVT-Gibbs ensemble is achieved by combining this technique with the bond formation early rejection one. The hybrid method achieves more than 50% saving of the central processing unit (CPU) time.
Monte Carlo simulation of the HEGRA cosmic ray detector performance
Energy Technology Data Exchange (ETDEWEB)
Martinez, S. [Universidad Complutense de Madrid (Spain). Dept. de Fisica Atomica, Molecular y Nuclear; Arqueros, F. [Universidad Complutense de Madrid (Spain). Dept. de Fisica Atomica, Molecular y Nuclear; Fonseca, V. [Universidad Complutense de Madrid (Spain). Dept. de Fisica Atomica, Molecular y Nuclear; Karle, A. [Max-Planck-Institut fuer Physik, Foehringer Ring 6, D80805 Munich (Germany); Lorenz, E. [Max-Planck-Institut fuer Physik, Foehringer Ring 6, D80805 Munich (Germany); Plaga, R. [Max-Planck-Institut fuer Physik, Foehringer Ring 6, D80805 Munich (Germany); Rozanska, M. [Max-Planck-Institut fuer Physik, Foehringer Ring 6, D80805 Munich (Germany)]|[Institute of Nuclear Physics, ul.Kawiory 26a, PL30-055 Cracow (Poland)
1995-04-21
Models of the scintillator and wide-angle air Cherenkov (AIROBICC) arrays of the HEGRA experiment are described here. Their response to extensive air showers generated by cosmic rays in the 10 to 1000 TeV range has been assessed using a detailed Monte Carlo simulation of air shower development and associated Cherenkov emission. Protons, {gamma}-rays and oxygen and iron nuclei have been considered as primary particles. For both arrays, the angular resolution as determined from the Monte Carlo simulation is compared with experimental data. Shower size N{sub e} can be reconstructed from the scintillator signals with an error ranging from 10% (N{sub e}=2x10{sup 5}) to 35% (N{sub e}=3x10{sup 3}). The energy threshold of AIROBICC is 14 TeV for primary gammas and 27 TeV for protons and an angular resolution of 0.25 can be obtained. The measurement of the Cherenkov light at 90 m from the shower core provides an accurate determination of primary energy E{sub 0} as far as the nature of the primary particle is known. For gammas an error in the energy prediction ranging from 8% (E{sub 0}=5x10{sup 14} eV) to 15% (E{sub 0}=2x10{sup 13} eV) is achieved. This detector is therefore a powerful tool for {gamma}-ray astronomy. ((orig.)).
Characterization of a cylindrical plastic β-detector with Monte Carlo simulations of optical photons
Energy Technology Data Exchange (ETDEWEB)
Guadilla, V., E-mail: victor.guadilla@ific.uv.es [Instituto de Física Corpuscular, CSIC-Universidad de Valencia, E-46071 Valencia (Spain); Algora, A. [Instituto de Física Corpuscular, CSIC-Universidad de Valencia, E-46071 Valencia (Spain); Institute of Nuclear Research of the Hungarian Academy of Sciences, Debrecen H-4026 (Hungary); Tain, J.L.; Agramunt, J. [Instituto de Física Corpuscular, CSIC-Universidad de Valencia, E-46071 Valencia (Spain); Äystö, J. [University of Jyvaskyla, Department of Physics, P.O. Box 35, FI-40014 (Finland); Briz, J.A.; Cucoanes, A. [Subatech, CNRS/IN2P3, Nantes, EMN, F-44307 Nantes (France); Eronen, T. [University of Jyvaskyla, Department of Physics, P.O. Box 35, FI-40014 (Finland); Estienne, M.; Fallot, M. [Subatech, CNRS/IN2P3, Nantes, EMN, F-44307 Nantes (France); Fraile, L.M. [Universidad Complutense, Grupo de Física Nuclear, CEI Moncloa, E-28040 Madrid (Spain); Ganioğlu, E. [Department of Physics, Istanbul University, 34134 Istanbul (Turkey); Gelletly, W. [Instituto de Física Corpuscular, CSIC-Universidad de Valencia, E-46071 Valencia (Spain); Department of Physics, University of Surrey, GU2 7XH Guildford (United Kingdom); Gorelov, D.; Hakala, J.; Jokinen, A. [University of Jyvaskyla, Department of Physics, P.O. Box 35, FI-40014 (Finland); Jordan, D. [Instituto de Física Corpuscular, CSIC-Universidad de Valencia, E-46071 Valencia (Spain); Kankainen, A.; Kolhinen, V.; Koponen, J. [University of Jyvaskyla, Department of Physics, P.O. Box 35, FI-40014 (Finland); and others
2017-05-11
In this work we report on the Monte Carlo study performed to understand and reproduce experimental measurements of a new plastic β-detector with cylindrical geometry. Since energy deposition simulations differ from the experimental measurements for such a geometry, we show how the simulation of production and transport of optical photons does allow one to obtain the shapes of the experimental spectra. Moreover, taking into account the computational effort associated with this kind of simulation, we develop a method to convert the simulations of energy deposited into light collected, depending only on the interaction point in the detector. This method represents a useful solution when extensive simulations have to be done, as in the case of the calculation of the response function of the spectrometer in a total absorption γ-ray spectroscopy analysis.
BOMAB phantom manufacturing quality assurance study using Monte Carlo computations
International Nuclear Information System (INIS)
Mallett, M.W.
1994-01-01
Monte Carlo calculations have been performed to assess the importance of and quantify quality assurance protocols in the manufacturing of the Bottle-Manikin-Absorption (BOMAB) phantom for calibrating in vivo measurement systems. The parameters characterizing the BOMAB phantom that were examined included height, fill volume, fill material density, wall thickness, and source concentration. Transport simulation was performed for monoenergetic photon sources of 0.200, 0.662, and 1,460 MeV. A linear response was observed in the photon current exiting the exterior surface of the BOMAB phantom due to variations in these parameters. Sensitivity studies were also performed for an in vivo system in operation at the Pacific Northwest Laboratories in Richland, WA. Variations in detector current for this in vivo system are reported for changes in the BOMAB phantom parameters studied here. Physical justifications for the observed results are also discussed
Monte Carlo simulation of medical linear accelerator using primo code
International Nuclear Information System (INIS)
Omer, Mohamed Osman Mohamed Elhasan
2014-12-01
The use of monte Carlo simulation has become very important in the medical field and especially in calculation in radiotherapy. Various Monte Carlo codes were developed simulating interactions of particles and photons with matter. One of these codes is PRIMO that performs simulation of radiation transport from the primary electron source of a linac to estimate the absorbed dose in a water phantom or computerized tomography (CT). PRIMO is based on Penelope Monte Carlo code. Measurements of 6 MV photon beam PDD and profile were done for Elekta precise linear accelerator at Radiation and Isotopes Center Khartoum using computerized Blue water phantom and CC13 Ionization Chamber. accept Software was used to control the phantom to measure and verify dose distribution. Elektalinac from the list of available linacs in PRIMO was tuned to model Elekta precise linear accelerator. Beam parameter of 6.0 MeV initial electron energy, 0.20 MeV FWHM, and 0.20 cm focal spot FWHM were used, and an error of 4% between calculated and measured curves was found. The buildup region Z max was 1.40 cm and homogenous profile in cross line and in line were acquired. A number of studies were done to verily the model usability one of them is the effect of the number of histories on accuracy of the simulation and the resulted profile for the same beam parameters. The effect was noticeable and inaccuracies in the profile were reduced by increasing the number of histories. Another study was the effect of Side-step errors on the calculated dose which was compared with the measured dose for the same setting.It was in range of 2% for 5 cm shift, but it was higher in the calculated dose because of the small difference between the tuned model and measured dose curves. Future developments include simulating asymmetrical fields, calculating the dose distribution in computerized tomographic (CT) volume, studying the effect of beam modifiers on beam profile for both electron and photon beams.(Author)
Monte Carlo simulation of electron swarms in H2
International Nuclear Information System (INIS)
Hunter, S.R.
1977-01-01
A Monte Carlo simulation of the motion of an electron swarm in molecular hydrogen has been studied in the range E/N 1.4-170 Td. The simulation was performed for 400-600 electrons at several values of E/N for two different sets of inelastic collision cross sections at high E/N. Results were obtained for the longitudinal diffusion coefficient Dsub(L), lateral diffusion coefficient D, swarm drift velocity W, average swarm energy and ionization and excitation production coefficients, and these were compared with experimental data where available. It is found that the results differ significantly from the experimental values and this is attributed to the isotropic scattering model used in this work. However, the results lend support to the experimental technique used recently by Blevin et al. to determine these transport parameters, and in particular confirm their results that Dsub(L) > D at high values of E/N. (Author)
Comparison of Bootstrap Confidence Intervals Using Monte Carlo Simulations
Directory of Open Access Journals (Sweden)
Roberto S. Flowers-Cano
2018-02-01
Full Text Available Design of hydraulic works requires the estimation of design hydrological events by statistical inference from a probability distribution. Using Monte Carlo simulations, we compared coverage of confidence intervals constructed with four bootstrap techniques: percentile bootstrap (BP, bias-corrected bootstrap (BC, accelerated bias-corrected bootstrap (BCA and a modified version of the standard bootstrap (MSB. Different simulation scenarios were analyzed. In some cases, the mother distribution function was fit to the random samples that were generated. In other cases, a distribution function different to the mother distribution was fit to the samples. When the fitted distribution had three parameters, and was the same as the mother distribution, the intervals constructed with the four techniques had acceptable coverage. However, the bootstrap techniques failed in several of the cases in which the fitted distribution had two parameters.
CORPORATE VALUATION USING TWO-DIMENSIONAL MONTE CARLO SIMULATION
Directory of Open Access Journals (Sweden)
Toth Reka
2010-12-01
Full Text Available In this paper, we have presented a corporate valuation model. The model combine several valuation methods in order to get more accurate results. To determine the corporate asset value we have used the Gordon-like two-stage asset valuation model based on the calculation of the free cash flow to the firm. We have used the free cash flow to the firm to determine the corporate market value, which was calculated with use of the Black-Scholes option pricing model in frame of the two-dimensional Monte Carlo simulation method. The combined model and the use of the two-dimensional simulation model provides a better opportunity for the corporate value estimation.
Monte Carlo Simulation of a Linear Accelerator and Electron Beam Parameters Used in Radiotherapy
Directory of Open Access Journals (Sweden)
Mohammad Taghi Bahreyni Toossi
2009-06-01
Full Text Available Introduction: In recent decades, several Monte Carlo codes have been introduced for research and medical applications. These methods provide both accurate and detailed calculation of particle transport from linear accelerators. The main drawback of Monte Carlo techniques is the extremely long computing time that is required in order to obtain a dose distribution with good statistical accuracy. Material and Methods: In this study, the MCNP-4C Monte Carlo code was used to simulate the electron beams generated by a Neptun 10 PC linear accelerator. The depth dose curves and related parameters to depth dose and beam profiles were calculated for 6, 8 and 10 MeV electron beams with different field sizes and these data were compared with the corresponding measured values. The actual dosimetry was performed by employing a Welhofer-Scanditronix dose scanning system, semiconductor detectors and ionization chambers. Results: The result showed good agreement (better than 2% between calculated and measured depth doses and lateral dose profiles for all energies in different field sizes. Also good agreements were achieved between calculated and measured related electron beam parameters such as E0, Rq, Rp and R50. Conclusion: The simulated model of the linac developed in this study is capable of computing electron beam data in a water phantom for different field sizes and the resulting data can be used to predict the dose distributions in other complex geometries.
Energy Technology Data Exchange (ETDEWEB)
Rojas C, E. L., E-mail: leticia.rojas@inin.gob.m [ININ, Gerencia de Ciencias Ambientales, Carretera Mexico-Toluca s/n, 52750 Ocoyoacac, Estado de Mexico (Mexico)
2010-07-01
At the present time the computers use to solve important problems extends to all the areas. These areas can be of social, economic, of engineering, of basic and applied science, etc. With and appropriate handling of computation programs and information can be carried out calculations and simulations of real models, to study them and to solve theoretical or application problems. The processes that contain random variables are susceptible of being approached with the Monte Carlo method. This is a numeric method that, thanks to the improvements in the processors of the computers, it can apply in many tasks more than what was made in the principles of their practical application (at the beginning of the decade of 1950). In this work the application of the Monte Carlo method will be approached in the simulation of the radiation interaction with the matter, to investigate dosimetric aspects of some problems that exist in the medical physics area. Also, contain an introduction about some historical data and some general concepts related with the Monte Carlo simulation are revised. (Author)
Dynamic connectivity algorithms for Monte Carlo simulations of the random-cluster model
International Nuclear Information System (INIS)
Elçi, Eren Metin; Weigel, Martin
2014-01-01
We review Sweeny's algorithm for Monte Carlo simulations of the random cluster model. Straightforward implementations suffer from the problem of computational critical slowing down, where the computational effort per edge operation scales with a power of the system size. By using a tailored dynamic connectivity algorithm we are able to perform all operations with a poly-logarithmic computational effort. This approach is shown to be efficient in keeping online connectivity information and is of use for a number of applications also beyond cluster-update simulations, for instance in monitoring droplet shape transitions. As the handling of the relevant data structures is non-trivial, we provide a Python module with a full implementation for future reference.
Cai, Han-Jie; Zhang, Zhi-Lei; Fu, Fen; Li, Jian-Yang; Zhang, Xun-Chao; Zhang, Ya-Ling; Yan, Xue-Song; Lin, Ping; Xv, Jian-Ya; Yang, Lei
2018-02-01
The dense granular flow spallation target is a new target concept chosen for the Accelerator-Driven Subcritical (ADS) project in China. For the R&D of this kind of target concept, a dedicated Monte Carlo (MC) program named GMT was developed to perform the simulation study of the beam-target interaction. Owing to the complexities of the target geometry, the computational cost of the MC simulation of particle tracks is highly expensive. Thus, improvement of computational efficiency will be essential for the detailed MC simulation studies of the dense granular target. Here we present the special design of the GMT program and its high efficiency performance. In addition, the speedup potential of the GPU-accelerated spallation models is discussed.
Monte Carlo simulation of x-ray spectra in mammography
Energy Technology Data Exchange (ETDEWEB)
Ng, K.P. [Department of Optometry and Radiography, The Hong Kong Polytechnic University, Hung Hom, Kowloon, Hong Kong (China). E-mail: benngkp at netvigator.com; Kwok, C.S.; Ng, K.P.; Tang, F.H. [Department of Optometry and Radiography, The Hong Kong Polytechnic University, Hung Hom, Kowloon, Hong Kong (China)
2000-05-01
A model for generating x-ray spectra in mammography is presented. This model used the ITS version 3 Monte Carlo code for simulating the radiation transport. Various target/filter combinations such as tungsten/aluminium, molybdenum/molybdenum, molybdenum/rhodium and rhodium/rhodium were used in the simulation. Both bremsstrahlung and characteristic x-ray production were included in the model. The simulated x-ray emission spectra were compared with two sets of spectra, those of Boone et al (1997 Med. Phys. 24 1863-74) and IPEM report 78. The {chi}{sup 2} test was used for the overall goodness of fit of the spectral data. There is good agreement between the simulated x-ray spectra and the comparison spectra as the test yielded a probability value of nearly 1. When the transmitted x-ray spectra for specific target/filter combinations were generated and compared with a measured molybdenum/rhodium spectrum and spectra generated in IPEM report 78, close agreement is also observed. This was demonstrated by the probability value for the {chi}{sup 2} test being almost 1 for all the cases. However, minor differences between the simulated spectra and the 'standard' ones are observed. (author)
Monte Carlo simulation of x-ray spectra in mammography
International Nuclear Information System (INIS)
Ng, K.P.
2000-01-01
A model for generating x-ray spectra in mammography is presented. This model used the ITS version 3 Monte Carlo code for simulating the radiation transport. Various target/filter combinations such as tungsten/aluminium, molybdenum/molybdenum, molybdenum/rhodium and rhodium/rhodium were used in the simulation. Both bremsstrahlung and characteristic x-ray production were included in the model. The simulated x-ray emission spectra were compared with two sets of spectra, those of Boone et al (1997 Med. Phys. 24 1863-74) and IPEM report 78. The χ 2 test was used for the overall goodness of fit of the spectral data. There is good agreement between the simulated x-ray spectra and the comparison spectra as the test yielded a probability value of nearly 1. When the transmitted x-ray spectra for specific target/filter combinations were generated and compared with a measured molybdenum/rhodium spectrum and spectra generated in IPEM report 78, close agreement is also observed. This was demonstrated by the probability value for the χ 2 test being almost 1 for all the cases. However, minor differences between the simulated spectra and the 'standard' ones are observed. (author)
Monte Carlo simulation of gamma ray tomography for image reconstruction
Energy Technology Data Exchange (ETDEWEB)
Guedes, Karlos A.N.; Moura, Alex; Dantas, Carlos; Melo, Silvio; Lima, Emerson, E-mail: karlosguedes@hotmail.com [Universidade Federal de Pernambuco (UFPE), Recife, PE (Brazil); Meric, Ilker [University of Bergen (Norway)
2015-07-01
The Monte Carlo simulations of known density and shape object was validate with Gamma Ray Tomography in static experiments. An aluminum half-moon piece placed inside a steel pipe was the MC simulation test object that was also measured by means of gamma ray transmission. Wall effect of the steel pipe due to irradiation geometry in a single pair source-detector tomography was evaluated by comparison with theoretical data. MCNPX code requires a defined geometry to each photon trajectory which practically prevents this usage for tomography reconstruction simulation. The solution was found by writing a program in Delphi language to create input files automation code. Simulations of tomography data by automated MNCPX code were carried out and validated by experimental data. Working in this sequence the produced data needed a databank to be stored. Experimental setup used a Cesium-137 isotopic radioactive source (7.4 × 109 Bq), and NaI(Tl) scintillation detector of (51 × 51) × 10−3 m crystal size coupled to a multichannel analyzer. A stainless steel tubes of 0,154 m internal diameter, 0.014 m thickness wall. The results show that the MCNPX simulation code adapted to automated input file is useful for generating a matrix data M(θ,t), of a computerized gamma ray tomography for any known density and regular shape object. Experimental validation used RMSE from gamma ray paths and from attenuation coefficient data. (author)
Gorshkov, Anton V; Kirillin, Mikhail Yu
2015-08-01
Over two decades, the Monte Carlo technique has become a gold standard in simulation of light propagation in turbid media, including biotissues. Technological solutions provide further advances of this technique. The Intel Xeon Phi coprocessor is a new type of accelerator for highly parallel general purpose computing, which allows execution of a wide range of applications without substantial code modification. We present a technical approach of porting our previously developed Monte Carlo (MC) code for simulation of light transport in tissues to the Intel Xeon Phi coprocessor. We show that employing the accelerator allows reducing computational time of MC simulation and obtaining simulation speed-up comparable to GPU. We demonstrate the performance of the developed code for simulation of light transport in the human head and determination of the measurement volume in near-infrared spectroscopy brain sensing.
Energy Technology Data Exchange (ETDEWEB)
Dixon, D.A., E-mail: ddixon@lanl.gov [Los Alamos National Laboratory, P.O. Box 1663, MS P365, Los Alamos, NM 87545 (United States); Prinja, A.K., E-mail: prinja@unm.edu [Department of Nuclear Engineering, MSC01 1120, 1 University of New Mexico, Albuquerque, NM 87131-0001 (United States); Franke, B.C., E-mail: bcfrank@sandia.gov [Sandia National Laboratories, Albuquerque, NM 87123 (United States)
2015-09-15
This paper presents the theoretical development and numerical demonstration of a moment-preserving Monte Carlo electron transport method. Foremost, a full implementation of the moment-preserving (MP) method within the Geant4 particle simulation toolkit is demonstrated. Beyond implementation details, it is shown that the MP method is a viable alternative to the condensed history (CH) method for inclusion in current and future generation transport codes through demonstration of the key features of the method including: systematically controllable accuracy, computational efficiency, mathematical robustness, and versatility. A wide variety of results common to electron transport are presented illustrating the key features of the MP method. In particular, it is possible to achieve accuracy that is statistically indistinguishable from analog Monte Carlo, while remaining up to three orders of magnitude more efficient than analog Monte Carlo simulations. Finally, it is shown that the MP method can be generalized to any applicable analog scattering DCS model by extending previous work on the MP method beyond analytical DCSs to the partial-wave (PW) elastic tabulated DCS data.
International Nuclear Information System (INIS)
Taillade, Frédéric; Dumont, Eric; Belin, Etienne
2008-01-01
We propose an analytical model for backscattered luminance in fog and derive an expression for the visibility signal-to-noise ratio as a function of meteorological visibility distance. The model uses single scattering processes. It is based on the Mie theory and the geometry of the optical device (emitter and receiver). In particular, we present an overlap function and take the phase function of fog into account. The results of the backscattered luminance obtained with our analytical model are compared to simulations made using the Monte Carlo method based on multiple scattering processes. An excellent agreement is found in that the discrepancy between the results is smaller than the Monte Carlo standard uncertainties. If we take no account of the geometry of the optical device, the results of the model-estimated backscattered luminance differ from the simulations by a factor 20. We also conclude that the signal-to-noise ratio computed with the Monte Carlo method and our analytical model is in good agreement with experimental results since the mean difference between the calculations and experimental measurements is smaller than the experimental uncertainty
The impact of Monte Carlo simulation: a scientometric analysis of scholarly literature
Pia, Maria Grazia; Bell, Zane W; Dressendorfer, Paul V
2010-01-01
A scientometric analysis of Monte Carlo simulation and Monte Carlo codes has been performed over a set of representative scholarly journals related to radiation physics. The results of this study are reported and discussed. They document and quantitatively appraise the role of Monte Carlo methods and codes in scientific research and engineering applications.
Kinetic Monte Carlo Simulation of Cation Diffusion in Low-K Ceramics
Good, Brian
2013-01-01
Low thermal conductivity (low-K) ceramic materials are of interest to the aerospace community for use as the thermal barrier component of coating systems for turbine engine components. In particular, zirconia-based materials exhibit both low thermal conductivity and structural stability at high temperature, making them suitable for such applications. Because creep is one of the potential failure modes, and because diffusion is a mechanism by which creep takes place, we have performed computer simulations of cation diffusion in a variety of zirconia-based low-K materials. The kinetic Monte Carlo simulation method is an alternative to the more widely known molecular dynamics (MD) method. It is designed to study "infrequent-event" processes, such as diffusion, for which MD simulation can be highly inefficient. We describe the results of kinetic Monte Carlo computer simulations of cation diffusion in several zirconia-based materials, specifically, zirconia doped with Y, Gd, Nb and Yb. Diffusion paths are identified, and migration energy barriers are obtained from density functional calculations and from the literature. We present results on the temperature dependence of the diffusivity, and on the effects of the presence of oxygen vacancies in cation diffusion barrier complexes as well.
Construction of the quantitative analysis environment using Monte Carlo simulation
International Nuclear Information System (INIS)
Shirakawa, Seiji; Ushiroda, Tomoya; Hashimoto, Hiroshi; Tadokoro, Masanori; Uno, Masaki; Tsujimoto, Masakazu; Ishiguro, Masanobu; Toyama, Hiroshi
2013-01-01
The thoracic phantom image was acquisitioned of the axial section to construct maps of the source and density with Monte Carlo (MC) simulation. The phantom was Heart/Liver Type HL (Kyoto Kagaku Co., Ltd.) single photon emission CT (SPECT)/CT machine was Symbia T6 (Siemence) with the collimator LMEGP (low-medium energy general purpose). Maps were constructed from CT images with an in-house software using Visual studio C Sharp (Microsoft). The code simulation of imaging nuclear detectors (SIMIND) was used for MC simulation, Prominence processor (Nihon Medi-Physics) for filter processing and image reconstruction, and the environment DELL Precision T7400 for all image processes. For the actual experiment, the phantom was given 15 MBq of 99m Tc assuming the uptake 2% at the dose of 740 MBq in its myocardial portion and SPECT image was acquisitioned and reconstructed with Butter-worth filter and filter back projection method. CT images were similarly obtained in 0.3 mm thick slices, which were filed in one formatted with digital imaging and communication in medicine (DICOM), and then processed for application to SIMIND for mapping the source and density. Physical and mensuration factors were examined in ideal images by sequential exclusion and simulation of those factors as attenuation, scattering, spatial resolution deterioration and statistical fluctuation. Gamma energy spectrum, SPECT projection and reconstructed images given by the simulation were found to well agree with the actual data, and the precision of MC simulation was confirmed. Physical and mensuration factors were found to be evaluable individually, suggesting the usefulness of the simulation for assessing the precision of their correction. (T.T.)
Li, Pengcheng; Liu, Celong; Li, Xianpeng; He, Honghui; Ma, Hui
2016-09-20
In earlier studies, we developed scattering models and the corresponding CPU-based Monte Carlo simulation programs to study the behavior of polarized photons as they propagate through complex biological tissues. Studying the simulation results in high degrees of freedom that created a demand for massive simulation tasks. In this paper, we report a parallel implementation of the simulation program based on the compute unified device architecture running on a graphics processing unit (GPU). Different schemes for sphere-only simulations and sphere-cylinder mixture simulations were developed. Diverse optimizing methods were employed to achieve the best acceleration. The final-version GPU program is hundreds of times faster than the CPU version. Dependence of the performance on input parameters and precision were also studied. It is shown that using single precision in the GPU simulations results in very limited losses in accuracy. Consumer-level graphics cards, even those in laptop computers, are more cost-effective than scientific graphics cards for single-precision computation.
Monte Carlo simulation of the spear reflectometer at LANSCE
International Nuclear Information System (INIS)
Smith, G.S.
1995-01-01
The Monte Carlo instrument simulation code, MCLIB, contains elements to represent several components found in neutron spectrometers including slits, choppers, detectors, sources and various samples. Using these elements to represent the components of a neutron scattering instrument, one can simulate, for example, an inelastic spectrometer, a small angle scattering machine, or a reflectometer. In order to benchmark the code, we chose to compare simulated data from the MCLIB code with an actual experiment performed on the SPEAR reflectometer at LANSCE. This was done by first fitting an actual SPEAR data set to obtain the model scattering-length-density profile, Β(z), for the sample and the substrate. Then these parameters were used as input values for the sample scattering function. A simplified model of SPEAR was chosen which contained all of the essential components of the instrument. A code containing the MCLIB subroutines was then written to simulate this simplified instrument. The resulting data was then fit and compared to the actual data set in terms of the statistics, resolution and accuracy
Detailed Monte Carlo simulation of electron elastic scattering
International Nuclear Information System (INIS)
Chakarova, R.
1994-04-01
A detailed Monte Carlo model is described which simulates the transport of electrons penetrating a medium without energy loss. The trajectory of each electron is constructed as a series of successive interaction events - elastic or inelastic scattering. Differential elastic scattering cross sections, elastic and inelastic mean free paths are used to describe the interaction process. It is presumed that the cross sections data are available and the Monte Carlo algorithm does not include their evaluation. Electrons suffering successive elastic collisions are followed until they escape from the medium or (if the absorption is negligible) their path length exceeds a certain value. The inelastic events are thus treated as absorption. The medium geometry is a layered infinite slab. The electron source could be an incident electron beam or electrons created inside the material. The objective is to obtain the angular distribution, the path length and depth distribution and the collision number distribution of electrons emitted through the surface of the medium. The model is applied successfully to electrons with energy between 0.4 and 20 keV reflected from semi-infinite homogeneous materials with different scattering properties. 16 refs, 9 figs
Rambalakos, Andreas
Current federal aviation regulations in the United States and around the world mandate the need for aircraft structures to meet damage tolerance requirements through out the service life. These requirements imply that the damaged aircraft structure must maintain adequate residual strength in order to sustain its integrity that is accomplished by a continuous inspection program. The multifold objective of this research is to develop a methodology based on a direct Monte Carlo simulation process and to assess the reliability of aircraft structures. Initially, the structure is modeled as a parallel system with active redundancy comprised of elements with uncorrelated (statistically independent) strengths and subjected to an equal load distribution. Closed form expressions for the system capacity cumulative distribution function (CDF) are developed by expanding the current expression for the capacity CDF of a parallel system comprised by three elements to a parallel system comprised with up to six elements. These newly developed expressions will be used to check the accuracy of the implementation of a Monte Carlo simulation algorithm to determine the probability of failure of a parallel system comprised of an arbitrary number of statistically independent elements. The second objective of this work is to compute the probability of failure of a fuselage skin lap joint under static load conditions through a Monte Carlo simulation scheme by utilizing the residual strength of the fasteners subjected to various initial load distributions and then subjected to a new unequal load distribution resulting from subsequent fastener sequential failures. The final and main objective of this thesis is to present a methodology for computing the resulting gradual deterioration of the reliability of an aircraft structural component by employing a direct Monte Carlo simulation approach. The uncertainties associated with the time to crack initiation, the probability of crack detection, the
Power-feedwater temperature operating domain for Sbwr applying Monte Carlo simulation
Energy Technology Data Exchange (ETDEWEB)
Aguilar M, L. A.; Quezada G, S.; Espinosa M, E. G.; Vazquez R, A.; Varela H, J. R.; Cazares R, R. I.; Espinosa P, G., E-mail: sequega@gmail.com [Universidad Autonoma Metropolitana, Unidad Iztapalapa, San Rafael Atlixco No. 186, Col. Vicentina, 09340 Mexico D. F. (Mexico)
2014-10-15
In this work the analyses of the feedwater temperature effects on reactor power in a simplified boiling water reactor (Sbwr) applying a methodology based on Monte Carlo simulation is presented. The Monte Carlo methodology was applied systematically to establish operating domain, due that the Sbwr are not yet in operation, the analysis of the nuclear and thermal-hydraulic processes must rely on numerical modeling, with the purpose of developing or confirming the design basis and qualifying the existing or new computer codes to enable reliable analyses. The results show that the reactor power is inversely proportional to the temperature of the feedwater, reactor power changes at 8% when the feed water temperature changes in 8%. (Author)
Power-feedwater temperature operating domain for Sbwr applying Monte Carlo simulation
International Nuclear Information System (INIS)
Aguilar M, L. A.; Quezada G, S.; Espinosa M, E. G.; Vazquez R, A.; Varela H, J. R.; Cazares R, R. I.; Espinosa P, G.
2014-10-01
In this work the analyses of the feedwater temperature effects on reactor power in a simplified boiling water reactor (Sbwr) applying a methodology based on Monte Carlo simulation is presented. The Monte Carlo methodology was applied systematically to establish operating domain, due that the Sbwr are not yet in operation, the analysis of the nuclear and thermal-hydraulic processes must rely on numerical modeling, with the purpose of developing or confirming the design basis and qualifying the existing or new computer codes to enable reliable analyses. The results show that the reactor power is inversely proportional to the temperature of the feedwater, reactor power changes at 8% when the feed water temperature changes in 8%. (Author)
Non-Boltzmann Ensembles and Monte Carlo Simulations
International Nuclear Information System (INIS)
Murthy, K. P. N.
2016-01-01
Boltzmann sampling based on Metropolis algorithm has been extensively used for simulating a canonical ensemble and for calculating macroscopic properties of a closed system at desired temperatures. An estimate of a mechanical property, like energy, of an equilibrium system, is made by averaging over a large number microstates generated by Boltzmann Monte Carlo methods. This is possible because we can assign a numerical value for energy to each microstate. However, a thermal property like entropy, is not easily accessible to these methods. The reason is simple. We can not assign a numerical value for entropy, to a microstate. Entropy is not a property associated with any single microstate. It is a collective property of all the microstates. Toward calculating entropy and other thermal properties, a non-Boltzmann Monte Carlo technique called Umbrella sampling was proposed some forty years ago. Umbrella sampling has since undergone several metamorphoses and we have now, multi-canonical Monte Carlo, entropic sampling, flat histogram methods, Wang-Landau algorithm etc . This class of methods generates non-Boltzmann ensembles which are un-physical. However, physical quantities can be calculated as follows. First un-weight a microstates of the entropic ensemble; then re-weight it to the desired physical ensemble. Carry out weighted average over the entropic ensemble to estimate physical quantities. In this talk I shall tell you of the most recent non- Boltzmann Monte Carlo method and show how to calculate free energy for a few systems. We first consider estimation of free energy as a function of energy at different temperatures to characterize phase transition in an hairpin DNA in the presence of an unzipping force. Next we consider free energy as a function of order parameter and to this end we estimate density of states g ( E , M ), as a function of both energy E , and order parameter M . This is carried out in two stages. We estimate g ( E ) in the first stage
Zaidi, H; Morel, Christian
1998-01-01
This paper describes the implementation of the Eidolon Monte Carlo program designed to simulate fully three-dimensional (3D) cylindrical positron tomographs on a MIMD parallel architecture. The original code was written in Objective-C and developed under the NeXTSTEP development environment. Different steps involved in porting the software on a parallel architecture based on PowerPC 604 processors running under AIX 4.1 are presented. Basic aspects and strategies of running Monte Carlo calculations on parallel computers are described. A linear decrease of the computing time was achieved with the number of computing nodes. The improved time performances resulting from parallelisation of the Monte Carlo calculations makes it an attractive tool for modelling photon transport in 3D positron tomography. The parallelisation paradigm used in this work is independent from the chosen parallel architecture
Random number generators tested on quantum Monte Carlo simulations.
Hongo, Kenta; Maezono, Ryo; Miura, Kenichi
2010-08-01
We have tested and compared several (pseudo) random number generators (RNGs) applied to a practical application, ground state energy calculations of molecules using variational and diffusion Monte Carlo metheds. A new multiple recursive generator with 8th-order recursion (MRG8) and the Mersenne twister generator (MT19937) are tested and compared with the RANLUX generator with five luxury levels (RANLUX-[0-4]). Both MRG8 and MT19937 are proven to give the same total energy as that evaluated with RANLUX-4 (highest luxury level) within the statistical error bars with less computational cost to generate the sequence. We also tested the notorious implementation of linear congruential generator (LCG), RANDU, for comparison. (c) 2010 Wiley Periodicals, Inc.
Monte Carlo modelling of Schottky diode for rectenna simulation
Bernuchon, E.; Aniel, F.; Zerounian, N.; Grimault-Jacquin, A. S.
2017-09-01
Before designing a detector circuit, the electrical parameters extraction of the Schottky diode is a critical step. This article is based on a Monte-Carlo (MC) solver of the Boltzmann Transport Equation (BTE) including different transport mechanisms at the metal-semiconductor contact such as image force effect or tunneling. The weight of tunneling and thermionic current is quantified according to different degrees of tunneling modelling. The I-V characteristic highlights the dependence of the ideality factor and the current saturation with bias. Harmonic Balance (HB) simulation on a rectifier circuit within Advanced Design System (ADS) software shows that considering non-linear ideality factor and saturation current for the electrical model of the Schottky diode does not seem essential. Indeed, bias independent values extracted in forward regime on I-V curve are sufficient. However, the non-linear series resistance extracted from a small signal analysis (SSA) strongly influences the conversion efficiency at low input powers.
Monte-Carlo Tree Search for Simulated Car Racing
DEFF Research Database (Denmark)
Fischer, Jacob; Falsted, Nikolaj; Vielwerth, Mathias
2015-01-01
Monte Carlo Tree Search (MCTS) has recently seen considerable success in playing certain types of games, most of which are discrete, fully observable zero-sum games. Consequently there is currently considerable interest within the research community in investigating what other games this algorithm...... might play well, and how it can be modified to achieve this. In this paper, we investigate the application of MCTS to simulated car racing, in particular the open-source racing game TORCS. The presented approach is based on the development of an efficient forward model and the discretization...... of the action space. This combination allows the controller to effectively search the tree of potential future states. Results show that it is indeed possible to implement a competent MCTS-based racing controller. The controller generalizes to most road tracks as long as a warm-up period is provided....
Dynamic Monte Carlo simulations of radiatively accelerated GRB fireballs
Chhotray, Atul; Lazzati, Davide
2018-05-01
We present a novel Dynamic Monte Carlo code (DynaMo code) that self-consistently simulates the Compton-scattering-driven dynamic evolution of a plasma. We use the DynaMo code to investigate the time-dependent expansion and acceleration of dissipationless gamma-ray burst fireballs by varying their initial opacities and baryonic content. We study the opacity and energy density evolution of an initially optically thick, radiation-dominated fireball across its entire phase space - in particular during the Rph matter-dominated fireballs due to Thomson scattering. We quantify the new phases by providing analytical expressions of Lorentz factor evolution, which will be useful for deriving jet parameters.
MONTE CARLO SIMULATION OF MULTIFOCAL STOCHASTIC SCANNING SYSTEM
Directory of Open Access Journals (Sweden)
LIXIN LIU
2014-01-01
Full Text Available Multifocal multiphoton microscopy (MMM has greatly improved the utilization of excitation light and imaging speed due to parallel multiphoton excitation of the samples and simultaneous detection of the signals, which allows it to perform three-dimensional fast fluorescence imaging. Stochastic scanning can provide continuous, uniform and high-speed excitation of the sample, which makes it a suitable scanning scheme for MMM. In this paper, the graphical programming language — LabVIEW is used to achieve stochastic scanning of the two-dimensional galvo scanners by using white noise signals to control the x and y mirrors independently. Moreover, the stochastic scanning process is simulated by using Monte Carlo method. Our results show that MMM can avoid oversampling or subsampling in the scanning area and meet the requirements of uniform sampling by stochastically scanning the individual units of the N × N foci array. Therefore, continuous and uniform scanning in the whole field of view is implemented.
Monte Carlo simulation of magnetic multi-core nanoparticles
International Nuclear Information System (INIS)
Schaller, Vincent; Wahnstroem, Goeran; Sanz-Velasco, Anke; Enoksson, Peter; Johansson, Christer
2009-01-01
In this paper, a Monte Carlo simulation is carried out to evaluate the equilibrium magnetization of magnetic multi-core nanoparticles in a liquid and subjected to a static magnetic field. The particles contain a magnetic multi-core consisting of a cluster of magnetic single-domains of magnetite. We show that the magnetization of multi-core nanoparticles cannot be fully described by a Langevin model. Inter-domain dipolar interactions and domain magnetic anisotropy contribute to decrease the magnetization of the particles, whereas the single-domain size distribution yields an increase in magnetization. Also, we show that the interactions affect the effective magnetic moment of the multi-core nanoparticles.
Dendrimer-magnetic nanostructure: a Monte Carlo simulation
Jabar, A.; Masrour, R.
2017-11-01
In this paper, the magnetic properties of ternary mixed spins (σ,S,q) Ising model on a dendrimer nanostructure are studied using Monte Carlo simulations. The ground state phase diagrams of dendrimer nanostructure with ternary mixed spins σ = 1/2, S = 1 and q = 3/2 Ising model are found. The variation of the thermal total and partial magnetizations with the different exchange interactions, the external magnetic fields and the crystal fields have been also studied. The reduced critical temperatures have been deduced. The magnetic hysteresis cycles have been discussed. In particular, the corresponding magnetic coercive filed values have been deduced. The multiples hysteresis cycles are found. The dendrimer nanostructure has several applications in the medicine.
MCB. A continuous energy Monte Carlo burnup simulation code
International Nuclear Information System (INIS)
Cetnar, J.; Wallenius, J.; Gudowski, W.
1999-01-01
A code for integrated simulation of neutrinos and burnup based upon continuous energy Monte Carlo techniques and transmutation trajectory analysis has been developed. Being especially well suited for studies of nuclear waste transmutation systems, the code is an extension of the well validated MCNP transport program of Los Alamos National Laboratory. Among the advantages of the code (named MCB) is a fully integrated data treatment combined with a time-stepping routine that automatically corrects for burnup dependent changes in reaction rates, neutron multiplication, material composition and self-shielding. Fission product yields are treated as continuous functions of incident neutron energy, using a non-equilibrium thermodynamical model of the fission process. In the present paper a brief description of the code and applied methods are given. (author)
Monte Carlo simulations shed light on Bathsheba's suspect breast.
Heijblom, Michelle; Meijer, Linda M; van Leeuwen, Ton G; Steenbergen, Wiendelt; Manohar, Srirang
2014-05-01
In 1654, Rembrandt van Rijn painted his famous painting Bathsheba at her Bath. Over the years, the depiction of Bathsheba's left breast and especially the presence of local discoloration, has generated debate on whether Rembrandt's Bathsheba suffered from breast cancer. Historical, medical and artistic arguments appeared to be not sufficient to prove if Bathsheba's model truly suffered from breast cancer. However, the bluish discoloration of the breast is an intriguing aspect from a biomedical optics point of view that might help us ending the old debate. By using Monte Carlo simulations in combination with the retinex theory of color vision, we showed that is highly unlikely that breast cancer results in a local bluish discoloration of the skin as is present on Bathsheba's breast. Copyright © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Monte Carlo simulation of lower hybrid current drive in tokamaks
International Nuclear Information System (INIS)
Sipilae, S.K.; Heikkinen, J.A.
1994-01-01
In the report a method for noninductive current drive studies based on three-dimensional simulation of test particle orbits is presented. A Monte Carlo momentum diffusion operator is developed to model the wave-particle interaction. The scheme can be utilised in studies of current drive efficiency as well as in examining the current density profiles caused by waves with a finite parallel wave number spectrum and a nonuniform power deposition profile in a toroidal configuration space of arbitrary shape. Calculations performed with a uniform poorer deposition profile of lower hybrid waves for axisymmetric magnetic configurations having different aspect ratios and poloidal cross-section shape confirm the semianalytic estimates for the current drive efficiency based on the solutions of the flux surface averaged Fokker-Planck equation for configurations with circular poloidal cross section. The consequences of the combined effect of radial diffusion, magnetic trapping and radially nonhomogeneous power deposition and background plasma parameter profiles are investigated
Monte Carlo simulation of ionization in a magnetron plasma
International Nuclear Information System (INIS)
Miranda, J.E.; Goeckner, M.J.; Goree, J.; Sheridan, T.E.
1990-01-01
A Monte Carlo simulation of electrons emitted from the cathode of a planar magnetron is tested against experiments that were reported by Wendt, Lieberman, and Meuth [J. Vac. Sci. Technol. A 6, 1827 (1988)] and by Gu and Lieberman [J. Vac. Sci. Technol. A 6, 2960 (1988)]. Comparing their measurements of the radial profile of current and the axial profile of optical emission to the ionization profiles predicted by the model, we find good agreement for a typical magnetic field strength of 456 G. We also find that at 456 G the product of the average number of ionizations left-angle N i right-angle and the secondary electron emission coefficient γ is ∼1. This indicates that secondary emission contributes significantly to the ionization that sustains the discharge. At 171 G, however, left-angle N i right-angle γ much-lt 1, revealing that cathode emission is inadequate to sustain a discharge at a low magnetic field
Optimization of reconstruction algorithms using Monte Carlo simulation
International Nuclear Information System (INIS)
Hanson, K.M.
1989-01-01
A method for optimizing reconstruction algorithms is presented that is based on how well a specified task can be performed using the reconstructed images. Task performance is numerically assessed by a Monte Carlo simulation of the complete imaging process including the generation of scenes appropriate to the desired application, subsequent data taking, reconstruction, and performance of the stated task based on the final image. The use of this method is demonstrated through the optimization of the Algebraic Reconstruction Technique (ART), which reconstructs images from their projections by an iterative procedure. The optimization is accomplished by varying the relaxation factor employed in the updating procedure. In some of the imaging situations studied, it is found that the optimization of constrained ART, in which a non-negativity constraint is invoked, can vastly increase the detectability of objects. There is little improvement attained for unconstrained ART. The general method presented may be applied to the problem of designing neutron-diffraction spectrometers. (author)
Monte Carlo simulations and benchmark studies at CERN's accelerator chain
AUTHOR|(CDS)2083190; Brugger, Markus
2016-01-01
Mixed particle and energy radiation fields present at the Large Hadron Collider (LHC) and its accelerator chain are responsible for failures on electronic devices located in the vicinity of the accelerator beam lines. These radiation effects on electronics and, more generally, the overall radiation damage issues have a direct impact on component and system lifetimes, as well as on maintenance requirements and radiation exposure to personnel who have to intervene and fix existing faults. The radiation environments and respective radiation damage issues along the CERN’s accelerator chain were studied in the framework of the CERN Radiation to Electronics (R2E) project and are hereby presented. The important interplay between Monte Carlo simulations and radiation monitoring is also highlighted.
Vector Monte Carlo simulations on atmospheric scattering of polarization qubits.
Li, Ming; Lu, Pengfei; Yu, Zhongyuan; Yan, Lei; Chen, Zhihui; Yang, Chuanghua; Luo, Xiao
2013-03-01
In this paper, a vector Monte Carlo (MC) method is proposed to study the influence of atmospheric scattering on polarization qubits for satellite-based quantum communication. The vector MC method utilizes a transmittance method to solve the photon free path for an inhomogeneous atmosphere and random number sampling to determine whether the type of scattering is aerosol scattering or molecule scattering. Simulations are performed for downlink and uplink. The degrees and the rotations of polarization are qualitatively and quantitatively obtained, which agree well with the measured results in the previous experiments. The results show that polarization qubits are well preserved in the downlink and uplink, while the number of received single photons is less than half of the total transmitted single photons for both links. Moreover, our vector MC method can be applied for the scattering of polarized light in other inhomogeneous random media.
Characterization of parallel-hole collimator using Monte Carlo Simulation
International Nuclear Information System (INIS)
Pandey, Anil Kumar; Sharma, Sanjay Kumar; Karunanithi, Sellam; Kumar, Praveen; Bal, Chandrasekhar; Kumar, Rakesh
2015-01-01
Accuracy of in vivo activity quantification improves after the correction of penetrated and scattered photons. However, accurate assessment is not possible with physical experiment. We have used Monte Carlo Simulation to accurately assess the contribution of penetrated and scattered photons in the photopeak window. Simulations were performed with Simulation of Imaging Nuclear Detectors Monte Carlo Code. The simulations were set up in such a way that it provides geometric, penetration, and scatter components after each simulation and writes binary images to a data file. These components were analyzed graphically using Microsoft Excel (Microsoft Corporation, USA). Each binary image was imported in software (ImageJ) and logarithmic transformation was applied for visual assessment of image quality, plotting profile across the center of the images and calculating full width at half maximum (FWHM) in horizontal and vertical directions. The geometric, penetration, and scatter at 140 keV for low-energy general-purpose were 93.20%, 4.13%, 2.67% respectively. Similarly, geometric, penetration, and scatter at 140 keV for low-energy high-resolution (LEHR), medium-energy general-purpose (MEGP), and high-energy general-purpose (HEGP) collimator were (94.06%, 3.39%, 2.55%), (96.42%, 1.52%, 2.06%), and (96.70%, 1.45%, 1.85%), respectively. For MEGP collimator at 245 keV photon and for HEGP collimator at 364 keV were 89.10%, 7.08%, 3.82% and 67.78%, 18.63%, 13.59%, respectively. Low-energy general-purpose and LEHR collimator is best to image 140 keV photon. HEGP can be used for 245 keV and 364 keV; however, correction for penetration and scatter must be applied if one is interested to quantify the in vivo activity of energy 364 keV. Due to heavy penetration and scattering, 511 keV photons should not be imaged with HEGP collimator
An introduction to computer simulation methods applications to physical systems
Gould, Harvey; Christian, Wolfgang
2007-01-01
Now in its third edition, this book teaches physical concepts using computer simulations. The text incorporates object-oriented programming techniques and encourages readers to develop good programming habits in the context of doing physics. Designed for readers at all levels , An Introduction to Computer Simulation Methods uses Java, currently the most popular programming language. Introduction, Tools for Doing Simulations, Simulating Particle Motion, Oscillatory Systems, Few-Body Problems: The Motion of the Planets, The Chaotic Motion of Dynamical Systems, Random Processes, The Dynamics of Many Particle Systems, Normal Modes and Waves, Electrodynamics, Numerical and Monte Carlo Methods, Percolation, Fractals and Kinetic Growth Models, Complex Systems, Monte Carlo Simulations of Thermal Systems, Quantum Systems, Visualization and Rigid Body Dynamics, Seeing in Special and General Relativity, Epilogue: The Unity of Physics For all readers interested in developing programming habits in the context of doing phy...
Oxygen transport properties estimation by classical trajectory–direct simulation Monte Carlo
Energy Technology Data Exchange (ETDEWEB)
Bruno, Domenico, E-mail: domenico.bruno@cnr.it [Istituto di Metodologie Inorganiche e dei Plasmi, Consiglio Nazionale delle Ricerche– Via G. Amendola 122, 70125 Bari (Italy); Frezzotti, Aldo, E-mail: aldo.frezzotti@polimi.it; Ghiroldi, Gian Pietro, E-mail: gpghiro@gmail.com [Dipartimento di Scienze e Tecnologie Aerospaziali, Politecnico di Milano–Via La Masa 34, 20156 Milano (Italy)
2015-05-15
Coupling direct simulation Monte Carlo (DSMC) simulations with classical trajectory calculations is a powerful tool to improve predictive capabilities of computational dilute gas dynamics. The considerable increase in computational effort outlined in early applications of the method can be compensated by running simulations on massively parallel computers. In particular, Graphics Processing Unit acceleration has been found quite effective in reducing computing time of classical trajectory (CT)-DSMC simulations. The aim of the present work is to study dilute molecular oxygen flows by modeling binary collisions, in the rigid rotor approximation, through an accurate Potential Energy Surface (PES), obtained by molecular beams scattering. The PES accuracy is assessed by calculating molecular oxygen transport properties by different equilibrium and non-equilibrium CT-DSMC based simulations that provide close values of the transport properties. Comparisons with available experimental data are presented and discussed in the temperature range 300–900 K, where vibrational degrees of freedom are expected to play a limited (but not always negligible) role.
Monte Carlo simulation of mixed neutron-gamma radiation fields and dosimetry devices
International Nuclear Information System (INIS)
Zhang, Guoqing
2011-01-01
different incident angles of neutrons, the responses were calculated. To correct the track overlapping effect for high track densities, density correction factors are computed with the Monte Carlo method. A computer code has been developed to handle all the calculations with different parameters. To verify the simulation results, experiments were performed.
Monte Carlo simulation of mixed neutron-gamma radiation fields and dosimetry devices
Energy Technology Data Exchange (ETDEWEB)
Zhang, Guoqing
2011-12-22
different incident angles of neutrons, the responses were calculated. To correct the track overlapping effect for high track densities, density correction factors are computed with the Monte Carlo method. A computer code has been developed to handle all the calculations with different parameters. To verify the simulation results, experiments were performed.
Penelope-2006: a code system for Monte Carlo simulation of electron and photon transport
International Nuclear Information System (INIS)
2006-01-01
The computer code system PENELOPE (version 2006) performs Monte Carlo simulation of coupled electron-photon transport in arbitrary materials for a wide energy range, from a few hundred eV to about 1 GeV. Photon transport is simulated by means of the standard, detailed simulation scheme. Electron and positron histories are generated on the basis of a mixed procedure, which combines detailed simulation of hard events with condensed simulation of soft interactions. A geometry package called PENGEOM permits the generation of random electron-photon showers in material systems consisting of homogeneous bodies limited by quadric surfaces, i.e. planes, spheres, cylinders, etc. This report is intended not only to serve as a manual of the PENELOPE code system, but also to provide the user with the necessary information to understand the details of the Monte Carlo algorithm. These proceedings contain the corresponding manual and teaching notes of the PENELOPE-2006 workshop and training course, held on 4-7 July 2006 in Barcelona, Spain. (author)
MCViNE – An object oriented Monte Carlo neutron ray tracing simulation package
Energy Technology Data Exchange (ETDEWEB)
Lin, Jiao Y.Y., E-mail: linjiao@ornl.gov [Caltech Center for Advanced Computing Research, California Institute of Technology (United States); Department of Applied Physics and Materials Science, California Institute of Technology (United States); Neutron Data Analysis and Visualization Division, Oak Ridge National Laboratory (United States); Smith, Hillary L. [Department of Applied Physics and Materials Science, California Institute of Technology (United States); Granroth, Garrett E., E-mail: granrothge@ornl.gov [Neutron Data Analysis and Visualization Division, Oak Ridge National Laboratory (United States); Abernathy, Douglas L.; Lumsden, Mark D.; Winn, Barry; Aczel, Adam A. [Quantum Condensed Matter Division, Oak Ridge National Laboratory (United States); Aivazis, Michael [Caltech Center for Advanced Computing Research, California Institute of Technology (United States); Fultz, Brent, E-mail: btf@caltech.edu [Department of Applied Physics and Materials Science, California Institute of Technology (United States)
2016-02-21
MCViNE (Monte-Carlo VIrtual Neutron Experiment) is an open-source Monte Carlo (MC) neutron ray-tracing software for performing computer modeling and simulations that mirror real neutron scattering experiments. We exploited the close similarity between how instrument components are designed and operated and how such components can be modeled in software. For example we used object oriented programming concepts for representing neutron scatterers and detector systems, and recursive algorithms for implementing multiple scattering. Combining these features together in MCViNE allows one to handle sophisticated neutron scattering problems in modern instruments, including, for example, neutron detection by complex detector systems, and single and multiple scattering events in a variety of samples and sample environments. In addition, MCViNE can use simulation components from linear-chain-based MC ray tracing packages which facilitates porting instrument models from those codes. Furthermore it allows for components written solely in Python, which expedites prototyping of new components. These developments have enabled detailed simulations of neutron scattering experiments, with non-trivial samples, for time-of-flight inelastic instruments at the Spallation Neutron Source. Examples of such simulations for powder and single-crystal samples with various scattering kernels, including kernels for phonon and magnon scattering, are presented. With simulations that closely reproduce experimental results, scattering mechanisms can be turned on and off to determine how they contribute to the measured scattering intensities, improving our understanding of the underlying physics.
Testing Lorentz Invariance Emergence in the Ising Model using Monte Carlo simulations
Dias Astros, Maria Isabel
2017-01-01
In the context of the Lorentz invariance as an emergent phenomenon at low energy scales to study quantum gravity a system composed by two 3D interacting Ising models (one with an anisotropy in one direction) was proposed. Two Monte Carlo simulations were run: one for the 2D Ising model and one for the target model. In both cases the observables (energy, magnetization, heat capacity and magnetic susceptibility) were computed for different lattice sizes and a Binder cumulant introduced in order to estimate the critical temperature of the systems. Moreover, the correlation function was calculated for the 2D Ising model.
Power-feedwater enthalpy operating domain for SBWR applying Monte Carlo simulation
International Nuclear Information System (INIS)
Quezada-Garcia, S.; Espinosa-Martinez, E.-G.; Vazquez-Rodriguez, A.; Varela-Ham, J.R.; Espinosa-Paredes, G.
2014-01-01
In this work the analyses of the feedwater enthalpy effects on reactor power in a simplified boiling water reactor (SBWR) applying a methodology based on Monte Carlo's simulation (MCS), is presented. The MCS methodology was applied systematically to establish operating domain, due that the SBWR are not yet in operation, the analysis of the nuclear and thermalhydraulic processes must rely on numerical modeling, with the purpose of developing or confirming the design basis and qualifying the existing or new computer codes to enable reliable analyses. (author)
Magnetic properties of Ni/Au core/shell studied by Monte Carlo simulations
Energy Technology Data Exchange (ETDEWEB)
Masrour, R., E-mail: rachidmasrour@hotmail.com [Laboratory of Materials, Processes, Environment and Quality, Cady Ayyed University, National School of Applied Sciences, Sidi Bouzid, Safi, 63 4600 (Morocco); LMPHE (URAC 12), Faculté des Sciences, Université Mohammed V-Agdal, Av. Ibn Batouta, B.P. 1014, Rabat (Morocco); Bahmad, L. [LMPHE (URAC 12), Faculté des Sciences, Université Mohammed V-Agdal, Av. Ibn Batouta, B.P. 1014, Rabat (Morocco); Hamedoun, M. [Institute of Nanomaterials and Nanotechnologies, MAScIR, Rabat (Morocco); Benyoussef, A. [LMPHE (URAC 12), Faculté des Sciences, Université Mohammed V-Agdal, Av. Ibn Batouta, B.P. 1014, Rabat (Morocco); Institute of Nanomaterials and Nanotechnologies, MAScIR, Rabat (Morocco); Hassan II Academy of Science and Technology, Rabat (Morocco); Hlil, E.K. [Institut Néel, CNRS et Université Joseph Fourier, BP 166, F-38042 Grenoble cedex 9 (France)
2014-01-10
The magnetic properties of ferromagnetic Ni/Au core/shell have been studied using Monte Carlo simulations within the Ising model framework. The considered Hamiltonian includes the exchange interactions between Ni–Ni, Au–Au and Ni–Au and the external magnetic field. The thermal total magnetizations and total magnetic susceptibilities of core/shell Ni/Au are computed. The critical temperature is deduced. The exchange interaction between Ni and Au atoms is obtained. In addition, the total magnetizations versus the external magnetic field and crystal filed for different temperature are also established.
International Nuclear Information System (INIS)
Sugawara, Hirotake; Mori, Naoki; Sakai, Yosuke; Suda, Yoshiyuki
2007-01-01
Techniques to reduce the computational load for determination of electron-molecule collisions in Monte Carlo simulations of electrical discharges have been presented. By enhancing the detection efficiency of the no-collision case in the decision scheme of the collisional events, we can decrease the frequency of access to time-consuming subroutines to calculate the electron collision cross sections of the gas molecules for obtaining the collision probability. A benchmark test and an estimation to evaluate the present techniques have shown a practical timesaving efficiency
MOCARS: a Monte Carlo code for determining the distribution and simulation limits
International Nuclear Information System (INIS)
Matthews, S.D.
1977-07-01
MOCARS is a computer program designed for the INEL CDC 76-173 operating system to determine the distribution and simulation limits for a function by Monte Carlo techniques. The code randomly samples data from any of the 12 user-specified distributions and then either evaluates the cut set system unavailability or a user-specified function with the sample data. After the data are ordered, the values at various quantities and associated confidence bounds are calculated for output. Also available for output on microfilm are the frequency and cumulative distribution histograms from the sample data. 29 figures, 4 tables
Monte Carlo simulations on marker grouping and ordering.
Wu, J; Jenkins, J; Zhu, J; McCarty, J; Watson, C
2003-08-01
Four global algorithms, maximum likelihood (ML), sum of adjacent LOD score (SALOD), sum of adjacent recombinant fractions (SARF) and product of adjacent recombinant fraction (PARF), and one approximation algorithm, seriation (SER), were used to compare the marker ordering efficiencies for correctly given linkage groups based on doubled haploid (DH) populations. The Monte Carlo simulation results indicated the marker ordering powers for the five methods were almost identical. High correlation coefficients were greater than 0.99 between grouping power and ordering power, indicating that all these methods for marker ordering were reliable. Therefore, the main problem for linkage analysis was how to improve the grouping power. Since the SER approach provided the advantage of speed without losing ordering power, this approach was used for detailed simulations. For more generality, multiple linkage groups were employed, and population size, linkage cutoff criterion, marker spacing pattern (even or uneven), and marker spacing distance (close or loose) were considered for obtaining acceptable grouping powers. Simulation results indicated that the grouping power was related to population size, marker spacing distance, and cutoff criterion. Generally, a large population size provided higher grouping power than small population size, and closely linked markers provided higher grouping power than loosely linked markers. The cutoff criterion range for achieving acceptable grouping power and ordering power differed for varying cases; however, combining all situations in this study, a cutoff criterion ranging from 50 cM to 60 cM was recommended for achieving acceptable grouping power and ordering power for different cases.
Energy Technology Data Exchange (ETDEWEB)
Han, Gi Yeong; Kim, Song Hyun; Kim, Do Hyun; Shin, Chang Ho; Kim, Jong Kyung [Hanyang Univ., Seoul (Korea, Republic of)
2014-05-15
In this study, how the geometry splitting strategy affects the calculation efficiency was analyzed. In this study, a geometry splitting method was proposed to increase the calculation efficiency in Monte Carlo simulation. First, the analysis of the neutron distribution characteristics in a deep penetration problem was performed. Then, considering the neutron population distribution, a geometry splitting method was devised. Using the proposed method, the FOMs with benchmark problems were estimated and compared with the conventional geometry splitting strategy. The results show that the proposed method can considerably increase the calculation efficiency in using geometry splitting method. It is expected that the proposed method will contribute to optimizing the computational cost as well as reducing the human errors in Monte Carlo simulation. Geometry splitting in Monte Carlo (MC) calculation is one of the most popular variance reduction techniques due to its simplicity, reliability and efficiency. For the use of the geometry splitting, the user should determine locations of geometry splitting and assign the relative importance of each region. Generally, the splitting parameters are decided by the user's experience. However, in this process, the splitting parameters can ineffectively or erroneously be selected. In order to prevent it, there is a recommendation to help the user eliminate guesswork, which is to split the geometry evenly. And then, the importance is estimated by a few iterations for preserving population of particle penetrating each region. However, evenly geometry splitting method can make the calculation inefficient due to the change in mean free path (MFP) of particles.
International Nuclear Information System (INIS)
Han, Gi Yeong; Kim, Song Hyun; Kim, Do Hyun; Shin, Chang Ho; Kim, Jong Kyung
2014-01-01
In this study, how the geometry splitting strategy affects the calculation efficiency was analyzed. In this study, a geometry splitting method was proposed to increase the calculation efficiency in Monte Carlo simulation. First, the analysis of the neutron distribution characteristics in a deep penetration problem was performed. Then, considering the neutron population distribution, a geometry splitting method was devised. Using the proposed method, the FOMs with benchmark problems were estimated and compared with the conventional geometry splitting strategy. The results show that the proposed method can considerably increase the calculation efficiency in using geometry splitting method. It is expected that the proposed method will contribute to optimizing the computational cost as well as reducing the human errors in Monte Carlo simulation. Geometry splitting in Monte Carlo (MC) calculation is one of the most popular variance reduction techniques due to its simplicity, reliability and efficiency. For the use of the geometry splitting, the user should determine locations of geometry splitting and assign the relative importance of each region. Generally, the splitting parameters are decided by the user's experience. However, in this process, the splitting parameters can ineffectively or erroneously be selected. In order to prevent it, there is a recommendation to help the user eliminate guesswork, which is to split the geometry evenly. And then, the importance is estimated by a few iterations for preserving population of particle penetrating each region. However, evenly geometry splitting method can make the calculation inefficient due to the change in mean free path (MFP) of particles
Kinetic energy of solid and liquid para-hydrogen: a path integral Monte Carlo simulation
International Nuclear Information System (INIS)
Zoppi, M.; Neumann, M.
1992-01-01
The translational (center of mass) kinetic energy of solid and liquid para-hydrogen have been recently measured by means of Deep Inelastic Neutron Scattering. We have evaluated the same quantity, in similar thermodynamic conditions, by means of Path Integral Monte Carlo computer simulation, modelling the system as composed of a set of spherical molecules interacting through a pairwise additive Lennard-Jones potential. In spite of the crude approximations on the interaction potential, the agreement is excellent. The pressure was also computed by means of the same simulations. This quantity, compared with the equation of state for solid para-hydrogen given by Driessen and Silvera, gives an agreement of a lesser quality and a negative value for the liquid state. We attribute this discrepancy to the limitations of the Lennard-Jones potential. (orig.)
Simulation of neutral gas flow in a tokamak divertor using the Direct Simulation Monte Carlo method
International Nuclear Information System (INIS)
Gleason-González, Cristian; Varoutis, Stylianos; Hauer, Volker; Day, Christian
2014-01-01
Highlights: • Subdivertor gas flows calculations in tokamaks by coupling the B2-EIRENE and DSMC method. • The results include pressure, temperature, bulk velocity and particle fluxes in the subdivertor. • Gas recirculation effect towards the plasma chamber through the vertical targets is found. • Comparison between DSMC and the ITERVAC code reveals a very good agreement. - Abstract: This paper presents a new innovative scientific and engineering approach for describing sub-divertor gas flows of fusion devices by coupling the B2-EIRENE (SOLPS) code and the Direct Simulation Monte Carlo (DSMC) method. The present study exemplifies this with a computational investigation of neutral gas flow in the ITER's sub-divertor region. The numerical results include the flow fields and contours of the overall quantities of practical interest such as the pressure, the temperature and the bulk velocity assuming helium as model gas. Moreover, the study unravels the gas recirculation effect located behind the vertical targets, viz. neutral particles flowing towards the plasma chamber. Comparison between calculations performed by the DSMC method and the ITERVAC code reveals a very good agreement along the main sub-divertor ducts
Monte Carlo estimation of the absorbed dose in computed tomography
Energy Technology Data Exchange (ETDEWEB)
Kim, Jin Woo; Youn, Han Bean; Kim, Ho Kyung [Pusan National University, Busan (Korea, Republic of)
2016-05-15
The purpose of this study is to devise an algorithm calculating absorbed dose distributions of patients based on Monte Carlo (MC) methods, and which includes the dose estimations due to primary and secondary (scattered) x-ray photons. Assessment of patient dose in computed tomography (CT) at the population level has become a subject of public attention and concern, and ultimate CT quality assurance and dose optimization have the goal of reducing radiation-induced cancer risks in the examined population. However, the conventional CT dose index (CTDI) concept is not a surrogate of risk but it has rather been designed to measure an average central dose. In addition, the CTDI or the dose-length product has showed troubles for helical CT with a wider beam collimation. Simple algorithms to estimate a patient specific CT dose based on the MCNP output data have been introduced. For numerical chest and head phantoms, the spatial dose distributions were calculated. The results were reasonable. The estimated dose distribution map can be readily converted into the effective dose. The important list for further studies includes the validation of the models with the experimental measurements and the acceleration of algorithms.
Monte Carlo Simulation of Complete X-Ray Spectra for Use in Scanning Electron Microscopy Analysis
International Nuclear Information System (INIS)
Roet, David; Van Espen, Piet
2003-01-01
Full Text: The interactions of keV electrons and photons with matter can be simulated accurately with the aid of the Monte Carlo (MC) technique. In scanning electron microscopy x-ray analysis (SEM-EDX) such simulations can be used to perform quantitative analysis using a Reverse Monte Carlo method even if the samples have irregular geometry. Alternatively the MC technique can generate spectra of standards for use in quantization with partial least squares regression. The feasibility of these alternatives to the more classical ZAF or phi-rho-Z quantification methods has been proven already. In order to be applicable for these purposes the MC-code needs to generate accurately only the characteristic K and L x-ray lines, but also the Bremsstrahlung continuum, i.e. the complete x-ray spectrum need to be simulated. Currently two types of MC simulation codes are available. Programs like Electron Flight Simulator and CASINO simulate characteristic x-rays due to electron interaction in a fast and efficient way but lack provision for the simulation of the continuum. On the other hand, programs like EGS4, MCNP4 and PENELOPE, originally developed for high energy (MeV- GeV) applications, are more complete but difficult to use and still slow, even on todays fastest computers. We therefore started the development of a dedicated MC simulation code for use in quantitative SEM-EDX work. The selection of the most appropriate cross section for the different interactions will be discussed and the results obtained will be compared with those obtained with existing MC programs. Examples of the application of MC simulations for quantitative analysis of samples with various composition will be given
Assessing the convergence of LHS Monte Carlo simulations of wastewater treatment models.
Benedetti, Lorenzo; Claeys, Filip; Nopens, Ingmar; Vanrolleghem, Peter A
2011-01-01
Monte Carlo (MC) simulation appears to be the only currently adopted tool to estimate global sensitivities and uncertainties in wastewater treatment modelling. Such models are highly complex, dynamic and non-linear, requiring long computation times, especially in the scope of MC simulation, due to the large number of simulations usually required. However, no stopping rule to decide on the number of simulations required to achieve a given confidence in the MC simulation results has been adopted so far in the field. In this work, a pragmatic method is proposed to minimize the computation time by using a combination of several criteria. It makes no use of prior knowledge about the model, is very simple, intuitive and can be automated: all convenient features in engineering applications. A case study is used to show an application of the method, and the results indicate that the required number of simulations strongly depends on the model output(s) selected, and on the type and desired accuracy of the analysis conducted. Hence, no prior indication is available regarding the necessary number of MC simulations, but the proposed method is capable of dealing with these variations and stopping the calculations after convergence is reached.
Computer Simulation of Mutagenesis.
North, J. C.; Dent, M. T.
1978-01-01
A FORTRAN program is described which simulates point-substitution mutations in the DNA strands of typical organisms. Its objective is to help students to understand the significance and structure of the genetic code, and the mechanisms and effect of mutagenesis. (Author/BB)
A Monte Carlo simulation for the field theory with quartic interaction
Energy Technology Data Exchange (ETDEWEB)
Santos, Sergio Mittmann dos [Instituto Federal de Educacao, Ciencia e Tecnologia do Rio Grande do Sul (IFRS), Porto Alegre, RS (Brazil)
2011-07-01
Full text: In the work [1-S. M. Santos, B. E. J. Bodmann and A. T. Gomez, Um novo metodo computacional para a teoria de campos na rede: resultados preliminares, IV Escola do Centro Brasileiro de Pesquisas Fisicas (CBPF), Rio de Janeiro, 2002; and 2-S. M. Santos and B. E. J. Bodmann, Simulacao na rede de teorias de campos quanticos, XXVIII Congresso Nacional de Matematica Aplicada e Computacional (CNMAC), Sao Paulo, 2005], a computational method on the lattice was elaborated for the problem known as scalar field theory with quartic interaction (for instance, see: J. R. Klauder, Beyound conventional quantization, Cambridge: Cambridge University Press, 2000). This one introduced an algorithm, which allows the simulation of a given field theory and is independent of the lattice spacing, by redefining the fields and the parameters (the mass m and the coupling constant g). This kind of approach permits varying the dimension of the lattice without changing the computational complexity of the algorithm. A simulation was made using the Monte Carlo method, where the renormalized mass m{sub R}, the renormalized coupling constant g{sub R} and the two point correlation function were determined with success. In the present work, the genuine computational method is used for new simulations. Now, the Monte Carlo method is not used just for the simulation of the algorithm, like in [1, 2], but also for defining the adjust parameters (the mass and the coupling constant), introduced ad hoc in [1, 2]. This work presents the first simulations' outcomes, where best results that [1, 2] were determined, for the renormalized mass and the renormalized coupling constant. (author)
A Monte Carlo-based model for simulation of digital chest tomo-synthesis
International Nuclear Information System (INIS)
Ullman, G.; Dance, D. R.; Sandborg, M.; Carlsson, G. A.; Svalkvist, A.; Baath, M.
2010-01-01
The aim of this work was to calculate synthetic digital chest tomo-synthesis projections using a computer simulation model based on the Monte Carlo method. An anthropomorphic chest phantom was scanned in a computed tomography scanner, segmented and included in the computer model to allow for simulation of realistic high-resolution X-ray images. The input parameters to the model were adapted to correspond to the VolumeRAD chest tomo-synthesis system from GE Healthcare. Sixty tomo-synthesis projections were calculated with projection angles ranging from + 15 to -15 deg. The images from primary photons were calculated using an analytical model of the anti-scatter grid and a pre-calculated detector response function. The contributions from scattered photons were calculated using an in-house Monte Carlo-based model employing a number of variance reduction techniques such as the collision density estimator. Tomographic section images were reconstructed by transferring the simulated projections into the VolumeRAD system. The reconstruction was performed for three types of images using: (i) noise-free primary projections, (ii) primary projections including contributions from scattered photons and (iii) projections as in (ii) with added correlated noise. The simulated section images were compared with corresponding section images from projections taken with the real, anthropomorphic phantom from which the digital voxel phantom was originally created. The present article describes a work in progress aiming towards developing a model intended for optimisation of chest tomo-synthesis, allowing for simulation of both existing and future chest tomo-synthesis systems. (authors)
On stochastic error and computational efficiency of the Markov Chain Monte Carlo method
Li, Jun
2014-01-01
In Markov Chain Monte Carlo (MCMC) simulations, thermal equilibria quantities are estimated by ensemble average over a sample set containing a large number of correlated samples. These samples are selected in accordance with the probability distribution function, known from the partition function of equilibrium state. As the stochastic error of the simulation results is significant, it is desirable to understand the variance of the estimation by ensemble average, which depends on the sample size (i.e., the total number of samples in the set) and the sampling interval (i.e., cycle number between two consecutive samples). Although large sample sizes reduce the variance, they increase the computational cost of the simulation. For a given CPU time, the sample size can be reduced greatly by increasing the sampling interval, while having the corresponding increase in variance be negligible if the original sampling interval is very small. In this work, we report a few general rules that relate the variance with the sample size and the sampling interval. These results are observed and confirmed numerically. These variance rules are derived for theMCMCmethod but are also valid for the correlated samples obtained using other Monte Carlo methods. The main contribution of this work includes the theoretical proof of these numerical observations and the set of assumptions that lead to them. © 2014 Global-Science Press.
IB: A Monte Carlo simulation tool for neutron scattering instrument design under PVM and MPI
International Nuclear Information System (INIS)
Zhao Jinkui
2011-01-01
Design of modern neutron scattering instruments relies heavily on Monte Carlo simulation tools for optimization. IB is one such tool written in C++ and implemented under Parallel Virtual Machine and the Message Passing Interface. The program was initially written for the design and optimization of the EQ-SANS instrument at the Spallation Neutron Source. One of its features is the ability to group simple instrument components into more complex ones at the user input level, e.g. grouping neutron mirrors into neutron guides and curved benders. The simulation engine manages the grouped components such that neutrons entering a group are properly operated upon by all components, multiple times if needed, before exiting the group. Thus, only a few basic optical modules are needed at the programming level. For simulations that require higher computer speeds, the program can be compiled and run in parallel modes using either the PVM or the MPI architectures.
Monte Carlo simulations of microchannel plate detectors I: steady-state voltage bias results
Energy Technology Data Exchange (ETDEWEB)
Ming Wu, Craig Kruschwitz, Dane Morgan, Jiaming Morgan
2008-07-01
X-ray detectors based on straight-channel microchannel plates (MCPs) are a powerful diagnostic tool for two-dimensional, time-resolved imaging and timeresolved x-ray spectroscopy in the fields of laser-driven inertial confinement fusion and fast z-pinch experiments. Understanding the behavior of microchannel plates as used in such detectors is critical to understanding the data obtained. The subject of this paper is a Monte Carlo computer code we have developed to simulate the electron cascade in a microchannel plate under a static applied voltage. Also included in the simulation is elastic reflection of low-energy electrons from the channel wall, which is important at lower voltages. When model results were compared to measured microchannel plate sensitivities, good agreement was found. Spatial resolution simulations of MCP-based detectors were also presented and found to agree with experimental measurements.
International Nuclear Information System (INIS)
Yanez, R.; Dempsey, J. F.
2007-01-01
We present studies in support of the development of a magnetic resonance imaging (MRI) guided intensity modulated radiation therapy (IMRT) device for the treatment of cancer patients. Fast and accurate computation of the absorbed ionizing radiation dose delivered in the presence of the MRI magnetic field are required for clinical implementation. The fast Monte Carlo simulation code DPM, optimized for radiotherapy treatment planning, is modified to simulate absorbed doses in uniform, static magnetic fields, and benchmarked against PENELOPE. Simulations of dose deposition in inhomogeneous phantoms in which a low density material is sandwiched in water shows that a lower MRI field strength (0.3 T) is to prefer in order to avoid dose build-up near material boundaries. (authors)
Monte Carlo Simulation for LINAC Standoff Interrogation of Nuclear Material
International Nuclear Information System (INIS)
Clarke, Shaun D.; Flaska, Marek; Miller, Thomas Martin; Protopopescu, Vladimir A.; Pozzi, Sara A.
2007-01-01
The development of new techniques for the interrogation of shielded nuclear materials relies on the use of Monte Carlo codes to accurately simulate the entire system, including the interrogation source, the fissile target and the detection environment. The objective of this modeling effort is to develop analysis tools and methods-based on a relevant scenario-which may be applied to the design of future systems for active interrogation at a standoff. For the specific scenario considered here, the analysis will focus on providing the information needed to determine the type and optimum position of the detectors. This report describes the results of simulations for a detection system employing gamma rays to interrogate fissile and nonfissile targets. The simulations were performed using specialized versions of the codes MCNPX and MCNP-PoliMi. Both prompt neutron and gamma ray and delayed neutron fluxes have been mapped in three dimensions. The time dependence of the prompt neutrons in the system has also been characterized For this particular scenario, the flux maps generated with the Monte Carlo model indicate that the detectors should be placed approximately 50 cm behind the exit of the accelerator, 40 cm away from the vehicle, and 150 cm above the ground. This position minimizes the number of neutrons coming from the accelerator structure and also receives the maximum flux of prompt neutrons coming from the source. The lead shielding around the accelerator minimizes the gamma-ray background from the accelerator in this area. The number of delayed neutrons emitted from the target is approximately seven orders of magnitude less than the prompt neutrons emitted from the system. Therefore, in order to possibly detect the delayed neutrons, the detectors should be active only after all prompt neutrons have scattered out of the system. Preliminary results have shown this time to be greater than 5 ?s after the accelerator pulse. This type of system is illustrative of a
Direct Simulation Monte Carlo Application of the Three Dimensional Forced Harmonic Oscillator Model
2017-12-07
NUMBER (Include area code) 07 December 2017 Journal Article 24 February 2017 - 31 December 2017 Direct Simulation Monte Carlo Application of the...is proposed. The implementation employs precalculated lookup tables for transition probabilities and is suitable for the direct simulation Monte Carlo...method. It takes into account the microscopic reversibility between the excitation and deexcitation processes , and it satisfies the detailed balance
International Nuclear Information System (INIS)
Danielson, Thomas; Sutton, Jonathan E.; Hin, Céline; Virginia Polytechnic Institute and State University; Savara, Aditya
2017-01-01
Lattice based Kinetic Monte Carlo (KMC) simulations offer a powerful simulation technique for investigating large reaction networks while retaining spatial configuration information, unlike ordinary differential equations. However, large chemical reaction networks can contain reaction processes with rates spanning multiple orders of magnitude. This can lead to the problem of “KMC stiffness” (similar to stiffness in differential equations), where the computational expense has the potential to be overwhelmed by very short time-steps during KMC simulations, with the simulation spending an inordinate amount of KMC steps / cpu-time simulating fast frivolous processes (FFPs) without progressing the system (reaction network). In order to achieve simulation times that are experimentally relevant or desired for predictions, a dynamic throttling algorithm involving separation of the processes into speed-ranks based on event frequencies has been designed and implemented with the intent of decreasing the probability of FFP events, and increasing the probability of slow process events -- allowing rate limiting events to become more likely to be observed in KMC simulations. This Staggered Quasi-Equilibrium Rank-based Throttling for Steady-state (SQERTSS) algorithm designed for use in achieving and simulating steady-state conditions in KMC simulations. Lastly, as shown in this work, the SQERTSS algorithm also works for transient conditions: the correct configuration space and final state will still be achieved if the required assumptions are not violated, with the caveat that the sizes of the time-steps may be distorted during the transient period.
Srna - Monte Carlo codes for proton transport simulation in combined and voxelized geometries
Directory of Open Access Journals (Sweden)
Ilić Radovan D.
2002-01-01
Full Text Available This paper describes new Monte Carlo codes for proton transport simulations in complex geometrical forms and in materials of different composition. The SRNA codes were developed for three dimensional (3D dose distribution calculation in proton therapy and dosimetry. The model of these codes is based on the theory of proton multiple scattering and a simple model of compound nucleus decay. The developed package consists of two codes: SRNA-2KG and SRNA-VOX. The first code simulates proton transport in combined geometry that can be described by planes and second order surfaces. The second one uses the voxelized geometry of material zones and is specifically adopted for the application of patient computer tomography data. Transition probabilities for both codes are given by the SRNADAT program. In this paper, we will present the models and algorithms of our programs, as well as the results of the numerical experiments we have carried out applying them, along with the results of proton transport simulation obtained through the PETRA and GEANT programs. The simulation of the proton beam characterization by means of the Multi-Layer Faraday Cup and spatial distribution of positron emitters obtained by our program indicate the imminent application of Monte Carlo techniques in clinical practice.
Monte Carlo simulations of lattice models for single polymer systems
Hsu, Hsiao-Ping
2014-10-01
Single linear polymer chains in dilute solutions under good solvent conditions are studied by Monte Carlo simulations with the pruned-enriched Rosenbluth method up to the chain length N ˜ O(10^4). Based on the standard simple cubic lattice model (SCLM) with fixed bond length and the bond fluctuation model (BFM) with bond lengths in a range between 2 and sqrt{10}, we investigate the conformations of polymer chains described by self-avoiding walks on the simple cubic lattice, and by random walks and non-reversible random walks in the absence of excluded volume interactions. In addition to flexible chains, we also extend our study to semiflexible chains for different stiffness controlled by a bending potential. The persistence lengths of chains extracted from the orientational correlations are estimated for all cases. We show that chains based on the BFM are more flexible than those based on the SCLM for a fixed bending energy. The microscopic differences between these two lattice models are discussed and the theoretical predictions of scaling laws given in the literature are checked and verified. Our simulations clarify that a different mapping ratio between the coarse-grained models and the atomistically realistic description of polymers is required in a coarse-graining approach due to the different crossovers to the asymptotic behavior.
Monte Carlo simulations of lattice models for single polymer systems
International Nuclear Information System (INIS)
Hsu, Hsiao-Ping
2014-01-01
Single linear polymer chains in dilute solutions under good solvent conditions are studied by Monte Carlo simulations with the pruned-enriched Rosenbluth method up to the chain length N∼O(10 4 ). Based on the standard simple cubic lattice model (SCLM) with fixed bond length and the bond fluctuation model (BFM) with bond lengths in a range between 2 and √(10), we investigate the conformations of polymer chains described by self-avoiding walks on the simple cubic lattice, and by random walks and non-reversible random walks in the absence of excluded volume interactions. In addition to flexible chains, we also extend our study to semiflexible chains for different stiffness controlled by a bending potential. The persistence lengths of chains extracted from the orientational correlations are estimated for all cases. We show that chains based on the BFM are more flexible than those based on the SCLM for a fixed bending energy. The microscopic differences between these two lattice models are discussed and the theoretical predictions of scaling laws given in the literature are checked and verified. Our simulations clarify that a different mapping ratio between the coarse-grained models and the atomistically realistic description of polymers is required in a coarse-graining approach due to the different crossovers to the asymptotic behavior
Vacuum thermochromatography: physical principles and Monte Carlo simulation
International Nuclear Information System (INIS)
Zvara, I.
2014-01-01
The title method for preparative separation of infinitesimal amounts of relatively volatile elements or compounds with different adsorbability is based on the molecular flow in an evacuated open column with imposed temperature gradient. The analytes put into the column's closed 'hot' end begin to migrate owing to random flights of their molecules between two consecutive collisions with the wall. Each strike results in adsorption of the entity on the surface for a random time whose mean increases d ownstream ; as a result, various analytes come to practical rest in individual temperature ranges. Here, the microscopic picture of the molecular histories is described in quantitative details, assuming that the velocity vectors of the desorbing molecules obey the cosine law angular distribution. The probability density functions for the full and projected flight lengths in long cylinders are derived. They were used in Monte Carlo simulation of great many migration histories to obtain the peaking profiles of the deposits. Numerous particular sets of experimental regimes and conditions were simulated to elucidate influence of these variables on the profiles and the characteristic deposition temperatures
Monte Carlo Simulations of Necrotic Cell Targeted Alpha Therapy
International Nuclear Information System (INIS)
Penfold, S.N.; Brown, M.P.; Bezak, E.
2011-01-01
Full text: Hypoxic tumour cells are radioresistant and are significant contributors to the locoregional recurrences and distant metastases that mark treatment failure. Due to restricted circulatory supply, hypoxic tumor cells frequently become necrotic and thus necrotic areas often lie near hypoxic tumour areas. In this study we investigate the feasibility of binding an alpha-emitting conjugate to necrotic cells located in the proximity of hypoxic, viable tumour cells. Monte Carlo radiation transport simulations were performed to investigate the dose distribution resulting from the thorium 227 (Th227) decay chain in a representative tumour geometry. The Geant4 software toolkit was used to simulate the decay and interactions of the Th227 decay chain. The distribution of Th227 was based on a study by Thomlinson and Gray of human lung cancer histological samples (Thomlinson RH, Gray LH. Br J Cancer 1955; 9:539). The normalized dose distribution obtained with Geant4 from a cylindrical Th227 source in water is illustrated in Fig. I. The relative contribution of the different decay channels is displayed, together with a profile through the centre of the accumulated dose map. The results support the hypothesis that significant α-particle doses will be deposited in the hypoxic tumor tissue immediately surrounding the necrotic core (where the majority of Th227 will be located). As an internal a-particle generator, the Th227-radioimmunoconjugate shows potential as an efficient hypoxic tumour sterilizer.
Monte-Carlo simulation of dispersion fuel meat structure
International Nuclear Information System (INIS)
Xing Zhonghu; Ying Shihao
2003-01-01
Under the irradiation conditions in research reactors, the inter-diffusion occurs at the fuel particle and matrix interfaces of U 3 Si 2 -Al dispersion fuel. Because of the inter-diffusion reaction, the U 3 Al 7 Si 2 layer is formed around each U 3 Si 2 particle. The layer thickness grows up with irradiation duration and fission density. The formation of resultant layer causes the consumption of U 3 Si 2 fuel and aluminum matrix. This process leads to the evolution of geometrical structure of fuel meat. According to the stochastic locations of particles in dispersion, the authors developed a simulation method for the evolution of the fuel meat structure by utilizing Monte-Carlo method. Every particle is characterized by its diameter and location. The parameters of meat structure include particle size distribution, as-fabricated fuel volume fraction, resultant layer thickness, layer volume fraction, U 3 Si 2 fuel volume fraction, aluminum volume fraction, contiguity probability and inter-linkage fraction of particles. Particularly for the dispersion with as-fabricated fuel volume fraction of 43% and particle sizes in a well-defined normal distribution, more than 13000 sampling particles are simulated in the meat volume of 6 mm x 6 mm x 0.5 mm. The meat structure parameters are calculated as functions of layer thickness in the range from 0-16 μm. (authors)
Monte Carlo simulation of electron swarms in H2
International Nuclear Information System (INIS)
Hunter, S.R.
1976-05-01
A Monte-Carlo simulation of the motion of an electron swarm in molecular hydrogen was studied in the range E/N = 1.4-170 Td (1 Td = 10 -17 V/cms 2 ). The simulation was performed for 400-600 electrons at several values of E/N for two different sets of inelastic collision cross sections at high values of E/N. The longitudinal diffusion coefficient Dsub(L), lateral diffusion coefficient D, swarm drift velocity W, average swarm energy epsilon, and the ionization and excitation production coefficients were obtained and compared with experimental results where these are available. It was found that the results obtained differ significantly from the experimental values and this is attributed to the isotopic scattering model used in this work. However, the results lend support to the experimental technique reported by Blevin et al used to determine these transport parameters, and in particular confirm their result that Dsub(L) > D at high values of E/N. (author)
Fluid simulation for computer graphics
Bridson, Robert
2008-01-01
Animating fluids like water, smoke, and fire using physics-based simulation is increasingly important in visual effects, in particular in movies, like The Day After Tomorrow, and in computer games. This book provides a practical introduction to fluid simulation for graphics. The focus is on animating fully three-dimensional incompressible flow, from understanding the math and the algorithms to the actual implementation.
Fel simulations using distributed computing
Einstein, J.; Biedron, S.G.; Freund, H.P.; Milton, S.V.; Van Der Slot, P. J M; Bernabeu, G.
2016-01-01
While simulation tools are available and have been used regularly for simulating light sources, including Free-Electron Lasers, the increasing availability and lower cost of accelerated computing opens up new opportunities. This paper highlights a method of how accelerating and parallelizing code
The Cherenkov Telescope Array production system for Monte Carlo simulations and analysis
Arrabito, L.; Bernloehr, K.; Bregeon, J.; Cumani, P.; Hassan, T.; Haupt, A.; Maier, G.; Moralejo, A.; Neyroud, N.; pre="for the"> CTA Consortium,
2017-10-01
The Cherenkov Telescope Array (CTA), an array of many tens of Imaging Atmospheric Cherenkov Telescopes deployed on an unprecedented scale, is the next-generation instrument in the field of very high energy gamma-ray astronomy. An average data stream of about 0.9 GB/s for about 1300 hours of observation per year is expected, therefore resulting in 4 PB of raw data per year and a total of 27 PB/year, including archive and data processing. The start of CTA operation is foreseen in 2018 and it will last about 30 years. The installation of the first telescopes in the two selected locations (Paranal, Chile and La Palma, Spain) will start in 2017. In order to select the best site candidate to host CTA telescopes (in the Northern and in the Southern hemispheres), massive Monte Carlo simulations have been performed since 2012. Once the two sites have been selected, we have started new Monte Carlo simulations to determine the optimal array layout with respect to the obtained sensitivity. Taking into account that CTA may be finally composed of 7 different telescope types coming in 3 different sizes, many different combinations of telescope position and multiplicity as a function of the telescope type have been proposed. This last Monte Carlo campaign represented a huge computational effort, since several hundreds of telescope positions have been simulated, while for future instrument response function simulations, only the operating telescopes will be considered. In particular, during the last 18 months, about 2 PB of Monte Carlo data have been produced and processed with different analysis chains, with a corresponding overall CPU consumption of about 125 M HS06 hours. In these proceedings, we describe the employed computing model, based on the use of grid resources, as well as the production system setup, which relies on the DIRAC interware. Finally, we present the envisaged evolutions of the CTA production system for the off-line data processing during CTA operations and
The Monte Carlo Simulation Method for System Reliability and Risk Analysis
Zio, Enrico
2013-01-01
Monte Carlo simulation is one of the best tools for performing realistic analysis of complex systems as it allows most of the limiting assumptions on system behavior to be relaxed. The Monte Carlo Simulation Method for System Reliability and Risk Analysis comprehensively illustrates the Monte Carlo simulation method and its application to reliability and system engineering. Readers are given a sound understanding of the fundamentals of Monte Carlo sampling and simulation and its application for realistic system modeling. Whilst many of the topics rely on a high-level understanding of calculus, probability and statistics, simple academic examples will be provided in support to the explanation of the theoretical foundations to facilitate comprehension of the subject matter. Case studies will be introduced to provide the practical value of the most advanced techniques. This detailed approach makes The Monte Carlo Simulation Method for System Reliability and Risk Analysis a key reference for senior undergra...
Modelling of an industrial environment, part 1.: Monte Carlo simulations of photon transport
International Nuclear Information System (INIS)
Kis, Z.; Eged, K.; Meckbach, R.; Voigt, G.
2002-01-01
After a nuclear accident releasing radioactive material into the environment the external exposures may contribute significantly to the radiation exposure of the population (UNSCEAR 1988, 2000). For urban populations the external gamma exposure from radionuclides deposited on the surfaces of the urban-industrial environments yields the dominant contributions to the total dose to the public (Kelly 1987; Jacob and Meckbach 1990). The radiation field is naturally influenced by the environment around the sources. For calculations of the shielding effect of the structures in complex and realistic urban environments Monte Carlo methods turned out to be useful tools (Jacob and Meckbach 1987; Meckbach et al. 1988). Using these methods a complex environment can be set up in which the photon transport can be solved on a reliable way. The accuracy of the methods is in principle limited only by the knowledge of the atomic cross sections and the computational time. Several papers using Monte Carlo results for calculating doses from the external gamma exposures were published (Jacob and Meckbach 1987, 1990; Meckbach et al. 1988; Rochedo et al. 1996). In these papers the Monte Carlo simulations were run in urban environments and for different photon energies. The industrial environment can be defined as such an area where productive and/or commercial activity is carried out. A good example can be a factory or a supermarket. An industrial environment can rather be different from the urban ones as for the types and structures of the buildings and their dimensions. These variations will affect the radiation field of this environment. Hence there is a need to run new Monte Carlo simulations designed specially for the industrial environments
Computer simulation of high energy displacement cascades
International Nuclear Information System (INIS)
Heinisch, H.L.
1990-01-01
A methodology developed for modeling many aspects of high energy displacement cascades with molecular level computer simulations is reviewed. The initial damage state is modeled in the binary collision approximation (using the MARLOWE computer code), and the subsequent disposition of the defects within a cascade is modeled with a Monte Carlo annealing simulation (the ALSOME code). There are few adjustable parameters, and none are set to physically unreasonable values. The basic configurations of the simulated high energy cascades in copper, i.e., the number, size and shape of damage regions, compare well with observations, as do the measured numbers of residual defects and the fractions of freely migrating defects. The success of these simulations is somewhat remarkable, given the relatively simple models of defects and their interactions that are employed. The reason for this success is that the behavior of the defects is very strongly influenced by their initial spatial distributions, which the binary collision approximation adequately models. The MARLOWE/ALSOME system, with input from molecular dynamics and experiments, provides a framework for investigating the influence of high energy cascades on microstructure evolution. (author)
International Nuclear Information System (INIS)
Dinpajooh, Mohammadhasan; Bai, Peng; Allan, Douglas A.; Siepmann, J. Ilja
2015-01-01
Since the seminal paper by Panagiotopoulos [Mol. Phys. 61, 813 (1997)], the Gibbs ensemble Monte Carlo (GEMC) method has been the most popular particle-based simulation approach for the computation of vapor–liquid phase equilibria. However, the validity of GEMC simulations in the near-critical region has been questioned because rigorous finite-size scaling approaches cannot be applied to simulations with fluctuating volume. Valleau [Mol. Simul. 29, 627 (2003)] has argued that GEMC simulations would lead to a spurious overestimation of the critical temperature. More recently, Patel et al. [J. Chem. Phys. 134, 024101 (2011)] opined that the use of analytical tail corrections would be problematic in the near-critical region. To address these issues, we perform extensive GEMC simulations for Lennard-Jones particles in the near-critical region varying the system size, the overall system density, and the cutoff distance. For a system with N = 5500 particles, potential truncation at 8σ and analytical tail corrections, an extrapolation of GEMC simulation data at temperatures in the range from 1.27 to 1.305 yields T c = 1.3128 ± 0.0016, ρ c = 0.316 ± 0.004, and p c = 0.1274 ± 0.0013 in excellent agreement with the thermodynamic limit determined by Potoff and Panagiotopoulos [J. Chem. Phys. 109, 10914 (1998)] using grand canonical Monte Carlo simulations and finite-size scaling. Critical properties estimated using GEMC simulations with different overall system densities (0.296 ≤ ρ t ≤ 0.336) agree to within the statistical uncertainties. For simulations with tail corrections, data obtained using r cut = 3.5σ yield T c and p c that are higher by 0.2% and 1.4% than simulations with r cut = 5 and 8σ but still with overlapping 95% confidence intervals. In contrast, GEMC simulations with a truncated and shifted potential show that r cut = 8σ is insufficient to obtain accurate results. Additional GEMC simulations for hard-core square-well particles with various
MONTE CARLO METHOD AND APPLICATION IN @RISK SIMULATION SYSTEM
Directory of Open Access Journals (Sweden)
Gabriela Ižaríková
2015-12-01
Full Text Available The article is an example of using the software simulation @Risk designed for simulation in Microsoft Excel spread sheet, demonstrated the possibility of its usage in order to show a universal method of solving problems. The simulation is experimenting with computer models based on the real production process in order to optimize the production processes or the system. The simulation model allows performing a number of experiments, analysing them, evaluating, optimizing and afterwards applying the results to the real system. A simulation model in general is presenting modelling system by using mathematical formulations and logical relations. In the model is possible to distinguish controlled inputs (for instance investment costs and random outputs (for instance demand, which are by using a model transformed into outputs (for instance mean value of profit. In case of a simulation experiment at the beginning are chosen controlled inputs and random (stochastic outputs are generated randomly. Simulations belong into quantitative tools, which can be used as a support for a decision making.
Energy Technology Data Exchange (ETDEWEB)
McGrath, M; Siepmann, J I; Kuo, I W; Mundy, C J; VandeVondele, J; Hutter, J; Mohamed, F; Krack, M
2004-12-02
A series of first principles Monte Carlo simulations in the isobaric-isothermal ensemble were carried out for liquid water at ambient conditions (T = 298 K and p = 1 atm). The Becke-Lee-Yang-Parr (BLYP) exchange and correlation energy functionals and norm-conserving Goedecker-Teter-Hutter (GTH) pseudopotentials were employed with the CP2K simulation package to examine systems consisting of 64 water molecules. The fluctuations in the system volume encountered in simulations in the isobaric-isothermal ensemble requires a reconsideration of the suitability of the typical charge density cutoff and the regular grid generation method previously used for the computation of the electrostatic energy in first principles simulations in the microcanonical or canonical ensembles. In particular, it is noted that a much higher cutoff is needed and that the most computationally efficient method of creating grids can result in poor simulations. Analysis of the simulation trajectories using a very large charge density cutoff at 1200 Ry and four different grid generation methods point to a substantially underestimated liquid density of about 0.85 g/cm{sup 3} resulting in a somewhat understructured liquid (with a value of about 2.7 for the height of the first peak in the oxygen/oxygen radial distribution function) for BLYP-GTH water at ambient conditions.
Monte Carlo simulation of radiative processes in electron-positron scattering
International Nuclear Information System (INIS)
Kleiss, R.H.P.
1982-01-01
The Monte Carlo simulation of scattering processes has turned out to be one of the most successful methods of translating theoretical predictions into experimentally meaningful quantities. It is the purpose of this thesis to describe how this approach can be applied to higher-order QED corrections to several fundamental processes. In chapter II a very brief overview of the currently interesting phenomena in e +- scattering is given. It is argued that accurate information on higher-order QED corrections is very important and that the Monte Carlo approach is one of the most flexible and general methods to obtain this information. In chapter III the author describes various techniques which are useful in this context, and makes a few remarks on the numerical aspects of the proposed method. In the following three chapters he applies this to the processes e + e - → μ + μ - (γ) and e + e - → qanti q(sigma). In chapter IV he motivates his choice of these processes in view of their experimental and theoretical relevance. The formulae necessary for a computer simulation of all quantities of interest, up to order α 3 , is given. Chapters V and VI describe how this simulation can be performed using the techniques mentioned in chapter III. In chapter VII it is shown how additional dynamical quantities, namely the polarization of the incoming and outgoing particles, can be incorporated in our treatment, and the relevant formulae for the example processes mentioned above are given. Finally, in chapter VIII the author presents some examples of the comparison between theoretical predictions based on Monte Carlo simulations as outlined here, and the results from actual experiments. (Auth.)
Confidence interval procedures for Monte Carlo transport simulations
International Nuclear Information System (INIS)
Pederson, S.P.
1997-01-01
The problem of obtaining valid confidence intervals based on estimates from sampled distributions using Monte Carlo particle transport simulation codes such as MCNP is examined. Such intervals can cover the true parameter of interest at a lower than nominal rate if the sampled distribution is extremely right-skewed by large tallies. Modifications to the standard theory of confidence intervals are discussed and compared with some existing heuristics, including batched means normality tests. Two new types of diagnostics are introduced to assess whether the conditions of central limit theorem-type results are satisfied: the relative variance of the variance determines whether the sample size is sufficiently large, and estimators of the slope of the right tail of the distribution are used to indicate the number of moments that exist. A simulation study is conducted to quantify the relationship between various diagnostics and coverage rates and to find sample-based quantities useful in indicating when intervals are expected to be valid. Simulated tally distributions are chosen to emulate behavior seen in difficult particle transport problems. Measures of variation in the sample variance s 2 are found to be much more effective than existing methods in predicting when coverage will be near nominal rates. Batched means tests are found to be overly conservative in this regard. A simple but pathological MCNP problem is presented as an example of false convergence using existing heuristics. The new methods readily detect the false convergence and show that the results of the problem, which are a factor of 4 too small, should not be used. Recommendations are made for applying these techniques in practice, using the statistical output currently produced by MCNP
International Nuclear Information System (INIS)
Li, Junli; Qiu, Rui; Yan, Congchong; Xie, Wenzhang; Zeng, Zhi; Li, Chunyan; Wu, Zhen; Tung, Chuanjong
2015-01-01
The method of Monte Carlo simulation is a powerful tool to investigate the details of radiation biological damage at the molecular level. In this paper, a Monte Carlo code called NASIC (Nanodosimetry Monte Carlo Simulation Code) was developed. It includes physical module, pre-chemical module, chemical module, geometric module and DNA damage module. The physical module can simulate physical tracks of low-energy electrons in the liquid water event-by-event. More than one set of inelastic cross sections were calculated by applying the dielectric function method of Emfietzoglou's optical-data treatments, with different optical data sets and dispersion models. In the pre-chemical module, the ionised and excited water molecules undergo dissociation processes. In the chemical module, the produced radiolytic chemical species diffuse and react. In the geometric module, an atomic model of 46 chromatin fibres in a spherical nucleus of human lymphocyte was established. In the DNA damage module, the direct damages induced by the energy depositions of the electrons and the indirect damages induced by the radiolytic chemical species were calculated. The parameters should be adjusted to make the simulation results be agreed with the experimental results. In this paper, the influence study of the inelastic cross sections and vibrational excitation reaction on the parameters and the DNA strand break yields were studied. Further work of NASIC is underway (authors)
International Nuclear Information System (INIS)
Badal, Andreu; Badano, Aldo
2009-01-01
Purpose: It is a known fact that Monte Carlo simulations of radiation transport are computationally intensive and may require long computing times. The authors introduce a new paradigm for the acceleration of Monte Carlo simulations: The use of a graphics processing unit (GPU) as the main computing device instead of a central processing unit (CPU). Methods: A GPU-based Monte Carlo code that simulates photon transport in a voxelized geometry with the accurate physics models from PENELOPE has been developed using the CUDA programming model (NVIDIA Corporation, Santa Clara, CA). Results: An outline of the new code and a sample x-ray imaging simulation with an anthropomorphic phantom are presented. A remarkable 27-fold speed up factor was obtained using a GPU compared to a single core CPU. Conclusions: The reported results show that GPUs are currently a good alternative to CPUs for the simulation of radiation transport. Since the performance of GPUs is currently increasing at a faster pace than that of CPUs, the advantages of GPU-based software are likely to be more pronounced in the future.
International Nuclear Information System (INIS)
Ozaki, Y.; Watanabe, H.; Kaida, A.; Miura, M.; Nakagawa, K.; Toda, K.; Yoshimura, R.; Sumi, Y.; Kurabayashi, T.
2017-01-01
Early stage oral cancer can be cured with oral brachytherapy, but whole-body radiation exposure status has not been previously studied. Recently, the International Commission on Radiological Protection Committee (ICRP) recommended the use of ICRP phantoms to estimate radiation exposure from external and internal radiation sources. In this study, we used a Monte Carlo simulation with ICRP phantoms to estimate whole-body exposure from oral brachytherapy. We used a Particle and Heavy Ion Transport code System (PHITS) to model oral brachytherapy with 192 Ir hairpins and 198 Au grains and to perform a Monte Carlo simulation on the ICRP adult reference computational phantoms. To confirm the simulations, we also computed local dose distributions from these small sources, and compared them with the results from Oncentra manual Low Dose Rate Treatment Planning (mLDR) software which is used in day-to-day clinical practice. We successfully obtained data on absorbed dose for each organ in males and females. Sex-averaged equivalent doses were 0.547 and 0.710 Sv with 192 Ir hairpins and 198 Au grains, respectively. Simulation with PHITS was reliable when compared with an alternative computational technique using mLDR software. We concluded that the absorbed dose for each organ and whole-body exposure from oral brachytherapy can be estimated with Monte Carlo simulation using PHITS on ICRP reference phantoms. Effective doses for patients with oral cancer were obtained.
Energy Technology Data Exchange (ETDEWEB)
Badal, Andreu; Badano, Aldo [Division of Imaging and Applied Mathematics, OSEL, CDRH, U.S. Food and Drug Administration, Silver Spring, Maryland 20993-0002 (United States)
2009-11-15
Purpose: It is a known fact that Monte Carlo simulations of radiation transport are computationally intensive and may require long computing times. The authors introduce a new paradigm for the acceleration of Monte Carlo simulations: The use of a graphics processing unit (GPU) as the main computing device instead of a central processing unit (CPU). Methods: A GPU-based Monte Carlo code that simulates photon transport in a voxelized geometry with the accurate physics models from PENELOPE has been developed using the CUDA programming model (NVIDIA Corporation, Santa Clara, CA). Results: An outline of the new code and a sample x-ray imaging simulation with an anthropomorphic phantom are presented. A remarkable 27-fold speed up factor was obtained using a GPU compared to a single core CPU. Conclusions: The reported results show that GPUs are currently a good alternative to CPUs for the simulation of radiation transport. Since the performance of GPUs is currently increasing at a faster pace than that of CPUs, the advantages of GPU-based software are likely to be more pronounced in the future.
Badal, Andreu; Badano, Aldo
2009-11-01
It is a known fact that Monte Carlo simulations of radiation transport are computationally intensive and may require long computing times. The authors introduce a new paradigm for the acceleration of Monte Carlo simulations: The use of a graphics processing unit (GPU) as the main computing device instead of a central processing unit (CPU). A GPU-based Monte Carlo code that simulates photon transport in a voxelized geometry with the accurate physics models from PENELOPE has been developed using the CUDATM programming model (NVIDIA Corporation, Santa Clara, CA). An outline of the new code and a sample x-ray imaging simulation with an anthropomorphic phantom are presented. A remarkable 27-fold speed up factor was obtained using a GPU compared to a single core CPU. The reported results show that GPUs are currently a good alternative to CPUs for the simulation of radiation transport. Since the performance of GPUs is currently increasing at a faster pace than that of CPUs, the advantages of GPU-based software are likely to be more pronounced in the future.
Computer Simulation of Electron Positron Annihilation Processes
Energy Technology Data Exchange (ETDEWEB)
Chen, y
2003-10-02
With the launching of the Next Linear Collider coming closer and closer, there is a pressing need for physicists to develop a fully-integrated computer simulation of e{sup +}e{sup -} annihilation process at center-of-mass energy of 1TeV. A simulation program acts as the template for future experiments. Either new physics will be discovered, or current theoretical uncertainties will shrink due to more accurate higher-order radiative correction calculations. The existence of an efficient and accurate simulation will help us understand the new data and validate (or veto) some of the theoretical models developed to explain new physics. It should handle well interfaces between different sectors of physics, e.g., interactions happening at parton levels well above the QCD scale which are described by perturbative QCD, and interactions happening at much lower energy scale, which combine partons into hadrons. Also it should achieve competitive speed in real time when the complexity of the simulation increases. This thesis contributes some tools that will be useful for the development of such simulation programs. We begin our study by the development of a new Monte Carlo algorithm intended to perform efficiently in selecting weight-1 events when multiple parameter dimensions are strongly correlated. The algorithm first seeks to model the peaks of the distribution by features, adapting these features to the function using the EM algorithm. The representation of the distribution provided by these features is then improved using the VEGAS algorithm for the Monte Carlo integration. The two strategies mesh neatly into an effective multi-channel adaptive representation. We then present a new algorithm for the simulation of parton shower processes in high energy QCD. We want to find an algorithm which is free of negative weights, produces its output as a set of exclusive events, and whose total rate exactly matches the full Feynman amplitude calculation. Our strategy is to create
Shielding evaluation of neutron generator hall by Monte Carlo simulations
Energy Technology Data Exchange (ETDEWEB)
Pujala, U.; Selvakumaran, T.S.; Baskaran, R.; Venkatraman, B. [Radiological Safety Division, Indira Gandhi Center for Atomic Research, Kalpakkam (India); Thilagam, L.; Mohapatra, D.K., E-mail: swathythila2@yahoo.com [Safety Research Institute, Atomic Energy Regulatory Board, Kalpakkam (India)
2017-04-01
A shielded hall was constructed for accommodating a D-D, D-T or D-Be based pulsed neutron generator (NG) with 4π yield of 10{sup 9} n/s. The neutron shield design of the facility was optimized using NCRP-51 methodology such that the total dose rates outside the hall areas are well below the regulatory limit for full occupancy criterion (1 μSv/h). However, the total dose rates at roof top, cooling room trench exit and labyrinth exit were found to be above this limit for the optimized design. Hence, additional neutron shielding arrangements were proposed for cooling room trench and labyrinth exits. The roof top was made inaccessible. The present study is an attempt to evaluate the neutron and associated capture gamma transport through the bulk shields for the complete geometry and materials of the NG-Hall using Monte Carlo (MC) codes MCNP and FLUKA. The neutron source terms of D-D, D-T and D-Be reactions are considered in the simulations. The effect of additional shielding proposed has been demonstrated through the simulations carried out with the consideration of the additional shielding for D-Be neutron source term. The results MC simulations using two different codes are found to be consistent with each other for neutron dose rate estimates. However, deviation up to 28% is noted between these two codes at few locations for capture gamma dose rate estimates. Overall, the dose rates estimated by MC simulations including additional shields shows that all the locations surrounding the hall satisfy the full occupancy criteria for all three types of sources. Additionally, the dose rates due to direct transmission of primary neutrons estimated by FLUKA are compared with the values calculated using the formula given in NCRP-51 which shows deviations up to 50% with each other. The details of MC simulations and NCRP-51 methodology for the estimation of primary neutron dose rate along with the results are presented in this paper. (author)
Medical images of patients in voxel structures in high resolution for Monte Carlo simulation
International Nuclear Information System (INIS)
Boia, Leonardo S.; Menezes, Artur F.; Silva, Ademir X.
2011-01-01
This work aims to present a computational process of conversion of tomographic and MRI medical images from patients in voxel structures to an input file, which will be manipulated in Monte Carlo Simulation code for tumor's radiotherapic treatments. The problem's scenario inherent to the patient is simulated by such process, using the volume element (voxel) as a unit of computational tracing. The head's voxel structure geometry has voxels with volumetric dimensions around 1 mm 3 and a population of millions, which helps - in that way, for a realistic simulation and a decrease in image's digital process techniques for adjustments and equalizations. With such additional data from the code, a more critical analysis can be developed in order to determine the volume of the tumor, and the protection, beside the patients' medical images were borrowed by Clinicas Oncologicas Integradas (COI/RJ), joined to the previous performed planning. In order to execute this computational process, SAPDI computational system is used in a digital image process for optimization of data, conversion program Scan2MCNP, which manipulates, processes, and converts the medical images into voxel structures to input files and the graphic visualizer Moritz for the verification of image's geometry placing. (author)
Medical images of patients in voxel structures in high resolution for Monte Carlo simulation
Energy Technology Data Exchange (ETDEWEB)
Boia, Leonardo S.; Menezes, Artur F.; Silva, Ademir X., E-mail: lboia@con.ufrj.b, E-mail: ademir@con.ufrj.b [Universidade Federal do Rio de Janeiro (PEN/COPPE/UFRJ), RJ (Brazil). Coordenacao dos Programas de Pos-Graduacao de Engenharia. Programa de Engenharia Nuclear; Salmon Junior, Helio A. [Clinicas Oncologicas Integradas (COI), Rio de Janeiro, RJ (Brazil)
2011-07-01
This work aims to present a computational process of conversion of tomographic and MRI medical images from patients in voxel structures to an input file, which will be manipulated in Monte Carlo Simulation code for tumor's radiotherapic treatments. The problem's scenario inherent to the patient is simulated by such process, using the volume element (voxel) as a unit of computational tracing. The head's voxel structure geometry has voxels with volumetric dimensions around 1 mm{sup 3} and a population of millions, which helps - in that way, for a realistic simulation and a decrease in image's digital process techniques for adjustments and equalizations. With such additional data from the code, a more critical analysis can be developed in order to determine the volume of the tumor, and the protection, beside the patients' medical images were borrowed by Clinicas Oncologicas Integradas (COI/RJ), joined to the previous performed planning. In order to execute this computational process, SAPDI computational system is used in a digital image process for optimization of data, conversion program Scan2MCNP, which manipulates, processes, and converts the medical images into voxel structures to input files and the graphic visualizer Moritz for the verification of image's geometry placing. (author)
International Nuclear Information System (INIS)
Schoen, M.
1995-01-01
In this article the Taylor-expansion method is introduced by which Monte Carlo (MC) simulations in the canonical ensemble can be speeded up significantly, Substantial gains in computational speed of 20-40% over conventional implementations of the MC technique are obtained over a wide range of densities in homogeneous bulk phases. The basic philosophy behind the Taylor-expansion method is a division of the neighborhood of each atom (or molecule) into three different spatial zones. Interactions between atoms belonging to each zone are treated at different levels of computational sophistication. For example, only interactions between atoms belonging to the primary zone immediately surrounding an atom are treated explicitly before and after displacement. The change in the configurational energy contribution from secondary-zone interactions is obtained from the first-order term of a Taylor expansion of the configurational energy in terms of the displacement vector d. Interactions with atoms in the tertiary zone adjacent to the secondary zone are neglected throughout. The Taylor-expansion method is not restricted to the canonical ensemble but may be employed to enhance computational efficiency of MC simulations in other ensembles as well. This is demonstrated for grand canonical ensemble MC simulations of an inhomogeneous fluid which can be performed essentially on a modern personal computer
Monte Carlo Simulation of the Echo Signals from Low-Flying Targets for Airborne Radar
Directory of Open Access Journals (Sweden)
Mingyuan Man
2014-01-01
Full Text Available A demonstrated hybrid method based on the combination of half-space physical optics method (PO, graphical-electromagnetic computing (GRECO, and Monte Carlo method on echo signals from low-flying targets based on actual environment for airborne radar is presented in this paper. The half-space physical optics method , combined with the graphical-electromagnetic computing (GRECO method to eliminate the shadow regions quickly and rebuild the target automatically, is employed to calculate the radar cross section (RCS of the conductive targets in half space fast and accurately. The direct echo is computed based on the radar equation. The reflected paths from sea or ground surface cause multipath effects. In order to accurately obtain the echo signals, the phase factors are modified for fluctuations in multipath, and the statistical average value of the echo signals is obtained using the Monte Carlo method. A typical simulation is performed, and the numerical results show the accuracy of the proposed method.
Monte Carlo simulation of single accident airport risk profile
1979-01-01
A computer simulation model was developed for estimating the potential economic impacts of a carbon fiber release upon facilities within an 80 kilometer radius of a major airport. The model simulated the possible range of release conditions and the resulting dispersion of the carbon fibers. Each iteration of the model generated a specific release scenario, which would cause a specific amount of dollar loss to the surrounding community. By repeated iterations, a risk profile was generated, showing the probability distribution of losses from one accident. Using accident probability estimates, the risks profile for annual losses was derived. The mechanics are described of the simulation model, the required input data, and the risk profiles generated for the 26 large hub airports.
International Nuclear Information System (INIS)
Jejcic, A.; Maillard, J.; Silva, J.; Auguin, M.; Boeri, F.
1989-01-01
Results obtained on a strongly coupled parallel computer are reported. They concern Monte-Carlo simulation and pattern recognition. Though the calculations were made on an experimental computer of rather low processing power, it is believed that the quoted figures could give useful indications on architectural choices for dedicated computers. (orig.)
International Nuclear Information System (INIS)
Jejcic, A.; Maillard, J.; Silva, J.; Auguin, M.; Boeri, F.
1989-01-01
Results obtained on strongly coupled parallel computer are reported. They concern Monte-Carlo simulation and pattern recognition. Though the calculations were made on an experimental computer of rather low processing power, it is believed that the quoted figures could give useful indications on architectural choices for dedicated computers
Monte Carlo simulation as a tool to predict blasting fragmentation based on the Kuz Ram model
Morin, Mario A.; Ficarazzo, Francesco
2006-04-01
Rock fragmentation is considered the most important aspect of production blasting because of its direct effects on the costs of drilling and blasting and on the economics of the subsequent operations of loading, hauling and crushing. Over the past three decades, significant progress has been made in the development of new technologies for blasting applications. These technologies include increasingly sophisticated computer models for blast design and blast performance prediction. Rock fragmentation depends on many variables such as rock mass properties, site geology, in situ fracturing and blasting parameters and as such has no complete theoretical solution for its prediction. However, empirical models for the estimation of size distribution of rock fragments have been developed. In this study, a blast fragmentation Monte Carlo-based simulator, based on the Kuz-Ram fragmentation model, has been developed to predict the entire fragmentation size distribution, taking into account intact and joints rock properties, the type and properties of explosives and the drilling pattern. Results produced by this simulator were quite favorable when compared with real fragmentation data obtained from a blast quarry. It is anticipated that the use of Monte Carlo simulation will increase our understanding of the effects of rock mass and explosive properties on the rock fragmentation by blasting, as well as increase our confidence in these empirical models. This understanding will translate into improvements in blasting operations, its corresponding costs and the overall economics of open pit mines and rock quarries.
Kosmidis, Kosmas; Argyrakis, Panos; Macheras, Panos
2003-07-01
To verify the Higuchi law and study the drug release from cylindrical and spherical matrices by means of Monte Carlo computer simulation. A one-dimensional matrix, based on the theoretical assumptions of the derivation of the Higuchi law, was simulated and its time evolution was monitored. Cylindrical and spherical three-dimensional lattices were simulated with sites at the boundary of the lattice having been denoted as leak sites. Particles were allowed to move inside it using the random walk model. Excluded volume interactions between the particles was assumed. We have monitored the system time evolution for different lattice sizes and different initial particle concentrations. The Higuchi law was verified using the Monte Carlo technique in a one-dimensional lattice. It was found that Fickian drug release from cylindrical matrices can be approximated nicely with the Weibull function. A simple linear relation between the Weibull function parameters and the specific surface of the system was found. Drug release from a matrix, as a result of a diffusion process assuming excluded volume interactions between the drug molecules, can be described using a Weibull function. This model, although approximate and semiempirical, has the benefit of providing a simple physical connection between the model parameters and the system geometry, which was something missing from other semiempirical models.
Multi-Subband Ensemble Monte Carlo simulations of scaled GAA MOSFETs
Donetti, L.; Sampedro, C.; Ruiz, F. G.; Godoy, A.; Gamiz, F.
2018-05-01
We developed a Multi-Subband Ensemble Monte Carlo simulator for non-planar devices, taking into account two-dimensional quantum confinement. It couples self-consistently the solution of the 3D Poisson equation, the 2D Schrödinger equation, and the 1D Boltzmann transport equation with the Ensemble Monte Carlo method. This simulator was employed to study MOS devices based on ultra-scaled Gate-All-Around Si nanowires with diameters in the range from 4 nm to 8 nm with gate length from 8 nm to 14 nm. We studied the output and transfer characteristics, interpreting the behavior in the sub-threshold region and in the ON state in terms of the spatial charge distribution and the mobility computed with the same simulator. We analyzed the results, highlighting the contribution of different valleys and subbands and the effect of the gate bias on the energy and velocity profiles. Finally the scaling behavior was studied, showing that only the devices with D = 4nm maintain a good control of the short channel effects down to the gate length of 8nm .
Atomic scale Monte Carlo simulations of BF3 plasma immersion ion implantation in Si
International Nuclear Information System (INIS)
La Magna, Antonino; Fisicaro, Giuseppe; Nicotra, Giuseppe; Spiegel, Yohann; Torregrosa, Frank
2014-01-01
We present a numerical model aimed to accurately simulate the plasma immersion ion implantation (PIII) process in micro and nano-patterned Si samples. The code, based on the Monte Carlo approach, is designed to reproduce all the relevant physical phenomena involved in the process. The particle based simulation technique is fundamental to efficiently compute the material modifications promoted by the plasma implantation at the atomic resolution. The accuracy in the description of the process kinetic is achieved linking (one to one) each virtual Monte Carlo event to each possible atomic phenomenon (e.g. ion penetration, neutral absorption, ion induced surface modification, etc.). The code is designed to be coupled with a generic plasma status, characterized by the particle types (ions and neutrals), their flow rates and their energy/angle distributions. The coupling with a Poisson solver allows the simulation of the correct trajectories of charged particles in the void regions of the micro-structures. The implemented model is able to predict the implantation 2D profiles and significantly support the process design. (copyright 2014 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim) (orig.)
Monte Carlo simulation in UWB1 depletion code
International Nuclear Information System (INIS)
Lovecky, M.; Prehradny, J.; Jirickova, J.; Skoda, R.
2015-01-01
U W B 1 depletion code is being developed as a fast computational tool for the study of burnable absorbers in the University of West Bohemia in Pilsen, Czech Republic. In order to achieve higher precision, the newly developed code was extended by adding a Monte Carlo solver. Research of fuel depletion aims at development and introduction of advanced types of burnable absorbers in nuclear fuel. Burnable absorbers (BA) allow the compensation of the initial reactivity excess of nuclear fuel and result in an increase of fuel cycles lengths with higher enriched fuels. The paper describes the depletion calculations of VVER nuclear fuel doped with rare earth oxides as burnable absorber based on performed depletion calculations, rare earth oxides are divided into two equally numerous groups, suitable burnable absorbers and poisoning absorbers. According to residual poisoning and BA reactivity worth, rare earth oxides marked as suitable burnable absorbers are Nd, Sm, Eu, Gd, Dy, Ho and Er, while poisoning absorbers include Sc, La, Lu, Y, Ce, Pr and Tb. The presentation slides have been added to the article
Monte Carlo simulation for radiation dose in children radiology
International Nuclear Information System (INIS)
Mendes, Hitalo R.; Tomal, Alessandra
2016-01-01
The dosimetry in pediatric radiology is essential due to the higher risk that children have in comparison to adults. The focus of this study is to present how the dose varies depending on the depth in a 10 year old and a newborn, for this purpose simulations are made using the Monte Carlo method. Potential differences were considered 70 and 90 kVp for the 10 year old and 70 and 80 kVp for the newborn. The results show that in both cases, the dose at the skin surface is larger for smaller potential value, however, it decreases faster for larger potential values. Another observation made is that because the newborn is less thick the ratio between the initial dose and the final is lower compared to the case of a 10 year old, showing that it is possible to make an image using a smaller entrance dose in the skin, keeping the same level of exposure at the detector. (author)
Monte Carlo simulation of the OCP freezing transition
International Nuclear Information System (INIS)
DeWitt, H.E.; Slattery, W.L.; Yang, Juxing
1992-09-01
The One Component Plasma (OCP) in three dimensions is a system of classical point charges moving in a fixed uniform neutralizing background. In nature the OCP is a rough approximation of the conditions in a white dwarf star in which one has fully ionized nuclei such as carbon, oxygen, and smaller amounts of heavier elements up to iron all moving in a nearly uniform background provided by relativistically degenerate electrons. The OCP is also a mathematical limiting model for a non-neutral plasma of ions in a Penning trap and cooled to strongly coupled conditions. Similarly, a collection of charge colloidal suspensions in water can exhibit the Coulomb freezing behavior of the OCP. A single dimensionless parameter, Γ is sufficient to describe the system. For very weak coupling, Γ much-lt 1, the thermodynamic properties of the OCP are given rigorously by the Debye-Huckel theory. This paper reports on Monte Carlo simulation of the freezing of the OCP from a random start for particle numbers ranging from 500 to 2000. In one case the authors obtained a perfect bcc lattice, but in most cases the final state would be an imperfect crystal or two different microcrystals, fcc and bcc, growing into each other. With a cluster analysis program the authors looked at the formation of nucleating clusters, and followed the actual freezing process. Roughly 80 particles are needed in a cluster before it starts to grow rapidly and freeze
Monte Carlo simulation techniques for predicting annual power production
International Nuclear Information System (INIS)
Cross, J.P.; Bulandr, P.J.
1991-01-01
As the owner and operator of a number of small to mid-sized hydroelectric sites, STS HydroPower has been faced with the need to accurately predict anticipated hydroelectric revenues over a period of years. The typical approach to this problem has been to look at each site from a mathematical deterministic perspective and evaluate the annual production from historic streamflows. Average annual production is simply taken to be the area under the flow duration curve defined by the operating and design characteristics of the selected turbines. Minimum annual production is taken to be a historic dry year scenario and maximum production is viewed as power generated under the most ideal of conditions. Such an approach creates two problems. First, in viewing the characteristics of a single site, it does not take into account the probability of such an event occurring. Second, in viewing all sites in a single organization's portfolio together, it does not reflect the varying flow conditions at the different sites. This paper attempts to address the first of these two concerns, that being the creation of a simulation model utilizing the Monte Carlo method at a single site. The result of the analysis is a picture of the production at the site that is both a better representation of anticipated conditions and defined probabilistically
Optimization of reconstruction algorithms using Monte Carlo simulation
International Nuclear Information System (INIS)
Hanson, K.M.
1989-01-01
A method for optimizing reconstruction algorithms is presented that is based on how well a specified task can be performed using the reconstructed images. Task performance is numerically assessed by a Monte Carlo simulation of the complete imaging process including the generation of scenes appropriate to the desired application, subsequent data taking, reconstruction, and performance of the stated task based on the final image. The use of this method is demonstrated through the optimization of the Algebraic Reconstruction Technique (ART), which reconstructs images from their projections by a iterative procedure. The optimization is accomplished by varying the relaxation factor employed in the updating procedure. In some of the imaging situations studied, it is found that the optimization of constrained ART, in which a nonnegativity constraint is invoked, can vastly increase the detectability of objects. There is little improvement attained for unconstrained ART. The general method presented may be applied to the problem of designing neutron-diffraction spectrometers. 11 refs., 6 figs., 2 tabs
Monte Carlo simulations of ionization potential depression in dense plasmas
Energy Technology Data Exchange (ETDEWEB)
Stransky, M., E-mail: stransky@fzu.cz [Department of Radiation and Chemical Physics, Institute of Physics ASCR, Na Slovance 2, 182 21 Prague 8 (Czech Republic)
2016-01-15
A particle-particle grand canonical Monte Carlo model with Coulomb pair potential interaction was used to simulate modification of ionization potentials by electrostatic microfields. The Barnes-Hut tree algorithm [J. Barnes and P. Hut, Nature 324, 446 (1986)] was used to speed up calculations of electric potential. Atomic levels were approximated to be independent of the microfields as was assumed in the original paper by Ecker and Kröll [Phys. Fluids 6, 62 (1963)]; however, the available levels were limited by the corresponding mean inter-particle distance. The code was tested on hydrogen and dense aluminum plasmas. The amount of depression was up to 50% higher in the Debye-Hückel regime for hydrogen plasmas, in the high density limit, reasonable agreement was found with the Ecker-Kröll model for hydrogen plasmas and with the Stewart-Pyatt model [J. Stewart and K. Pyatt, Jr., Astrophys. J. 144, 1203 (1966)] for aluminum plasmas. Our 3D code is an improvement over the spherically symmetric simplifications of the Ecker-Kröll and Stewart-Pyatt models and is also not limited to high atomic numbers as is the underlying Thomas-Fermi model used in the Stewart-Pyatt model.
Calculation of beam quality correction factor using Monte Carlo simulation
International Nuclear Information System (INIS)
Kawachi, T.; Saitoh, H.; Myojoyama, A.; Katayose, T.; Kojima, T.; Fukuda, K.; Inoue, M.
2005-01-01
In recent years, a number of the CyberKnife systems (Accuray C., U.S.) have been increasing significantly. However, the CyberKnife has unique treatment head structure and beam collimating system. Therefore, the global standard protocols can not be adopted for absolute absorbed dose dosimetry in CyberKnife beam. In this work, the energy spectrum of photon and electron from CyberKnife treatment head at 80 cm SSD and several depths in water are simulated with conscientious geometry using by the EGS Monte Carlo method. Furthermore, for calculation of the beam quality correction factor k Q , the mean restricted mass stopping power and the mass energy absorption coefficient of air, water and several chamber wall and waterproofing sleeve materials are calculated. As a result, the factors k Q CyberKnife beam for several ionization chambers are determined. And the relationship between the beam quality index PDD(10) x in CyberKnife beam and k Q is described in this report. (author)
A Monte Carlo simulation technique to determine the optimal portfolio
Directory of Open Access Journals (Sweden)
Hassan Ghodrati
2014-03-01
Full Text Available During the past few years, there have been several studies for portfolio management. One of the primary concerns on any stock market is to detect the risk associated with various assets. One of the recognized methods in order to measure, to forecast, and to manage the existing risk is associated with Value at Risk (VaR, which draws much attention by financial institutions in recent years. VaR is a method for recognizing and evaluating of risk, which uses the standard statistical techniques and the method has been used in other fields, increasingly. The present study has measured the value at risk of 26 companies from chemical industry in Tehran Stock Exchange over the period 2009-2011 using the simulation technique of Monte Carlo with 95% confidence level. The used variability in the present study has been the daily return resulted from the stock daily price change. Moreover, the weight of optimal investment has been determined using a hybrid model called Markowitz and Winker model in each determined stocks. The results showed that the maximum loss would not exceed from 1259432 Rials at 95% confidence level in future day.
Monte Carlo simulations of ionization potential depression in dense plasmas
International Nuclear Information System (INIS)
Stransky, M.
2016-01-01
A particle-particle grand canonical Monte Carlo model with Coulomb pair potential interaction was used to simulate modification of ionization potentials by electrostatic microfields. The Barnes-Hut tree algorithm [J. Barnes and P. Hut, Nature 324, 446 (1986)] was used to speed up calculations of electric potential. Atomic levels were approximated to be independent of the microfields as was assumed in the original paper by Ecker and Kröll [Phys. Fluids 6, 62 (1963)]; however, the available levels were limited by the corresponding mean inter-particle distance. The code was tested on hydrogen and dense aluminum plasmas. The amount of depression was up to 50% higher in the Debye-Hückel regime for hydrogen plasmas, in the high density limit, reasonable agreement was found with the Ecker-Kröll model for hydrogen plasmas and with the Stewart-Pyatt model [J. Stewart and K. Pyatt, Jr., Astrophys. J. 144, 1203 (1966)] for aluminum plasmas. Our 3D code is an improvement over the spherically symmetric simplifications of the Ecker-Kröll and Stewart-Pyatt models and is also not limited to high atomic numbers as is the underlying Thomas-Fermi model used in the Stewart-Pyatt model
International Nuclear Information System (INIS)
Orkoulas, G.; Panagiotopoulos, A.Z.
1994-01-01
In this work, we investigate the liquid--vapor phase transition of the restricted primitive model of ionic fluids. We show that at the low temperatures where the phase transition occurs, the system cannot be studied by conventional molecular simulation methods because convergence to equilibrium is slow. To accelerate convergence, we propose cluster Monte Carlo moves capable of moving more than one particle at a time. We then address the issue of charged particle transfers in grand canonical and Gibbs ensemble Monte Carlo simulations, for which we propose a biased particle insertion/destruction scheme capable of sampling short interparticle distances. We compute the chemical potential for the restricted primitive model as a function of temperature and density from grand canonical Monte Carlo simulations and the phase envelope from Gibbs Monte Carlo simulations. Our calculated phase coexistence curve is in agreement with recent results of Caillol obtained on the four-dimensional hypersphere and our own earlier Gibbs ensemble simulations with single-ion transfers, with the exception of the critical temperature, which is lower in the current calculations. Our best estimates for the critical parameters are T * c =0.053, ρ * c =0.025. We conclude with possible future applications of the biased techniques developed here for phase equilibrium calculations for ionic fluids
International Nuclear Information System (INIS)
Rowley, A.
1998-01-01
An ionic interaction model is developed which accounts for the effects of the ionic environment upon the electron densities of both cations and anions through changes in their size and shape and is transferable between materials. These variations are represented by additional dynamical variables which are handled within the model using the techniques of the Car-Parrinello method. The model parameters are determined as far as possible by input from external ab initio electronic structure calculations directed at examining the individual effects of the ionic environment upon the ions, particularly the oxide ion. Techniques for the evaluation of dipolar and quadrupolar Ewald sums in non-cubic simulation cells and the calculation of the pressure due to the terms in the potential are presented. This model is applied to the description of the perfect crystal properties and phonon dispersion curves of MgO. Consideration of the high symmetry phonon modes allows parameterization of the remaining model parameters in an unambiguous fashion. The same procedure is used to obtain parameters for CaO. These two parameter sets are examined to determine how they may be used to generate the parameters for SrO and simple scaling relationships based on ionic radii and polarizabilities are formulated. The transferability of the model to Cr 2 O 3 is investigated using parameters generated from the alkaline earth oxides. The importance of lower symmetry model terms, particularly quadrupolar interactions, at the low symmetry ion sites in the crystal structure is demonstrated. The correct ground-state crystal structure is predicted and the calculated surface energies and relaxation phenomena are found to agree well with previous ab initio studies. The model is applied to GeO 2 as a strong test of its applicability to ion environments far different from those encountered in MgO. An good description of the crystal structures is obtained and the interplay of dipolar and quadrupolar effects is
Monte Carlo Simulations of Photospheric Emission in Relativistic Outflows
Bhattacharya, Mukul; Lu, Wenbin; Kumar, Pawan; Santana, Rodolfo
2018-01-01
We study the spectra of photospheric emission from highly relativistic gamma-ray burst outflows using a Monte Carlo code. We consider the Comptonization of photons with a fast-cooled synchrotron spectrum in a relativistic jet with a realistic photon-to-electron number ratio {N}γ /{N}{{e}}={10}5, using mono-energetic protons that interact with thermalized electrons through Coulomb interaction. The photons, electrons, and protons are cooled adiabatically as the jet expands outward. We find that the initial energy distributions of the protons and electrons do not have any appreciable effect on the photon peak energy {E}γ ,{peak} and the power-law spectrum above {E}γ ,{peak}. The Coulomb interaction between the electrons and the protons does not affect the output photon spectrum significantly as the energy of the electrons is elevated only marginally. {E}γ ,{peak} and the spectral indices for the low- and high-energy power-law tails of the photon spectrum remain practically unchanged even with electron-proton coupling. Increasing the initial optical depth {τ }{in} results in a slightly shallower photon spectrum below {E}γ ,{peak} and fewer photons at the high-energy tail, although {f}ν \\propto {ν }-0.5 above {E}γ ,{peak} and up to ∼1 MeV, independent of {τ }{in}. We find that {E}γ ,{peak} determines the peak energy and the shape of the output photon spectrum. Finally, we find that our simulation results are quite sensitive to {N}γ /{N}{{e}}, for {N}{{e}}=3× {10}3. For almost all our simulations, we obtain an output photon spectrum with a power-law tail above {E}γ ,{peak} extending up to ∼1 MeV.
Monte Carlo simulation of nuclear spin relaxation in disordered system
International Nuclear Information System (INIS)
Luo, X.; Sholl, C.A.
2002-01-01
Full text: Nuclear spin relaxation is a very useful technique for obtaining information about diffusion in solids. The present work is motivated by relaxation experiments on H diffusing in disordered systems such as metallic glasses or quasicrystalline materials. A theory of the spectral density functions of the magnetic dipolar interactions between diffusing spins is required in order to relate the experimental data to diffusional parameters. In simple ordered systems, the spectral density functions are well understood and a simple BPP (exponential correlation function) model is often used to interpret the data. Diffusion in disordered systems involves a distribution of activation energies and the simple extension of the BPP model that has been used traditionally is of doubtful validity. A more rigorously based BPP model has been developed, and this model has recently been applied to H diffusion in a metal quasicrystal. The improved BPP model still, however, involves approximations and the accuracy of the parameters deduced from it is not clear. The present work involves a Monte Carlo simulation of diffusion in disordered systems and the calculation of the spectral density functions and relaxation rates. The simulations use two algorithms (discrete time and continuous time) for the time-development of the system, and correctly incorporate the Fermi-Dirac distribution for equilibrium occupation of sites, as required by the principle of detailed balance and only single site occupancy of sites. The results are compared with the BPP models for some site- and barrier-energy distributions arising from the structural disorder of the system. The improved BPP model is found to give reasonable values for the diffusion and disorder parameters. Quantitative estimates of the errors involved are determined