WorldWideScience

Sample records for carlo computer simulation

  1. Radiotherapy Monte Carlo simulation using cloud computing technology.

    Science.gov (United States)

    Poole, C M; Cornelius, I; Trapp, J V; Langton, C M

    2012-12-01

    Cloud computing allows for vast computational resources to be leveraged quickly and easily in bursts as and when required. Here we describe a technique that allows for Monte Carlo radiotherapy dose calculations to be performed using GEANT4 and executed in the cloud, with relative simulation cost and completion time evaluated as a function of machine count. As expected, simulation completion time decreases as 1/n for n parallel machines, and relative simulation cost is found to be optimal where n is a factor of the total simulation time in hours. Using the technique, we demonstrate the potential usefulness of cloud computing as a solution for rapid Monte Carlo simulation for radiotherapy dose calculation without the need for dedicated local computer hardware as a proof of principal.

  2. Radiotherapy Monte Carlo simulation using cloud computing technology

    International Nuclear Information System (INIS)

    Poole, C.M.; Cornelius, I.; Trapp, J.V.; Langton, C.M.

    2012-01-01

    Cloud computing allows for vast computational resources to be leveraged quickly and easily in bursts as and when required. Here we describe a technique that allows for Monte Carlo radiotherapy dose calculations to be performed using GEANT4 and executed in the cloud, with relative simulation cost and completion time evaluated as a function of machine count. As expected, simulation completion time decreases as 1/n for n parallel machines, and relative simulation cost is found to be optimal where n is a factor of the total simulation time in hours. Using the technique, we demonstrate the potential usefulness of cloud computing as a solution for rapid Monte Carlo simulation for radiotherapy dose calculation without the need for dedicated local computer hardware as a proof of principal.

  3. CloudMC: a cloud computing application for Monte Carlo simulation

    International Nuclear Information System (INIS)

    Miras, H; Jiménez, R; Miras, C; Gomà, C

    2013-01-01

    This work presents CloudMC, a cloud computing application—developed in Windows Azure®, the platform of the Microsoft® cloud—for the parallelization of Monte Carlo simulations in a dynamic virtual cluster. CloudMC is a web application designed to be independent of the Monte Carlo code in which the simulations are based—the simulations just need to be of the form: input files → executable → output files. To study the performance of CloudMC in Windows Azure®, Monte Carlo simulations with penelope were performed on different instance (virtual machine) sizes, and for different number of instances. The instance size was found to have no effect on the simulation runtime. It was also found that the decrease in time with the number of instances followed Amdahl's law, with a slight deviation due to the increase in the fraction of non-parallelizable time with increasing number of instances. A simulation that would have required 30 h of CPU on a single instance was completed in 48.6 min when executed on 64 instances in parallel (speedup of 37 ×). Furthermore, the use of cloud computing for parallel computing offers some advantages over conventional clusters: high accessibility, scalability and pay per usage. Therefore, it is strongly believed that cloud computing will play an important role in making Monte Carlo dose calculation a reality in future clinical practice. (note)

  4. CloudMC: a cloud computing application for Monte Carlo simulation.

    Science.gov (United States)

    Miras, H; Jiménez, R; Miras, C; Gomà, C

    2013-04-21

    This work presents CloudMC, a cloud computing application-developed in Windows Azure®, the platform of the Microsoft® cloud-for the parallelization of Monte Carlo simulations in a dynamic virtual cluster. CloudMC is a web application designed to be independent of the Monte Carlo code in which the simulations are based-the simulations just need to be of the form: input files → executable → output files. To study the performance of CloudMC in Windows Azure®, Monte Carlo simulations with penelope were performed on different instance (virtual machine) sizes, and for different number of instances. The instance size was found to have no effect on the simulation runtime. It was also found that the decrease in time with the number of instances followed Amdahl's law, with a slight deviation due to the increase in the fraction of non-parallelizable time with increasing number of instances. A simulation that would have required 30 h of CPU on a single instance was completed in 48.6 min when executed on 64 instances in parallel (speedup of 37 ×). Furthermore, the use of cloud computing for parallel computing offers some advantages over conventional clusters: high accessibility, scalability and pay per usage. Therefore, it is strongly believed that cloud computing will play an important role in making Monte Carlo dose calculation a reality in future clinical practice.

  5. Parallel Monte Carlo simulations on an ARC-enabled computing grid

    International Nuclear Information System (INIS)

    Nilsen, Jon K; Samset, Bjørn H

    2011-01-01

    Grid computing opens new possibilities for running heavy Monte Carlo simulations of physical systems in parallel. The presentation gives an overview of GaMPI, a system for running an MPI-based random walker simulation on grid resources. Integrating the ARC middleware and the new storage system Chelonia with the Ganga grid job submission and control system, we show that MPI jobs can be run on a world-wide computing grid with good performance and promising scaling properties. Results for relatively communication-heavy Monte Carlo simulations run on multiple heterogeneous, ARC-enabled computing clusters in several countries are presented.

  6. Monte Carlo simulation with the Gate software using grid computing

    International Nuclear Information System (INIS)

    Reuillon, R.; Hill, D.R.C.; Gouinaud, C.; El Bitar, Z.; Breton, V.; Buvat, I.

    2009-03-01

    Monte Carlo simulations are widely used in emission tomography, for protocol optimization, design of processing or data analysis methods, tomographic reconstruction, or tomograph design optimization. Monte Carlo simulations needing many replicates to obtain good statistical results can be easily executed in parallel using the 'Multiple Replications In Parallel' approach. However, several precautions have to be taken in the generation of the parallel streams of pseudo-random numbers. In this paper, we present the distribution of Monte Carlo simulations performed with the GATE software using local clusters and grid computing. We obtained very convincing results with this large medical application, thanks to the EGEE Grid (Enabling Grid for E-science), achieving in one week computations that could have taken more than 3 years of processing on a single computer. This work has been achieved thanks to a generic object-oriented toolbox called DistMe which we designed to automate this kind of parallelization for Monte Carlo simulations. This toolbox, written in Java is freely available on SourceForge and helped to ensure a rigorous distribution of pseudo-random number streams. It is based on the use of a documented XML format for random numbers generators statuses. (authors)

  7. Monte Carlo codes and Monte Carlo simulator program

    International Nuclear Information System (INIS)

    Higuchi, Kenji; Asai, Kiyoshi; Suganuma, Masayuki.

    1990-03-01

    Four typical Monte Carlo codes KENO-IV, MORSE, MCNP and VIM have been vectorized on VP-100 at Computing Center, JAERI. The problems in vector processing of Monte Carlo codes on vector processors have become clear through the work. As the result, it is recognized that these are difficulties to obtain good performance in vector processing of Monte Carlo codes. A Monte Carlo computing machine, which processes the Monte Carlo codes with high performances is being developed at our Computing Center since 1987. The concept of Monte Carlo computing machine and its performance have been investigated and estimated by using a software simulator. In this report the problems in vectorization of Monte Carlo codes, Monte Carlo pipelines proposed to mitigate these difficulties and the results of the performance estimation of the Monte Carlo computing machine by the simulator are described. (author)

  8. Computed radiography simulation using the Monte Carlo code MCNPX

    International Nuclear Information System (INIS)

    Correa, S.C.A.; Souza, E.M.; Silva, A.X.; Lopes, R.T.

    2009-01-01

    Simulating x-ray images has been of great interest in recent years as it makes possible an analysis of how x-ray images are affected owing to relevant operating parameters. In this paper, a procedure for simulating computed radiographic images using the Monte Carlo code MCNPX is proposed. The sensitivity curve of the BaFBr image plate detector as well as the characteristic noise of a 16-bit computed radiography system were considered during the methodology's development. The results obtained confirm that the proposed procedure for simulating computed radiographic images is satisfactory, as it allows obtaining results comparable with experimental data. (author)

  9. Computed radiography simulation using the Monte Carlo code MCNPX

    Energy Technology Data Exchange (ETDEWEB)

    Correa, S.C.A. [Programa de Engenharia Nuclear/COPPE, Universidade Federal do Rio de Janeiro, Ilha do Fundao, Caixa Postal 68509, 21945-970, Rio de Janeiro, RJ (Brazil); Centro Universitario Estadual da Zona Oeste (CCMAT)/UEZO, Av. Manuel Caldeira de Alvarenga, 1203, Campo Grande, 23070-200, Rio de Janeiro, RJ (Brazil); Souza, E.M. [Programa de Engenharia Nuclear/COPPE, Universidade Federal do Rio de Janeiro, Ilha do Fundao, Caixa Postal 68509, 21945-970, Rio de Janeiro, RJ (Brazil); Silva, A.X., E-mail: ademir@con.ufrj.b [PEN/COPPE-DNC/Poli CT, Universidade Federal do Rio de Janeiro, Ilha do Fundao, Caixa Postal 68509, 21945-970, Rio de Janeiro, RJ (Brazil); Cassiano, D.H. [Instituto de Radioprotecao e Dosimetria/CNEN Av. Salvador Allende, s/n, Recreio, 22780-160, Rio de Janeiro, RJ (Brazil); Lopes, R.T. [Programa de Engenharia Nuclear/COPPE, Universidade Federal do Rio de Janeiro, Ilha do Fundao, Caixa Postal 68509, 21945-970, Rio de Janeiro, RJ (Brazil)

    2010-09-15

    Simulating X-ray images has been of great interest in recent years as it makes possible an analysis of how X-ray images are affected owing to relevant operating parameters. In this paper, a procedure for simulating computed radiographic images using the Monte Carlo code MCNPX is proposed. The sensitivity curve of the BaFBr image plate detector as well as the characteristic noise of a 16-bit computed radiography system were considered during the methodology's development. The results obtained confirm that the proposed procedure for simulating computed radiographic images is satisfactory, as it allows obtaining results comparable with experimental data.

  10. A computer code package for electron transport Monte Carlo simulation

    International Nuclear Information System (INIS)

    Popescu, Lucretiu M.

    1999-01-01

    A computer code package was developed for solving various electron transport problems by Monte Carlo simulation. It is based on condensed history Monte Carlo algorithm. In order to get reliable results over wide ranges of electron energies and target atomic numbers, specific techniques of electron transport were implemented such as: Moliere multiscatter angular distributions, Blunck-Leisegang multiscatter energy distribution, sampling of electron-electron and Bremsstrahlung individual interactions. Path-length and lateral displacement corrections algorithms and the module for computing collision, radiative and total restricted stopping powers and ranges of electrons are also included. Comparisons of simulation results with experimental measurements are finally presented. (author)

  11. SU-E-T-222: Computational Optimization of Monte Carlo Simulation On 4D Treatment Planning Using the Cloud Computing Technology

    International Nuclear Information System (INIS)

    Chow, J

    2015-01-01

    Purpose: This study evaluated the efficiency of 4D lung radiation treatment planning using Monte Carlo simulation on the cloud. The EGSnrc Monte Carlo code was used in dose calculation on the 4D-CT image set. Methods: 4D lung radiation treatment plan was created by the DOSCTP linked to the cloud, based on the Amazon elastic compute cloud platform. Dose calculation was carried out by Monte Carlo simulation on the 4D-CT image set on the cloud, and results were sent to the FFD4D image deformation program for dose reconstruction. The dependence of computing time for treatment plan on the number of compute node was optimized with variations of the number of CT image set in the breathing cycle and dose reconstruction time of the FFD4D. Results: It is found that the dependence of computing time on the number of compute node was affected by the diminishing return of the number of node used in Monte Carlo simulation. Moreover, the performance of the 4D treatment planning could be optimized by using smaller than 10 compute nodes on the cloud. The effects of the number of image set and dose reconstruction time on the dependence of computing time on the number of node were not significant, as more than 15 compute nodes were used in Monte Carlo simulations. Conclusion: The issue of long computing time in 4D treatment plan, requiring Monte Carlo dose calculations in all CT image sets in the breathing cycle, can be solved using the cloud computing technology. It is concluded that the optimized number of compute node selected in simulation should be between 5 and 15, as the dependence of computing time on the number of node is significant

  12. SU-E-T-222: Computational Optimization of Monte Carlo Simulation On 4D Treatment Planning Using the Cloud Computing Technology

    Energy Technology Data Exchange (ETDEWEB)

    Chow, J [Princess Margaret Cancer Center, Toronto, ON (Canada)

    2015-06-15

    Purpose: This study evaluated the efficiency of 4D lung radiation treatment planning using Monte Carlo simulation on the cloud. The EGSnrc Monte Carlo code was used in dose calculation on the 4D-CT image set. Methods: 4D lung radiation treatment plan was created by the DOSCTP linked to the cloud, based on the Amazon elastic compute cloud platform. Dose calculation was carried out by Monte Carlo simulation on the 4D-CT image set on the cloud, and results were sent to the FFD4D image deformation program for dose reconstruction. The dependence of computing time for treatment plan on the number of compute node was optimized with variations of the number of CT image set in the breathing cycle and dose reconstruction time of the FFD4D. Results: It is found that the dependence of computing time on the number of compute node was affected by the diminishing return of the number of node used in Monte Carlo simulation. Moreover, the performance of the 4D treatment planning could be optimized by using smaller than 10 compute nodes on the cloud. The effects of the number of image set and dose reconstruction time on the dependence of computing time on the number of node were not significant, as more than 15 compute nodes were used in Monte Carlo simulations. Conclusion: The issue of long computing time in 4D treatment plan, requiring Monte Carlo dose calculations in all CT image sets in the breathing cycle, can be solved using the cloud computing technology. It is concluded that the optimized number of compute node selected in simulation should be between 5 and 15, as the dependence of computing time on the number of node is significant.

  13. The adaptation method in the Monte Carlo simulation for computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Hyoung Gun; Yoon, Chang Yeon; Lee, Won Ho [Dept. of Bio-convergence Engineering, Korea University, Seoul (Korea, Republic of); Cho, Seung Ryong [Dept. of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of); Park, Sung Ho [Dept. of Neurosurgery, Ulsan University Hospital, Ulsan (Korea, Republic of)

    2015-06-15

    The patient dose incurred from diagnostic procedures during advanced radiotherapy has become an important issue. Many researchers in medical physics are using computational simulations to calculate complex parameters in experiments. However, extended computation times make it difficult for personal computers to run the conventional Monte Carlo method to simulate radiological images with high-flux photons such as images produced by computed tomography (CT). To minimize the computation time without degrading imaging quality, we applied a deterministic adaptation to the Monte Carlo calculation and verified its effectiveness by simulating CT image reconstruction for an image evaluation phantom (Catphan; Phantom Laboratory, New York NY, USA) and a human-like voxel phantom (KTMAN-2) (Los Alamos National Laboratory, Los Alamos, NM, USA). For the deterministic adaptation, the relationship between iteration numbers and the simulations was estimated and the option to simulate scattered radiation was evaluated. The processing times of simulations using the adaptive method were at least 500 times faster than those using a conventional statistical process. In addition, compared with the conventional statistical method, the adaptive method provided images that were more similar to the experimental images, which proved that the adaptive method was highly effective for a simulation that requires a large number of iterations-assuming no radiation scattering in the vicinity of detectors minimized artifacts in the reconstructed image.

  14. Microcanonical Monte Carlo approach for computing melting curves by atomistic simulations

    OpenAIRE

    Davis, Sergio; Gutiérrez, Gonzalo

    2017-01-01

    We report microcanonical Monte Carlo simulations of melting and superheating of a generic, Lennard-Jones system starting from the crystalline phase. The isochoric curve, the melting temperature $T_m$ and the critical superheating temperature $T_{LS}$ obtained are in close agreement (well within the microcanonical temperature fluctuations) with standard molecular dynamics one-phase and two-phase methods. These results validate the use of microcanonical Monte Carlo to compute melting points, a ...

  15. Scalable and massively parallel Monte Carlo photon transport simulations for heterogeneous computing platforms.

    Science.gov (United States)

    Yu, Leiming; Nina-Paravecino, Fanny; Kaeli, David; Fang, Qianqian

    2018-01-01

    We present a highly scalable Monte Carlo (MC) three-dimensional photon transport simulation platform designed for heterogeneous computing systems. Through the development of a massively parallel MC algorithm using the Open Computing Language framework, this research extends our existing graphics processing unit (GPU)-accelerated MC technique to a highly scalable vendor-independent heterogeneous computing environment, achieving significantly improved performance and software portability. A number of parallel computing techniques are investigated to achieve portable performance over a wide range of computing hardware. Furthermore, multiple thread-level and device-level load-balancing strategies are developed to obtain efficient simulations using multiple central processing units and GPUs. (2018) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE).

  16. Monte Carlo simulation for IRRMA

    International Nuclear Information System (INIS)

    Gardner, R.P.; Liu Lianyan

    2000-01-01

    Monte Carlo simulation is fast becoming a standard approach for many radiation applications that were previously treated almost entirely by experimental techniques. This is certainly true for Industrial Radiation and Radioisotope Measurement Applications - IRRMA. The reasons for this include: (1) the increased cost and inadequacy of experimentation for design and interpretation purposes; (2) the availability of low cost, large memory, and fast personal computers; and (3) the general availability of general purpose Monte Carlo codes that are increasingly user-friendly, efficient, and accurate. This paper discusses the history and present status of Monte Carlo simulation for IRRMA including the general purpose (GP) and specific purpose (SP) Monte Carlo codes and future needs - primarily from the experience of the authors

  17. Atomic-level computer simulation

    International Nuclear Information System (INIS)

    Adams, J.B.; Rockett, Angus; Kieffer, John; Xu Wei; Nomura, Miki; Kilian, K.A.; Richards, D.F.; Ramprasad, R.

    1994-01-01

    This paper provides a broad overview of the methods of atomic-level computer simulation. It discusses methods of modelling atomic bonding, and computer simulation methods such as energy minimization, molecular dynamics, Monte Carlo, and lattice Monte Carlo. ((orig.))

  18. TITAN: a computer program for accident occurrence frequency analyses by component Monte Carlo simulation

    Energy Technology Data Exchange (ETDEWEB)

    Nomura, Yasushi [Department of Fuel Cycle Safety Research, Nuclear Safety Research Center, Tokai Research Establishment, Japan Atomic Energy Research Institute, Tokai, Ibaraki (Japan); Tamaki, Hitoshi [Department of Safety Research Technical Support, Tokai Research Establishment, Japan Atomic Energy Research Institute, Tokai, Ibaraki (Japan); Kanai, Shigeru [Fuji Research Institute Corporation, Tokyo (Japan)

    2000-04-01

    In a plant system consisting of complex equipments and components for a reprocessing facility, there might be grace time between an initiating event and a resultant serious accident, allowing operating personnel to take remedial actions, thus, terminating the ongoing accident sequence. A component Monte Carlo simulation computer program TITAN has been developed to analyze such a complex reliability model including the grace time without any difficulty to obtain an accident occurrence frequency. Firstly, basic methods for the component Monte Carlo simulation is introduced to obtain an accident occurrence frequency, and then, the basic performance such as precision, convergence, and parallelization of calculation, is shown through calculation of a prototype accident sequence model. As an example to illustrate applicability to a real scale plant model, a red oil explosion in a German reprocessing plant model is simulated to show that TITAN can give an accident occurrence frequency with relatively good accuracy. Moreover, results of uncertainty analyses by TITAN are rendered to show another performance, and a proposal is made for introducing of a new input-data format to adapt the component Monte Carlo simulation. The present paper describes the calculational method, performance, applicability to a real scale, and new proposal for the TITAN code. In the Appendixes, a conventional analytical method is shown to avoid complex and laborious calculation to obtain a strict solution of accident occurrence frequency, compared with Monte Carlo method. The user's manual and the list/structure of program are also contained in the Appendixes to facilitate TITAN computer program usage. (author)

  19. TITAN: a computer program for accident occurrence frequency analyses by component Monte Carlo simulation

    International Nuclear Information System (INIS)

    Nomura, Yasushi; Tamaki, Hitoshi; Kanai, Shigeru

    2000-04-01

    In a plant system consisting of complex equipments and components for a reprocessing facility, there might be grace time between an initiating event and a resultant serious accident, allowing operating personnel to take remedial actions, thus, terminating the ongoing accident sequence. A component Monte Carlo simulation computer program TITAN has been developed to analyze such a complex reliability model including the grace time without any difficulty to obtain an accident occurrence frequency. Firstly, basic methods for the component Monte Carlo simulation is introduced to obtain an accident occurrence frequency, and then, the basic performance such as precision, convergence, and parallelization of calculation, is shown through calculation of a prototype accident sequence model. As an example to illustrate applicability to a real scale plant model, a red oil explosion in a German reprocessing plant model is simulated to show that TITAN can give an accident occurrence frequency with relatively good accuracy. Moreover, results of uncertainty analyses by TITAN are rendered to show another performance, and a proposal is made for introducing of a new input-data format to adapt the component Monte Carlo simulation. The present paper describes the calculational method, performance, applicability to a real scale, and new proposal for the TITAN code. In the Appendixes, a conventional analytical method is shown to avoid complex and laborious calculation to obtain a strict solution of accident occurrence frequency, compared with Monte Carlo method. The user's manual and the list/structure of program are also contained in the Appendixes to facilitate TITAN computer program usage. (author)

  20. Mean field simulation for Monte Carlo integration

    CERN Document Server

    Del Moral, Pierre

    2013-01-01

    In the last three decades, there has been a dramatic increase in the use of interacting particle methods as a powerful tool in real-world applications of Monte Carlo simulation in computational physics, population biology, computer sciences, and statistical machine learning. Ideally suited to parallel and distributed computation, these advanced particle algorithms include nonlinear interacting jump diffusions; quantum, diffusion, and resampled Monte Carlo methods; Feynman-Kac particle models; genetic and evolutionary algorithms; sequential Monte Carlo methods; adaptive and interacting Marko

  1. A Monte Carlo simulation of scattering reduction in spectral x-ray computed tomography

    DEFF Research Database (Denmark)

    Busi, Matteo; Olsen, Ulrik Lund; Bergbäck Knudsen, Erik

    2017-01-01

    In X-ray computed tomography (CT), scattered radiation plays an important role in the accurate reconstruction of the inspected object, leading to a loss of contrast between the different materials in the reconstruction volume and cupping artifacts in the images. We present a Monte Carlo simulation...

  2. Dosimetry in radiotherapy and brachytherapy by Monte-Carlo GATE simulation on computing grid

    International Nuclear Information System (INIS)

    Thiam, Ch.O.

    2007-10-01

    Accurate radiotherapy treatment requires the delivery of a precise dose to the tumour volume and a good knowledge of the dose deposit to the neighbouring zones. Computation of the treatments is usually carried out by a Treatment Planning System (T.P.S.) which needs to be precise and fast. The G.A.T.E. platform for Monte-Carlo simulation based on G.E.A.N.T.4 is an emerging tool for nuclear medicine application that provides functionalities for fast and reliable dosimetric calculations. In this thesis, we studied in parallel a validation of the G.A.T.E. platform for the modelling of electrons and photons low energy sources and the optimized use of grid infrastructures to reduce simulations computing time. G.A.T.E. was validated for the dose calculation of point kernels for mono-energetic electrons and compared with the results of other Monte-Carlo studies. A detailed study was made on the energy deposit during electrons transport in G.E.A.N.T.4. In order to validate G.A.T.E. for very low energy photons (<35 keV), three models of radioactive sources used in brachytherapy and containing iodine 125 (2301 of Best Medical International; Symmetra of Uro- Med/Bebig and 6711 of Amersham) were simulated. Our results were analyzed according to the recommendations of task group No43 of American Association of Physicists in Medicine (A.A.P.M.). They show a good agreement between G.A.T.E., the reference studies and A.A.P.M. recommended values. The use of Monte-Carlo simulations for a better definition of the dose deposited in the tumour volumes requires long computing time. In order to reduce it, we exploited E.G.E.E. grid infrastructure where simulations are distributed using innovative technologies taking into account the grid status. Time necessary for the computing of a radiotherapy planning simulation using electrons was reduced by a factor 30. A Web platform based on G.E.N.I.U.S. portal was developed to make easily available all the methods to submit and manage G

  3. Monte Carlo simulations of neutron scattering instruments

    International Nuclear Information System (INIS)

    Aestrand, Per-Olof; Copenhagen Univ.; Lefmann, K.; Nielsen, K.

    2001-01-01

    A Monte Carlo simulation is an important computational tool used in many areas of science and engineering. The use of Monte Carlo techniques for simulating neutron scattering instruments is discussed. The basic ideas, techniques and approximations are presented. Since the construction of a neutron scattering instrument is very expensive, Monte Carlo software used for design of instruments have to be validated and tested extensively. The McStas software was designed with these aspects in mind and some of the basic principles of the McStas software will be discussed. Finally, some future prospects are discussed for using Monte Carlo simulations in optimizing neutron scattering experiments. (R.P.)

  4. GATE Monte Carlo simulation of dose distribution using MapReduce in a cloud computing environment.

    Science.gov (United States)

    Liu, Yangchuan; Tang, Yuguo; Gao, Xin

    2017-12-01

    The GATE Monte Carlo simulation platform has good application prospects of treatment planning and quality assurance. However, accurate dose calculation using GATE is time consuming. The purpose of this study is to implement a novel cloud computing method for accurate GATE Monte Carlo simulation of dose distribution using MapReduce. An Amazon Machine Image installed with Hadoop and GATE is created to set up Hadoop clusters on Amazon Elastic Compute Cloud (EC2). Macros, the input files for GATE, are split into a number of self-contained sub-macros. Through Hadoop Streaming, the sub-macros are executed by GATE in Map tasks and the sub-results are aggregated into final outputs in Reduce tasks. As an evaluation, GATE simulations were performed in a cubical water phantom for X-ray photons of 6 and 18 MeV. The parallel simulation on the cloud computing platform is as accurate as the single-threaded simulation on a local server and the simulation correctness is not affected by the failure of some worker nodes. The cloud-based simulation time is approximately inversely proportional to the number of worker nodes. For the simulation of 10 million photons on a cluster with 64 worker nodes, time decreases of 41× and 32× were achieved compared to the single worker node case and the single-threaded case, respectively. The test of Hadoop's fault tolerance showed that the simulation correctness was not affected by the failure of some worker nodes. The results verify that the proposed method provides a feasible cloud computing solution for GATE.

  5. Toward real-time Monte Carlo simulation using a commercial cloud computing infrastructure.

    Science.gov (United States)

    Wang, Henry; Ma, Yunzhi; Pratx, Guillem; Xing, Lei

    2011-09-07

    Monte Carlo (MC) methods are the gold standard for modeling photon and electron transport in a heterogeneous medium; however, their computational cost prohibits their routine use in the clinic. Cloud computing, wherein computing resources are allocated on-demand from a third party, is a new approach for high performance computing and is implemented to perform ultra-fast MC calculation in radiation therapy. We deployed the EGS5 MC package in a commercial cloud environment. Launched from a single local computer with Internet access, a Python script allocates a remote virtual cluster. A handshaking protocol designates master and worker nodes. The EGS5 binaries and the simulation data are initially loaded onto the master node. The simulation is then distributed among independent worker nodes via the message passing interface, and the results aggregated on the local computer for display and data analysis. The described approach is evaluated for pencil beams and broad beams of high-energy electrons and photons. The output of cloud-based MC simulation is identical to that produced by single-threaded implementation. For 1 million electrons, a simulation that takes 2.58 h on a local computer can be executed in 3.3 min on the cloud with 100 nodes, a 47× speed-up. Simulation time scales inversely with the number of parallel nodes. The parallelization overhead is also negligible for large simulations. Cloud computing represents one of the most important recent advances in supercomputing technology and provides a promising platform for substantially improved MC simulation. In addition to the significant speed up, cloud computing builds a layer of abstraction for high performance parallel computing, which may change the way dose calculations are performed and radiation treatment plans are completed.

  6. Toward real-time Monte Carlo simulation using a commercial cloud computing infrastructure

    International Nuclear Information System (INIS)

    Wang, Henry; Ma Yunzhi; Pratx, Guillem; Xing Lei

    2011-01-01

    Monte Carlo (MC) methods are the gold standard for modeling photon and electron transport in a heterogeneous medium; however, their computational cost prohibits their routine use in the clinic. Cloud computing, wherein computing resources are allocated on-demand from a third party, is a new approach for high performance computing and is implemented to perform ultra-fast MC calculation in radiation therapy. We deployed the EGS5 MC package in a commercial cloud environment. Launched from a single local computer with Internet access, a Python script allocates a remote virtual cluster. A handshaking protocol designates master and worker nodes. The EGS5 binaries and the simulation data are initially loaded onto the master node. The simulation is then distributed among independent worker nodes via the message passing interface, and the results aggregated on the local computer for display and data analysis. The described approach is evaluated for pencil beams and broad beams of high-energy electrons and photons. The output of cloud-based MC simulation is identical to that produced by single-threaded implementation. For 1 million electrons, a simulation that takes 2.58 h on a local computer can be executed in 3.3 min on the cloud with 100 nodes, a 47x speed-up. Simulation time scales inversely with the number of parallel nodes. The parallelization overhead is also negligible for large simulations. Cloud computing represents one of the most important recent advances in supercomputing technology and provides a promising platform for substantially improved MC simulation. In addition to the significant speed up, cloud computing builds a layer of abstraction for high performance parallel computing, which may change the way dose calculations are performed and radiation treatment plans are completed. (note)

  7. Toward real-time Monte Carlo simulation using a commercial cloud computing infrastructure

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Henry [Department of Electrical Engineering, Stanford University, Stanford, CA 94305 (United States); Ma Yunzhi; Pratx, Guillem; Xing Lei, E-mail: hwang41@stanford.edu [Department of Radiation Oncology, Stanford University School of Medicine, Stanford, CA 94305-5847 (United States)

    2011-09-07

    Monte Carlo (MC) methods are the gold standard for modeling photon and electron transport in a heterogeneous medium; however, their computational cost prohibits their routine use in the clinic. Cloud computing, wherein computing resources are allocated on-demand from a third party, is a new approach for high performance computing and is implemented to perform ultra-fast MC calculation in radiation therapy. We deployed the EGS5 MC package in a commercial cloud environment. Launched from a single local computer with Internet access, a Python script allocates a remote virtual cluster. A handshaking protocol designates master and worker nodes. The EGS5 binaries and the simulation data are initially loaded onto the master node. The simulation is then distributed among independent worker nodes via the message passing interface, and the results aggregated on the local computer for display and data analysis. The described approach is evaluated for pencil beams and broad beams of high-energy electrons and photons. The output of cloud-based MC simulation is identical to that produced by single-threaded implementation. For 1 million electrons, a simulation that takes 2.58 h on a local computer can be executed in 3.3 min on the cloud with 100 nodes, a 47x speed-up. Simulation time scales inversely with the number of parallel nodes. The parallelization overhead is also negligible for large simulations. Cloud computing represents one of the most important recent advances in supercomputing technology and provides a promising platform for substantially improved MC simulation. In addition to the significant speed up, cloud computing builds a layer of abstraction for high performance parallel computing, which may change the way dose calculations are performed and radiation treatment plans are completed. (note)

  8. Atmosphere Re-Entry Simulation Using Direct Simulation Monte Carlo (DSMC Method

    Directory of Open Access Journals (Sweden)

    Francesco Pellicani

    2016-05-01

    Full Text Available Hypersonic re-entry vehicles aerothermodynamic investigations provide fundamental information to other important disciplines like materials and structures, assisting the development of thermal protection systems (TPS efficient and with a low weight. In the transitional flow regime, where thermal and chemical equilibrium is almost absent, a new numerical method for such studies has been introduced, the direct simulation Monte Carlo (DSMC numerical technique. The acceptance and applicability of the DSMC method have increased significantly in the 50 years since its invention thanks to the increase in computer speed and to the parallel computing. Anyway, further verification and validation efforts are needed to lead to its greater acceptance. In this study, the Monte Carlo simulator OpenFOAM and Sparta have been studied and benchmarked against numerical and theoretical data for inert and chemically reactive flows and the same will be done against experimental data in the near future. The results show the validity of the data found with the DSMC. The best setting of the fundamental parameters used by a DSMC simulator are presented for each software and they are compared with the guidelines deriving from the theory behind the Monte Carlo method. In particular, the number of particles per cell was found to be the most relevant parameter to achieve valid and optimized results. It is shown how a simulation with a mean value of one particle per cell gives sufficiently good results with very low computational resources. This achievement aims to reconsider the correct investigation method in the transitional regime where both the direct simulation Monte Carlo (DSMC and the computational fluid-dynamics (CFD can work, but with a different computational effort.

  9. Dosimetry in radiotherapy and brachytherapy by Monte-Carlo GATE simulation on computing grid; Dosimetrie en radiotherapie et curietherapie par simulation Monte-Carlo GATE sur grille informatique

    Energy Technology Data Exchange (ETDEWEB)

    Thiam, Ch O

    2007-10-15

    Accurate radiotherapy treatment requires the delivery of a precise dose to the tumour volume and a good knowledge of the dose deposit to the neighbouring zones. Computation of the treatments is usually carried out by a Treatment Planning System (T.P.S.) which needs to be precise and fast. The G.A.T.E. platform for Monte-Carlo simulation based on G.E.A.N.T.4 is an emerging tool for nuclear medicine application that provides functionalities for fast and reliable dosimetric calculations. In this thesis, we studied in parallel a validation of the G.A.T.E. platform for the modelling of electrons and photons low energy sources and the optimized use of grid infrastructures to reduce simulations computing time. G.A.T.E. was validated for the dose calculation of point kernels for mono-energetic electrons and compared with the results of other Monte-Carlo studies. A detailed study was made on the energy deposit during electrons transport in G.E.A.N.T.4. In order to validate G.A.T.E. for very low energy photons (<35 keV), three models of radioactive sources used in brachytherapy and containing iodine 125 (2301 of Best Medical International; Symmetra of Uro- Med/Bebig and 6711 of Amersham) were simulated. Our results were analyzed according to the recommendations of task group No43 of American Association of Physicists in Medicine (A.A.P.M.). They show a good agreement between G.A.T.E., the reference studies and A.A.P.M. recommended values. The use of Monte-Carlo simulations for a better definition of the dose deposited in the tumour volumes requires long computing time. In order to reduce it, we exploited E.G.E.E. grid infrastructure where simulations are distributed using innovative technologies taking into account the grid status. Time necessary for the computing of a radiotherapy planning simulation using electrons was reduced by a factor 30. A Web platform based on G.E.N.I.U.S. portal was developed to make easily available all the methods to submit and manage G

  10. Monte Carlo Simulation in Statistical Physics An Introduction

    CERN Document Server

    Binder, Kurt

    2010-01-01

    Monte Carlo Simulation in Statistical Physics deals with the computer simulation of many-body systems in condensed-matter physics and related fields of physics, chemistry and beyond, to traffic flows, stock market fluctuations, etc.). Using random numbers generated by a computer, probability distributions are calculated, allowing the estimation of the thermodynamic properties of various systems. This book describes the theoretical background to several variants of these Monte Carlo methods and gives a systematic presentation from which newcomers can learn to perform such simulations and to analyze their results. The fifth edition covers Classical as well as Quantum Monte Carlo methods. Furthermore a new chapter on the sampling of free-energy landscapes has been added. To help students in their work a special web server has been installed to host programs and discussion groups (http://wwwcp.tphys.uni-heidelberg.de). Prof. Binder was awarded the Berni J. Alder CECAM Award for Computational Physics 2001 as well ...

  11. The Development and Comparison of Molecular Dynamics Simulation and Monte Carlo Simulation

    Science.gov (United States)

    Chen, Jundong

    2018-03-01

    Molecular dynamics is an integrated technology that combines physics, mathematics and chemistry. Molecular dynamics method is a computer simulation experimental method, which is a powerful tool for studying condensed matter system. This technique not only can get the trajectory of the atom, but can also observe the microscopic details of the atomic motion. By studying the numerical integration algorithm in molecular dynamics simulation, we can not only analyze the microstructure, the motion of particles and the image of macroscopic relationship between them and the material, but can also study the relationship between the interaction and the macroscopic properties more conveniently. The Monte Carlo Simulation, similar to the molecular dynamics, is a tool for studying the micro-molecular and particle nature. In this paper, the theoretical background of computer numerical simulation is introduced, and the specific methods of numerical integration are summarized, including Verlet method, Leap-frog method and Velocity Verlet method. At the same time, the method and principle of Monte Carlo Simulation are introduced. Finally, similarities and differences of Monte Carlo Simulation and the molecular dynamics simulation are discussed.

  12. Region-oriented CT image representation for reducing computing time of Monte Carlo simulations

    International Nuclear Information System (INIS)

    Sarrut, David; Guigues, Laurent

    2008-01-01

    Purpose. We propose a new method for efficient particle transportation in voxelized geometry for Monte Carlo simulations. We describe its use for calculating dose distribution in CT images for radiation therapy. Material and methods. The proposed approach, based on an implicit volume representation named segmented volume, coupled with an adapted segmentation procedure and a distance map, allows us to minimize the number of boundary crossings, which slows down simulation. The method was implemented with the GEANT4 toolkit and compared to four other methods: One box per voxel, parameterized volumes, octree-based volumes, and nested parameterized volumes. For each representation, we compared dose distribution, time, and memory consumption. Results. The proposed method allows us to decrease computational time by up to a factor of 15, while keeping memory consumption low, and without any modification of the transportation engine. Speeding up is related to the geometry complexity and the number of different materials used. We obtained an optimal number of steps with removal of all unnecessary steps between adjacent voxels sharing a similar material. However, the cost of each step is increased. When the number of steps cannot be decreased enough, due for example, to the large number of material boundaries, such a method is not considered suitable. Conclusion. This feasibility study shows that optimizing the representation of an image in memory potentially increases computing efficiency. We used the GEANT4 toolkit, but we could potentially use other Monte Carlo simulation codes. The method introduces a tradeoff between speed and geometry accuracy, allowing computational time gain. However, simulations with GEANT4 remain slow and further work is needed to speed up the procedure while preserving the desired accuracy

  13. Pediatric personalized CT-dosimetry Monte Carlo simulations, using computational phantoms

    International Nuclear Information System (INIS)

    Papadimitroulas, P; Kagadis, G C; Ploussi, A; Kordolaimi, S; Papamichail, D; Karavasilis, E; Syrgiamiotis, V; Loudos, G

    2015-01-01

    The last 40 years Monte Carlo (MC) simulations serve as a “gold standard” tool for a wide range of applications in the field of medical physics and tend to be essential in daily clinical practice. Regarding diagnostic imaging applications, such as computed tomography (CT), the assessment of deposited energy is of high interest, so as to better analyze the risks and the benefits of the procedure. The last few years a big effort is done towards personalized dosimetry, especially in pediatric applications. In the present study the GATE toolkit was used and computational pediatric phantoms have been modeled for the assessment of CT examinations dosimetry. The pediatric models used come from the XCAT and IT'IS series. The X-ray spectrum of a Brightspeed CT scanner was simulated and validated with experimental data. Specifically, a DCT-10 ionization chamber was irradiated twice using 120 kVp with 100 mAs and 200 mAs, for 1 sec in 1 central axial slice (thickness = 10mm). The absorbed dose was measured in air resulting in differences lower than 4% between the experimental and simulated data. The simulations were acquired using ∼10 10 number of primaries in order to achieve low statistical uncertainties. Dose maps were also saved for quantification of the absorbed dose in several children critical organs during CT acquisition. (paper)

  14. Rapid Monte Carlo Simulation of Gravitational Wave Galaxies

    Science.gov (United States)

    Breivik, Katelyn; Larson, Shane L.

    2015-01-01

    With the detection of gravitational waves on the horizon, astrophysical catalogs produced by gravitational wave observatories can be used to characterize the populations of sources and validate different galactic population models. Efforts to simulate gravitational wave catalogs and source populations generally focus on population synthesis models that require extensive time and computational power to produce a single simulated galaxy. Monte Carlo simulations of gravitational wave source populations can also be used to generate observation catalogs from the gravitational wave source population. Monte Carlo simulations have the advantes of flexibility and speed, enabling rapid galactic realizations as a function of galactic binary parameters with less time and compuational resources required. We present a Monte Carlo method for rapid galactic simulations of gravitational wave binary populations.

  15. Improving computational efficiency of Monte Carlo simulations with variance reduction

    International Nuclear Information System (INIS)

    Turner, A.; Davis, A.

    2013-01-01

    CCFE perform Monte-Carlo transport simulations on large and complex tokamak models such as ITER. Such simulations are challenging since streaming and deep penetration effects are equally important. In order to make such simulations tractable, both variance reduction (VR) techniques and parallel computing are used. It has been found that the application of VR techniques in such models significantly reduces the efficiency of parallel computation due to 'long histories'. VR in MCNP can be accomplished using energy-dependent weight windows. The weight window represents an 'average behaviour' of particles, and large deviations in the arriving weight of a particle give rise to extreme amounts of splitting being performed and a long history. When running on parallel clusters, a long history can have a detrimental effect on the parallel efficiency - if one process is computing the long history, the other CPUs complete their batch of histories and wait idle. Furthermore some long histories have been found to be effectively intractable. To combat this effect, CCFE has developed an adaptation of MCNP which dynamically adjusts the WW where a large weight deviation is encountered. The method effectively 'de-optimises' the WW, reducing the VR performance but this is offset by a significant increase in parallel efficiency. Testing with a simple geometry has shown the method does not bias the result. This 'long history method' has enabled CCFE to significantly improve the performance of MCNP calculations for ITER on parallel clusters, and will be beneficial for any geometry combining streaming and deep penetration effects. (authors)

  16. Design and study of parallel computing environment of Monte Carlo simulation for particle therapy planning using a public cloud-computing infrastructure

    International Nuclear Information System (INIS)

    Yokohama, Noriya

    2013-01-01

    This report was aimed at structuring the design of architectures and studying performance measurement of a parallel computing environment using a Monte Carlo simulation for particle therapy using a high performance computing (HPC) instance within a public cloud-computing infrastructure. Performance measurements showed an approximately 28 times faster speed than seen with single-thread architecture, combined with improved stability. A study of methods of optimizing the system operations also indicated lower cost. (author)

  17. Monte Carlo simulation of grain growth

    Directory of Open Access Journals (Sweden)

    Paulo Blikstein

    1999-07-01

    Full Text Available Understanding and predicting grain growth in Metallurgy is meaningful. Monte Carlo methods have been used in computer simulations in many different fields of knowledge. Grain growth simulation using this method is especially attractive as the statistical behavior of the atoms is properly reproduced; microstructural evolution depends only on the real topology of the grains and not on any kind of geometric simplification. Computer simulation has the advantage of allowing the user to visualize graphically the procedures, even dynamically and in three dimensions. Single-phase alloy grain growth simulation was carried out by calculating the free energy of each atom in the lattice (with its present crystallographic orientation and comparing this value to another one calculated with a different random orientation. When the resulting free energy is lower or equal to the initial value, the new orientation replaces the former. The measure of time is the Monte Carlo Step (MCS, which involves a series of trials throughout the lattice. A very close relationship between experimental and theoretical values for the grain growth exponent (n was observed.

  18. General purpose code for Monte Carlo simulations

    International Nuclear Information System (INIS)

    Wilcke, W.W.

    1983-01-01

    A general-purpose computer called MONTHY has been written to perform Monte Carlo simulations of physical systems. To achieve a high degree of flexibility the code is organized like a general purpose computer, operating on a vector describing the time dependent state of the system under simulation. The instruction set of the computer is defined by the user and is therefore adaptable to the particular problem studied. The organization of MONTHY allows iterative and conditional execution of operations

  19. Simulation and the Monte Carlo method

    CERN Document Server

    Rubinstein, Reuven Y

    2016-01-01

    Simulation and the Monte Carlo Method, Third Edition reflects the latest developments in the field and presents a fully updated and comprehensive account of the major topics that have emerged in Monte Carlo simulation since the publication of the classic First Edition over more than a quarter of a century ago. While maintaining its accessible and intuitive approach, this revised edition features a wealth of up-to-date information that facilitates a deeper understanding of problem solving across a wide array of subject areas, such as engineering, statistics, computer science, mathematics, and the physical and life sciences. The book begins with a modernized introduction that addresses the basic concepts of probability, Markov processes, and convex optimization. Subsequent chapters discuss the dramatic changes that have occurred in the field of the Monte Carlo method, with coverage of many modern topics including: Markov Chain Monte Carlo, variance reduction techniques such as the transform likelihood ratio...

  20. Linking computer-aided design (CAD) to Geant4-based Monte Carlo simulations for precise implementation of complex treatment head geometries

    International Nuclear Information System (INIS)

    Constantin, Magdalena; Constantin, Dragos E; Keall, Paul J; Narula, Anisha; Svatos, Michelle; Perl, Joseph

    2010-01-01

    Most of the treatment head components of medical linear accelerators used in radiation therapy have complex geometrical shapes. They are typically designed using computer-aided design (CAD) applications. In Monte Carlo simulations of radiotherapy beam transport through the treatment head components, the relevant beam-generating and beam-modifying devices are inserted in the simulation toolkit using geometrical approximations of these components. Depending on their complexity, such approximations may introduce errors that can be propagated throughout the simulation. This drawback can be minimized by exporting a more precise geometry of the linac components from CAD and importing it into the Monte Carlo simulation environment. We present a technique that links three-dimensional CAD drawings of the treatment head components to Geant4 Monte Carlo simulations of dose deposition. (note)

  1. Monte Carlo simulation of Markov unreliability models

    International Nuclear Information System (INIS)

    Lewis, E.E.; Boehm, F.

    1984-01-01

    A Monte Carlo method is formulated for the evaluation of the unrealibility of complex systems with known component failure and repair rates. The formulation is in terms of a Markov process allowing dependences between components to be modeled and computational efficiencies to be achieved in the Monte Carlo simulation. Two variance reduction techniques, forced transition and failure biasing, are employed to increase computational efficiency of the random walk procedure. For an example problem these result in improved computational efficiency by more than three orders of magnitudes over analog Monte Carlo. The method is generalized to treat problems with distributed failure and repair rate data, and a batching technique is introduced and shown to result in substantial increases in computational efficiency for an example problem. A method for separating the variance due to the data uncertainty from that due to the finite number of random walks is presented. (orig.)

  2. Monte Carlo computation in the applied research of nuclear technology

    International Nuclear Information System (INIS)

    Xu Shuyan; Liu Baojie; Li Qin

    2007-01-01

    This article briefly introduces Monte Carlo Methods and their properties. It narrates the Monte Carlo methods with emphasis in their applications to several domains of nuclear technology. Monte Carlo simulation methods and several commonly used computer software to implement them are also introduced. The proposed methods are demonstrated by a real example. (authors)

  3. Computational physics an introduction to Monte Carlo simulations of matrix field theory

    CERN Document Server

    Ydri, Badis

    2017-01-01

    This book is divided into two parts. In the first part we give an elementary introduction to computational physics consisting of 21 simulations which originated from a formal course of lectures and laboratory simulations delivered since 2010 to physics students at Annaba University. The second part is much more advanced and deals with the problem of how to set up working Monte Carlo simulations of matrix field theories which involve finite dimensional matrix regularizations of noncommutative and fuzzy field theories, fuzzy spaces and matrix geometry. The study of matrix field theory in its own right has also become very important to the proper understanding of all noncommutative, fuzzy and matrix phenomena. The second part, which consists of 9 simulations, was delivered informally to doctoral students who are working on various problems in matrix field theory. Sample codes as well as sample key solutions are also provided for convenience and completness. An appendix containing an executive arabic summary of t...

  4. Exploring Various Monte Carlo Simulations for Geoscience Applications

    Science.gov (United States)

    Blais, R.

    2010-12-01

    Computer simulations are increasingly important in geoscience research and development. At the core of stochastic or Monte Carlo simulations are the random number sequences that are assumed to be distributed with specific characteristics. Computer generated random numbers, uniformly distributed on (0, 1), can be very different depending on the selection of pseudo-random number (PRN), or chaotic random number (CRN) generators. Equidistributed quasi-random numbers (QRNs) can also be used in Monte Carlo simulations. In the evaluation of some definite integrals, the resulting error variances can even be of different orders of magnitude. Furthermore, practical techniques for variance reduction such as Importance Sampling and Stratified Sampling can be implemented to significantly improve the results. A comparative analysis of these strategies has been carried out for computational applications in planar and spatial contexts. Based on these experiments, and on examples of geodetic applications of gravimetric terrain corrections and gravity inversion, conclusions and recommendations concerning their performance and general applicability are included.

  5. Exploring pseudo- and chaotic random Monte Carlo simulations

    Science.gov (United States)

    Blais, J. A. Rod; Zhang, Zhan

    2011-07-01

    Computer simulations are an increasingly important area of geoscience research and development. At the core of stochastic or Monte Carlo simulations are the random number sequences that are assumed to be distributed with specific characteristics. Computer-generated random numbers, uniformly distributed on (0, 1), can be very different depending on the selection of pseudo-random number (PRN) or chaotic random number (CRN) generators. In the evaluation of some definite integrals, the resulting error variances can even be of different orders of magnitude. Furthermore, practical techniques for variance reduction such as importance sampling and stratified sampling can be applied in most Monte Carlo simulations and significantly improve the results. A comparative analysis of these strategies has been carried out for computational applications in planar and spatial contexts. Based on these experiments, and on some practical examples of geodetic direct and inverse problems, conclusions and recommendations concerning their performance and general applicability are included.

  6. Evaluation and characterization of X-ray scattering in tissues and mammographic simulators using Monte Carlo simulation

    International Nuclear Information System (INIS)

    Oliveira, Monica G. Nunes; Braz, Delson; Silva, Regina Cely B. da S.

    2005-01-01

    The computer simulation has been widely used in physical researches by both the viability of the codes and the growth of the power of computers in the last decades. The Monte Carlo simulation program, EGS4 code is a simulation program used in the area of radiation transport. The simulators, surrogate tissues, phantoms are objects used to perform studies on dosimetric quantities and quality testing of images. The simulators have characteristics of scattering and absorption of radiation similar to tissues that make up the body. The aim of this work is to translate the effects of radiation interactions in a real healthy breast tissues, sick and on simulators using the EGS4 Monte Carlo simulation code

  7. A general purpose code for Monte Carlo simulations

    International Nuclear Information System (INIS)

    Wilcke, W.W.; Rochester Univ., NY

    1984-01-01

    A general-purpose computer code MONTHY has been written to perform Monte Carlo simulations of physical systems. To achieve a high degree of flexibility the code is organized like a general purpose computer, operating on a vector describing the time dependent state of the system under simulation. The instruction set of the 'computer' is defined by the user and is therefore adaptable to the particular problem studied. The organization of MONTHY allows iterative and conditional execution of operations. (orig.)

  8. Monte Carlo simulation of neutron counters for safeguards applications

    International Nuclear Information System (INIS)

    Looman, Marc; Peerani, Paolo; Tagziria, Hamid

    2009-01-01

    MCNP-PTA is a new Monte Carlo code for the simulation of neutron counters for nuclear safeguards applications developed at the Joint Research Centre (JRC) in Ispra (Italy). After some preliminary considerations outlining the general aspects involved in the computational modelling of neutron counters, this paper describes the specific details and approximations which make up the basis of the model implemented in the code. One of the major improvements allowed by the use of Monte Carlo simulation is a considerable reduction in both the experimental work and in the reference materials required for the calibration of the instruments. This new approach to the calibration of counters using Monte Carlo simulation techniques is also discussed.

  9. Monte Carlo simulation in statistical physics an introduction

    CERN Document Server

    Binder, Kurt

    1992-01-01

    The Monte Carlo method is a computer simulation method which uses random numbers to simulate statistical fluctuations The method is used to model complex systems with many degrees of freedom Probability distributions for these systems are generated numerically and the method then yields numerically exact information on the models Such simulations may be used tosee how well a model system approximates a real one or to see how valid the assumptions are in an analyical theory A short and systematic theoretical introduction to the method forms the first part of this book The second part is a practical guide with plenty of examples and exercises for the student Problems treated by simple sampling (random and self-avoiding walks, percolation clusters, etc) are included, along with such topics as finite-size effects and guidelines for the analysis of Monte Carlo simulations The two parts together provide an excellent introduction to the theory and practice of Monte Carlo simulations

  10. Radiation doses in volume-of-interest breast computed tomography—A Monte Carlo simulation study

    Energy Technology Data Exchange (ETDEWEB)

    Lai, Chao-Jen, E-mail: cjlai3711@gmail.com; Zhong, Yuncheng; Yi, Ying; Wang, Tianpeng; Shaw, Chris C. [Department of Imaging Physics, The University of Texas MD Anderson Cancer Center, Houston, Texas 77030-4009 (United States)

    2015-06-15

    Purpose: Cone beam breast computed tomography (breast CT) with true three-dimensional, nearly isotropic spatial resolution has been developed and investigated over the past decade to overcome the problem of lesions overlapping with breast anatomical structures on two-dimensional mammographic images. However, the ability of breast CT to detect small objects, such as tissue structure edges and small calcifications, is limited. To resolve this problem, the authors proposed and developed a volume-of-interest (VOI) breast CT technique to image a small VOI using a higher radiation dose to improve that region’s visibility. In this study, the authors performed Monte Carlo simulations to estimate average breast dose and average glandular dose (AGD) for the VOI breast CT technique. Methods: Electron–Gamma-Shower system code-based Monte Carlo codes were used to simulate breast CT. The Monte Carlo codes estimated were validated using physical measurements of air kerma ratios and point doses in phantoms with an ion chamber and optically stimulated luminescence dosimeters. The validated full cone x-ray source was then collimated to simulate half cone beam x-rays to image digital pendant-geometry, hemi-ellipsoidal, homogeneous breast phantoms and to estimate breast doses with full field scans. 13-cm in diameter, 10-cm long hemi-ellipsoidal homogeneous phantoms were used to simulate median breasts. Breast compositions of 25% and 50% volumetric glandular fractions (VGFs) were used to investigate the influence on breast dose. The simulated half cone beam x-rays were then collimated to a narrow x-ray beam with an area of 2.5 × 2.5 cm{sup 2} field of view at the isocenter plane and to perform VOI field scans. The Monte Carlo results for the full field scans and the VOI field scans were then used to estimate the AGD for the VOI breast CT technique. Results: The ratios of air kerma ratios and dose measurement results from the Monte Carlo simulation to those from the physical

  11. Development of a space radiation Monte Carlo computer simulation based on the FLUKA and ROOT codes

    CERN Document Server

    Pinsky, L; Ferrari, A; Sala, P; Carminati, F; Brun, R

    2001-01-01

    This NASA funded project is proceeding to develop a Monte Carlo-based computer simulation of the radiation environment in space. With actual funding only initially in place at the end of May 2000, the study is still in the early stage of development. The general tasks have been identified and personnel have been selected. The code to be assembled will be based upon two major existing software packages. The radiation transport simulation will be accomplished by updating the FLUKA Monte Carlo program, and the user interface will employ the ROOT software being developed at CERN. The end-product will be a Monte Carlo-based code which will complement the existing analytic codes such as BRYNTRN/HZETRN presently used by NASA to evaluate the effects of radiation shielding in space. The planned code will possess the ability to evaluate the radiation environment for spacecraft and habitats in Earth orbit, in interplanetary space, on the lunar surface, or on a planetary surface such as Mars. Furthermore, it will be usef...

  12. Atomistic Monte Carlo simulation of lipid membranes

    DEFF Research Database (Denmark)

    Wüstner, Daniel; Sklenar, Heinz

    2014-01-01

    Biological membranes are complex assemblies of many different molecules of which analysis demands a variety of experimental and computational approaches. In this article, we explain challenges and advantages of atomistic Monte Carlo (MC) simulation of lipid membranes. We provide an introduction...... of local-move MC methods in combination with molecular dynamics simulations, for example, for studying multi-component lipid membranes containing cholesterol....

  13. Monte Carlo simulations on SIMD computer architectures

    International Nuclear Information System (INIS)

    Burmester, C.P.; Gronsky, R.; Wille, L.T.

    1992-01-01

    In this paper algorithmic considerations regarding the implementation of various materials science applications of the Monte Carlo technique to single instruction multiple data (SIMD) computer architectures are presented. In particular, implementation of the Ising model with nearest, next nearest, and long range screened Coulomb interactions on the SIMD architecture MasPar MP-1 (DEC mpp-12000) series of massively parallel computers is demonstrated. Methods of code development which optimize processor array use and minimize inter-processor communication are presented including lattice partitioning and the use of processor array spanning tree structures for data reduction. Both geometric and algorithmic parallel approaches are utilized. Benchmarks in terms of Monte Carl updates per second for the MasPar architecture are presented and compared to values reported in the literature from comparable studies on other architectures

  14. Monte Carlo simulation for theoretical calculations of damage and sputtering processes

    International Nuclear Information System (INIS)

    Yamamura, Yasunori

    1984-01-01

    The radiation damage accompanying ion irradiation and the various problems caused with it should be determined in principle by resolving Boltzmann's equations. However, in reality, those for a semi-infinite system cannot be generally resolved. Moreover, the effect of crystals, oblique incidence and so on make the situation more difficult. The analysis of the complicated phenomena of the collision in solids and the problems of radiation damage and sputtering accompanying them is possible in most cases only by computer simulation. At present, the methods of simulating the atomic collision phenomena in solids are roughly classified into molecular dynamics method and Monte Carlo method. In the molecular dynamics, Newton's equations are numerically calculated time-dependently as they are, and it has large merits that many body effect and nonlinear effect can be taken in consideration, but much computing time is required. The features and problems of the Monte Carlo simulation and nonlinear Monte Carlo simulation are described. The comparison of the Monte Carlo simulation codes calculating on the basis of two-body collision approximation, MARLOWE, TRIM and ACAT, was carried out through the calculation of the backscattering spectra of light ions. (Kako, I.)

  15. A computationally efficient simulator for three-dimensional Monte Carlo simulation of ion implantation into complex structures

    International Nuclear Information System (INIS)

    Li Di; Wang Geng; Chen Yang; Li Lin; Shrivastav, Gaurav; Oak, Stimit; Tasch, Al; Banerjee, Sanjay; Obradovic, Borna

    2001-01-01

    A physically-based three-dimensional Monte Carlo simulator has been developed within UT-MARLOWE, which is capable of simulating ion implantation into multi-material systems and arbitrary topography. Introducing the third dimension can result in a severe CPU time penalty. In order to minimize this penalty, a three-dimensional trajectory replication algorithm has been developed, implemented and verified. More than two orders of magnitude savings of CPU time have been observed. An unbalanced Octree structure was used to decompose three-dimensional structures. It effectively simplifies the structure, offers a good balance between modeling accuracy and computational efficiency, and allows arbitrary precision of mapping the Octree onto desired structure. Using the well-established and validated physical models in UT-MARLOWE 5.0, this simulator has been extensively verified by comparing the integrated one-dimensional simulation results with secondary ion mass spectroscopy (SIMS). Two options, the typical case and the worst scenario, have been selected to simulate ion implantation into poly-silicon under various scenarios using this simulator: implantation into a random, amorphous network, and implantation into the worst-case channeling condition, into (1 1 0) orientated wafers

  16. Monte Carlo simulation of experiments

    International Nuclear Information System (INIS)

    Opat, G.I.

    1977-07-01

    An outline of the technique of computer simulation of particle physics experiments by the Monte Carlo method is presented. Useful special purpose subprograms are listed and described. At each stage the discussion is made concrete by direct reference to the programs SIMUL8 and its variant MONTE-PION, written to assist in the analysis of the radiative decay experiments μ + → e + ν sub(e) antiνγ and π + → e + ν sub(e)γ, respectively. These experiments were based on the use of two large sodium iodide crystals, TINA and MINA, as e and γ detectors. Instructions for the use of SIMUL8 and MONTE-PION are given. (author)

  17. Monte Carlo simulations of adult and pediatric computed tomography exams: Validation studies of organ doses with physical phantoms

    International Nuclear Information System (INIS)

    Long, Daniel J.; Lee, Choonsik; Tien, Christopher; Fisher, Ryan; Hoerner, Matthew R.; Hintenlang, David; Bolch, Wesley E.

    2013-01-01

    Purpose: To validate the accuracy of a Monte Carlo source model of the Siemens SOMATOM Sensation 16 CT scanner using organ doses measured in physical anthropomorphic phantoms. Methods: The x-ray output of the Siemens SOMATOM Sensation 16 multidetector CT scanner was simulated within the Monte Carlo radiation transport code, MCNPX version 2.6. The resulting source model was able to perform various simulated axial and helical computed tomographic (CT) scans of varying scan parameters, including beam energy, filtration, pitch, and beam collimation. Two custom-built anthropomorphic phantoms were used to take dose measurements on the CT scanner: an adult male and a 9-month-old. The adult male is a physical replica of University of Florida reference adult male hybrid computational phantom, while the 9-month-old is a replica of University of Florida Series B 9-month-old voxel computational phantom. Each phantom underwent a series of axial and helical CT scans, during which organ doses were measured using fiber-optic coupled plastic scintillator dosimeters developed at University of Florida. The physical setup was reproduced and simulated in MCNPX using the CT source model and the computational phantoms upon which the anthropomorphic phantoms were constructed. Average organ doses were then calculated based upon these MCNPX results. Results: For all CT scans, good agreement was seen between measured and simulated organ doses. For the adult male, the percent differences were within 16% for axial scans, and within 18% for helical scans. For the 9-month-old, the percent differences were all within 15% for both the axial and helical scans. These results are comparable to previously published validation studies using GE scanners and commercially available anthropomorphic phantoms. Conclusions: Overall results of this study show that the Monte Carlo source model can be used to accurately and reliably calculate organ doses for patients undergoing a variety of axial or helical CT

  18. Monte Carlo and analytic simulations in nanoparticle-enhanced radiation therapy

    Directory of Open Access Journals (Sweden)

    Paro AD

    2016-09-01

    Full Text Available Autumn D Paro,1 Mainul Hossain,2 Thomas J Webster,1,3,4 Ming Su1,4 1Department of Chemical Engineering, Northeastern University, Boston, MA, USA; 2NanoScience Technology Center and School of Electrical Engineering and Computer Science, University of Central Florida, Orlando, Florida, USA; 3Excellence for Advanced Materials Research, King Abdulaziz University, Jeddah, Saudi Arabia; 4Wenzhou Institute of Biomaterials and Engineering, Chinese Academy of Science, Wenzhou Medical University, Zhejiang, People’s Republic of China Abstract: Analytical and Monte Carlo simulations have been used to predict dose enhancement factors in nanoparticle-enhanced X-ray radiation therapy. Both simulations predict an increase in dose enhancement in the presence of nanoparticles, but the two methods predict different levels of enhancement over the studied energy, nanoparticle materials, and concentration regime for several reasons. The Monte Carlo simulation calculates energy deposited by electrons and photons, while the analytical one only calculates energy deposited by source photons and photoelectrons; the Monte Carlo simulation accounts for electron–hole recombination, while the analytical one does not; and the Monte Carlo simulation randomly samples photon or electron path and accounts for particle interactions, while the analytical simulation assumes a linear trajectory. This study demonstrates that the Monte Carlo simulation will be a better choice to evaluate dose enhancement with nanoparticles in radiation therapy. Keywords: nanoparticle, dose enhancement, Monte Carlo simulation, analytical simulation, radiation therapy, tumor cell, X-ray 

  19. Monte Carlo methods of PageRank computation

    NARCIS (Netherlands)

    Litvak, Nelli

    2004-01-01

    We describe and analyze an on-line Monte Carlo method of PageRank computation. The PageRank is being estimated basing on results of a large number of short independent simulation runs initiated from each page that contains outgoing hyperlinks. The method does not require any storage of the hyperlink

  20. Monte Carlo simulation of VHTR particle fuel with chord length sampling

    International Nuclear Information System (INIS)

    Ji, W.; Martin, W. R.

    2007-01-01

    The Very High Temperature Gas-Cooled Reactor (VHTR) poses a problem for neutronic analysis due to the double heterogeneity posed by the particle fuel and either the fuel compacts in the case of the prismatic block reactor or the fuel pebbles in the case of the pebble bed reactor. Direct Monte Carlo simulation has been used in recent years to analyze these VHTR configurations but is computationally challenged when space dependent phenomena are considered such as depletion or temperature feedback. As an alternative approach, we have considered chord length sampling to reduce the computational burden of the Monte Carlo simulation. We have improved on an existing method called 'limited chord length sampling' and have used it to analyze stochastic media representative of either pebble bed or prismatic VHTR fuel geometries. Based on the assumption that the PDF had an exponential form, a theoretical chord length distribution is derived and shown to be an excellent model for a wide range of packing fractions. This chord length PDF was then used to analyze a stochastic medium that was constructed using the RSA (Random Sequential Addition) algorithm and the results were compared to a benchmark Monte Carlo simulation of the actual stochastic geometry. The results are promising and suggest that the theoretical chord length PDF can be used instead of a full Monte Carlo random walk simulation in the stochastic medium, saving orders of magnitude in computational time (and memory demand) to perform the simulation. (authors)

  1. Parallel Algorithms for Monte Carlo Particle Transport Simulation on Exascale Computing Architectures

    Science.gov (United States)

    Romano, Paul Kollath

    Monte Carlo particle transport methods are being considered as a viable option for high-fidelity simulation of nuclear reactors. While Monte Carlo methods offer several potential advantages over deterministic methods, there are a number of algorithmic shortcomings that would prevent their immediate adoption for full-core analyses. In this thesis, algorithms are proposed both to ameliorate the degradation in parallel efficiency typically observed for large numbers of processors and to offer a means of decomposing large tally data that will be needed for reactor analysis. A nearest-neighbor fission bank algorithm was proposed and subsequently implemented in the OpenMC Monte Carlo code. A theoretical analysis of the communication pattern shows that the expected cost is O( N ) whereas traditional fission bank algorithms are O(N) at best. The algorithm was tested on two supercomputers, the Intrepid Blue Gene/P and the Titan Cray XK7, and demonstrated nearly linear parallel scaling up to 163,840 processor cores on a full-core benchmark problem. An algorithm for reducing network communication arising from tally reduction was analyzed and implemented in OpenMC. The proposed algorithm groups only particle histories on a single processor into batches for tally purposes---in doing so it prevents all network communication for tallies until the very end of the simulation. The algorithm was tested, again on a full-core benchmark, and shown to reduce network communication substantially. A model was developed to predict the impact of load imbalances on the performance of domain decomposed simulations. The analysis demonstrated that load imbalances in domain decomposed simulations arise from two distinct phenomena: non-uniform particle densities and non-uniform spatial leakage. The dominant performance penalty for domain decomposition was shown to come from these physical effects rather than insufficient network bandwidth or high latency. The model predictions were verified with

  2. Hybrid Multilevel Monte Carlo Simulation of Stochastic Reaction Networks

    KAUST Repository

    Moraes, Alvaro

    2015-01-01

    even more, we want to achieve this objective with near optimal computational work. We first introduce a hybrid path-simulation scheme based on the well-known stochastic simulation algorithm (SSA)[3] and the tau-leap method [2]. Then, we introduce a Multilevel Monte Carlo strategy that allows us to achieve a computational complexity of order O(T OL−2), this is the same computational complexity as in an exact method but with a smaller constant. We provide numerical examples to show our results.

  3. Computation cluster for Monte Carlo calculations

    Energy Technology Data Exchange (ETDEWEB)

    Petriska, M.; Vitazek, K.; Farkas, G.; Stacho, M.; Michalek, S. [Dep. Of Nuclear Physics and Technology, Faculty of Electrical Engineering and Information, Technology, Slovak Technical University, Ilkovicova 3, 81219 Bratislava (Slovakia)

    2010-07-01

    Two computation clusters based on Rocks Clusters 5.1 Linux distribution with Intel Core Duo and Intel Core Quad based computers were made at the Department of the Nuclear Physics and Technology. Clusters were used for Monte Carlo calculations, specifically for MCNP calculations applied in Nuclear reactor core simulations. Optimization for computation speed was made on hardware and software basis. Hardware cluster parameters, such as size of the memory, network speed, CPU speed, number of processors per computation, number of processors in one computer were tested for shortening the calculation time. For software optimization, different Fortran compilers, MPI implementations and CPU multi-core libraries were tested. Finally computer cluster was used in finding the weighting functions of neutron ex-core detectors of VVER-440. (authors)

  4. Computation cluster for Monte Carlo calculations

    International Nuclear Information System (INIS)

    Petriska, M.; Vitazek, K.; Farkas, G.; Stacho, M.; Michalek, S.

    2010-01-01

    Two computation clusters based on Rocks Clusters 5.1 Linux distribution with Intel Core Duo and Intel Core Quad based computers were made at the Department of the Nuclear Physics and Technology. Clusters were used for Monte Carlo calculations, specifically for MCNP calculations applied in Nuclear reactor core simulations. Optimization for computation speed was made on hardware and software basis. Hardware cluster parameters, such as size of the memory, network speed, CPU speed, number of processors per computation, number of processors in one computer were tested for shortening the calculation time. For software optimization, different Fortran compilers, MPI implementations and CPU multi-core libraries were tested. Finally computer cluster was used in finding the weighting functions of neutron ex-core detectors of VVER-440. (authors)

  5. Monte Carlo simulation of hybrid systems: An example

    International Nuclear Information System (INIS)

    Bacha, F.; D'Alencon, H.; Grivelet, J.; Jullien, E.; Jejcic, A.; Maillard, J.; Silva, J.; Zukanovich, R.; Vergnes, J.

    1997-01-01

    Simulation of hybrid systems needs tracking of particles from the GeV (incident proton beam) range down to a fraction of eV (thermic neutrons). We show how a GEANT based Monte-Carlo program can achieve this, with a realistic computer time and accompanying tools. An example of a dedicated original actinide burner is simulated with this chain. 8 refs., 5 figs

  6. Octree indexing of DICOM images for voxel number reduction and improvement of Monte Carlo simulation computing efficiency

    International Nuclear Information System (INIS)

    Hubert-Tremblay, Vincent; Archambault, Louis; Tubic, Dragan; Roy, Rene; Beaulieu, Luc

    2006-01-01

    The purpose of the present study is to introduce a compression algorithm for the CT (computed tomography) data used in Monte Carlo simulations. Performing simulations on the CT data implies large computational costs as well as large memory requirements since the number of voxels in such data reaches typically into hundreds of millions voxels. CT data, however, contain homogeneous regions which could be regrouped to form larger voxels without affecting the simulation's accuracy. Based on this property we propose a compression algorithm based on octrees: in homogeneous regions the algorithm replaces groups of voxels with a smaller number of larger voxels. This reduces the number of voxels while keeping the critical high-density gradient area. Results obtained using the present algorithm on both phantom and clinical data show that compression rates up to 75% are possible without losing the dosimetric accuracy of the simulation

  7. Reactor physics simulations with coupled Monte Carlo calculation and computational fluid dynamics

    International Nuclear Information System (INIS)

    Seker, V.; Thomas, J.W.; Downar, T.J.

    2007-01-01

    A computational code system based on coupling the Monte Carlo code MCNP5 and the Computational Fluid Dynamics (CFD) code STAR-CD was developed as an audit tool for lower order nuclear reactor calculations. This paper presents the methodology of the developed computer program 'McSTAR'. McSTAR is written in FORTRAN90 programming language and couples MCNP5 and the commercial CFD code STAR-CD. MCNP uses a continuous energy cross section library produced by the NJOY code system from the raw ENDF/B data. A major part of the work was to develop and implement methods to update the cross section library with the temperature distribution calculated by STARCD for every region. Three different methods were investigated and implemented in McSTAR. The user subroutines in STAR-CD are modified to read the power density data and assign them to the appropriate variables in the program and to write an output data file containing the temperature, density and indexing information to perform the mapping between MCNP and STAR-CD cells. Preliminary testing of the code was performed using a 3x3 PWR pin-cell problem. The preliminary results are compared with those obtained from a STAR-CD coupled calculation with the deterministic transport code DeCART. Good agreement in the k eff and the power profile was observed. Increased computational capabilities and improvements in computational methods have accelerated interest in high fidelity modeling of nuclear reactor cores during the last several years. High-fidelity has been achieved by utilizing full core neutron transport solutions for the neutronics calculation and computational fluid dynamics solutions for the thermal-hydraulics calculation. Previous researchers have reported the coupling of 3D deterministic neutron transport method to CFD and their application to practical reactor analysis problems. One of the principal motivations of the work here was to utilize Monte Carlo methods to validate the coupled deterministic neutron transport

  8. A computer code package for Monte Carlo photon-electron transport simulation Comparisons with experimental benchmarks

    International Nuclear Information System (INIS)

    Popescu, Lucretiu M.

    2000-01-01

    A computer code package (PTSIM) for particle transport Monte Carlo simulation was developed using object oriented techniques of design and programming. A flexible system for simulation of coupled photon, electron transport, facilitating development of efficient simulation applications, was obtained. For photons: Compton and photo-electric effects, pair production and Rayleigh interactions are simulated, while for electrons, a class II condensed history scheme was considered, in which catastrophic interactions (Moeller electron-electron interaction, bremsstrahlung, etc.) are treated in detail and all other interactions with reduced individual effect on electron history are grouped together using continuous slowing down approximation and energy straggling theories. Electron angular straggling is simulated using Moliere theory or a mixed model in which scatters at large angles are treated as distinct events. Comparisons with experimentally benchmarks for electron transmission and bremsstrahlung emissions energy and angular spectra, and for dose calculations are presented

  9. Monte Carlo simulation in nuclear medicine

    International Nuclear Information System (INIS)

    Morel, Ch.

    2007-01-01

    The Monte Carlo method allows for simulating random processes by using series of pseudo-random numbers. It became an important tool in nuclear medicine to assist in the design of new medical imaging devices, optimise their use and analyse their data. Presently, the sophistication of the simulation tools allows the introduction of Monte Carlo predictions in data correction and image reconstruction processes. The availability to simulate time dependent processes opens up new horizons for Monte Carlo simulation in nuclear medicine. In a near future, these developments will allow to tackle simultaneously imaging and dosimetry issues and soon, case system Monte Carlo simulations may become part of the nuclear medicine diagnostic process. This paper describes some Monte Carlo method basics and the sampling methods that were developed for it. It gives a referenced list of different simulation software used in nuclear medicine and enumerates some of their present and prospective applications. (author)

  10. Estimation of computed tomography dose index in cone beam computed tomography: MOSFET measurements and Monte Carlo simulations.

    Science.gov (United States)

    Kim, Sangroh; Yoshizumi, Terry; Toncheva, Greta; Yoo, Sua; Yin, Fang-Fang; Frush, Donald

    2010-05-01

    To address the lack of accurate dose estimation method in cone beam computed tomography (CBCT), we performed point dose metal oxide semiconductor field-effect transistor (MOSFET) measurements and Monte Carlo (MC) simulations. A Varian On-Board Imager (OBI) was employed to measure point doses in the polymethyl methacrylate (PMMA) CT phantoms with MOSFETs for standard and low dose modes. A MC model of the OBI x-ray tube was developed using BEAMnrc/EGSnrc MC system and validated by the half value layer, x-ray spectrum and lateral and depth dose profiles. We compared the weighted computed tomography dose index (CTDIw) between MOSFET measurements and MC simulations. The CTDIw was found to be 8.39 cGy for the head scan and 4.58 cGy for the body scan from the MOSFET measurements in standard dose mode, and 1.89 cGy for the head and 1.11 cGy for the body in low dose mode, respectively. The CTDIw from MC compared well to the MOSFET measurements within 5% differences. In conclusion, a MC model for Varian CBCT has been established and this approach may be easily extended from the CBCT geometry to multi-detector CT geometry.

  11. Applications of the Monte Carlo simulation in dosimetry and medical physics problems

    International Nuclear Information System (INIS)

    Rojas C, E. L.

    2010-01-01

    At the present time the computers use to solve important problems extends to all the areas. These areas can be of social, economic, of engineering, of basic and applied science, etc. With and appropriate handling of computation programs and information can be carried out calculations and simulations of real models, to study them and to solve theoretical or application problems. The processes that contain random variables are susceptible of being approached with the Monte Carlo method. This is a numeric method that, thanks to the improvements in the processors of the computers, it can apply in many tasks more than what was made in the principles of their practical application (at the beginning of the decade of 1950). In this work the application of the Monte Carlo method will be approached in the simulation of the radiation interaction with the matter, to investigate dosimetric aspects of some problems that exist in the medical physics area. Also, contain an introduction about some historical data and some general concepts related with the Monte Carlo simulation are revised. (Author)

  12. On the inclusion of macroscopic theory in Monte Carlo simulation using game theory

    International Nuclear Information System (INIS)

    Tatarkiewicz, J.

    1980-01-01

    This paper presents the inclusion of macroscopic damage theory into Monte Carlo particle-range simulation using game theory. A new computer code called RADDI was developed on the basis of this inclusion. Results of Monte Carlo damage simulation after 6.3 MeV proton bombardment of silicon are compared with experimental data of Bulgakov et al. (orig.)

  13. Coupling photon Monte Carlo simulation and CAD software. Application to X-ray nondestructive evaluation

    International Nuclear Information System (INIS)

    Tabary, J.; Gliere, A.

    2001-01-01

    A Monte Carlo radiation transport simulation program, EGS Nova, and a computer aided design software, BRL-CAD, have been coupled within the framework of Sindbad, a nondestructive evaluation (NDE) simulation system. In its current status, the program is very valuable in a NDE laboratory context, as it helps simulate the images due to the uncollided and scattered photon fluxes in a single NDE software environment, without having to switch to a Monte Carlo code parameters set. Numerical validations show a good agreement with EGS4 computed and published data. As the program's major drawback is the execution time, computational efficiency improvements are foreseen. (orig.)

  14. Prediction of beam hardening artefacts in computed tomography using Monte Carlo simulations

    DEFF Research Database (Denmark)

    Thomsen, M.; Bergbäck Knudsen, Erik; Willendrup, Peter Kjær

    2015-01-01

    We show how radiological images of both single and multi material samples can be simulated using the Monte Carlo simulation tool McXtrace and how these images can be used to make a three dimensional reconstruction. Good numerical agreement between the X-ray attenuation coefficient in experimental...

  15. Massive Parallelism of Monte-Carlo Simulation on Low-End Hardware using Graphic Processing Units

    International Nuclear Information System (INIS)

    Mburu, Joe Mwangi; Hah, Chang Joo Hah

    2014-01-01

    Within the past decade, research has been done on utilizing GPU massive parallelization in core simulation with impressive results but unfortunately, not much commercial application has been done in the nuclear field especially in reactor core simulation. The purpose of this paper is to give an introductory concept on the topic and illustrate the potential of exploiting the massive parallel nature of GPU computing on a simple monte-carlo simulation with very minimal hardware specifications. To do a comparative analysis, a simple two dimension monte-carlo simulation is implemented for both the CPU and GPU in order to evaluate performance gain based on the computing devices. The heterogeneous platform utilized in this analysis is done on a slow notebook with only 1GHz processor. The end results are quite surprising whereby high speedups obtained are almost a factor of 10. In this work, we have utilized heterogeneous computing in a GPU-based approach in applying potential high arithmetic intensive calculation. By applying a complex monte-carlo simulation on GPU platform, we have speed up the computational process by almost a factor of 10 based on one million neutrons. This shows how easy, cheap and efficient it is in using GPU in accelerating scientific computing and the results should encourage in exploring further this avenue especially in nuclear reactor physics simulation where deterministic and stochastic calculations are quite favourable in parallelization

  16. Massive Parallelism of Monte-Carlo Simulation on Low-End Hardware using Graphic Processing Units

    Energy Technology Data Exchange (ETDEWEB)

    Mburu, Joe Mwangi; Hah, Chang Joo Hah [KEPCO International Nuclear Graduate School, Ulsan (Korea, Republic of)

    2014-05-15

    Within the past decade, research has been done on utilizing GPU massive parallelization in core simulation with impressive results but unfortunately, not much commercial application has been done in the nuclear field especially in reactor core simulation. The purpose of this paper is to give an introductory concept on the topic and illustrate the potential of exploiting the massive parallel nature of GPU computing on a simple monte-carlo simulation with very minimal hardware specifications. To do a comparative analysis, a simple two dimension monte-carlo simulation is implemented for both the CPU and GPU in order to evaluate performance gain based on the computing devices. The heterogeneous platform utilized in this analysis is done on a slow notebook with only 1GHz processor. The end results are quite surprising whereby high speedups obtained are almost a factor of 10. In this work, we have utilized heterogeneous computing in a GPU-based approach in applying potential high arithmetic intensive calculation. By applying a complex monte-carlo simulation on GPU platform, we have speed up the computational process by almost a factor of 10 based on one million neutrons. This shows how easy, cheap and efficient it is in using GPU in accelerating scientific computing and the results should encourage in exploring further this avenue especially in nuclear reactor physics simulation where deterministic and stochastic calculations are quite favourable in parallelization.

  17. Computer simulation of stochastic processes through model-sampling (Monte Carlo) techniques.

    Science.gov (United States)

    Sheppard, C W.

    1969-03-01

    A simple Monte Carlo simulation program is outlined which can be used for the investigation of random-walk problems, for example in diffusion, or the movement of tracers in the blood circulation. The results given by the simulation are compared with those predicted by well-established theory, and it is shown how the model can be expanded to deal with drift, and with reflexion from or adsorption at a boundary.

  18. Propagation of uncertainty by Monte Carlo simulations in case of basic geodetic computations

    Directory of Open Access Journals (Sweden)

    Wyszkowska Patrycja

    2017-12-01

    Full Text Available The determination of the accuracy of functions of measured or adjusted values may be a problem in geodetic computations. The general law of covariance propagation or in case of the uncorrelated observations the propagation of variance (or the Gaussian formula are commonly used for that purpose. That approach is theoretically justified for the linear functions. In case of the non-linear functions, the first-order Taylor series expansion is usually used but that solution is affected by the expansion error. The aim of the study is to determine the applicability of the general variance propagation law in case of the non-linear functions used in basic geodetic computations. The paper presents errors which are a result of negligence of the higher-order expressions and it determines the range of such simplification. The basis of that analysis is the comparison of the results obtained by the law of propagation of variance and the probabilistic approach, namely Monte Carlo simulations. Both methods are used to determine the accuracy of the following geodetic computations: the Cartesian coordinates of unknown point in the three-point resection problem, azimuths and distances of the Cartesian coordinates, height differences in the trigonometric and the geometric levelling. These simulations and the analysis of the results confirm the possibility of applying the general law of variance propagation in basic geodetic computations even if the functions are non-linear. The only condition is the accuracy of observations, which cannot be too low. Generally, this is not a problem with using present geodetic instruments.

  19. Propagation of uncertainty by Monte Carlo simulations in case of basic geodetic computations

    Science.gov (United States)

    Wyszkowska, Patrycja

    2017-12-01

    The determination of the accuracy of functions of measured or adjusted values may be a problem in geodetic computations. The general law of covariance propagation or in case of the uncorrelated observations the propagation of variance (or the Gaussian formula) are commonly used for that purpose. That approach is theoretically justified for the linear functions. In case of the non-linear functions, the first-order Taylor series expansion is usually used but that solution is affected by the expansion error. The aim of the study is to determine the applicability of the general variance propagation law in case of the non-linear functions used in basic geodetic computations. The paper presents errors which are a result of negligence of the higher-order expressions and it determines the range of such simplification. The basis of that analysis is the comparison of the results obtained by the law of propagation of variance and the probabilistic approach, namely Monte Carlo simulations. Both methods are used to determine the accuracy of the following geodetic computations: the Cartesian coordinates of unknown point in the three-point resection problem, azimuths and distances of the Cartesian coordinates, height differences in the trigonometric and the geometric levelling. These simulations and the analysis of the results confirm the possibility of applying the general law of variance propagation in basic geodetic computations even if the functions are non-linear. The only condition is the accuracy of observations, which cannot be too low. Generally, this is not a problem with using present geodetic instruments.

  20. PENELOPE, and algorithm and computer code for Monte Carlo simulation of electron-photon showers

    Energy Technology Data Exchange (ETDEWEB)

    Salvat, F.; Fernandez-Varea, J.M.; Baro, J.; Sempau, J.

    1996-10-01

    The FORTRAN 77 subroutine package PENELOPE performs Monte Carlo simulation of electron-photon showers in arbitrary for a wide energy range, from similar{sub t}o 1 KeV to several hundred MeV. Photon transport is simulated by means of the standard, detailed simulation scheme. Electron and positron histories are generated on the basis of a mixed procedure, which combines detailed simulation of hard events with condensed simulation of soft interactions. A simple geometry package permits the generation of random electron-photon showers in material systems consisting of homogeneous bodies limited by quadric surfaces, i.e. planes, spheres cylinders, etc. This report is intended not only to serve as a manual of the simulation package, but also to provide the user with the necessary information to understand the details of the Monte Carlo algorithm.

  1. PENELOPE, an algorithm and computer code for Monte Carlo simulation of electron-photon showers

    Energy Technology Data Exchange (ETDEWEB)

    Salvat, F; Fernandez-Varea, J M; Baro, J; Sempau, J

    1996-07-01

    The FORTRAN 77 subroutine package PENELOPE performs Monte Carlo simulation of electron-photon showers in arbitrary for a wide energy range, from 1 keV to several hundred MeV. Photon transport is simulated by means of the standard, detailed simulation scheme. Electron and positron histories are generated on the basis of a mixed procedure, which combines detailed simulation of hard events with condensed simulation of soft interactions. A simple geometry package permits the generation of random electron-photon showers in material systems consisting of homogeneous bodies limited by quadric surfaces, i.e. planes, spheres, cylinders, etc. This report is intended not only to serve as a manual of the simulation package, but also to provide the user with the necessary information to understand the details of the Monte Carlo algorithm. (Author) 108 refs.

  2. Monte Carlo simulation of AB-copolymers with saturating bonds

    DEFF Research Database (Denmark)

    Chertovich, A.C.; Ivanov, V.A.; Khokhlov, A.R.

    2003-01-01

    Structural transitions in a single AB-copolymer chain where saturating bonds can be formed between A- and B-units are studied by means of Monte Carlo computer simulations using the bond fluctuation model. Three transitions are found, coil-globule, coil-hairpin and globule-hairpin, depending...

  3. Patient-specific scatter correction in clinical cone beam computed tomography imaging made possible by the combination of Monte Carlo simulations and a ray tracing algorithm

    DEFF Research Database (Denmark)

    Slot Thing, Rune; Bernchou, Uffe; Mainegra-Hing, Ernesto

    2013-01-01

    Abstract Purpose. Cone beam computed tomography (CBCT) image quality is limited by scattered photons. Monte Carlo (MC) simulations provide the ability of predicting the patient-specific scatter contamination in clinical CBCT imaging. Lengthy simulations prevent MC-based scatter correction from...

  4. Fast multipurpose Monte Carlo simulation for proton therapy using multi- and many-core CPU architectures

    Energy Technology Data Exchange (ETDEWEB)

    Souris, Kevin, E-mail: kevin.souris@uclouvain.be; Lee, John Aldo [Center for Molecular Imaging and Experimental Radiotherapy, Institut de Recherche Expérimentale et Clinique, Université catholique de Louvain, Avenue Hippocrate 54, 1200 Brussels, Belgium and ICTEAM Institute, Université catholique de Louvain, Louvain-la-Neuve 1348 (Belgium); Sterpin, Edmond [Center for Molecular Imaging and Experimental Radiotherapy, Institut de Recherche Expérimentale et Clinique, Université catholique de Louvain, Avenue Hippocrate 54, 1200 Brussels, Belgium and Department of Oncology, Katholieke Universiteit Leuven, O& N I Herestraat 49, 3000 Leuven (Belgium)

    2016-04-15

    Purpose: Accuracy in proton therapy treatment planning can be improved using Monte Carlo (MC) simulations. However the long computation time of such methods hinders their use in clinical routine. This work aims to develop a fast multipurpose Monte Carlo simulation tool for proton therapy using massively parallel central processing unit (CPU) architectures. Methods: A new Monte Carlo, called MCsquare (many-core Monte Carlo), has been designed and optimized for the last generation of Intel Xeon processors and Intel Xeon Phi coprocessors. These massively parallel architectures offer the flexibility and the computational power suitable to MC methods. The class-II condensed history algorithm of MCsquare provides a fast and yet accurate method of simulating heavy charged particles such as protons, deuterons, and alphas inside voxelized geometries. Hard ionizations, with energy losses above a user-specified threshold, are simulated individually while soft events are regrouped in a multiple scattering theory. Elastic and inelastic nuclear interactions are sampled from ICRU 63 differential cross sections, thereby allowing for the computation of prompt gamma emission profiles. MCsquare has been benchmarked with the GATE/GEANT4 Monte Carlo application for homogeneous and heterogeneous geometries. Results: Comparisons with GATE/GEANT4 for various geometries show deviations within 2%–1 mm. In spite of the limited memory bandwidth of the coprocessor simulation time is below 25 s for 10{sup 7} primary 200 MeV protons in average soft tissues using all Xeon Phi and CPU resources embedded in a single desktop unit. Conclusions: MCsquare exploits the flexibility of CPU architectures to provide a multipurpose MC simulation tool. Optimized code enables the use of accurate MC calculation within a reasonable computation time, adequate for clinical practice. MCsquare also simulates prompt gamma emission and can thus be used also for in vivo range verification.

  5. Fast multipurpose Monte Carlo simulation for proton therapy using multi- and many-core CPU architectures

    International Nuclear Information System (INIS)

    Souris, Kevin; Lee, John Aldo; Sterpin, Edmond

    2016-01-01

    Purpose: Accuracy in proton therapy treatment planning can be improved using Monte Carlo (MC) simulations. However the long computation time of such methods hinders their use in clinical routine. This work aims to develop a fast multipurpose Monte Carlo simulation tool for proton therapy using massively parallel central processing unit (CPU) architectures. Methods: A new Monte Carlo, called MCsquare (many-core Monte Carlo), has been designed and optimized for the last generation of Intel Xeon processors and Intel Xeon Phi coprocessors. These massively parallel architectures offer the flexibility and the computational power suitable to MC methods. The class-II condensed history algorithm of MCsquare provides a fast and yet accurate method of simulating heavy charged particles such as protons, deuterons, and alphas inside voxelized geometries. Hard ionizations, with energy losses above a user-specified threshold, are simulated individually while soft events are regrouped in a multiple scattering theory. Elastic and inelastic nuclear interactions are sampled from ICRU 63 differential cross sections, thereby allowing for the computation of prompt gamma emission profiles. MCsquare has been benchmarked with the GATE/GEANT4 Monte Carlo application for homogeneous and heterogeneous geometries. Results: Comparisons with GATE/GEANT4 for various geometries show deviations within 2%–1 mm. In spite of the limited memory bandwidth of the coprocessor simulation time is below 25 s for 10"7 primary 200 MeV protons in average soft tissues using all Xeon Phi and CPU resources embedded in a single desktop unit. Conclusions: MCsquare exploits the flexibility of CPU architectures to provide a multipurpose MC simulation tool. Optimized code enables the use of accurate MC calculation within a reasonable computation time, adequate for clinical practice. MCsquare also simulates prompt gamma emission and can thus be used also for in vivo range verification.

  6. Fast multipurpose Monte Carlo simulation for proton therapy using multi- and many-core CPU architectures.

    Science.gov (United States)

    Souris, Kevin; Lee, John Aldo; Sterpin, Edmond

    2016-04-01

    Accuracy in proton therapy treatment planning can be improved using Monte Carlo (MC) simulations. However the long computation time of such methods hinders their use in clinical routine. This work aims to develop a fast multipurpose Monte Carlo simulation tool for proton therapy using massively parallel central processing unit (CPU) architectures. A new Monte Carlo, called MCsquare (many-core Monte Carlo), has been designed and optimized for the last generation of Intel Xeon processors and Intel Xeon Phi coprocessors. These massively parallel architectures offer the flexibility and the computational power suitable to MC methods. The class-II condensed history algorithm of MCsquare provides a fast and yet accurate method of simulating heavy charged particles such as protons, deuterons, and alphas inside voxelized geometries. Hard ionizations, with energy losses above a user-specified threshold, are simulated individually while soft events are regrouped in a multiple scattering theory. Elastic and inelastic nuclear interactions are sampled from ICRU 63 differential cross sections, thereby allowing for the computation of prompt gamma emission profiles. MCsquare has been benchmarked with the gate/geant4 Monte Carlo application for homogeneous and heterogeneous geometries. Comparisons with gate/geant4 for various geometries show deviations within 2%-1 mm. In spite of the limited memory bandwidth of the coprocessor simulation time is below 25 s for 10(7) primary 200 MeV protons in average soft tissues using all Xeon Phi and CPU resources embedded in a single desktop unit. MCsquare exploits the flexibility of CPU architectures to provide a multipurpose MC simulation tool. Optimized code enables the use of accurate MC calculation within a reasonable computation time, adequate for clinical practice. MCsquare also simulates prompt gamma emission and can thus be used also for in vivo range verification.

  7. Monte Carlo simulation of electrothermal atomization on a desktop personal computer

    Science.gov (United States)

    Histen, Timothy E.; Güell, Oscar A.; Chavez, Iris A.; Holcombea, James A.

    1996-07-01

    Monte Carlo simulations have been applied to electrothermal atomization (ETA) using a tubular atomizer (e.g. graphite furnace) because of the complexity in the geometry, heating, molecular interactions, etc. The intense computational time needed to accurately model ETA often limited its effective implementation to the use of supercomputers. However, with the advent of more powerful desktop processors, this is no longer the case. A C-based program has been developed and can be used under Windows TM or DOS. With this program, basic parameters such as furnace dimensions, sample placement, furnace heating and kinetic parameters such as activation energies for desorption and adsorption can be varied to show the absorbance profile dependence on these parameters. Even data such as time-dependent spatial distribution of analyte inside the furnace can be collected. The DOS version also permits input of external temperaturetime data to permit comparison of simulated profiles with experimentally obtained absorbance data. The run-time versions are provided along with the source code. This article is an electronic publication in Spectrochimica Acta Electronica (SAE), the electronic section of Spectrochimica Acta Part B (SAB). The hardcopy text is accompanied by a diskette with a program (PC format), data files and text files.

  8. Atomistic Monte Carlo simulation of lipid membranes

    DEFF Research Database (Denmark)

    Wüstner, Daniel; Sklenar, Heinz

    2014-01-01

    Biological membranes are complex assemblies of many different molecules of which analysis demands a variety of experimental and computational approaches. In this article, we explain challenges and advantages of atomistic Monte Carlo (MC) simulation of lipid membranes. We provide an introduction...... into the various move sets that are implemented in current MC methods for efficient conformational sampling of lipids and other molecules. In the second part, we demonstrate for a concrete example, how an atomistic local-move set can be implemented for MC simulations of phospholipid monomers and bilayer patches...

  9. Exploring Monte Carlo Simulation Strategies for Geoscience Applications

    Science.gov (United States)

    Blais, J.; Grebenitcharsky, R.; Zhang, Z.

    2008-12-01

    Computer simulations are an increasingly important area of geoscience research and development. At the core of stochastic or Monte Carlo simulations are the random number sequences that are assumed to be distributed with specific characteristics. Computer generated random numbers, uniformly distributed on [0, 1], can be very different depending on the selection of pseudo-random number (PRN), quasi-random number (QRN) or chaotic random number (CRN) generators. In the evaluation of some definite integrals, the expected error variances are generally of different orders for the same number of random numbers. A comparative analysis of these three strategies has been carried out for geodetic and related applications in planar and spherical contexts. Based on these computational experiments, conclusions and recommendations concerning their performance and error variances are included.

  10. Monte Carlo simulations on a 9-node PC cluster

    International Nuclear Information System (INIS)

    Gouriou, J.

    2001-01-01

    Monte Carlo simulation methods are frequently used in the fields of medical physics, dosimetry and metrology of ionising radiation. Nevertheless, the main drawback of this technique is to be computationally slow, because the statistical uncertainty of the result improves only as the square root of the computational time. We present a method, which allows to reduce by a factor 10 to 20 the used effective running time. In practice, the aim was to reduce the calculation time in the LNHB metrological applications from several weeks to a few days. This approach includes the use of a PC-cluster, under Linux operating system and PVM parallel library (version 3.4). The Monte Carlo codes EGS4, MCNP and PENELOPE have been implemented on this platform and for the two last ones adapted for running under the PVM environment. The maximum observed speedup is ranging from a factor 13 to 18 according to the codes and the problems to be simulated. (orig.)

  11. Monte carlo simulation for soot dynamics

    KAUST Repository

    Zhou, Kun

    2012-01-01

    A new Monte Carlo method termed Comb-like frame Monte Carlo is developed to simulate the soot dynamics. Detailed stochastic error analysis is provided. Comb-like frame Monte Carlo is coupled with the gas phase solver Chemkin II to simulate soot formation in a 1-D premixed burner stabilized flame. The simulated soot number density, volume fraction, and particle size distribution all agree well with the measurement available in literature. The origin of the bimodal distribution of particle size distribution is revealed with quantitative proof.

  12. Cluster computing software for GATE simulations

    International Nuclear Information System (INIS)

    Beenhouwer, Jan de; Staelens, Steven; Kruecker, Dirk; Ferrer, Ludovic; D'Asseler, Yves; Lemahieu, Ignace; Rannou, Fernando R.

    2007-01-01

    Geometry and tracking (GEANT4) is a Monte Carlo package designed for high energy physics experiments. It is used as the basis layer for Monte Carlo simulations of nuclear medicine acquisition systems in GEANT4 Application for Tomographic Emission (GATE). GATE allows the user to realistically model experiments using accurate physics models and time synchronization for detector movement through a script language contained in a macro file. The downside of this high accuracy is long computation time. This paper describes a platform independent computing approach for running GATE simulations on a cluster of computers in order to reduce the overall simulation time. Our software automatically creates fully resolved, nonparametrized macros accompanied with an on-the-fly generated cluster specific submit file used to launch the simulations. The scalability of GATE simulations on a cluster is investigated for two imaging modalities, positron emission tomography (PET) and single photon emission computed tomography (SPECT). Due to a higher sensitivity, PET simulations are characterized by relatively high data output rates that create rather large output files. SPECT simulations, on the other hand, have lower data output rates but require a long collimator setup time. Both of these characteristics hamper scalability as a function of the number of CPUs. The scalability of PET simulations is improved here by the development of a fast output merger. The scalability of SPECT simulations is improved by greatly reducing the collimator setup time. Accordingly, these two new developments result in higher scalability for both PET and SPECT simulations and reduce the computation time to more practical values

  13. Suppression of the initial transient in Monte Carlo criticality simulations; Suppression du regime transitoire initial des simulations Monte-Carlo de criticite

    Energy Technology Data Exchange (ETDEWEB)

    Richet, Y

    2006-12-15

    Criticality Monte Carlo calculations aim at estimating the effective multiplication factor (k-effective) for a fissile system through iterations simulating neutrons propagation (making a Markov chain). Arbitrary initialization of the neutron population can deeply bias the k-effective estimation, defined as the mean of the k-effective computed at each iteration. A simplified model of this cycle k-effective sequence is built, based on characteristics of industrial criticality Monte Carlo calculations. Statistical tests, inspired by Brownian bridge properties, are designed to discriminate stationarity of the cycle k-effective sequence. The initial detected transient is, then, suppressed in order to improve the estimation of the system k-effective. The different versions of this methodology are detailed and compared, firstly on a plan of numerical tests fitted on criticality Monte Carlo calculations, and, secondly on real criticality calculations. Eventually, the best methodologies observed in these tests are selected and allow to improve industrial Monte Carlo criticality calculations. (author)

  14. Temporal acceleration of spatially distributed kinetic Monte Carlo simulations

    International Nuclear Information System (INIS)

    Chatterjee, Abhijit; Vlachos, Dionisios G.

    2006-01-01

    The computational intensity of kinetic Monte Carlo (KMC) simulation is a major impediment in simulating large length and time scales. In recent work, an approximate method for KMC simulation of spatially uniform systems, termed the binomial τ-leap method, was introduced [A. Chatterjee, D.G. Vlachos, M.A. Katsoulakis, Binomial distribution based τ-leap accelerated stochastic simulation, J. Chem. Phys. 122 (2005) 024112], where molecular bundles instead of individual processes are executed over coarse-grained time increments. This temporal coarse-graining can lead to significant computational savings but its generalization to spatially lattice KMC simulation has not been realized yet. Here we extend the binomial τ-leap method to lattice KMC simulations by combining it with spatially adaptive coarse-graining. Absolute stability and computational speed-up analyses for spatial systems along with simulations provide insights into the conditions where accuracy and substantial acceleration of the new spatio-temporal coarse-graining method are ensured. Model systems demonstrate that the r-time increment criterion of Chatterjee et al. obeys the absolute stability limit for values of r up to near 1

  15. SU-E-T-314: The Application of Cloud Computing in Pencil Beam Scanning Proton Therapy Monte Carlo Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Z [Reading Hospital, West Reading, PA (United States); Gao, M [ProCure Treatment Centers, Warrenville, IL (United States)

    2014-06-01

    Purpose: Monte Carlo simulation plays an important role for proton Pencil Beam Scanning (PBS) technique. However, MC simulation demands high computing power and is limited to few large proton centers that can afford a computer cluster. We study the feasibility of utilizing cloud computing in the MC simulation of PBS beams. Methods: A GATE/GEANT4 based MC simulation software was installed on a commercial cloud computing virtual machine (Linux 64-bits, Amazon EC2). Single spot Integral Depth Dose (IDD) curves and in-air transverse profiles were used to tune the source parameters to simulate an IBA machine. With the use of StarCluster software developed at MIT, a Linux cluster with 2–100 nodes can be conveniently launched in the cloud. A proton PBS plan was then exported to the cloud where the MC simulation was run. Results: The simulated PBS plan has a field size of 10×10cm{sup 2}, 20cm range, 10cm modulation, and contains over 10,000 beam spots. EC2 instance type m1.medium was selected considering the CPU/memory requirement and 40 instances were used to form a Linux cluster. To minimize cost, master node was created with on-demand instance and worker nodes were created with spot-instance. The hourly cost for the 40-node cluster was $0.63 and the projected cost for a 100-node cluster was $1.41. Ten million events were simulated to plot PDD and profile, with each job containing 500k events. The simulation completed within 1 hour and an overall statistical uncertainty of < 2% was achieved. Good agreement between MC simulation and measurement was observed. Conclusion: Cloud computing is a cost-effective and easy to maintain platform to run proton PBS MC simulation. When proton MC packages such as GATE and TOPAS are combined with cloud computing, it will greatly facilitate the pursuing of PBS MC studies, especially for newly established proton centers or individual researchers.

  16. SU-E-T-314: The Application of Cloud Computing in Pencil Beam Scanning Proton Therapy Monte Carlo Simulation

    International Nuclear Information System (INIS)

    Wang, Z; Gao, M

    2014-01-01

    Purpose: Monte Carlo simulation plays an important role for proton Pencil Beam Scanning (PBS) technique. However, MC simulation demands high computing power and is limited to few large proton centers that can afford a computer cluster. We study the feasibility of utilizing cloud computing in the MC simulation of PBS beams. Methods: A GATE/GEANT4 based MC simulation software was installed on a commercial cloud computing virtual machine (Linux 64-bits, Amazon EC2). Single spot Integral Depth Dose (IDD) curves and in-air transverse profiles were used to tune the source parameters to simulate an IBA machine. With the use of StarCluster software developed at MIT, a Linux cluster with 2–100 nodes can be conveniently launched in the cloud. A proton PBS plan was then exported to the cloud where the MC simulation was run. Results: The simulated PBS plan has a field size of 10×10cm 2 , 20cm range, 10cm modulation, and contains over 10,000 beam spots. EC2 instance type m1.medium was selected considering the CPU/memory requirement and 40 instances were used to form a Linux cluster. To minimize cost, master node was created with on-demand instance and worker nodes were created with spot-instance. The hourly cost for the 40-node cluster was $0.63 and the projected cost for a 100-node cluster was $1.41. Ten million events were simulated to plot PDD and profile, with each job containing 500k events. The simulation completed within 1 hour and an overall statistical uncertainty of < 2% was achieved. Good agreement between MC simulation and measurement was observed. Conclusion: Cloud computing is a cost-effective and easy to maintain platform to run proton PBS MC simulation. When proton MC packages such as GATE and TOPAS are combined with cloud computing, it will greatly facilitate the pursuing of PBS MC studies, especially for newly established proton centers or individual researchers

  17. Reliability analysis of neutron transport simulation using Monte Carlo method

    International Nuclear Information System (INIS)

    Souza, Bismarck A. de; Borges, Jose C.

    1995-01-01

    This work presents a statistical and reliability analysis covering data obtained by computer simulation of neutron transport process, using the Monte Carlo method. A general description of the method and its applications is presented. Several simulations, corresponding to slowing down and shielding problems have been accomplished. The influence of the physical dimensions of the materials and of the sample size on the reliability level of results was investigated. The objective was to optimize the sample size, in order to obtain reliable results, optimizing computation time. (author). 5 refs, 8 figs

  18. Systematic uncertainties on Monte Carlo simulation of lead based ADS

    International Nuclear Information System (INIS)

    Embid, M.; Fernandez, R.; Garcia-Sanz, J.M.; Gonzalez, E.

    1999-01-01

    Computer simulations of the neutronic behaviour of ADS systems foreseen for actinide and fission product transmutation are affected by many sources of systematic uncertainties, both from the nuclear data and by the methodology selected when applying the codes. Several actual ADS Monte Carlo simulations are presented, comparing different options both for the data and for the methodology, evaluating the relevance of the different uncertainties. (author)

  19. Topics in computer simulations of statistical systems

    International Nuclear Information System (INIS)

    Salvador, R.S.

    1987-01-01

    Several computer simulations studying a variety of topics in statistical mechanics and lattice gauge theories are performed. The first study describes a Monte Carlo simulation performed on Ising systems defined on Sierpinsky carpets of dimensions between one and four. The critical coupling and the exponent γ are measured as a function of dimension. The Ising gauge theory in d = 4 - epsilon, for epsilon → 0 + , is then studied by performing a Monte Carlo simulation for the theory defined on fractals. A high statistics Monte Carlo simulation for the three-dimensional Ising model is presented for lattices of sizes 8 3 to 44 3 . All the data obtained agrees completely, within statistical errors, with the forms predicted by finite-sizing scaling. Finally, a method to estimate numerically the partition function of statistical systems is developed

  20. Computer simulation of HTGR fuel microspheres using a Monte-Carlo statistical approach

    International Nuclear Information System (INIS)

    Hedrick, C.E.

    1976-01-01

    The concept and computational aspects of a Monte-Carlo statistical approach in relating structure of HTGR fuel microspheres to the uranium content of fuel samples have been verified. Results of the preliminary validation tests and the benefits to be derived from the program are summarized

  1. Applications of the Monte Carlo simulation in dosimetry and medical physics problems; Aplicaciones de la simulacion Monte Carlo en dosimetria y problemas de fisica medica

    Energy Technology Data Exchange (ETDEWEB)

    Rojas C, E. L., E-mail: leticia.rojas@inin.gob.m [ININ, Gerencia de Ciencias Ambientales, Carretera Mexico-Toluca s/n, 52750 Ocoyoacac, Estado de Mexico (Mexico)

    2010-07-01

    At the present time the computers use to solve important problems extends to all the areas. These areas can be of social, economic, of engineering, of basic and applied science, etc. With and appropriate handling of computation programs and information can be carried out calculations and simulations of real models, to study them and to solve theoretical or application problems. The processes that contain random variables are susceptible of being approached with the Monte Carlo method. This is a numeric method that, thanks to the improvements in the processors of the computers, it can apply in many tasks more than what was made in the principles of their practical application (at the beginning of the decade of 1950). In this work the application of the Monte Carlo method will be approached in the simulation of the radiation interaction with the matter, to investigate dosimetric aspects of some problems that exist in the medical physics area. Also, contain an introduction about some historical data and some general concepts related with the Monte Carlo simulation are revised. (Author)

  2. Data decomposition of Monte Carlo particle transport simulations via tally servers

    International Nuclear Information System (INIS)

    Romano, Paul K.; Siegel, Andrew R.; Forget, Benoit; Smith, Kord

    2013-01-01

    An algorithm for decomposing large tally data in Monte Carlo particle transport simulations is developed, analyzed, and implemented in a continuous-energy Monte Carlo code, OpenMC. The algorithm is based on a non-overlapping decomposition of compute nodes into tracking processors and tally servers. The former are used to simulate the movement of particles through the domain while the latter continuously receive and update tally data. A performance model for this approach is developed, suggesting that, for a range of parameters relevant to LWR analysis, the tally server algorithm should perform with minimal overhead on contemporary supercomputers. An implementation of the algorithm in OpenMC is then tested on the Intrepid and Titan supercomputers, supporting the key predictions of the model over a wide range of parameters. We thus conclude that the tally server algorithm is a successful approach to circumventing classical on-node memory constraints en route to unprecedentedly detailed Monte Carlo reactor simulations

  3. Application of Macro Response Monte Carlo method for electron spectrum simulation

    International Nuclear Information System (INIS)

    Perles, L.A.; Almeida, A. de

    2007-01-01

    During the past years several variance reduction techniques for Monte Carlo electron transport have been developed in order to reduce the electron computation time transport for absorbed dose distribution. We have implemented the Macro Response Monte Carlo (MRMC) method to evaluate the electron spectrum which can be used as a phase space input for others simulation programs. Such technique uses probability distributions for electron histories previously simulated in spheres (called kugels). These probabilities are used to sample the primary electron final state, as well as the creation secondary electrons and photons. We have compared the MRMC electron spectra simulated in homogeneous phantom against the Geant4 spectra. The results showed an agreement better than 6% in the spectra peak energies and that MRMC code is up to 12 time faster than Geant4 simulations

  4. Fast Monte Carlo-simulator with full collimator and detector response modelling for SPECT

    International Nuclear Information System (INIS)

    Sohlberg, A.O.; Kajaste, M.T.

    2012-01-01

    Monte Carlo (MC)-simulations have proved to be a valuable tool in studying single photon emission computed tomography (SPECT)-reconstruction algorithms. Despite their popularity, the use of Monte Carlo-simulations is still often limited by their large computation demand. This is especially true in situations where full collimator and detector modelling with septal penetration, scatter and X-ray fluorescence needs to be included. This paper presents a rapid and simple MC-simulator, which can effectively reduce the computation times. The simulator was built on the convolution-based forced detection principle, which can markedly lower the number of simulated photons. Full collimator and detector response look-up tables are pre-simulated and then later used in the actual MC-simulations to model the system response. The developed simulator was validated by comparing it against 123 I point source measurements made with a clinical gamma camera system and against 99m Tc software phantom simulations made with the SIMIND MC-package. The results showed good agreement between the new simulator, measurements and the SIMIND-package. The new simulator provided near noise-free projection data in approximately 1.5 min per projection with 99m Tc, which was less than one-tenth of SIMIND's time. The developed MC-simulator can markedly decrease the simulation time without sacrificing image quality. (author)

  5. 11th International Conference on Monte Carlo and Quasi-Monte Carlo Methods in Scientific Computing

    CERN Document Server

    Nuyens, Dirk

    2016-01-01

    This book presents the refereed proceedings of the Eleventh International Conference on Monte Carlo and Quasi-Monte Carlo Methods in Scientific Computing that was held at the University of Leuven (Belgium) in April 2014. These biennial conferences are major events for Monte Carlo and quasi-Monte Carlo researchers. The proceedings include articles based on invited lectures as well as carefully selected contributed papers on all theoretical aspects and applications of Monte Carlo and quasi-Monte Carlo methods. Offering information on the latest developments in these very active areas, this book is an excellent reference resource for theoreticians and practitioners interested in solving high-dimensional computational problems, arising, in particular, in finance, statistics and computer graphics.

  6. Understanding quantum tunneling using diffusion Monte Carlo simulations

    Science.gov (United States)

    Inack, E. M.; Giudici, G.; Parolini, T.; Santoro, G.; Pilati, S.

    2018-03-01

    In simple ferromagnetic quantum Ising models characterized by an effective double-well energy landscape the characteristic tunneling time of path-integral Monte Carlo (PIMC) simulations has been shown to scale as the incoherent quantum-tunneling time, i.e., as 1 /Δ2 , where Δ is the tunneling gap. Since incoherent quantum tunneling is employed by quantum annealers (QAs) to solve optimization problems, this result suggests that there is no quantum advantage in using QAs with respect to quantum Monte Carlo (QMC) simulations. A counterexample is the recently introduced shamrock model (Andriyash and Amin, arXiv:1703.09277), where topological obstructions cause an exponential slowdown of the PIMC tunneling dynamics with respect to incoherent quantum tunneling, leaving open the possibility for potential quantum speedup, even for stoquastic models. In this work we investigate the tunneling time of projective QMC simulations based on the diffusion Monte Carlo (DMC) algorithm without guiding functions, showing that it scales as 1 /Δ , i.e., even more favorably than the incoherent quantum-tunneling time, both in a simple ferromagnetic system and in the more challenging shamrock model. However, a careful comparison between the DMC ground-state energies and the exact solution available for the transverse-field Ising chain indicates an exponential scaling of the computational cost required to keep a fixed relative error as the system size increases.

  7. Mosaic crystal algorithm for Monte Carlo simulations

    CERN Document Server

    Seeger, P A

    2002-01-01

    An algorithm is presented for calculating reflectivity, absorption, and scattering of mosaic crystals in Monte Carlo simulations of neutron instruments. The algorithm uses multi-step transport through the crystal with an exact solution of the Darwin equations at each step. It relies on the kinematical model for Bragg reflection (with parameters adjusted to reproduce experimental data). For computation of thermal effects (the Debye-Waller factor and coherent inelastic scattering), an expansion of the Debye integral as a rapidly converging series of exponential terms is also presented. Any crystal geometry and plane orientation may be treated. The algorithm has been incorporated into the neutron instrument simulation package NISP. (orig.)

  8. Development of 1-year-old computational phantom and calculation of organ doses during CT scans using Monte Carlo simulation

    International Nuclear Information System (INIS)

    Pan, Yuxi; Qiu, Rui; Ge, Chaoyong; Xie, Wenzhang; Li, Junli; Gao, Linfeng; Zheng, Junzheng

    2014-01-01

    With the rapidly growing number of CT examinations, the consequential radiation risk has aroused more and more attention. The average dose in each organ during CT scans can only be obtained by using Monte Carlo simulation with computational phantoms. Since children tend to have higher radiation sensitivity than adults, the radiation dose of pediatric CT examinations requires special attention and needs to be assessed accurately. So far, studies on organ doses from CT exposures for pediatric patients are still limited. In this work, a 1-year-old computational phantom was constructed. The body contour was obtained from the CT images of a 1-year-old physical phantom and the internal organs were deformed from an existing Chinese reference adult phantom. To ensure the organ locations in the 1-year-old computational phantom were consistent with those of the physical phantom, the organ locations in 1-year-old computational phantom were manually adjusted one by one, and the organ masses were adjusted to the corresponding Chinese reference values. Moreover, a CT scanner model was developed using the Monte Carlo technique and the 1-year-old computational phantom was applied to estimate organ doses derived from simulated CT exposures. As a result, a database including doses to 36 organs and tissues from 47 single axial scans was built. It has been verified by calculation that doses of axial scans are close to those of helical scans; therefore, this database could be applied to helical scans as well. Organ doses were calculated using the database and compared with those obtained from the measurements made in the physical phantom for helical scans. The differences between simulation and measurement were less than 25% for all organs. The result shows that the 1-year-old phantom developed in this work can be used to calculate organ doses in CT exposures, and the dose database provides a method for the estimation of 1-year-old patient doses in a variety of CT examinations. (paper)

  9. Investigation of attenuation correction in SPECT using textural features, Monte Carlo simulations, and computational anthropomorphic models.

    Science.gov (United States)

    Spirou, Spiridon V; Papadimitroulas, Panagiotis; Liakou, Paraskevi; Georgoulias, Panagiotis; Loudos, George

    2015-09-01

    To present and evaluate a new methodology to investigate the effect of attenuation correction (AC) in single-photon emission computed tomography (SPECT) using textural features analysis, Monte Carlo techniques, and a computational anthropomorphic model. The GATE Monte Carlo toolkit was used to simulate SPECT experiments using the XCAT computational anthropomorphic model, filled with a realistic biodistribution of (99m)Tc-N-DBODC. The simulated gamma camera was the Siemens ECAM Dual-Head, equipped with a parallel hole lead collimator, with an image resolution of 3.54 × 3.54 mm(2). Thirty-six equispaced camera positions, spanning a full 360° arc, were simulated. Projections were calculated after applying a ± 20% energy window or after eliminating all scattered photons. The activity of the radioisotope was reconstructed using the MLEM algorithm. Photon attenuation was accounted for by calculating the radiological pathlength in a perpendicular line from the center of each voxel to the gamma camera. Twenty-two textural features were calculated on each slice, with and without AC, using 16 and 64 gray levels. A mask was used to identify only those pixels that belonged to each organ. Twelve of the 22 features showed almost no dependence on AC, irrespective of the organ involved. In both the heart and the liver, the mean and SD were the features most affected by AC. In the liver, six features were affected by AC only on some slices. Depending on the slice, skewness decreased by 22-34% with AC, kurtosis by 35-50%, long-run emphasis mean by 71-91%, and long-run emphasis range by 62-95%. In contrast, gray-level non-uniformity mean increased by 78-218% compared with the value without AC and run percentage mean by 51-159%. These results were not affected by the number of gray levels (16 vs. 64) or the data used for reconstruction: with the energy window or without scattered photons. The mean and SD were the main features affected by AC. In the heart, no other feature was

  10. Efficient Monte Carlo Simulations of Gas Molecules Inside Porous Materials.

    Science.gov (United States)

    Kim, Jihan; Smit, Berend

    2012-07-10

    Monte Carlo (MC) simulations are commonly used to obtain adsorption properties of gas molecules inside porous materials. In this work, we discuss various optimization strategies that lead to faster MC simulations with CO2 gas molecules inside host zeolite structures used as a test system. The reciprocal space contribution of the gas-gas Ewald summation and both the direct and the reciprocal gas-host potential energy interactions are stored inside energy grids to reduce the wall time in the MC simulations. Additional speedup can be obtained by selectively calling the routine that computes the gas-gas Ewald summation, which does not impact the accuracy of the zeolite's adsorption characteristics. We utilize two-level density-biased sampling technique in the grand canonical Monte Carlo (GCMC) algorithm to restrict CO2 insertion moves into low-energy regions within the zeolite materials to accelerate convergence. Finally, we make use of the graphics processing units (GPUs) hardware to conduct multiple MC simulations in parallel via judiciously mapping the GPU threads to available workload. As a result, we can obtain a CO2 adsorption isotherm curve with 14 pressure values (up to 10 atm) for a zeolite structure within a minute of total compute wall time.

  11. Simulation of Satellite, Airborne and Terrestrial LiDAR with DART (I):Waveform Simulation with Quasi-Monte Carlo Ray Tracing

    Science.gov (United States)

    Gastellu-Etchegorry, Jean-Philippe; Yin, Tiangang; Lauret, Nicolas; Grau, Eloi; Rubio, Jeremy; Cook, Bruce D.; Morton, Douglas C.; Sun, Guoqing

    2016-01-01

    Light Detection And Ranging (LiDAR) provides unique data on the 3-D structure of atmosphere constituents and the Earth's surface. Simulating LiDAR returns for different laser technologies and Earth scenes is fundamental for evaluating and interpreting signal and noise in LiDAR data. Different types of models are capable of simulating LiDAR waveforms of Earth surfaces. Semi-empirical and geometric models can be imprecise because they rely on simplified simulations of Earth surfaces and light interaction mechanisms. On the other hand, Monte Carlo ray tracing (MCRT) models are potentially accurate but require long computational time. Here, we present a new LiDAR waveform simulation tool that is based on the introduction of a quasi-Monte Carlo ray tracing approach in the Discrete Anisotropic Radiative Transfer (DART) model. Two new approaches, the so-called "box method" and "Ray Carlo method", are implemented to provide robust and accurate simulations of LiDAR waveforms for any landscape, atmosphere and LiDAR sensor configuration (view direction, footprint size, pulse characteristics, etc.). The box method accelerates the selection of the scattering direction of a photon in the presence of scatterers with non-invertible phase function. The Ray Carlo method brings traditional ray-tracking into MCRT simulation, which makes computational time independent of LiDAR field of view (FOV) and reception solid angle. Both methods are fast enough for simulating multi-pulse acquisition. Sensitivity studies with various landscapes and atmosphere constituents are presented, and the simulated LiDAR signals compare favorably with their associated reflectance images and Laser Vegetation Imaging Sensor (LVIS) waveforms. The LiDAR module is fully integrated into DART, enabling more detailed simulations of LiDAR sensitivity to specific scene elements (e.g., atmospheric aerosols, leaf area, branches, or topography) and sensor configuration for airborne or satellite LiDAR sensors.

  12. Odd-flavor Simulations by the Hybrid Monte Carlo

    CERN Document Server

    Takaishi, Tetsuya; Takaishi, Tetsuya; De Forcrand, Philippe

    2001-01-01

    The standard hybrid Monte Carlo algorithm is known to simulate even flavors QCD only. Simulations of odd flavors QCD, however, can be also performed in the framework of the hybrid Monte Carlo algorithm where the inverse of the fermion matrix is approximated by a polynomial. In this exploratory study we perform three flavors QCD simulations. We make a comparison of the hybrid Monte Carlo algorithm and the R-algorithm which also simulates odd flavors systems but has step-size errors. We find that results from our hybrid Monte Carlo algorithm are in agreement with those from the R-algorithm obtained at very small step-size.

  13. Kinetic Monte Carlo Simulation of Cation Diffusion in Low-K Ceramics

    Science.gov (United States)

    Good, Brian

    2013-01-01

    Low thermal conductivity (low-K) ceramic materials are of interest to the aerospace community for use as the thermal barrier component of coating systems for turbine engine components. In particular, zirconia-based materials exhibit both low thermal conductivity and structural stability at high temperature, making them suitable for such applications. Because creep is one of the potential failure modes, and because diffusion is a mechanism by which creep takes place, we have performed computer simulations of cation diffusion in a variety of zirconia-based low-K materials. The kinetic Monte Carlo simulation method is an alternative to the more widely known molecular dynamics (MD) method. It is designed to study "infrequent-event" processes, such as diffusion, for which MD simulation can be highly inefficient. We describe the results of kinetic Monte Carlo computer simulations of cation diffusion in several zirconia-based materials, specifically, zirconia doped with Y, Gd, Nb and Yb. Diffusion paths are identified, and migration energy barriers are obtained from density functional calculations and from the literature. We present results on the temperature dependence of the diffusivity, and on the effects of the presence of oxygen vacancies in cation diffusion barrier complexes as well.

  14. CTmod—A toolkit for Monte Carlo simulation of projections including scatter in computed tomography

    Czech Academy of Sciences Publication Activity Database

    Malušek, Alexandr; Sandborg, M.; Alm Carlsson, G.

    2008-01-01

    Roč. 90, č. 2 (2008), s. 167-178 ISSN 0169-2607 Institutional research plan: CEZ:AV0Z10480505 Keywords : Monte Carlo * computed tomography * cone beam * scatter Subject RIV: JC - Computer Hardware ; Software Impact factor: 1.220, year: 2008 http://dx.doi.org/10.1016/j.cmpb.2007.12.005

  15. Oxygen transport properties estimation by classical trajectory–direct simulation Monte Carlo

    Energy Technology Data Exchange (ETDEWEB)

    Bruno, Domenico, E-mail: domenico.bruno@cnr.it [Istituto di Metodologie Inorganiche e dei Plasmi, Consiglio Nazionale delle Ricerche– Via G. Amendola 122, 70125 Bari (Italy); Frezzotti, Aldo, E-mail: aldo.frezzotti@polimi.it; Ghiroldi, Gian Pietro, E-mail: gpghiro@gmail.com [Dipartimento di Scienze e Tecnologie Aerospaziali, Politecnico di Milano–Via La Masa 34, 20156 Milano (Italy)

    2015-05-15

    Coupling direct simulation Monte Carlo (DSMC) simulations with classical trajectory calculations is a powerful tool to improve predictive capabilities of computational dilute gas dynamics. The considerable increase in computational effort outlined in early applications of the method can be compensated by running simulations on massively parallel computers. In particular, Graphics Processing Unit acceleration has been found quite effective in reducing computing time of classical trajectory (CT)-DSMC simulations. The aim of the present work is to study dilute molecular oxygen flows by modeling binary collisions, in the rigid rotor approximation, through an accurate Potential Energy Surface (PES), obtained by molecular beams scattering. The PES accuracy is assessed by calculating molecular oxygen transport properties by different equilibrium and non-equilibrium CT-DSMC based simulations that provide close values of the transport properties. Comparisons with available experimental data are presented and discussed in the temperature range 300–900 K, where vibrational degrees of freedom are expected to play a limited (but not always negligible) role.

  16. Development and applications of Super Monte Carlo Simulation Program for Advanced Nuclear Energy Systems

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Y., E-mail: yican.wu@fds.org.cn [Inst. of Nuclear Energy Safety Technology, Hefei, Anhui (China)

    2015-07-01

    'Full text:' Super Monte Carlo Simulation Program for Advanced Nuclear Energy Systems (SuperMC) is a CAD-based Monte Carlo (MC) program for integrated simulation of nuclear system by making use of hybrid MC-deterministic method and advanced computer technologies. The main usability features are automatic modeling of geometry and physics, visualization and virtual simulation and cloud computing service. SuperMC 2.3, the latest version, can perform coupled neutron and photon transport calculation. SuperMC has been verified by more than 2000 benchmark models and experiments, and has been applied in tens of major nuclear projects, such as the nuclear design and analysis of International Thermonuclear Experimental Reactor (ITER) and China Lead-based reactor (CLEAR). Development and applications of SuperMC are introduced in this presentation. (author)

  17. Development and applications of Super Monte Carlo Simulation Program for Advanced Nuclear Energy Systems

    International Nuclear Information System (INIS)

    Wu, Y.

    2015-01-01

    'Full text:' Super Monte Carlo Simulation Program for Advanced Nuclear Energy Systems (SuperMC) is a CAD-based Monte Carlo (MC) program for integrated simulation of nuclear system by making use of hybrid MC-deterministic method and advanced computer technologies. The main usability features are automatic modeling of geometry and physics, visualization and virtual simulation and cloud computing service. SuperMC 2.3, the latest version, can perform coupled neutron and photon transport calculation. SuperMC has been verified by more than 2000 benchmark models and experiments, and has been applied in tens of major nuclear projects, such as the nuclear design and analysis of International Thermonuclear Experimental Reactor (ITER) and China Lead-based reactor (CLEAR). Development and applications of SuperMC are introduced in this presentation. (author)

  18. Uncertainty analysis in Monte Carlo criticality computations

    International Nuclear Information System (INIS)

    Qi Ao

    2011-01-01

    Highlights: ► Two types of uncertainty methods for k eff Monte Carlo computations are examined. ► Sampling method has the least restrictions on perturbation but computing resources. ► Analytical method is limited to small perturbation on material properties. ► Practicality relies on efficiency, multiparameter applicability and data availability. - Abstract: Uncertainty analysis is imperative for nuclear criticality risk assessments when using Monte Carlo neutron transport methods to predict the effective neutron multiplication factor (k eff ) for fissionable material systems. For the validation of Monte Carlo codes for criticality computations against benchmark experiments, code accuracy and precision are measured by both the computational bias and uncertainty in the bias. The uncertainty in the bias accounts for known or quantified experimental, computational and model uncertainties. For the application of Monte Carlo codes for criticality analysis of fissionable material systems, an administrative margin of subcriticality must be imposed to provide additional assurance of subcriticality for any unknown or unquantified uncertainties. Because of a substantial impact of the administrative margin of subcriticality on economics and safety of nuclear fuel cycle operations, recently increasing interests in reducing the administrative margin of subcriticality make the uncertainty analysis in criticality safety computations more risk-significant. This paper provides an overview of two most popular k eff uncertainty analysis methods for Monte Carlo criticality computations: (1) sampling-based methods, and (2) analytical methods. Examples are given to demonstrate their usage in the k eff uncertainty analysis due to uncertainties in both neutronic and non-neutronic parameters of fissionable material systems.

  19. PhyloSim - Monte Carlo simulation of sequence evolution in the R statistical computing environment

    Directory of Open Access Journals (Sweden)

    Massingham Tim

    2011-04-01

    Full Text Available Abstract Background The Monte Carlo simulation of sequence evolution is routinely used to assess the performance of phylogenetic inference methods and sequence alignment algorithms. Progress in the field of molecular evolution fuels the need for more realistic and hence more complex simulations, adapted to particular situations, yet current software makes unreasonable assumptions such as homogeneous substitution dynamics or a uniform distribution of indels across the simulated sequences. This calls for an extensible simulation framework written in a high-level functional language, offering new functionality and making it easy to incorporate further complexity. Results PhyloSim is an extensible framework for the Monte Carlo simulation of sequence evolution, written in R, using the Gillespie algorithm to integrate the actions of many concurrent processes such as substitutions, insertions and deletions. Uniquely among sequence simulation tools, PhyloSim can simulate arbitrarily complex patterns of rate variation and multiple indel processes, and allows for the incorporation of selective constraints on indel events. User-defined complex patterns of mutation and selection can be easily integrated into simulations, allowing PhyloSim to be adapted to specific needs. Conclusions Close integration with R and the wide range of features implemented offer unmatched flexibility, making it possible to simulate sequence evolution under a wide range of realistic settings. We believe that PhyloSim will be useful to future studies involving simulated alignments.

  20. Genetic algorithms and Monte Carlo simulation for optimal plant design

    International Nuclear Information System (INIS)

    Cantoni, M.; Marseguerra, M.; Zio, E.

    2000-01-01

    We present an approach to the optimal plant design (choice of system layout and components) under conflicting safety and economic constraints, based upon the coupling of a Monte Carlo evaluation of plant operation with a Genetic Algorithms-maximization procedure. The Monte Carlo simulation model provides a flexible tool, which enables one to describe relevant aspects of plant design and operation, such as standby modes and deteriorating repairs, not easily captured by analytical models. The effects of deteriorating repairs are described by means of a modified Brown-Proschan model of imperfect repair which accounts for the possibility of an increased proneness to failure of a component after a repair. The transitions of a component from standby to active, and vice versa, are simulated using a multiplicative correlation model. The genetic algorithms procedure is demanded to optimize a profit function which accounts for the plant safety and economic performance and which is evaluated, for each possible design, by the above Monte Carlo simulation. In order to avoid an overwhelming use of computer time, for each potential solution proposed by the genetic algorithm, we perform only few hundreds Monte Carlo histories and, then, exploit the fact that during the genetic algorithm population evolution, the fit chromosomes appear repeatedly many times, so that the results for the solutions of interest (i.e. the best ones) attain statistical significance

  1. Markov chains analytic and Monte Carlo computations

    CERN Document Server

    Graham, Carl

    2014-01-01

    Markov Chains: Analytic and Monte Carlo Computations introduces the main notions related to Markov chains and provides explanations on how to characterize, simulate, and recognize them. Starting with basic notions, this book leads progressively to advanced and recent topics in the field, allowing the reader to master the main aspects of the classical theory. This book also features: Numerous exercises with solutions as well as extended case studies.A detailed and rigorous presentation of Markov chains with discrete time and state space.An appendix presenting probabilistic notions that are nec

  2. Accelerating Monte Carlo simulations of photon transport in a voxelized geometry using a massively parallel graphics processing unit

    International Nuclear Information System (INIS)

    Badal, Andreu; Badano, Aldo

    2009-01-01

    Purpose: It is a known fact that Monte Carlo simulations of radiation transport are computationally intensive and may require long computing times. The authors introduce a new paradigm for the acceleration of Monte Carlo simulations: The use of a graphics processing unit (GPU) as the main computing device instead of a central processing unit (CPU). Methods: A GPU-based Monte Carlo code that simulates photon transport in a voxelized geometry with the accurate physics models from PENELOPE has been developed using the CUDA programming model (NVIDIA Corporation, Santa Clara, CA). Results: An outline of the new code and a sample x-ray imaging simulation with an anthropomorphic phantom are presented. A remarkable 27-fold speed up factor was obtained using a GPU compared to a single core CPU. Conclusions: The reported results show that GPUs are currently a good alternative to CPUs for the simulation of radiation transport. Since the performance of GPUs is currently increasing at a faster pace than that of CPUs, the advantages of GPU-based software are likely to be more pronounced in the future.

  3. Accelerating Monte Carlo simulations of photon transport in a voxelized geometry using a massively parallel graphics processing unit

    Energy Technology Data Exchange (ETDEWEB)

    Badal, Andreu; Badano, Aldo [Division of Imaging and Applied Mathematics, OSEL, CDRH, U.S. Food and Drug Administration, Silver Spring, Maryland 20993-0002 (United States)

    2009-11-15

    Purpose: It is a known fact that Monte Carlo simulations of radiation transport are computationally intensive and may require long computing times. The authors introduce a new paradigm for the acceleration of Monte Carlo simulations: The use of a graphics processing unit (GPU) as the main computing device instead of a central processing unit (CPU). Methods: A GPU-based Monte Carlo code that simulates photon transport in a voxelized geometry with the accurate physics models from PENELOPE has been developed using the CUDA programming model (NVIDIA Corporation, Santa Clara, CA). Results: An outline of the new code and a sample x-ray imaging simulation with an anthropomorphic phantom are presented. A remarkable 27-fold speed up factor was obtained using a GPU compared to a single core CPU. Conclusions: The reported results show that GPUs are currently a good alternative to CPUs for the simulation of radiation transport. Since the performance of GPUs is currently increasing at a faster pace than that of CPUs, the advantages of GPU-based software are likely to be more pronounced in the future.

  4. Accelerating Monte Carlo simulations of photon transport in a voxelized geometry using a massively parallel graphics processing unit.

    Science.gov (United States)

    Badal, Andreu; Badano, Aldo

    2009-11-01

    It is a known fact that Monte Carlo simulations of radiation transport are computationally intensive and may require long computing times. The authors introduce a new paradigm for the acceleration of Monte Carlo simulations: The use of a graphics processing unit (GPU) as the main computing device instead of a central processing unit (CPU). A GPU-based Monte Carlo code that simulates photon transport in a voxelized geometry with the accurate physics models from PENELOPE has been developed using the CUDATM programming model (NVIDIA Corporation, Santa Clara, CA). An outline of the new code and a sample x-ray imaging simulation with an anthropomorphic phantom are presented. A remarkable 27-fold speed up factor was obtained using a GPU compared to a single core CPU. The reported results show that GPUs are currently a good alternative to CPUs for the simulation of radiation transport. Since the performance of GPUs is currently increasing at a faster pace than that of CPUs, the advantages of GPU-based software are likely to be more pronounced in the future.

  5. Parallel Monte Carlo simulation of aerosol dynamics

    KAUST Repository

    Zhou, K.

    2014-01-01

    A highly efficient Monte Carlo (MC) algorithm is developed for the numerical simulation of aerosol dynamics, that is, nucleation, surface growth, and coagulation. Nucleation and surface growth are handled with deterministic means, while coagulation is simulated with a stochastic method (Marcus-Lushnikov stochastic process). Operator splitting techniques are used to synthesize the deterministic and stochastic parts in the algorithm. The algorithm is parallelized using the Message Passing Interface (MPI). The parallel computing efficiency is investigated through numerical examples. Near 60% parallel efficiency is achieved for the maximum testing case with 3.7 million MC particles running on 93 parallel computing nodes. The algorithm is verified through simulating various testing cases and comparing the simulation results with available analytical and/or other numerical solutions. Generally, it is found that only small number (hundreds or thousands) of MC particles is necessary to accurately predict the aerosol particle number density, volume fraction, and so forth, that is, low order moments of the Particle Size Distribution (PSD) function. Accurately predicting the high order moments of the PSD needs to dramatically increase the number of MC particles. 2014 Kun Zhou et al.

  6. Monte Carlo in radiotherapy: experience in a distributed computational environment

    Science.gov (United States)

    Caccia, B.; Mattia, M.; Amati, G.; Andenna, C.; Benassi, M.; D'Angelo, A.; Frustagli, G.; Iaccarino, G.; Occhigrossi, A.; Valentini, S.

    2007-06-01

    New technologies in cancer radiotherapy need a more accurate computation of the dose delivered in the radiotherapeutical treatment plan, and it is important to integrate sophisticated mathematical models and advanced computing knowledge into the treatment planning (TP) process. We present some results about using Monte Carlo (MC) codes in dose calculation for treatment planning. A distributed computing resource located in the Technologies and Health Department of the Italian National Institute of Health (ISS) along with other computer facilities (CASPUR - Inter-University Consortium for the Application of Super-Computing for Universities and Research) has been used to perform a fully complete MC simulation to compute dose distribution on phantoms irradiated with a radiotherapy accelerator. Using BEAMnrc and GEANT4 MC based codes we calculated dose distributions on a plain water phantom and air/water phantom. Experimental and calculated dose values below ±2% (for depth between 5 mm and 130 mm) were in agreement both in PDD (Percentage Depth Dose) and transversal sections of the phantom. We consider these results a first step towards a system suitable for medical physics departments to simulate a complete treatment plan using remote computing facilities for MC simulations.

  7. PRELIMINARY COUPLING OF THE MONTE CARLO CODE OPENMC AND THE MULTIPHYSICS OBJECT-ORIENTED SIMULATION ENVIRONMENT (MOOSE) FOR ANALYZING DOPPLER FEEDBACK IN MONTE CARLO SIMULATIONS

    Energy Technology Data Exchange (ETDEWEB)

    Matthew Ellis; Derek Gaston; Benoit Forget; Kord Smith

    2011-07-01

    In recent years the use of Monte Carlo methods for modeling reactors has become feasible due to the increasing availability of massively parallel computer systems. One of the primary challenges yet to be fully resolved, however, is the efficient and accurate inclusion of multiphysics feedback in Monte Carlo simulations. The research in this paper presents a preliminary coupling of the open source Monte Carlo code OpenMC with the open source Multiphysics Object-Oriented Simulation Environment (MOOSE). The coupling of OpenMC and MOOSE will be used to investigate efficient and accurate numerical methods needed to include multiphysics feedback in Monte Carlo codes. An investigation into the sensitivity of Doppler feedback to fuel temperature approximations using a two dimensional 17x17 PWR fuel assembly is presented in this paper. The results show a functioning multiphysics coupling between OpenMC and MOOSE. The coupling utilizes Functional Expansion Tallies to accurately and efficiently transfer pin power distributions tallied in OpenMC to unstructured finite element meshes used in MOOSE. The two dimensional PWR fuel assembly case also demonstrates that for a simplified model the pin-by-pin doppler feedback can be adequately replicated by scaling a representative pin based on pin relative powers.

  8. Monte Carlo computer simulation of sedimentation of charged hard spherocylinders

    International Nuclear Information System (INIS)

    Viveros-Méndez, P. X.; Aranda-Espinoza, S.; Gil-Villegas, Alejandro

    2014-01-01

    In this article we present a NVT Monte Carlo computer simulation study of sedimentation of an electroneutral mixture of oppositely charged hard spherocylinders (CHSC) with aspect ratio L/σ = 5, where L and σ are the length and diameter of the cylinder and hemispherical caps, respectively, for each particle. This system is an extension of the restricted primitive model for spherical particles, where L/σ = 0, and it is assumed that the ions are immersed in an structureless solvent, i.e., a continuum with dielectric constant D. The system consisted of N = 2000 particles and the Wolf method was implemented to handle the coulombic interactions of the inhomogeneous system. Results are presented for different values of the strength ratio between the gravitational and electrostatic interactions, Γ = (mgσ)/(e 2 /Dσ), where m is the mass per particle, e is the electron's charge and g is the gravitational acceleration value. A semi-infinite simulation cell was used with dimensions L x ≈ L y and L z = 5L x , where L x , L y , and L z are the box dimensions in Cartesian coordinates, and the gravitational force acts along the z-direction. Sedimentation effects were studied by looking at every layer formed by the CHSC along the gravitational field. By increasing Γ, particles tend to get more packed at each layer and to arrange in local domains with an orientational ordering along two perpendicular axis, a feature not observed in the uncharged system with the same hard-body geometry. This type of arrangement, known as tetratic phase, has been observed in two-dimensional systems of hard-rectangles and rounded hard-squares. In this way, the coupling of gravitational and electric interactions in the CHSC system induces the arrangement of particles in layers, with the formation of quasi-two dimensional tetratic phases near the surface

  9. Power distribution system reliability evaluation using dagger-sampling Monte Carlo simulation

    Energy Technology Data Exchange (ETDEWEB)

    Hu, Y.; Zhao, S.; Ma, Y. [North China Electric Power Univ., Hebei (China). Dept. of Electrical Engineering

    2009-03-11

    A dagger-sampling Monte Carlo simulation method was used to evaluate power distribution system reliability. The dagger-sampling technique was used to record the failure of a component as an incident and to determine its occurrence probability by generating incident samples using random numbers. The dagger sampling technique was combined with the direct sequential Monte Carlo method to calculate average values of load point indices and system indices. Results of the 2 methods with simulation times of up to 100,000 years were then compared. The comparative evaluation showed that less computing time was required using the dagger-sampling technique due to its higher convergence speed. When simulation times were 1000 years, the dagger-sampling method required 0.05 seconds to accomplish an evaluation, while the direct method required 0.27 seconds. 12 refs., 3 tabs., 4 figs.

  10. ARCHER, a new Monte Carlo software tool for emerging heterogeneous computing environments

    International Nuclear Information System (INIS)

    Xu, X. George; Liu, Tianyu; Su, Lin; Du, Xining; Riblett, Matthew; Ji, Wei; Gu, Deyang; Carothers, Christopher D.; Shephard, Mark S.; Brown, Forrest B.; Kalra, Mannudeep K.; Liu, Bob

    2015-01-01

    Highlights: • A fast Monte Carlo based radiation transport code ARCHER was developed. • ARCHER supports different hardware including CPU, GPU and Intel Xeon Phi coprocessor. • Code is benchmarked again MCNP for medical applications. • A typical CT scan dose simulation only takes 6.8 s on an NVIDIA M2090 GPU. • GPU and coprocessor-based codes are 5–8 times faster than the CPU-based codes. - Abstract: The Monte Carlo radiation transport community faces a number of challenges associated with peta- and exa-scale computing systems that rely increasingly on heterogeneous architectures involving hardware accelerators such as GPUs and Xeon Phi coprocessors. Existing Monte Carlo codes and methods must be strategically upgraded to meet emerging hardware and software needs. In this paper, we describe the development of a software, called ARCHER (Accelerated Radiation-transport Computations in Heterogeneous EnviRonments), which is designed as a versatile testbed for future Monte Carlo codes. Preliminary results from five projects in nuclear engineering and medical physics are presented

  11. Direct Measurement of Power Dissipated by Monte Carlo Simulations on CPU and FPGA Platforms

    DEFF Research Database (Denmark)

    Albicocco, Pietro; Papini, Davide; Nannarelli, Alberto

    In this technical report, we describe how power dissipation measurements on different computing platforms (a desktop computer and an FPGA board) are performed by using a Hall effectbased current sensor. The chosen application is a Monte Carlo simulation for European option pricing which is a popu...

  12. Domain-growth kinetics and aspects of pinning: A Monte Carlo simulation study

    DEFF Research Database (Denmark)

    Castán, T.; Lindgård, Per-Anker

    1991-01-01

    By means of Monte Carlo computer simulations we study the domain-growth kinetics after a quench across a first-order line to very low and moderate temperatures in a multidegenerate system with nonconserved order parameter. The model is a continuous spin model relevant for martensitic transformati......By means of Monte Carlo computer simulations we study the domain-growth kinetics after a quench across a first-order line to very low and moderate temperatures in a multidegenerate system with nonconserved order parameter. The model is a continuous spin model relevant for martensitic...... to cross over from n = 1/4 at T approximately 0 to n = 1/2 with temperature for models with pinnings of types (a) and (b). For topological pinnings at T approximately 0, n is consistent with n = 1/8, a value conceivable for several levels of hierarchically interrelated domain-wall movement. When...

  13. Monte Carlo strategies in scientific computing

    CERN Document Server

    Liu, Jun S

    2008-01-01

    This paperback edition is a reprint of the 2001 Springer edition This book provides a self-contained and up-to-date treatment of the Monte Carlo method and develops a common framework under which various Monte Carlo techniques can be "standardized" and compared Given the interdisciplinary nature of the topics and a moderate prerequisite for the reader, this book should be of interest to a broad audience of quantitative researchers such as computational biologists, computer scientists, econometricians, engineers, probabilists, and statisticians It can also be used as the textbook for a graduate-level course on Monte Carlo methods Many problems discussed in the alter chapters can be potential thesis topics for masters’ or PhD students in statistics or computer science departments Jun Liu is Professor of Statistics at Harvard University, with a courtesy Professor appointment at Harvard Biostatistics Department Professor Liu was the recipient of the 2002 COPSS Presidents' Award, the most prestigious one for sta...

  14. 'Odontologic dosimetric card' experiments and simulations using Monte Carlo methods

    International Nuclear Information System (INIS)

    Menezes, C.J.M.; Lima, R. de A.; Peixoto, J.E.; Vieira, J.W.

    2008-01-01

    The techniques for data processing, combined with the development of fast and more powerful computers, makes the Monte Carlo methods one of the most widely used tools in the radiation transport simulation. For applications in diagnostic radiology, this method generally uses anthropomorphic phantoms to evaluate the absorbed dose to patients during exposure. In this paper, some Monte Carlo techniques were used to simulation of a testing device designed for intra-oral X-ray equipment performance evaluation called Odontologic Dosimetric Card (CDO of 'Cartao Dosimetrico Odontologico' in Portuguese) for different thermoluminescent detectors. This paper used two computational models of exposition RXD/EGS4 and CDO/EGS4. In the first model, the simulation results are compared with experimental data obtained in the similar conditions. The second model, it presents the same characteristics of the testing device studied (CDO). For the irradiations, the X-ray spectra were generated by the IPEM report number 78, spectrum processor. The attenuated spectrum was obtained for IEC 61267 qualities and various additional filters for a Pantak 320 X-ray industrial equipment. The results obtained for the study of the copper filters used in the determination of the kVp were compared with experimental data, validating the model proposed for the characterization of the CDO. The results shower of the CDO will be utilized in quality assurance programs in order to guarantee that the equipment fulfill the requirements of the Norm SVS No. 453/98 MS (Brazil) 'Directives of Radiation Protection in Medical and Dental Radiodiagnostic'. We conclude that the EGS4 is a suitable code Monte Carlo to simulate thermoluminescent dosimeters and experimental procedures employed in the routine of the quality control laboratory in diagnostic radiology. (author)

  15. A general transform for variance reduction in Monte Carlo simulations

    International Nuclear Information System (INIS)

    Becker, T.L.; Larsen, E.W.

    2011-01-01

    This paper describes a general transform to reduce the variance of the Monte Carlo estimate of some desired solution, such as flux or biological dose. This transform implicitly includes many standard variance reduction techniques, including source biasing, collision biasing, the exponential transform for path-length stretching, and weight windows. Rather than optimizing each of these techniques separately or choosing semi-empirical biasing parameters based on the experience of a seasoned Monte Carlo practitioner, this General Transform unites all these variance techniques to achieve one objective: a distribution of Monte Carlo particles that attempts to optimize the desired solution. Specifically, this transform allows Monte Carlo particles to be distributed according to the user's specification by using information obtained from a computationally inexpensive deterministic simulation of the problem. For this reason, we consider the General Transform to be a hybrid Monte Carlo/Deterministic method. The numerical results con rm that the General Transform distributes particles according to the user-specified distribution and generally provide reasonable results for shielding applications. (author)

  16. Dynamic bounds coupled with Monte Carlo simulations

    Energy Technology Data Exchange (ETDEWEB)

    Rajabalinejad, M., E-mail: M.Rajabalinejad@tudelft.n [Faculty of Civil Engineering, Delft University of Technology, Delft (Netherlands); Meester, L.E. [Delft Institute of Applied Mathematics, Delft University of Technology, Delft (Netherlands); Gelder, P.H.A.J.M. van; Vrijling, J.K. [Faculty of Civil Engineering, Delft University of Technology, Delft (Netherlands)

    2011-02-15

    For the reliability analysis of engineering structures a variety of methods is known, of which Monte Carlo (MC) simulation is widely considered to be among the most robust and most generally applicable. To reduce simulation cost of the MC method, variance reduction methods are applied. This paper describes a method to reduce the simulation cost even further, while retaining the accuracy of Monte Carlo, by taking into account widely present monotonicity. For models exhibiting monotonic (decreasing or increasing) behavior, dynamic bounds (DB) are defined, which in a coupled Monte Carlo simulation are updated dynamically, resulting in a failure probability estimate, as well as a strict (non-probabilistic) upper and lower bounds. Accurate results are obtained at a much lower cost than an equivalent ordinary Monte Carlo simulation. In a two-dimensional and a four-dimensional numerical example, the cost reduction factors are 130 and 9, respectively, where the relative error is smaller than 5%. At higher accuracy levels, this factor increases, though this effect is expected to be smaller with increasing dimension. To show the application of DB method to real world problems, it is applied to a complex finite element model of a flood wall in New Orleans.

  17. A multi-transputer system for parallel Monte Carlo simulations of extensive air showers

    International Nuclear Information System (INIS)

    Gils, H.J.; Heck, D.; Oehlschlaeger, J.; Schatz, G.; Thouw, T.

    1989-01-01

    A multiprocessor computer system has been brought into operation at the Kernforschungszentrum Karlsruhe. It is dedicated to Monte Carlo simulations of extensive air showers induced by ultra-high energy cosmic rays. The architecture consists of two independently working VMEbus systems each with a 68020 microprocessor as host computer and twelve T800 transputers for parallel processing. The two systems are linked via Ethernet for data exchange. The T800 transputers are equipped with 4 Mbyte RAM each, sufficient to run rather large codes. The host computers are operated under UNIX 5.3. On the transputers compilers for PARALLEL FORTRAN, C, and PASCAL are available. The simple modular architecture of this parallel computer reflects the single purpose for which it is intended. The hardware of the multiprocessor computer is described as well as the way how the user software is handled and distributed to the 24 working processors. The performance of the parallel computer is demonstrated by well-known benchmarks and by realistic Monte Carlo simulations of air showers. Comparisons with other types of microprocessors and with large universal computers are made. It is demonstrated that a cost reduction by more than a factor of 20 is achieved by this system as compared to universal computer. (orig.)

  18. Suppression of the initial transient in Monte Carlo criticality simulations

    International Nuclear Information System (INIS)

    Richet, Y.

    2006-12-01

    Criticality Monte Carlo calculations aim at estimating the effective multiplication factor (k-effective) for a fissile system through iterations simulating neutrons propagation (making a Markov chain). Arbitrary initialization of the neutron population can deeply bias the k-effective estimation, defined as the mean of the k-effective computed at each iteration. A simplified model of this cycle k-effective sequence is built, based on characteristics of industrial criticality Monte Carlo calculations. Statistical tests, inspired by Brownian bridge properties, are designed to discriminate stationarity of the cycle k-effective sequence. The initial detected transient is, then, suppressed in order to improve the estimation of the system k-effective. The different versions of this methodology are detailed and compared, firstly on a plan of numerical tests fitted on criticality Monte Carlo calculations, and, secondly on real criticality calculations. Eventually, the best methodologies observed in these tests are selected and allow to improve industrial Monte Carlo criticality calculations. (author)

  19. Accelerating population balance-Monte Carlo simulation for coagulation dynamics from the Markov jump model, stochastic algorithm and GPU parallel computing

    International Nuclear Information System (INIS)

    Xu, Zuwei; Zhao, Haibo; Zheng, Chuguang

    2015-01-01

    This paper proposes a comprehensive framework for accelerating population balance-Monte Carlo (PBMC) simulation of particle coagulation dynamics. By combining Markov jump model, weighted majorant kernel and GPU (graphics processing unit) parallel computing, a significant gain in computational efficiency is achieved. The Markov jump model constructs a coagulation-rule matrix of differentially-weighted simulation particles, so as to capture the time evolution of particle size distribution with low statistical noise over the full size range and as far as possible to reduce the number of time loopings. Here three coagulation rules are highlighted and it is found that constructing appropriate coagulation rule provides a route to attain the compromise between accuracy and cost of PBMC methods. Further, in order to avoid double looping over all simulation particles when considering the two-particle events (typically, particle coagulation), the weighted majorant kernel is introduced to estimate the maximum coagulation rates being used for acceptance–rejection processes by single-looping over all particles, and meanwhile the mean time-step of coagulation event is estimated by summing the coagulation kernels of rejected and accepted particle pairs. The computational load of these fast differentially-weighted PBMC simulations (based on the Markov jump model) is reduced greatly to be proportional to the number of simulation particles in a zero-dimensional system (single cell). Finally, for a spatially inhomogeneous multi-dimensional (multi-cell) simulation, the proposed fast PBMC is performed in each cell, and multiple cells are parallel processed by multi-cores on a GPU that can implement the massively threaded data-parallel tasks to obtain remarkable speedup ratio (comparing with CPU computation, the speedup ratio of GPU parallel computing is as high as 200 in a case of 100 cells with 10 000 simulation particles per cell). These accelerating approaches of PBMC are

  20. Adaptive Multilevel Monte Carlo Simulation

    KAUST Repository

    Hoel, H

    2011-08-23

    This work generalizes a multilevel forward Euler Monte Carlo method introduced in Michael B. Giles. (Michael Giles. Oper. Res. 56(3):607–617, 2008.) for the approximation of expected values depending on the solution to an Itô stochastic differential equation. The work (Michael Giles. Oper. Res. 56(3):607– 617, 2008.) proposed and analyzed a forward Euler multilevelMonte Carlo method based on a hierarchy of uniform time discretizations and control variates to reduce the computational effort required by a standard, single level, Forward Euler Monte Carlo method. This work introduces an adaptive hierarchy of non uniform time discretizations, generated by an adaptive algorithmintroduced in (AnnaDzougoutov et al. Raùl Tempone. Adaptive Monte Carlo algorithms for stopped diffusion. In Multiscale methods in science and engineering, volume 44 of Lect. Notes Comput. Sci. Eng., pages 59–88. Springer, Berlin, 2005; Kyoung-Sook Moon et al. Stoch. Anal. Appl. 23(3):511–558, 2005; Kyoung-Sook Moon et al. An adaptive algorithm for ordinary, stochastic and partial differential equations. In Recent advances in adaptive computation, volume 383 of Contemp. Math., pages 325–343. Amer. Math. Soc., Providence, RI, 2005.). This form of the adaptive algorithm generates stochastic, path dependent, time steps and is based on a posteriori error expansions first developed in (Anders Szepessy et al. Comm. Pure Appl. Math. 54(10):1169– 1214, 2001). Our numerical results for a stopped diffusion problem, exhibit savings in the computational cost to achieve an accuracy of ϑ(TOL),from(TOL−3), from using a single level version of the adaptive algorithm to ϑ(((TOL−1)log(TOL))2).

  1. Estimation of whole-body radiation exposure from brachytherapy for oral cancer using a Monte Carlo simulation

    International Nuclear Information System (INIS)

    Ozaki, Y.; Watanabe, H.; Kaida, A.; Miura, M.; Nakagawa, K.; Toda, K.; Yoshimura, R.; Sumi, Y.; Kurabayashi, T.

    2017-01-01

    Early stage oral cancer can be cured with oral brachytherapy, but whole-body radiation exposure status has not been previously studied. Recently, the International Commission on Radiological Protection Committee (ICRP) recommended the use of ICRP phantoms to estimate radiation exposure from external and internal radiation sources. In this study, we used a Monte Carlo simulation with ICRP phantoms to estimate whole-body exposure from oral brachytherapy. We used a Particle and Heavy Ion Transport code System (PHITS) to model oral brachytherapy with 192 Ir hairpins and 198 Au grains and to perform a Monte Carlo simulation on the ICRP adult reference computational phantoms. To confirm the simulations, we also computed local dose distributions from these small sources, and compared them with the results from Oncentra manual Low Dose Rate Treatment Planning (mLDR) software which is used in day-to-day clinical practice. We successfully obtained data on absorbed dose for each organ in males and females. Sex-averaged equivalent doses were 0.547 and 0.710 Sv with 192 Ir hairpins and 198 Au grains, respectively. Simulation with PHITS was reliable when compared with an alternative computational technique using mLDR software. We concluded that the absorbed dose for each organ and whole-body exposure from oral brachytherapy can be estimated with Monte Carlo simulation using PHITS on ICRP reference phantoms. Effective doses for patients with oral cancer were obtained.

  2. Personal Supercomputing for Monte Carlo Simulation Using a GPU

    Energy Technology Data Exchange (ETDEWEB)

    Oh, Jae-Yong; Koo, Yang-Hyun; Lee, Byung-Ho [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2008-05-15

    Since the usability, accessibility, and maintenance of a personal computer (PC) are very good, a PC is a useful computer simulation tool for researchers. It has enough calculation power to simulate a small scale system with the improved performance of a PC's CPU. However, if a system is large or long time scale, we need a cluster computer or supercomputer. Recently great changes have occurred in the PC calculation environment. A graphic process unit (GPU) on a graphic card, only used to calculate display data, has a superior calculation capability to a PC's CPU. This GPU calculation performance is a match for the supercomputer in 2000. Although it has such a great calculation potential, it is not easy to program a simulation code for GPU due to difficult programming techniques for converting a calculation matrix to a 3D rendering image using graphic APIs. In 2006, NVIDIA provided the Software Development Kit (SDK) for the programming environment for NVIDIA's graphic cards, which is called the Compute Unified Device Architecture (CUDA). It makes the programming on the GPU easy without knowledge of the graphic APIs. This paper describes the basic architectures of NVIDIA's GPU and CUDA, and carries out a performance benchmark for the Monte Carlo simulation.

  3. Personal Supercomputing for Monte Carlo Simulation Using a GPU

    International Nuclear Information System (INIS)

    Oh, Jae-Yong; Koo, Yang-Hyun; Lee, Byung-Ho

    2008-01-01

    Since the usability, accessibility, and maintenance of a personal computer (PC) are very good, a PC is a useful computer simulation tool for researchers. It has enough calculation power to simulate a small scale system with the improved performance of a PC's CPU. However, if a system is large or long time scale, we need a cluster computer or supercomputer. Recently great changes have occurred in the PC calculation environment. A graphic process unit (GPU) on a graphic card, only used to calculate display data, has a superior calculation capability to a PC's CPU. This GPU calculation performance is a match for the supercomputer in 2000. Although it has such a great calculation potential, it is not easy to program a simulation code for GPU due to difficult programming techniques for converting a calculation matrix to a 3D rendering image using graphic APIs. In 2006, NVIDIA provided the Software Development Kit (SDK) for the programming environment for NVIDIA's graphic cards, which is called the Compute Unified Device Architecture (CUDA). It makes the programming on the GPU easy without knowledge of the graphic APIs. This paper describes the basic architectures of NVIDIA's GPU and CUDA, and carries out a performance benchmark for the Monte Carlo simulation

  4. Anybody can do Value at Risk: A Teaching Study using Parametric Computation and Monte Carlo Simulation

    Directory of Open Access Journals (Sweden)

    Yun Hsing Cheung

    2012-12-01

    Full Text Available The three main Value at Risk (VaR methodologies are historical, parametric and Monte Carlo Simulation.Cheung & Powell (2012, using a step-by-step teaching study, showed how a nonparametric historical VaRmodel could be constructed using Excel, thus benefitting teachers and researchers by providing them with areadily useable teaching study and an inexpensive and flexible VaR modelling option. This article extends thatwork by demonstrating how parametric and Monte Carlo Simulation VaR models can also be constructed inExcel, thus providing a total Excel modelling package encompassing all three VaR methods.

  5. A hybrid transport-diffusion method for Monte Carlo radiative-transfer simulations

    International Nuclear Information System (INIS)

    Densmore, Jeffery D.; Urbatsch, Todd J.; Evans, Thomas M.; Buksas, Michael W.

    2007-01-01

    Discrete Diffusion Monte Carlo (DDMC) is a technique for increasing the efficiency of Monte Carlo particle-transport simulations in diffusive media. If standard Monte Carlo is used in such media, particle histories will consist of many small steps, resulting in a computationally expensive calculation. In DDMC, particles take discrete steps between spatial cells according to a discretized diffusion equation. Each discrete step replaces many small Monte Carlo steps, thus increasing the efficiency of the simulation. In addition, given that DDMC is based on a diffusion equation, it should produce accurate solutions if used judiciously. In practice, DDMC is combined with standard Monte Carlo to form a hybrid transport-diffusion method that can accurately simulate problems with both diffusive and non-diffusive regions. In this paper, we extend previously developed DDMC techniques in several ways that improve the accuracy and utility of DDMC for nonlinear, time-dependent, radiative-transfer calculations. The use of DDMC in these types of problems is advantageous since, due to the underlying linearizations, optically thick regions appear to be diffusive. First, we employ a diffusion equation that is discretized in space but is continuous in time. Not only is this methodology theoretically more accurate than temporally discretized DDMC techniques, but it also has the benefit that a particle's time is always known. Thus, there is no ambiguity regarding what time to assign a particle that leaves an optically thick region (where DDMC is used) and begins transporting by standard Monte Carlo in an optically thin region. Also, we treat the interface between optically thick and optically thin regions with an improved method, based on the asymptotic diffusion-limit boundary condition, that can produce accurate results regardless of the angular distribution of the incident Monte Carlo particles. Finally, we develop a technique for estimating radiation momentum deposition during the

  6. Adding computationally efficient realism to Monte Carlo turbulence simulation

    Science.gov (United States)

    Campbell, C. W.

    1985-01-01

    Frequently in aerospace vehicle flight simulation, random turbulence is generated using the assumption that the craft is small compared to the length scales of turbulence. The turbulence is presumed to vary only along the flight path of the vehicle but not across the vehicle span. The addition of the realism of three-dimensionality is a worthy goal, but any such attempt will not gain acceptance in the simulator community unless it is computationally efficient. A concept for adding three-dimensional realism with a minimum of computational complexity is presented. The concept involves the use of close rational approximations to irrational spectra and cross-spectra so that systems of stable, explicit difference equations can be used to generate the turbulence.

  7. Monte Carlo Simulation Tool Installation and Operation Guide

    Energy Technology Data Exchange (ETDEWEB)

    Aguayo Navarrete, Estanislao; Ankney, Austin S.; Berguson, Timothy J.; Kouzes, Richard T.; Orrell, John L.; Troy, Meredith D.; Wiseman, Clinton G.

    2013-09-02

    This document provides information on software and procedures for Monte Carlo simulations based on the Geant4 toolkit, the ROOT data analysis software and the CRY cosmic ray library. These tools have been chosen for its application to shield design and activation studies as part of the simulation task for the Majorana Collaboration. This document includes instructions for installation, operation and modification of the simulation code in a high cyber-security computing environment, such as the Pacific Northwest National Laboratory network. It is intended as a living document, and will be periodically updated. It is a starting point for information collection by an experimenter, and is not the definitive source. Users should consult with one of the authors for guidance on how to find the most current information for their needs.

  8. The Cherenkov Telescope Array production system for Monte Carlo simulations and analysis

    Science.gov (United States)

    Arrabito, L.; Bernloehr, K.; Bregeon, J.; Cumani, P.; Hassan, T.; Haupt, A.; Maier, G.; Moralejo, A.; Neyroud, N.; pre="for the"> CTA Consortium, DIRAC Consortium,

    2017-10-01

    The Cherenkov Telescope Array (CTA), an array of many tens of Imaging Atmospheric Cherenkov Telescopes deployed on an unprecedented scale, is the next-generation instrument in the field of very high energy gamma-ray astronomy. An average data stream of about 0.9 GB/s for about 1300 hours of observation per year is expected, therefore resulting in 4 PB of raw data per year and a total of 27 PB/year, including archive and data processing. The start of CTA operation is foreseen in 2018 and it will last about 30 years. The installation of the first telescopes in the two selected locations (Paranal, Chile and La Palma, Spain) will start in 2017. In order to select the best site candidate to host CTA telescopes (in the Northern and in the Southern hemispheres), massive Monte Carlo simulations have been performed since 2012. Once the two sites have been selected, we have started new Monte Carlo simulations to determine the optimal array layout with respect to the obtained sensitivity. Taking into account that CTA may be finally composed of 7 different telescope types coming in 3 different sizes, many different combinations of telescope position and multiplicity as a function of the telescope type have been proposed. This last Monte Carlo campaign represented a huge computational effort, since several hundreds of telescope positions have been simulated, while for future instrument response function simulations, only the operating telescopes will be considered. In particular, during the last 18 months, about 2 PB of Monte Carlo data have been produced and processed with different analysis chains, with a corresponding overall CPU consumption of about 125 M HS06 hours. In these proceedings, we describe the employed computing model, based on the use of grid resources, as well as the production system setup, which relies on the DIRAC interware. Finally, we present the envisaged evolutions of the CTA production system for the off-line data processing during CTA operations and

  9. Coded aperture optimization using Monte Carlo simulations

    International Nuclear Information System (INIS)

    Martineau, A.; Rocchisani, J.M.; Moretti, J.L.

    2010-01-01

    Coded apertures using Uniformly Redundant Arrays (URA) have been unsuccessfully evaluated for two-dimensional and three-dimensional imaging in Nuclear Medicine. The images reconstructed from coded projections contain artifacts and suffer from poor spatial resolution in the longitudinal direction. We introduce a Maximum-Likelihood Expectation-Maximization (MLEM) algorithm for three-dimensional coded aperture imaging which uses a projection matrix calculated by Monte Carlo simulations. The aim of the algorithm is to reduce artifacts and improve the three-dimensional spatial resolution in the reconstructed images. Firstly, we present the validation of GATE (Geant4 Application for Emission Tomography) for Monte Carlo simulations of a coded mask installed on a clinical gamma camera. The coded mask modelling was validated by comparison between experimental and simulated data in terms of energy spectra, sensitivity and spatial resolution. In the second part of the study, we use the validated model to calculate the projection matrix with Monte Carlo simulations. A three-dimensional thyroid phantom study was performed to compare the performance of the three-dimensional MLEM reconstruction with conventional correlation method. The results indicate that the artifacts are reduced and three-dimensional spatial resolution is improved with the Monte Carlo-based MLEM reconstruction.

  10. An introduction to computer simulation methods applications to physical systems

    CERN Document Server

    Gould, Harvey; Christian, Wolfgang

    2007-01-01

    Now in its third edition, this book teaches physical concepts using computer simulations. The text incorporates object-oriented programming techniques and encourages readers to develop good programming habits in the context of doing physics. Designed for readers at all levels , An Introduction to Computer Simulation Methods uses Java, currently the most popular programming language. Introduction, Tools for Doing Simulations, Simulating Particle Motion, Oscillatory Systems, Few-Body Problems: The Motion of the Planets, The Chaotic Motion of Dynamical Systems, Random Processes, The Dynamics of Many Particle Systems, Normal Modes and Waves, Electrodynamics, Numerical and Monte Carlo Methods, Percolation, Fractals and Kinetic Growth Models, Complex Systems, Monte Carlo Simulations of Thermal Systems, Quantum Systems, Visualization and Rigid Body Dynamics, Seeing in Special and General Relativity, Epilogue: The Unity of Physics For all readers interested in developing programming habits in the context of doing phy...

  11. Monte Carlo simulation of mixed neutron-gamma radiation fields and dosimetry devices

    International Nuclear Information System (INIS)

    Zhang, Guoqing

    2011-01-01

    different incident angles of neutrons, the responses were calculated. To correct the track overlapping effect for high track densities, density correction factors are computed with the Monte Carlo method. A computer code has been developed to handle all the calculations with different parameters. To verify the simulation results, experiments were performed.

  12. Monte Carlo simulation of mixed neutron-gamma radiation fields and dosimetry devices

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Guoqing

    2011-12-22

    different incident angles of neutrons, the responses were calculated. To correct the track overlapping effect for high track densities, density correction factors are computed with the Monte Carlo method. A computer code has been developed to handle all the calculations with different parameters. To verify the simulation results, experiments were performed.

  13. Icarus: A 2-D Direct Simulation Monte Carlo (DSMC) Code for Multi-Processor Computers

    International Nuclear Information System (INIS)

    BARTEL, TIMOTHY J.; PLIMPTON, STEVEN J.; GALLIS, MICHAIL A.

    2001-01-01

    Icarus is a 2D Direct Simulation Monte Carlo (DSMC) code which has been optimized for the parallel computing environment. The code is based on the DSMC method of Bird[11.1] and models from free-molecular to continuum flowfields in either cartesian (x, y) or axisymmetric (z, r) coordinates. Computational particles, representing a given number of molecules or atoms, are tracked as they have collisions with other particles or surfaces. Multiple species, internal energy modes (rotation and vibration), chemistry, and ion transport are modeled. A new trace species methodology for collisions and chemistry is used to obtain statistics for small species concentrations. Gas phase chemistry is modeled using steric factors derived from Arrhenius reaction rates or in a manner similar to continuum modeling. Surface chemistry is modeled with surface reaction probabilities; an optional site density, energy dependent, coverage model is included. Electrons are modeled by either a local charge neutrality assumption or as discrete simulational particles. Ion chemistry is modeled with electron impact chemistry rates and charge exchange reactions. Coulomb collision cross-sections are used instead of Variable Hard Sphere values for ion-ion interactions. The electro-static fields can either be: externally input, a Langmuir-Tonks model or from a Green's Function (Boundary Element) based Poison Solver. Icarus has been used for subsonic to hypersonic, chemically reacting, and plasma flows. The Icarus software package includes the grid generation, parallel processor decomposition, post-processing, and restart software. The commercial graphics package, Tecplot, is used for graphics display. All of the software packages are written in standard Fortran

  14. Monte Carlo simulations and dosimetric studies of an irradiation facility

    Energy Technology Data Exchange (ETDEWEB)

    Belchior, A. [Instituto Tecnologico e Nuclear, Estrada nacional no. 10, Apartado 21, 2686-953 Sacavem (Portugal)], E-mail: anabelchior@itn.pt; Botelho, M.L; Vaz, P. [Instituto Tecnologico e Nuclear, Estrada nacional no. 10, Apartado 21, 2686-953 Sacavem (Portugal)

    2007-09-21

    There is an increasing utilization of ionizing radiation for industrial applications. Additionally, the radiation technology offers a variety of advantages in areas, such as sterilization and food preservation. For these applications, dosimetric tests are of crucial importance in order to assess the dose distribution throughout the sample being irradiated. The use of Monte Carlo methods and computational tools in support of the assessment of the dose distributions in irradiation facilities can prove to be economically effective, representing savings in the utilization of dosemeters, among other benefits. One of the purposes of this study is the development of a Monte Carlo simulation, using a state-of-the-art computational tool-MCNPX-in order to determine the dose distribution inside an irradiation facility of Cobalt 60. This irradiation facility is currently in operation at the ITN campus and will feature an automation and robotics component, which will allow its remote utilization by an external user, under REEQ/996/BIO/2005 project. The detailed geometrical description of the irradiation facility has been implemented in MCNPX, which features an accurate and full simulation of the electron-photon processes involved. The validation of the simulation results obtained was performed by chemical dosimetry methods, namely a Fricke solution. The Fricke dosimeter is a standard dosimeter and is widely used in radiation processing for calibration purposes.

  15. A Monte Carlo simulation for the field theory with quartic interaction

    Energy Technology Data Exchange (ETDEWEB)

    Santos, Sergio Mittmann dos [Instituto Federal de Educacao, Ciencia e Tecnologia do Rio Grande do Sul (IFRS), Porto Alegre, RS (Brazil)

    2011-07-01

    Full text: In the work [1-S. M. Santos, B. E. J. Bodmann and A. T. Gomez, Um novo metodo computacional para a teoria de campos na rede: resultados preliminares, IV Escola do Centro Brasileiro de Pesquisas Fisicas (CBPF), Rio de Janeiro, 2002; and 2-S. M. Santos and B. E. J. Bodmann, Simulacao na rede de teorias de campos quanticos, XXVIII Congresso Nacional de Matematica Aplicada e Computacional (CNMAC), Sao Paulo, 2005], a computational method on the lattice was elaborated for the problem known as scalar field theory with quartic interaction (for instance, see: J. R. Klauder, Beyound conventional quantization, Cambridge: Cambridge University Press, 2000). This one introduced an algorithm, which allows the simulation of a given field theory and is independent of the lattice spacing, by redefining the fields and the parameters (the mass m and the coupling constant g). This kind of approach permits varying the dimension of the lattice without changing the computational complexity of the algorithm. A simulation was made using the Monte Carlo method, where the renormalized mass m{sub R}, the renormalized coupling constant g{sub R} and the two point correlation function were determined with success. In the present work, the genuine computational method is used for new simulations. Now, the Monte Carlo method is not used just for the simulation of the algorithm, like in [1, 2], but also for defining the adjust parameters (the mass and the coupling constant), introduced ad hoc in [1, 2]. This work presents the first simulations' outcomes, where best results that [1, 2] were determined, for the renormalized mass and the renormalized coupling constant. (author)

  16. Multi-Subband Ensemble Monte Carlo simulations of scaled GAA MOSFETs

    Science.gov (United States)

    Donetti, L.; Sampedro, C.; Ruiz, F. G.; Godoy, A.; Gamiz, F.

    2018-05-01

    We developed a Multi-Subband Ensemble Monte Carlo simulator for non-planar devices, taking into account two-dimensional quantum confinement. It couples self-consistently the solution of the 3D Poisson equation, the 2D Schrödinger equation, and the 1D Boltzmann transport equation with the Ensemble Monte Carlo method. This simulator was employed to study MOS devices based on ultra-scaled Gate-All-Around Si nanowires with diameters in the range from 4 nm to 8 nm with gate length from 8 nm to 14 nm. We studied the output and transfer characteristics, interpreting the behavior in the sub-threshold region and in the ON state in terms of the spatial charge distribution and the mobility computed with the same simulator. We analyzed the results, highlighting the contribution of different valleys and subbands and the effect of the gate bias on the energy and velocity profiles. Finally the scaling behavior was studied, showing that only the devices with D = 4nm maintain a good control of the short channel effects down to the gate length of 8nm .

  17. Monte Carlo Simulation for Particle Detectors

    CERN Document Server

    Pia, Maria Grazia

    2012-01-01

    Monte Carlo simulation is an essential component of experimental particle physics in all the phases of its life-cycle: the investigation of the physics reach of detector concepts, the design of facilities and detectors, the development and optimization of data reconstruction software, the data analysis for the production of physics results. This note briefly outlines some research topics related to Monte Carlo simulation, that are relevant to future experimental perspectives in particle physics. The focus is on physics aspects: conceptual progress beyond current particle transport schemes, the incorporation of materials science knowledge relevant to novel detection technologies, functionality to model radiation damage, the capability for multi-scale simulation, quantitative validation and uncertainty quantification to determine the predictive power of simulation. The R&D on simulation for future detectors would profit from cooperation within various components of the particle physics community, and synerg...

  18. A virtual source method for Monte Carlo simulation of Gamma Knife Model C

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Tae Hoon; Kim, Yong Kyun [Hanyang University, Seoul (Korea, Republic of); Chung, Hyun Tai [Seoul National University College of Medicine, Seoul (Korea, Republic of)

    2016-05-15

    The Monte Carlo simulation method has been used for dosimetry of radiation treatment. Monte Carlo simulation is the method that determines paths and dosimetry of particles using random number. Recently, owing to the ability of fast processing of the computers, it is possible to treat a patient more precisely. However, it is necessary to increase the simulation time to improve the efficiency of accuracy uncertainty. When generating the particles from the cobalt source in a simulation, there are many particles cut off. So it takes time to simulate more accurately. For the efficiency, we generated the virtual source that has the phase space distribution which acquired a single gamma knife channel. We performed the simulation using the virtual sources on the 201 channel and compared the measurement with the simulation using virtual sources and real sources. A virtual source file was generated to reduce the simulation time of a Gamma Knife Model C. Simulations with a virtual source executed about 50 times faster than the original source code and there was no statistically significant difference in simulated results.

  19. A virtual source method for Monte Carlo simulation of Gamma Knife Model C

    International Nuclear Information System (INIS)

    Kim, Tae Hoon; Kim, Yong Kyun; Chung, Hyun Tai

    2016-01-01

    The Monte Carlo simulation method has been used for dosimetry of radiation treatment. Monte Carlo simulation is the method that determines paths and dosimetry of particles using random number. Recently, owing to the ability of fast processing of the computers, it is possible to treat a patient more precisely. However, it is necessary to increase the simulation time to improve the efficiency of accuracy uncertainty. When generating the particles from the cobalt source in a simulation, there are many particles cut off. So it takes time to simulate more accurately. For the efficiency, we generated the virtual source that has the phase space distribution which acquired a single gamma knife channel. We performed the simulation using the virtual sources on the 201 channel and compared the measurement with the simulation using virtual sources and real sources. A virtual source file was generated to reduce the simulation time of a Gamma Knife Model C. Simulations with a virtual source executed about 50 times faster than the original source code and there was no statistically significant difference in simulated results

  20. Availability of fusion plants employing a Monte Carlo simulation computer code

    International Nuclear Information System (INIS)

    Musicki, Z.

    1984-01-01

    The fusion facilities being built or designed will have availability problems due to their complexity and employment of not yet fully developed technologies. Low availability of test facilities will have an adverse impact on the learning time and will therefore push back the commercialization date of fusion. Low availability of commercial electric power plants will increase the cost of electricity and make fusion a less-attractive power source. Thus, the time to study the availability problems of fusion plants and suggest improvements is now, before costly mistakes are committed. This study is an initial effort in the area and is an attempt to develop methods for calculation of system's performance, specifically its availability, start collecting necessary data and identify the areas where data are lacking, as well as to point out the subsystems where resources need to be applied in order to bring about an acceptable system performance. The method used to study availability is a simulation computer code based on the Monte Carlo process and developed by the author. The fusion systems analyzed were TASKA (a tandem mirror test facility design) and MARS (a tandem mirror power plant design). The model and available data were employed to find that the most critical subsystems needing further work are the neutral beams, RF heating subsystems, direct convertor, and certain magnets

  1. A Pipelined and Parallel Architecture for Quantum Monte Carlo Simulations on FPGAs

    Directory of Open Access Journals (Sweden)

    Akila Gothandaraman

    2010-01-01

    Full Text Available Recent advances in Field-Programmable Gate Array (FPGA technology make reconfigurable computing using FPGAs an attractive platform for accelerating scientific applications. We develop a deeply pipelined and parallel architecture for Quantum Monte Carlo simulations using FPGAs. Quantum Monte Carlo simulations enable us to obtain the structural and energetic properties of atomic clusters. We experiment with different pipeline structures for each component of the design and develop a deeply pipelined architecture that provides the best performance in terms of achievable clock rate, while at the same time has a modest use of the FPGA resources. We discuss the details of the pipelined and generic architecture that is used to obtain the potential energy and wave function of a cluster of atoms.

  2. Applications of Monte Carlo simulations of gamma-ray spectra

    International Nuclear Information System (INIS)

    Clark, D.D.

    1995-01-01

    A short, convenient computer program based on the Monte Carlo method that was developed to generate simulated gamma-ray spectra has been found to have useful applications in research and teaching. In research, we use it to predict spectra in neutron activation analysis (NAA), particularly in prompt gamma-ray NAA (PGNAA). In teaching, it is used to illustrate the dependence of detector response functions on the nature of gamma-ray interactions, the incident gamma-ray energy, and detector geometry

  3. Dynamic connectivity algorithms for Monte Carlo simulations of the random-cluster model

    International Nuclear Information System (INIS)

    Elçi, Eren Metin; Weigel, Martin

    2014-01-01

    We review Sweeny's algorithm for Monte Carlo simulations of the random cluster model. Straightforward implementations suffer from the problem of computational critical slowing down, where the computational effort per edge operation scales with a power of the system size. By using a tailored dynamic connectivity algorithm we are able to perform all operations with a poly-logarithmic computational effort. This approach is shown to be efficient in keeping online connectivity information and is of use for a number of applications also beyond cluster-update simulations, for instance in monitoring droplet shape transitions. As the handling of the relevant data structures is non-trivial, we provide a Python module with a full implementation for future reference.

  4. Treatment planning in radiosurgery: parallel Monte Carlo simulation software

    Energy Technology Data Exchange (ETDEWEB)

    Scielzo, G [Galliera Hospitals, Genova (Italy). Dept. of Hospital Physics; Grillo Ruggieri, F [Galliera Hospitals, Genova (Italy) Dept. for Radiation Therapy; Modesti, M; Felici, R [Electronic Data System, Rome (Italy); Surridge, M [University of South Hampton (United Kingdom). Parallel Apllication Centre

    1995-12-01

    The main objective of this research was to evaluate the possibility of direct Monte Carlo simulation for accurate dosimetry with short computation time. We made us of: graphics workstation, linear accelerator, water, PMMA and anthropomorphic phantoms, for validation purposes; ionometric, film and thermo-luminescent techniques, for dosimetry; treatment planning system for comparison. Benchmarking results suggest that short computing times can be obtained with use of the parallel version of EGS4 that was developed. Parallelism was obtained assigning simulation incident photons to separate processors, and the development of a parallel random number generator was necessary. Validation consisted in: phantom irradiation, comparison of predicted and measured values good agreement in PDD and dose profiles. Experiments on anthropomorphic phantoms (with inhomogeneities) were carried out, and these values are being compared with results obtained with the conventional treatment planning system.

  5. Monte Carlo computer simulation of sedimentation of charged hard spherocylinders

    Energy Technology Data Exchange (ETDEWEB)

    Viveros-Méndez, P. X., E-mail: xviveros@fisica.uaz.edu.mx; Aranda-Espinoza, S. [Unidad Académica de Física, Universidad Autónoma de Zacatecas, Calzada Solidaridad esq. Paseo, La Bufa s/n, 98060 Zacatecas, Zacatecas, México (Mexico); Gil-Villegas, Alejandro [Departamento de Ingeniería Física, División de Ciencias e Ingenierías, Campus León, Universidad de Guanajuato, Loma del Bosque 103, Lomas del Campestre, 37150 León, Guanajuato, México (Mexico)

    2014-07-28

    In this article we present a NVT Monte Carlo computer simulation study of sedimentation of an electroneutral mixture of oppositely charged hard spherocylinders (CHSC) with aspect ratio L/σ = 5, where L and σ are the length and diameter of the cylinder and hemispherical caps, respectively, for each particle. This system is an extension of the restricted primitive model for spherical particles, where L/σ = 0, and it is assumed that the ions are immersed in an structureless solvent, i.e., a continuum with dielectric constant D. The system consisted of N = 2000 particles and the Wolf method was implemented to handle the coulombic interactions of the inhomogeneous system. Results are presented for different values of the strength ratio between the gravitational and electrostatic interactions, Γ = (mgσ)/(e{sup 2}/Dσ), where m is the mass per particle, e is the electron's charge and g is the gravitational acceleration value. A semi-infinite simulation cell was used with dimensions L{sub x} ≈ L{sub y} and L{sub z} = 5L{sub x}, where L{sub x}, L{sub y}, and L{sub z} are the box dimensions in Cartesian coordinates, and the gravitational force acts along the z-direction. Sedimentation effects were studied by looking at every layer formed by the CHSC along the gravitational field. By increasing Γ, particles tend to get more packed at each layer and to arrange in local domains with an orientational ordering along two perpendicular axis, a feature not observed in the uncharged system with the same hard-body geometry. This type of arrangement, known as tetratic phase, has been observed in two-dimensional systems of hard-rectangles and rounded hard-squares. In this way, the coupling of gravitational and electric interactions in the CHSC system induces the arrangement of particles in layers, with the formation of quasi-two dimensional tetratic phases near the surface.

  6. A Monte Carlo-based model for simulation of digital chest tomo-synthesis

    International Nuclear Information System (INIS)

    Ullman, G.; Dance, D. R.; Sandborg, M.; Carlsson, G. A.; Svalkvist, A.; Baath, M.

    2010-01-01

    The aim of this work was to calculate synthetic digital chest tomo-synthesis projections using a computer simulation model based on the Monte Carlo method. An anthropomorphic chest phantom was scanned in a computed tomography scanner, segmented and included in the computer model to allow for simulation of realistic high-resolution X-ray images. The input parameters to the model were adapted to correspond to the VolumeRAD chest tomo-synthesis system from GE Healthcare. Sixty tomo-synthesis projections were calculated with projection angles ranging from + 15 to -15 deg. The images from primary photons were calculated using an analytical model of the anti-scatter grid and a pre-calculated detector response function. The contributions from scattered photons were calculated using an in-house Monte Carlo-based model employing a number of variance reduction techniques such as the collision density estimator. Tomographic section images were reconstructed by transferring the simulated projections into the VolumeRAD system. The reconstruction was performed for three types of images using: (i) noise-free primary projections, (ii) primary projections including contributions from scattered photons and (iii) projections as in (ii) with added correlated noise. The simulated section images were compared with corresponding section images from projections taken with the real, anthropomorphic phantom from which the digital voxel phantom was originally created. The present article describes a work in progress aiming towards developing a model intended for optimisation of chest tomo-synthesis, allowing for simulation of both existing and future chest tomo-synthesis systems. (authors)

  7. Monte Carlo simulation applied to alpha spectrometry

    International Nuclear Information System (INIS)

    Baccouche, S.; Gharbi, F.; Trabelsi, A.

    2007-01-01

    Alpha particle spectrometry is a widely-used analytical method, in particular when we deal with pure alpha emitting radionuclides. Monte Carlo simulation is an adequate tool to investigate the influence of various phenomena on this analytical method. We performed an investigation of those phenomena using the simulation code GEANT of CERN. The results concerning the geometrical detection efficiency in different measurement geometries agree with analytical calculations. This work confirms that Monte Carlo simulation of solid angle of detection is a very useful tool to determine with very good accuracy the detection efficiency.

  8. BRAND program complex for neutron-physical experiment simulation by the Monte-Carlo method

    International Nuclear Information System (INIS)

    Androsenko, A.A.; Androsenko, P.A.

    1984-01-01

    Possibilities of the BRAND program complex for neutron and γ-radiation transport simulation by the Monte-Carlo method are described in short. The complex includes the following modules: geometric module, source module, detector module, modules of simulation of a vector of particle motion direction after interaction and a free path. The complex is written in the FORTRAN langauage and realized by the BESM-6 computer

  9. Numerical stabilization of entanglement computation in auxiliary-field quantum Monte Carlo simulations of interacting many-fermion systems.

    Science.gov (United States)

    Broecker, Peter; Trebst, Simon

    2016-12-01

    In the absence of a fermion sign problem, auxiliary-field (or determinantal) quantum Monte Carlo (DQMC) approaches have long been the numerical method of choice for unbiased, large-scale simulations of interacting many-fermion systems. More recently, the conceptual scope of this approach has been expanded by introducing ingenious schemes to compute entanglement entropies within its framework. On a practical level, these approaches, however, suffer from a variety of numerical instabilities that have largely impeded their applicability. Here we report on a number of algorithmic advances to overcome many of these numerical instabilities and significantly improve the calculation of entanglement measures in the zero-temperature projective DQMC approach, ultimately allowing us to reach similar system sizes as for the computation of conventional observables. We demonstrate the applicability of this improved DQMC approach by providing an entanglement perspective on the quantum phase transition from a magnetically ordered Mott insulator to a band insulator in the bilayer square lattice Hubbard model at half filling.

  10. Hybrid Multilevel Monte Carlo Simulation of Stochastic Reaction Networks

    KAUST Repository

    Moraes, Alvaro

    2015-01-07

    Stochastic reaction networks (SRNs) is a class of continuous-time Markov chains intended to describe, from the kinetic point of view, the time-evolution of chemical systems in which molecules of different chemical species undergo a finite set of reaction channels. This talk is based on articles [4, 5, 6], where we are interested in the following problem: given a SRN, X, defined though its set of reaction channels, and its initial state, x0, estimate E (g(X(T))); that is, the expected value of a scalar observable, g, of the process, X, at a fixed time, T. This problem lead us to define a series of Monte Carlo estimators, M, such that, with high probability can produce values close to the quantity of interest, E (g(X(T))). More specifically, given a user-selected tolerance, TOL, and a small confidence level, η, find an estimator, M, based on approximate sampled paths of X, such that, P (|E (g(X(T))) − M| ≤ TOL) ≥ 1 − η; even more, we want to achieve this objective with near optimal computational work. We first introduce a hybrid path-simulation scheme based on the well-known stochastic simulation algorithm (SSA)[3] and the tau-leap method [2]. Then, we introduce a Multilevel Monte Carlo strategy that allows us to achieve a computational complexity of order O(T OL−2), this is the same computational complexity as in an exact method but with a smaller constant. We provide numerical examples to show our results.

  11. Monte Carlo simulation of the Leksell Gamma Knife: I. Source modelling and calculations in homogeneous media

    Energy Technology Data Exchange (ETDEWEB)

    Moskvin, Vadim [Department of Radiation Oncology, Indiana University School of Medicine, Indianapolis, IN (United States)]. E-mail: vmoskvin@iupui.edu; DesRosiers, Colleen; Papiez, Lech; Timmerman, Robert; Randall, Marcus; DesRosiers, Paul [Department of Radiation Oncology, Indiana University School of Medicine, Indianapolis, IN (United States)

    2002-06-21

    The Monte Carlo code PENELOPE has been used to simulate photon flux from the Leksell Gamma Knife, a precision method for treating intracranial lesions. Radiation from a single {sup 60}Co assembly traversing the collimator system was simulated, and phase space distributions at the output surface of the helmet for photons and electrons were calculated. The characteristics describing the emitted final beam were used to build a two-stage Monte Carlo simulation of irradiation of a target. A dose field inside a standard spherical polystyrene phantom, usually used for Gamma Knife dosimetry, has been computed and compared with experimental results, with calculations performed by other authors with the use of the EGS4 Monte Carlo code, and data provided by the treatment planning system Gamma Plan. Good agreement was found between these data and results of simulations in homogeneous media. Owing to this established accuracy, PENELOPE is suitable for simulating problems relevant to stereotactic radiosurgery. (author)

  12. Acceleration of Monte Carlo simulation of photon migration in complex heterogeneous media using Intel many-integrated core architecture.

    Science.gov (United States)

    Gorshkov, Anton V; Kirillin, Mikhail Yu

    2015-08-01

    Over two decades, the Monte Carlo technique has become a gold standard in simulation of light propagation in turbid media, including biotissues. Technological solutions provide further advances of this technique. The Intel Xeon Phi coprocessor is a new type of accelerator for highly parallel general purpose computing, which allows execution of a wide range of applications without substantial code modification. We present a technical approach of porting our previously developed Monte Carlo (MC) code for simulation of light transport in tissues to the Intel Xeon Phi coprocessor. We show that employing the accelerator allows reducing computational time of MC simulation and obtaining simulation speed-up comparable to GPU. We demonstrate the performance of the developed code for simulation of light transport in the human head and determination of the measurement volume in near-infrared spectroscopy brain sensing.

  13. Validation of a Monte Carlo model used for simulating tube current modulation in computed tomography over a wide range of phantom conditions/challenges

    Energy Technology Data Exchange (ETDEWEB)

    Bostani, Maryam, E-mail: mbostani@mednet.ucla.edu; McMillan, Kyle; Cagnon, Chris H.; McNitt-Gray, Michael F. [Departments of Biomedical Physics and Radiology, David Geffen School of Medicine, University of California, Los Angeles, Los Angeles, California 90024 (United States); DeMarco, John J. [Department of Radiation Oncology, University of California, Los Angeles, Los Angeles, California 90095 (United States)

    2014-11-01

    Purpose: Monte Carlo (MC) simulation methods have been widely used in patient dosimetry in computed tomography (CT), including estimating patient organ doses. However, most simulation methods have undergone a limited set of validations, often using homogeneous phantoms with simple geometries. As clinical scanning has become more complex and the use of tube current modulation (TCM) has become pervasive in the clinic, MC simulations should include these techniques in their methodologies and therefore should also be validated using a variety of phantoms with different shapes and material compositions to result in a variety of differently modulated tube current profiles. The purpose of this work is to perform the measurements and simulations to validate a Monte Carlo model under a variety of test conditions where fixed tube current (FTC) and TCM were used. Methods: A previously developed MC model for estimating dose from CT scans that models TCM, built using the platform of MCNPX, was used for CT dose quantification. In order to validate the suitability of this model to accurately simulate patient dose from FTC and TCM CT scan, measurements and simulations were compared over a wide range of conditions. Phantoms used for testing range from simple geometries with homogeneous composition (16 and 32 cm computed tomography dose index phantoms) to more complex phantoms including a rectangular homogeneous water equivalent phantom, an elliptical shaped phantom with three sections (where each section was a homogeneous, but different material), and a heterogeneous, complex geometry anthropomorphic phantom. Each phantom requires varying levels of x-, y- and z-modulation. Each phantom was scanned on a multidetector row CT (Sensation 64) scanner under the conditions of both FTC and TCM. Dose measurements were made at various surface and depth positions within each phantom. Simulations using each phantom were performed for FTC, detailed x–y–z TCM, and z-axis-only TCM to obtain

  14. Understanding Emergency Care Delivery Through Computer Simulation Modeling.

    Science.gov (United States)

    Laker, Lauren F; Torabi, Elham; France, Daniel J; Froehle, Craig M; Goldlust, Eric J; Hoot, Nathan R; Kasaie, Parastu; Lyons, Michael S; Barg-Walkow, Laura H; Ward, Michael J; Wears, Robert L

    2018-02-01

    In 2017, Academic Emergency Medicine convened a consensus conference entitled, "Catalyzing System Change through Health Care Simulation: Systems, Competency, and Outcomes." This article, a product of the breakout session on "understanding complex interactions through systems modeling," explores the role that computer simulation modeling can and should play in research and development of emergency care delivery systems. This article discusses areas central to the use of computer simulation modeling in emergency care research. The four central approaches to computer simulation modeling are described (Monte Carlo simulation, system dynamics modeling, discrete-event simulation, and agent-based simulation), along with problems amenable to their use and relevant examples to emergency care. Also discussed is an introduction to available software modeling platforms and how to explore their use for research, along with a research agenda for computer simulation modeling. Through this article, our goal is to enhance adoption of computer simulation, a set of methods that hold great promise in addressing emergency care organization and design challenges. © 2017 by the Society for Academic Emergency Medicine.

  15. Reactor physics simulations with coupled Monte Carlo calculation and computational fluid dynamics

    International Nuclear Information System (INIS)

    Seker, V.; Thomas, J. W.; Downar, T. J.

    2007-01-01

    The interest in high fidelity modeling of nuclear reactor cores has increased over the last few years and has become computationally more feasible because of the dramatic improvements in processor speed and the availability of low cost parallel platforms. In the research here high fidelity, multi-physics analyses was performed by solving the neutron transport equation using Monte Carlo methods and by solving the thermal-hydraulics equations using computational fluid dynamics. A computation tool based on coupling the Monte Carlo code MCNP5 and the Computational Fluid Dynamics (CFD) code STAR-CD was developed as an audit tool for lower order nuclear reactor calculations. This paper presents the methodology of the developed computer program 'McSTAR' along with the verification and validation efforts. McSTAR is written in PERL programming language and couples MCNP5 and the commercial CFD code STAR-CD. MCNP uses a continuous energy cross section library produced by the NJOY code system from the raw ENDF/B data. A major part of the work was to develop and implement methods to update the cross section library with the temperature distribution calculated by STAR-CD for every region. Three different methods were investigated and two of them are implemented in McSTAR. The user subroutines in STAR-CD are modified to read the power density data and assign them to the appropriate variables in the program and to write an output data file containing the temperature, density and indexing information to perform the mapping between MCNP and STAR-CD cells. The necessary input file manipulation, data file generation, normalization and multi-processor calculation settings are all done through the program flow in McSTAR. Initial testing of the code was performed using a single pin cell and a 3X3 PWR pin-cell problem. The preliminary results of the single pin-cell problem are compared with those obtained from a STAR-CD coupled calculation with the deterministic transport code De

  16. Enhancement of high-energy distribution tail in Monte Carlo semiconductor simulations using a Variance Reduction Scheme

    Directory of Open Access Journals (Sweden)

    Vincenza Di Stefano

    2009-11-01

    Full Text Available The Multicomb variance reduction technique has been introduced in the Direct Monte Carlo Simulation for submicrometric semiconductor devices. The method has been implemented in bulk silicon. The simulations show that the statistical variance of hot electrons is reduced with some computational cost. The method is efficient and easy to implement in existing device simulators.

  17. Fast Monte Carlo for ion beam analysis simulations

    International Nuclear Information System (INIS)

    Schiettekatte, Francois

    2008-01-01

    A Monte Carlo program for the simulation of ion beam analysis data is presented. It combines mainly four features: (i) ion slowdown is computed separately from the main scattering/recoil event, which is directed towards the detector. (ii) A virtual detector, that is, a detector larger than the actual one can be used, followed by trajectory correction. (iii) For each collision during ion slowdown, scattering angle components are extracted form tables. (iv) Tables of scattering angle components, stopping power and energy straggling are indexed using the binary representation of floating point numbers, which allows logarithmic distribution of these tables without the computation of logarithms to access them. Tables are sufficiently fine-grained that interpolation is not necessary. Ion slowdown computation thus avoids trigonometric, inverse and transcendental function calls and, as much as possible, divisions. All these improvements make possible the computation of 10 7 collisions/s on current PCs. Results for transmitted ions of several masses in various substrates are well comparable to those obtained using SRIM-2006 in terms of both angular and energy distributions, as long as a sufficiently large number of collisions is considered for each ion. Examples of simulated spectrum show good agreement with experimental data, although a large detector rather than the virtual detector has to be used to properly simulate background signals that are due to plural collisions. The program, written in standard C, is open-source and distributed under the terms of the GNU General Public License

  18. Direct Measurement of Power Dissipated by Monte Carlo Simulations on CPU and FPGA Platforms

    OpenAIRE

    Albicocco, Pietro; Papini, Davide; Nannarelli, Alberto

    2012-01-01

    In this technical report, we describe how power dissipation measurements on different computing platforms (a desktop computer and an FPGA board) are performed by using a Hall effectbased current sensor. The chosen application is a Monte Carlo simulation for European option pricing which is a popular algorithm used in financial computations. The Hall effect probe measurements complement the measurements performed on the core of the FPGA by a built-in Xilinxpower monitoring system.

  19. A Monte Carlo Simulation Framework for Testing Cosmological Models

    Directory of Open Access Journals (Sweden)

    Heymann Y.

    2014-10-01

    Full Text Available We tested alternative cosmologies using Monte Carlo simulations based on the sam- pling method of the zCosmos galactic survey. The survey encompasses a collection of observable galaxies with respective redshifts that have been obtained for a given spec- troscopic area of the sky. Using a cosmological model, we can convert the redshifts into light-travel times and, by slicing the survey into small redshift buckets, compute a curve of galactic density over time. Because foreground galaxies obstruct the images of more distant galaxies, we simulated the theoretical galactic density curve using an average galactic radius. By comparing the galactic density curves of the simulations with that of the survey, we could assess the cosmologies. We applied the test to the expanding-universe cosmology of de Sitter and to a dichotomous cosmology.

  20. Dosimetric reconstruction of radiological accident by numerical simulations by means associating an anthropomorphic model and a Monte Carlo computation code

    International Nuclear Information System (INIS)

    Courageot, Estelle

    2010-01-01

    After a description of the context of radiological accidents (definition, history, context, exposure types, associated clinic symptoms of irradiation and contamination, medical treatment, return on experience) and a presentation of dose assessment in the case of external exposure (clinic, biological and physical dosimetry), this research thesis describes the principles of numerical reconstruction of a radiological accident, presents some computation codes (Monte Carlo code, MCNPX code) and the SESAME tool, and reports an application to an actual case (an accident which occurred in Equator in April 2009). The next part reports the developments performed to modify the posture of voxelized phantoms and the experimental and numerical validations. The last part reports a feasibility study for the reconstruction of radiological accidents occurring in external radiotherapy. This work is based on a Monte Carlo simulation of a linear accelerator, with the aim of identifying the most relevant parameters to be implemented in SESAME in the case of external radiotherapy

  1. Tally and geometry definition influence on the computing time in radiotherapy treatment planning with MCNP Monte Carlo code.

    Science.gov (United States)

    Juste, B; Miro, R; Gallardo, S; Santos, A; Verdu, G

    2006-01-01

    The present work has simulated the photon and electron transport in a Theratron 780 (MDS Nordion) (60)Co radiotherapy unit, using the Monte Carlo transport code, MCNP (Monte Carlo N-Particle), version 5. In order to become computationally more efficient in view of taking part in the practical field of radiotherapy treatment planning, this work is focused mainly on the analysis of dose results and on the required computing time of different tallies applied in the model to speed up calculations.

  2. Monte Carlo simulated dynamical magnetization of single-chain magnets

    Energy Technology Data Exchange (ETDEWEB)

    Li, Jun; Liu, Bang-Gui, E-mail: bgliu@iphy.ac.cn

    2015-03-15

    Here, a dynamical Monte-Carlo (DMC) method is used to study temperature-dependent dynamical magnetization of famous Mn{sub 2}Ni system as typical example of single-chain magnets with strong magnetic anisotropy. Simulated magnetization curves are in good agreement with experimental results under typical temperatures and sweeping rates, and simulated coercive fields as functions of temperature are also consistent with experimental curves. Further analysis indicates that the magnetization reversal is determined by both thermal-activated effects and quantum spin tunnelings. These can help explore basic properties and applications of such important magnetic systems. - Highlights: • Monte Carlo simulated magnetization curves are in good agreement with experimental results. • Simulated coercive fields as functions of temperature are consistent with experimental results. • The magnetization reversal is understood in terms of the Monte Carlo simulations.

  3. Power-feedwater temperature operating domain for Sbwr applying Monte Carlo simulation

    Energy Technology Data Exchange (ETDEWEB)

    Aguilar M, L. A.; Quezada G, S.; Espinosa M, E. G.; Vazquez R, A.; Varela H, J. R.; Cazares R, R. I.; Espinosa P, G., E-mail: sequega@gmail.com [Universidad Autonoma Metropolitana, Unidad Iztapalapa, San Rafael Atlixco No. 186, Col. Vicentina, 09340 Mexico D. F. (Mexico)

    2014-10-15

    In this work the analyses of the feedwater temperature effects on reactor power in a simplified boiling water reactor (Sbwr) applying a methodology based on Monte Carlo simulation is presented. The Monte Carlo methodology was applied systematically to establish operating domain, due that the Sbwr are not yet in operation, the analysis of the nuclear and thermal-hydraulic processes must rely on numerical modeling, with the purpose of developing or confirming the design basis and qualifying the existing or new computer codes to enable reliable analyses. The results show that the reactor power is inversely proportional to the temperature of the feedwater, reactor power changes at 8% when the feed water temperature changes in 8%. (Author)

  4. Power-feedwater temperature operating domain for Sbwr applying Monte Carlo simulation

    International Nuclear Information System (INIS)

    Aguilar M, L. A.; Quezada G, S.; Espinosa M, E. G.; Vazquez R, A.; Varela H, J. R.; Cazares R, R. I.; Espinosa P, G.

    2014-10-01

    In this work the analyses of the feedwater temperature effects on reactor power in a simplified boiling water reactor (Sbwr) applying a methodology based on Monte Carlo simulation is presented. The Monte Carlo methodology was applied systematically to establish operating domain, due that the Sbwr are not yet in operation, the analysis of the nuclear and thermal-hydraulic processes must rely on numerical modeling, with the purpose of developing or confirming the design basis and qualifying the existing or new computer codes to enable reliable analyses. The results show that the reactor power is inversely proportional to the temperature of the feedwater, reactor power changes at 8% when the feed water temperature changes in 8%. (Author)

  5. A conservative and a hybrid early rejection schemes for accelerating Monte Carlo molecular simulation

    KAUST Repository

    Kadoura, Ahmad Salim

    2014-03-17

    Molecular simulation could provide detailed description of fluid systems when compared to experimental techniques. They can also replace equations of state; however, molecular simulation usually costs considerable computational efforts. Several techniques have been developed to overcome such high computational costs. In this paper, two early rejection schemes, a conservative and a hybrid one, are introduced. In these two methods, undesired configurations generated by the Monte Carlo trials are rejected earlier than it would when using conventional algorithms. The methods are tested for structureless single-component Lennard-Jones particles in both canonical and NVT-Gibbs ensembles. The computational time reduction for both ensembles is observed at a wide range of thermodynamic conditions. Results show that computational time savings are directly proportional to the rejection rate of Monte Carlo trials. The proposed conservative scheme has shown to be successful in saving up to 40% of the computational time in the canonical ensemble and up to 30% in the NVT-Gibbs ensemble when compared to standard algorithms. In addition, it preserves the exact Markov chains produced by the Metropolis scheme. Further enhancement for NVT-Gibbs ensemble is achieved by combining this technique with the bond formation early rejection one. The hybrid method achieves more than 50% saving of the central processing unit (CPU) time.

  6. GPU acceleration of Monte Carlo simulations for polarized photon scattering in anisotropic turbid media.

    Science.gov (United States)

    Li, Pengcheng; Liu, Celong; Li, Xianpeng; He, Honghui; Ma, Hui

    2016-09-20

    In earlier studies, we developed scattering models and the corresponding CPU-based Monte Carlo simulation programs to study the behavior of polarized photons as they propagate through complex biological tissues. Studying the simulation results in high degrees of freedom that created a demand for massive simulation tasks. In this paper, we report a parallel implementation of the simulation program based on the compute unified device architecture running on a graphics processing unit (GPU). Different schemes for sphere-only simulations and sphere-cylinder mixture simulations were developed. Diverse optimizing methods were employed to achieve the best acceleration. The final-version GPU program is hundreds of times faster than the CPU version. Dependence of the performance on input parameters and precision were also studied. It is shown that using single precision in the GPU simulations results in very limited losses in accuracy. Consumer-level graphics cards, even those in laptop computers, are more cost-effective than scientific graphics cards for single-precision computation.

  7. Molecular Monte Carlo Simulations Using Graphics Processing Units: To Waste Recycle or Not?

    Science.gov (United States)

    Kim, Jihan; Rodgers, Jocelyn M; Athènes, Manuel; Smit, Berend

    2011-10-11

    In the waste recycling Monte Carlo (WRMC) algorithm, (1) multiple trial states may be simultaneously generated and utilized during Monte Carlo moves to improve the statistical accuracy of the simulations, suggesting that such an algorithm may be well posed for implementation in parallel on graphics processing units (GPUs). In this paper, we implement two waste recycling Monte Carlo algorithms in CUDA (Compute Unified Device Architecture) using uniformly distributed random trial states and trial states based on displacement random-walk steps, and we test the methods on a methane-zeolite MFI framework system to evaluate their utility. We discuss the specific implementation details of the waste recycling GPU algorithm and compare the methods to other parallel algorithms optimized for the framework system. We analyze the relationship between the statistical accuracy of our simulations and the CUDA block size to determine the efficient allocation of the GPU hardware resources. We make comparisons between the GPU and the serial CPU Monte Carlo implementations to assess speedup over conventional microprocessors. Finally, we apply our optimized GPU algorithms to the important problem of determining free energy landscapes, in this case for molecular motion through the zeolite LTA.

  8. CAD-based Monte Carlo program for integrated simulation of nuclear system SuperMC

    International Nuclear Information System (INIS)

    Wu, Yican; Song, Jing; Zheng, Huaqing; Sun, Guangyao; Hao, Lijuan; Long, Pengcheng; Hu, Liqin

    2015-01-01

    Highlights: • The new developed CAD-based Monte Carlo program named SuperMC for integrated simulation of nuclear system makes use of hybrid MC-deterministic method and advanced computer technologies. SuperMC is designed to perform transport calculation of various types of particles, depletion and activation calculation including isotope burn-up, material activation and shutdown dose, and multi-physics coupling calculation including thermo-hydraulics, fuel performance and structural mechanics. The bi-directional automatic conversion between general CAD models and physical settings and calculation models can be well performed. Results and process of simulation can be visualized with dynamical 3D dataset and geometry model. Continuous-energy cross section, burnup, activation, irradiation damage and material data etc. are used to support the multi-process simulation. Advanced cloud computing framework makes the computation and storage extremely intensive simulation more attractive just as a network service to support design optimization and assessment. The modular design and generic interface promotes its flexible manipulation and coupling of external solvers. • The new developed and incorporated advanced methods in SuperMC was introduced including hybrid MC-deterministic transport method, particle physical interaction treatment method, multi-physics coupling calculation method, geometry automatic modeling and processing method, intelligent data analysis and visualization method, elastic cloud computing technology and parallel calculation method. • The functions of SuperMC2.1 integrating automatic modeling, neutron and photon transport calculation, results and process visualization was introduced. It has been validated by using a series of benchmarking cases such as the fusion reactor ITER model and the fast reactor BN-600 model. - Abstract: Monte Carlo (MC) method has distinct advantages to simulate complicated nuclear systems and is envisioned as a routine

  9. Optimizing maintenance and repair policies via a combination of genetic algorithms and Monte Carlo simulation

    International Nuclear Information System (INIS)

    Marseguerra, M.; Zio, E.

    2000-01-01

    In this paper we present an optimization approach based on the combination of a Genetic Algorithms maximization procedure with a Monte Carlo simulation. The approach is applied within the context of plant logistic management for what concerns the choice of maintenance and repair strategies. A stochastic model of plant operation is developed from the standpoint of its reliability/availability behavior, i.e. of the failure/repair/maintenance processes of its components. The model is evaluated by Monte Carlo simulation in terms of economic costs and revenues of operation. The flexibility of the Monte Carlo method allows us to include several practical aspects such as stand-by operation modes, deteriorating repairs, aging, sequences of periodic maintenances, number of repair teams available for different kinds of repair interventions (mechanical, electronic, hydraulic, etc.), components priority rankings. A genetic algorithm is then utilized to optimize the components maintenance periods and number of repair teams. The fitness function object of the optimization is a profit function which inherently accounts for the safety and economic performance of the plant and whose value is computed by the above Monte Carlo simulation model. For an efficient combination of Genetic Algorithms and Monte Carlo simulation, only few hundreds Monte Carlo histories are performed for each potential solution proposed by the genetic algorithm. Statistical significance of the results of the solutions of interest (i.e. the best ones) is then attained exploiting the fact that during the population evolution the fit chromosomes appear repeatedly many times. The proposed optimization approach is applied on two case studies of increasing complexity

  10. Direct Monte Carlo simulation of nanoscale mixed gas bearings

    Directory of Open Access Journals (Sweden)

    Kyaw Sett Myo

    2015-06-01

    Full Text Available The conception of sealed hard drives with helium gas mixture has been recently suggested over the current hard drives for achieving higher reliability and less position error. Therefore, it is important to understand the effects of different helium gas mixtures on the slider bearing characteristics in the head–disk interface. In this article, the helium/air and helium/argon gas mixtures are applied as the working fluids and their effects on the bearing characteristics are studied using the direct simulation Monte Carlo method. Based on direct simulation Monte Carlo simulations, the physical properties of these gas mixtures such as mean free path and dynamic viscosity are achieved and compared with those obtained from theoretical models. It is observed that both results are comparable. Using these gas mixture properties, the bearing pressure distributions are calculated under different fractions of helium with conventional molecular gas lubrication models. The outcomes reveal that the molecular gas lubrication results could have relatively good agreement with those of direct simulation Monte Carlo simulations, especially for pure air, helium, or argon gas cases. For gas mixtures, the bearing pressures predicted by molecular gas lubrication model are slightly larger than those from direct simulation Monte Carlo simulation.

  11. Monte Carlo simulation of radiative processes in electron-positron scattering

    International Nuclear Information System (INIS)

    Kleiss, R.H.P.

    1982-01-01

    The Monte Carlo simulation of scattering processes has turned out to be one of the most successful methods of translating theoretical predictions into experimentally meaningful quantities. It is the purpose of this thesis to describe how this approach can be applied to higher-order QED corrections to several fundamental processes. In chapter II a very brief overview of the currently interesting phenomena in e +- scattering is given. It is argued that accurate information on higher-order QED corrections is very important and that the Monte Carlo approach is one of the most flexible and general methods to obtain this information. In chapter III the author describes various techniques which are useful in this context, and makes a few remarks on the numerical aspects of the proposed method. In the following three chapters he applies this to the processes e + e - → μ + μ - (γ) and e + e - → qanti q(sigma). In chapter IV he motivates his choice of these processes in view of their experimental and theoretical relevance. The formulae necessary for a computer simulation of all quantities of interest, up to order α 3 , is given. Chapters V and VI describe how this simulation can be performed using the techniques mentioned in chapter III. In chapter VII it is shown how additional dynamical quantities, namely the polarization of the incoming and outgoing particles, can be incorporated in our treatment, and the relevant formulae for the example processes mentioned above are given. Finally, in chapter VIII the author presents some examples of the comparison between theoretical predictions based on Monte Carlo simulations as outlined here, and the results from actual experiments. (Auth.)

  12. Penelope-2006: a code system for Monte Carlo simulation of electron and photon transport

    International Nuclear Information System (INIS)

    2006-01-01

    The computer code system PENELOPE (version 2006) performs Monte Carlo simulation of coupled electron-photon transport in arbitrary materials for a wide energy range, from a few hundred eV to about 1 GeV. Photon transport is simulated by means of the standard, detailed simulation scheme. Electron and positron histories are generated on the basis of a mixed procedure, which combines detailed simulation of hard events with condensed simulation of soft interactions. A geometry package called PENGEOM permits the generation of random electron-photon showers in material systems consisting of homogeneous bodies limited by quadric surfaces, i.e. planes, spheres, cylinders, etc. This report is intended not only to serve as a manual of the PENELOPE code system, but also to provide the user with the necessary information to understand the details of the Monte Carlo algorithm. These proceedings contain the corresponding manual and teaching notes of the PENELOPE-2006 workshop and training course, held on 4-7 July 2006 in Barcelona, Spain. (author)

  13. A computer simulation model to compute the radiation transfer of mountainous regions

    Science.gov (United States)

    Li, Yuguang; Zhao, Feng; Song, Rui

    2011-11-01

    In mountainous regions, the radiometric signal recorded at the sensor depends on a number of factors such as sun angle, atmospheric conditions, surface cover type, and topography. In this paper, a computer simulation model of radiation transfer is designed and evaluated. This model implements the Monte Carlo ray-tracing techniques and is specifically dedicated to the study of light propagation in mountainous regions. The radiative processes between sun light and the objects within the mountainous region are realized by using forward Monte Carlo ray-tracing methods. The performance of the model is evaluated through detailed comparisons with the well-established 3D computer simulation model: RGM (Radiosity-Graphics combined Model) based on the same scenes and identical spectral parameters, which shows good agreements between these two models' results. By using the newly developed computer model, series of typical mountainous scenes are generated to analyze the physical mechanism of mountainous radiation transfer. The results show that the effects of the adjacent slopes are important for deep valleys and they particularly affect shadowed pixels, and the topographic effect needs to be considered in mountainous terrain before accurate inferences from remotely sensed data can be made.

  14. Fast Monte Carlo-assisted simulation of cloudy Earth backgrounds

    Science.gov (United States)

    Adler-Golden, Steven; Richtsmeier, Steven C.; Berk, Alexander; Duff, James W.

    2012-11-01

    A calculation method has been developed for rapidly synthesizing radiometrically accurate ultraviolet through longwavelengthinfrared spectral imagery of the Earth for arbitrary locations and cloud fields. The method combines cloudfree surface reflectance imagery with cloud radiance images calculated from a first-principles 3-D radiation transport model. The MCScene Monte Carlo code [1-4] is used to build a cloud image library; a data fusion method is incorporated to speed convergence. The surface and cloud images are combined with an upper atmospheric description with the aid of solar and thermal radiation transport equations that account for atmospheric inhomogeneity. The method enables a wide variety of sensor and sun locations, cloud fields, and surfaces to be combined on-the-fly, and provides hyperspectral wavelength resolution with minimal computational effort. The simulations agree very well with much more time-consuming direct Monte Carlo calculations of the same scene.

  15. TH-E-18A-01: Developments in Monte Carlo Methods for Medical Imaging

    Energy Technology Data Exchange (ETDEWEB)

    Badal, A [U.S. Food and Drug Administration (CDRH/OSEL), Silver Spring, MD (United States); Zbijewski, W [Johns Hopkins University, Baltimore, MD (United States); Bolch, W [University of Florida, Gainesville, FL (United States); Sechopoulos, I [Emory University, Atlanta, GA (United States)

    2014-06-15

    Monte Carlo simulation methods are widely used in medical physics research and are starting to be implemented in clinical applications such as radiation therapy planning systems. Monte Carlo simulations offer the capability to accurately estimate quantities of interest that are challenging to measure experimentally while taking into account the realistic anatomy of an individual patient. Traditionally, practical application of Monte Carlo simulation codes in diagnostic imaging was limited by the need for large computational resources or long execution times. However, recent advancements in high-performance computing hardware, combined with a new generation of Monte Carlo simulation algorithms and novel postprocessing methods, are allowing for the computation of relevant imaging parameters of interest such as patient organ doses and scatter-to-primaryratios in radiographic projections in just a few seconds using affordable computational resources. Programmable Graphics Processing Units (GPUs), for example, provide a convenient, affordable platform for parallelized Monte Carlo executions that yield simulation times on the order of 10{sup 7} xray/ s. Even with GPU acceleration, however, Monte Carlo simulation times can be prohibitive for routine clinical practice. To reduce simulation times further, variance reduction techniques can be used to alter the probabilistic models underlying the x-ray tracking process, resulting in lower variance in the results without biasing the estimates. Other complementary strategies for further reductions in computation time are denoising of the Monte Carlo estimates and estimating (scoring) the quantity of interest at a sparse set of sampling locations (e.g. at a small number of detector pixels in a scatter simulation) followed by interpolation. Beyond reduction of the computational resources required for performing Monte Carlo simulations in medical imaging, the use of accurate representations of patient anatomy is crucial to the

  16. TH-E-18A-01: Developments in Monte Carlo Methods for Medical Imaging

    International Nuclear Information System (INIS)

    Badal, A; Zbijewski, W; Bolch, W; Sechopoulos, I

    2014-01-01

    Monte Carlo simulation methods are widely used in medical physics research and are starting to be implemented in clinical applications such as radiation therapy planning systems. Monte Carlo simulations offer the capability to accurately estimate quantities of interest that are challenging to measure experimentally while taking into account the realistic anatomy of an individual patient. Traditionally, practical application of Monte Carlo simulation codes in diagnostic imaging was limited by the need for large computational resources or long execution times. However, recent advancements in high-performance computing hardware, combined with a new generation of Monte Carlo simulation algorithms and novel postprocessing methods, are allowing for the computation of relevant imaging parameters of interest such as patient organ doses and scatter-to-primaryratios in radiographic projections in just a few seconds using affordable computational resources. Programmable Graphics Processing Units (GPUs), for example, provide a convenient, affordable platform for parallelized Monte Carlo executions that yield simulation times on the order of 10 7 xray/ s. Even with GPU acceleration, however, Monte Carlo simulation times can be prohibitive for routine clinical practice. To reduce simulation times further, variance reduction techniques can be used to alter the probabilistic models underlying the x-ray tracking process, resulting in lower variance in the results without biasing the estimates. Other complementary strategies for further reductions in computation time are denoising of the Monte Carlo estimates and estimating (scoring) the quantity of interest at a sparse set of sampling locations (e.g. at a small number of detector pixels in a scatter simulation) followed by interpolation. Beyond reduction of the computational resources required for performing Monte Carlo simulations in medical imaging, the use of accurate representations of patient anatomy is crucial to the virtual

  17. Monte Carlo simulation of continuous-space crystal growth

    International Nuclear Information System (INIS)

    Dodson, B.W.; Taylor, P.A.

    1986-01-01

    We describe a method, based on Monte Carlo techniques, of simulating the atomic growth of crystals without the discrete lattice space assumed by conventional Monte Carlo growth simulations. Since no lattice space is assumed, problems involving epitaxial growth, heteroepitaxy, phonon-driven mechanisms, surface reconstruction, and many other phenomena incompatible with the lattice-space approximation can be studied. Also, use of the Monte Carlo method circumvents to some extent the extreme limitations on simulated timescale inherent in crystal-growth techniques which might be proposed using molecular dynamics. The implementation of the new method is illustrated by studying the growth of strained-layer superlattice (SLS) interfaces in two-dimensional Lennard-Jones atomic systems. Despite the extreme simplicity of such systems, the qualitative features of SLS growth seen here are similar to those observed experimentally in real semiconductor systems

  18. Cost effective distributed computing for Monte Carlo radiation dosimetry

    International Nuclear Information System (INIS)

    Wise, K.N.; Webb, D.V.

    2000-01-01

    Full text: An inexpensive computing facility has been established for performing repetitive Monte Carlo simulations with the BEAM and EGS4/EGSnrc codes of linear accelerator beams, for calculating effective dose from diagnostic imaging procedures and of ion chambers and phantoms used for the Australian high energy absorbed dose standards. The facility currently consists of 3 dual-processor 450 MHz processor PCs linked by a high speed LAN. The 3 PCs can be accessed either locally from a single keyboard/monitor/mouse combination using a SwitchView controller or remotely via a computer network from PCs with suitable communications software (e.g. Telnet, Kermit etc). All 3 PCs are identically configured to have the Red Hat Linux 6.0 operating system. A Fortran compiler and the BEAM and EGS4/EGSnrc codes are available on the 3 PCs. The preparation of sequences of jobs utilising the Monte Carlo codes is simplified using load-distributing software (enFuzion 6.0 marketed by TurboLinux Inc, formerly Cluster from Active Tools) which efficiently distributes the computing load amongst all 6 processors. We describe 3 applications of the system - (a) energy spectra from radiotherapy sources, (b) mean mass-energy absorption coefficients and stopping powers for absolute absorbed dose standards and (c) dosimetry for diagnostic procedures; (a) and (b) are based on the transport codes BEAM and FLURZnrc while (c) is a Fortran/EGS code developed at ARPANSA. Efficiency gains ranged from 3 for (c) to close to the theoretical maximum of 6 for (a) and (b), with the gain depending on the amount of 'bookkeeping' to begin each task and the time taken to complete a single task. We have found the use of a load-balancing batch processing system with many PCs to be an economical way of achieving greater productivity for Monte Carlo calculations or of any computer intensive task requiring many runs with different parameters. Copyright (2000) Australasian College of Physical Scientists and

  19. LCG MCDB - a Knowledgebase of Monte Carlo Simulated Events

    CERN Document Server

    Belov, S; Galkin, E; Gusev, A; Pokorski, Witold; Sherstnev, A V

    2008-01-01

    In this paper we report on LCG Monte Carlo Data Base (MCDB) and software which has been developed to operate MCDB. The main purpose of the LCG MCDB project is to provide a storage and documentation system for sophisticated event samples simulated for the LHC collaborations by experts. In many cases, the modern Monte Carlo simulation of physical processes requires expert knowledge in Monte Carlo generators or significant amount of CPU time to produce the events. MCDB is a knowledgebase mainly to accumulate simulated events of this type. The main motivation behind LCG MCDB is to make the sophisticated MC event samples available for various physical groups. All the data from MCDB is accessible in several convenient ways. LCG MCDB is being developed within the CERN LCG Application Area Simulation project.

  20. Communication: Predicting virial coefficients and alchemical transformations by extrapolating Mayer-sampling Monte Carlo simulations

    Science.gov (United States)

    Hatch, Harold W.; Jiao, Sally; Mahynski, Nathan A.; Blanco, Marco A.; Shen, Vincent K.

    2017-12-01

    Virial coefficients are predicted over a large range of both temperatures and model parameter values (i.e., alchemical transformation) from an individual Mayer-sampling Monte Carlo simulation by statistical mechanical extrapolation with minimal increase in computational cost. With this extrapolation method, a Mayer-sampling Monte Carlo simulation of the SPC/E (extended simple point charge) water model quantitatively predicted the second virial coefficient as a continuous function spanning over four orders of magnitude in value and over three orders of magnitude in temperature with less than a 2% deviation. In addition, the same simulation predicted the second virial coefficient if the site charges were scaled by a constant factor, from an increase of 40% down to zero charge. This method is also shown to perform well for the third virial coefficient and the exponential parameter for a Lennard-Jones fluid.

  1. Rare event simulation using Monte Carlo methods

    CERN Document Server

    Rubino, Gerardo

    2009-01-01

    In a probabilistic model, a rare event is an event with a very small probability of occurrence. The forecasting of rare events is a formidable task but is important in many areas. For instance a catastrophic failure in a transport system or in a nuclear power plant, the failure of an information processing system in a bank, or in the communication network of a group of banks, leading to financial losses. Being able to evaluate the probability of rare events is therefore a critical issue. Monte Carlo Methods, the simulation of corresponding models, are used to analyze rare events. This book sets out to present the mathematical tools available for the efficient simulation of rare events. Importance sampling and splitting are presented along with an exposition of how to apply these tools to a variety of fields ranging from performance and dependability evaluation of complex systems, typically in computer science or in telecommunications, to chemical reaction analysis in biology or particle transport in physics. ...

  2. The Monte Carlo Simulation Method for System Reliability and Risk Analysis

    CERN Document Server

    Zio, Enrico

    2013-01-01

    Monte Carlo simulation is one of the best tools for performing realistic analysis of complex systems as it allows most of the limiting assumptions on system behavior to be relaxed. The Monte Carlo Simulation Method for System Reliability and Risk Analysis comprehensively illustrates the Monte Carlo simulation method and its application to reliability and system engineering. Readers are given a sound understanding of the fundamentals of Monte Carlo sampling and simulation and its application for realistic system modeling.   Whilst many of the topics rely on a high-level understanding of calculus, probability and statistics, simple academic examples will be provided in support to the explanation of the theoretical foundations to facilitate comprehension of the subject matter. Case studies will be introduced to provide the practical value of the most advanced techniques.   This detailed approach makes The Monte Carlo Simulation Method for System Reliability and Risk Analysis a key reference for senior undergra...

  3. Randomized quasi-Monte Carlo simulation of fast-ion thermalization

    Science.gov (United States)

    Höök, L. J.; Johnson, T.; Hellsten, T.

    2012-01-01

    This work investigates the applicability of the randomized quasi-Monte Carlo method for simulation of fast-ion thermalization processes in fusion plasmas, e.g. for simulation of neutral beam injection and radio frequency heating. In contrast to the standard Monte Carlo method, the quasi-Monte Carlo method uses deterministic numbers instead of pseudo-random numbers and has a statistical weak convergence close to {O}(N^{-1}) , where N is the number of markers. We have compared different quasi-Monte Carlo methods for a neutral beam injection scenario, which is solved by many realizations of the associated stochastic differential equation, discretized with the Euler-Maruyama scheme. The statistical convergence of the methods is measured for time steps up to 214.

  4. Clinical treatment planning for stereotactic radiotherapy, evaluation by Monte Carlo simulation

    International Nuclear Information System (INIS)

    Kairn, T.; Aland, T.; Kenny, J.; Knight, R.T.; Crowe, S.B.; Langton, C.M.; Franich, R.D.; Johnston, P.N.

    2010-01-01

    Full text: This study uses re-evaluates the doses delivered by a series of clinical stereotactic radiotherapy treatments, to test the accuracy of treatment planning predictions for very small radiation fields. Stereotactic radiotherapy treatment plans for meningiomas near the petrous temporal bone and the foramen magnum (incorp rating fields smaller than I c m2) were examined using Monte Carlo simulations. Important differences between treatment planning predictions and Monte Carlo calculations of doses delivered to stereotactic radiotherapy patients are apparent. For example, in one case the Monte Carlo calculation shows that the delivery a planned meningioma treatment would spare the patient's critical structures (eyes, brainstem) more effectively than the treatment plan predicted, and therefore suggests that this patient could safely receive an increased dose to their tumour. Monte Carlo simulations can be used to test the dose predictions made by a conventional treatment planning system, for dosimetrically challenging small fields, and can thereby suggest valuable modifications to clinical treatment plans. This research was funded by the Wesley Research Institute, Australia. The authors wish to thank Andrew Fielding and David Schlect for valuable discussions of aspects of this work. The authors are also grateful to Muhammad Kakakhel, for assisting with the design and calibration of our linear accelerator model, and to the stereotactic radiation therapy team at Premion, who designed the treatment plans. Computational resources and services used in this work were provided by the HPC and Research Support Unit, QUT, Brisbane, Australia. (author)

  5. Study of Gamma spectra by Monte Carlo simulation

    International Nuclear Information System (INIS)

    Cantaragiu, A.; Gheorghies, A.; Borcia, C.

    2008-01-01

    The purpose of this paper is obtaining gamma ray spectra by means of a scintillation detector applying the Monte Carlo statistic simulation method using the EGS4 program. The Monte Carlo algorithm implies that the physical system is described by the probability density function which allows generating random figures and the result is taken as an average of numbers which were observed. The EGS4 program allows the simulation of the following physical processes: the photo-electrical effect, the Compton effect, the electron positron pairs generation and the Rayleigh diffusion. The gamma rays recorded by the detector are converted into electrical pulses and the gamma ray spectra are acquired and processed by means of the Nomad Plus portable spectrometer connected to a computer. As a gamma ray sources 137Cs and 60Co are used whose spectra drawn and used for study the interaction of the gamma radiations with the scintillation detector. The parameters which varied during the acquisition of the gamma ray spectra are the distance between source and detector and the measuring time. Due to the statistical processes in the detector, the peak looks like a Gauss distribution. The identification of the gamma quantum energy value is achieved by the experimental spectra peaks, thus gathering information about the position of the peak, the width and the area of the peak respectively. By means of the EGS4 program a simulation is run using these parameters and an 'ideal' spectrum is obtained, a spectrum which is not influenced by the statistical processes which take place inside the detector. Then, the convolution of the spectra is achieved by means of a normalised Gauss function. There is a close match between the experimental results and those simulated in the EGS4 program because the interactions which occurred during the simulation have a statistical behaviour close to the real one. (authors)

  6. Comparative evaluation of photon cross section libraries for materials of interest in PET Monte Carlo simulations

    CERN Document Server

    Zaidi, H

    1999-01-01

    the many applications of Monte Carlo modelling in nuclear medicine imaging make it desirable to increase the accuracy and computational speed of Monte Carlo codes. The accuracy of Monte Carlo simulations strongly depends on the accuracy in the probability functions and thus on the cross section libraries used for photon transport calculations. A comparison between different photon cross section libraries and parametrizations implemented in Monte Carlo simulation packages developed for positron emission tomography and the most recent Evaluated Photon Data Library (EPDL97) developed by the Lawrence Livermore National Laboratory was performed for several human tissues and common detector materials for energies from 1 keV to 1 MeV. Different photon cross section libraries and parametrizations show quite large variations as compared to the EPDL97 coefficients. This latter library is more accurate and was carefully designed in the form of look-up tables providing efficient data storage, access, and management. Toge...

  7. Importance estimation in Monte Carlo modelling of neutron and photon transport

    International Nuclear Information System (INIS)

    Mickael, M.W.

    1992-01-01

    The estimation of neutron and photon importance in a three-dimensional geometry is achieved using a coupled Monte Carlo and diffusion theory calculation. The parameters required for the solution of the multigroup adjoint diffusion equation are estimated from an analog Monte Carlo simulation of the system under investigation. The solution of the adjoint diffusion equation is then used as an estimate of the particle importance in the actual simulation. This approach provides an automated and efficient variance reduction method for Monte Carlo simulations. The technique has been successfully applied to Monte Carlo simulation of neutron and coupled neutron-photon transport in the nuclear well-logging field. The results show that the importance maps obtained in a few minutes of computer time using this technique are in good agreement with Monte Carlo generated importance maps that require prohibitive computing times. The application of this method to Monte Carlo modelling of the response of neutron porosity and pulsed neutron instruments has resulted in major reductions in computation time. (Author)

  8. Forest canopy BRDF simulation using Monte Carlo method

    NARCIS (Netherlands)

    Huang, J.; Wu, B.; Zeng, Y.; Tian, Y.

    2006-01-01

    Monte Carlo method is a random statistic method, which has been widely used to simulate the Bidirectional Reflectance Distribution Function (BRDF) of vegetation canopy in the field of visible remote sensing. The random process between photons and forest canopy was designed using Monte Carlo method.

  9. GPU-based high performance Monte Carlo simulation in neutron transport

    Energy Technology Data Exchange (ETDEWEB)

    Heimlich, Adino; Mol, Antonio C.A.; Pereira, Claudio M.N.A. [Instituto de Engenharia Nuclear (IEN/CNEN-RJ), Rio de Janeiro, RJ (Brazil). Lab. de Inteligencia Artificial Aplicada], e-mail: cmnap@ien.gov.br

    2009-07-01

    Graphics Processing Units (GPU) are high performance co-processors intended, originally, to improve the use and quality of computer graphics applications. Since researchers and practitioners realized the potential of using GPU for general purpose, their application has been extended to other fields out of computer graphics scope. The main objective of this work is to evaluate the impact of using GPU in neutron transport simulation by Monte Carlo method. To accomplish that, GPU- and CPU-based (single and multicore) approaches were developed and applied to a simple, but time-consuming problem. Comparisons demonstrated that the GPU-based approach is about 15 times faster than a parallel 8-core CPU-based approach also developed in this work. (author)

  10. GPU-based high performance Monte Carlo simulation in neutron transport

    International Nuclear Information System (INIS)

    Heimlich, Adino; Mol, Antonio C.A.; Pereira, Claudio M.N.A.

    2009-01-01

    Graphics Processing Units (GPU) are high performance co-processors intended, originally, to improve the use and quality of computer graphics applications. Since researchers and practitioners realized the potential of using GPU for general purpose, their application has been extended to other fields out of computer graphics scope. The main objective of this work is to evaluate the impact of using GPU in neutron transport simulation by Monte Carlo method. To accomplish that, GPU- and CPU-based (single and multicore) approaches were developed and applied to a simple, but time-consuming problem. Comparisons demonstrated that the GPU-based approach is about 15 times faster than a parallel 8-core CPU-based approach also developed in this work. (author)

  11. Randomized quasi-Monte Carlo simulation of fast-ion thermalization

    International Nuclear Information System (INIS)

    Höök, L J; Johnson, T; Hellsten, T

    2012-01-01

    This work investigates the applicability of the randomized quasi-Monte Carlo method for simulation of fast-ion thermalization processes in fusion plasmas, e.g. for simulation of neutral beam injection and radio frequency heating. In contrast to the standard Monte Carlo method, the quasi-Monte Carlo method uses deterministic numbers instead of pseudo-random numbers and has a statistical weak convergence close to O(N -1 ), where N is the number of markers. We have compared different quasi-Monte Carlo methods for a neutral beam injection scenario, which is solved by many realizations of the associated stochastic differential equation, discretized with the Euler-Maruyama scheme. The statistical convergence of the methods is measured for time steps up to 2 14 . (paper)

  12. Free energy and phase equilibria for the restricted primitive model of ionic fluids from Monte Carlo simulations

    International Nuclear Information System (INIS)

    Orkoulas, G.; Panagiotopoulos, A.Z.

    1994-01-01

    In this work, we investigate the liquid--vapor phase transition of the restricted primitive model of ionic fluids. We show that at the low temperatures where the phase transition occurs, the system cannot be studied by conventional molecular simulation methods because convergence to equilibrium is slow. To accelerate convergence, we propose cluster Monte Carlo moves capable of moving more than one particle at a time. We then address the issue of charged particle transfers in grand canonical and Gibbs ensemble Monte Carlo simulations, for which we propose a biased particle insertion/destruction scheme capable of sampling short interparticle distances. We compute the chemical potential for the restricted primitive model as a function of temperature and density from grand canonical Monte Carlo simulations and the phase envelope from Gibbs Monte Carlo simulations. Our calculated phase coexistence curve is in agreement with recent results of Caillol obtained on the four-dimensional hypersphere and our own earlier Gibbs ensemble simulations with single-ion transfers, with the exception of the critical temperature, which is lower in the current calculations. Our best estimates for the critical parameters are T * c =0.053, ρ * c =0.025. We conclude with possible future applications of the biased techniques developed here for phase equilibrium calculations for ionic fluids

  13. Preliminary performance evaluation of on-the-fly doppler broadening capability for Monte Carlo simulation in MCS

    International Nuclear Information System (INIS)

    Khassenov, A.; Choi, S.; Lee, H.; Zhang, P.; Zheng, Y.; Lee, D.

    2015-01-01

    This paper examines multipole representation of the cross section and its further Doppler broadening at the resolved resonance region. At the first step, a conversion is performed from nuclear data file resonance parameters to multipoles, which are corresponding poles and residues. The application of multipole representation allows generating the cross section at the target temperature, without pre-generation of 0K cross section libraries. In order to reduce the computational time for cross section generation, window energy concept was implemented and tested. On-the-fly Doppler broadening module based on multipole and windowed multipole representations were implemented into Monte Carlo code, and a pin cell problem was simulated. Simulation time and multiplication factors for different cases were compared with original Monte Carlo simulation results. (author)

  14. A backward Monte Carlo method for efficient computation of runaway probabilities in runaway electron simulation

    Science.gov (United States)

    Zhang, Guannan; Del-Castillo-Negrete, Diego

    2017-10-01

    Kinetic descriptions of RE are usually based on the bounced-averaged Fokker-Planck model that determines the PDFs of RE. Despite of the simplification involved, the Fokker-Planck equation can rarely be solved analytically and direct numerical approaches (e.g., continuum and particle-based Monte Carlo (MC)) can be time consuming specially in the computation of asymptotic-type observable including the runaway probability, the slowing-down and runaway mean times, and the energy limit probability. Here we present a novel backward MC approach to these problems based on backward stochastic differential equations (BSDEs). The BSDE model can simultaneously describe the PDF of RE and the runaway probabilities by means of the well-known Feynman-Kac theory. The key ingredient of the backward MC algorithm is to place all the particles in a runaway state and simulate them backward from the terminal time to the initial time. As such, our approach can provide much faster convergence than the brute-force MC methods, which can significantly reduce the number of particles required to achieve a prescribed accuracy. Moreover, our algorithm can be parallelized as easy as the direct MC code, which paves the way for conducting large-scale RE simulation. This work is supported by DOE FES and ASCR under the Contract Numbers ERKJ320 and ERAT377.

  15. Implementation of a Monte Carlo simulation environment for fully 3D PET on a high-performance parallel platform

    CERN Document Server

    Zaidi, H; Morel, Christian

    1998-01-01

    This paper describes the implementation of the Eidolon Monte Carlo program designed to simulate fully three-dimensional (3D) cylindrical positron tomographs on a MIMD parallel architecture. The original code was written in Objective-C and developed under the NeXTSTEP development environment. Different steps involved in porting the software on a parallel architecture based on PowerPC 604 processors running under AIX 4.1 are presented. Basic aspects and strategies of running Monte Carlo calculations on parallel computers are described. A linear decrease of the computing time was achieved with the number of computing nodes. The improved time performances resulting from parallelisation of the Monte Carlo calculations makes it an attractive tool for modelling photon transport in 3D positron tomography. The parallelisation paradigm used in this work is independent from the chosen parallel architecture

  16. Monte-Carlo simulation of electromagnetic showers

    International Nuclear Information System (INIS)

    Amatuni, Ts.A.

    1984-01-01

    The universal ELSS-1 program for Monte Carlo simulation of high energy electromagnetic showers in homogeneous absorbers of arbitrary geometry is written. The major processes and effects of electron and photon interaction with matter, particularly the Landau-Pomeranchuk-Migdal effect, are taken into account in the simulation procedures. The simulation results are compared with experimental data. Some characteristics of shower detectors and electromagnetic showers for energies up 1 TeV are calculated

  17. Lattice Boltzmann accelerated direct simulation Monte Carlo for dilute gas flow simulations.

    Science.gov (United States)

    Di Staso, G; Clercx, H J H; Succi, S; Toschi, F

    2016-11-13

    Hybrid particle-continuum computational frameworks permit the simulation of gas flows by locally adjusting the resolution to the degree of non-equilibrium displayed by the flow in different regions of space and time. In this work, we present a new scheme that couples the direct simulation Monte Carlo (DSMC) with the lattice Boltzmann (LB) method in the limit of isothermal flows. The former handles strong non-equilibrium effects, as they typically occur in the vicinity of solid boundaries, whereas the latter is in charge of the bulk flow, where non-equilibrium can be dealt with perturbatively, i.e. according to Navier-Stokes hydrodynamics. The proposed concurrent multiscale method is applied to the dilute gas Couette flow, showing major computational gains when compared with the full DSMC scenarios. In addition, it is shown that the coupling with LB in the bulk flow can speed up the DSMC treatment of the Knudsen layer with respect to the full DSMC case. In other words, LB acts as a DSMC accelerator.This article is part of the themed issue 'Multiscale modelling at the physics-chemistry-biology interface'. © 2016 The Author(s).

  18. Diffusion Monte Carlo approach versus adiabatic computation for local Hamiltonians

    Science.gov (United States)

    Bringewatt, Jacob; Dorland, William; Jordan, Stephen P.; Mink, Alan

    2018-02-01

    Most research regarding quantum adiabatic optimization has focused on stoquastic Hamiltonians, whose ground states can be expressed with only real non-negative amplitudes and thus for whom destructive interference is not manifest. This raises the question of whether classical Monte Carlo algorithms can efficiently simulate quantum adiabatic optimization with stoquastic Hamiltonians. Recent results have given counterexamples in which path-integral and diffusion Monte Carlo fail to do so. However, most adiabatic optimization algorithms, such as for solving MAX-k -SAT problems, use k -local Hamiltonians, whereas our previous counterexample for diffusion Monte Carlo involved n -body interactions. Here we present a 6-local counterexample which demonstrates that even for these local Hamiltonians there are cases where diffusion Monte Carlo cannot efficiently simulate quantum adiabatic optimization. Furthermore, we perform empirical testing of diffusion Monte Carlo on a standard well-studied class of permutation-symmetric tunneling problems and similarly find large advantages for quantum optimization over diffusion Monte Carlo.

  19. Modern analysis of ion channeling data by Monte Carlo simulations

    Energy Technology Data Exchange (ETDEWEB)

    Nowicki, Lech [Andrzej SoItan Institute for Nuclear Studies, ul. Hoza 69, 00-681 Warsaw (Poland)]. E-mail: lech.nowicki@fuw.edu.pl; Turos, Andrzej [Institute of Electronic Materials Technology, Wolczynska 133, 01-919 Warsaw (Poland); Ratajczak, Renata [Andrzej SoItan Institute for Nuclear Studies, ul. Hoza 69, 00-681 Warsaw (Poland); Stonert, Anna [Andrzej SoItan Institute for Nuclear Studies, ul. Hoza 69, 00-681 Warsaw (Poland); Garrido, Frederico [Centre de Spectrometrie Nucleaire et Spectrometrie de Masse, CNRS-IN2P3-Universite Paris-Sud, 91405 Orsay (France)

    2005-10-15

    Basic scheme of ion channeling spectra Monte Carlo simulation is reformulated in terms of statistical sampling. The McChasy simulation code is described and two examples of the code applications are presented. These are: calculation of projectile flux in uranium dioxide crystal and defect analysis for ion implanted InGaAsP/InP superlattice. Virtues and pitfalls of defect analysis using Monte Carlo simulations are discussed.

  20. Monte Carlo simulation methodology for the reliabilty of aircraft structures under damage tolerance considerations

    Science.gov (United States)

    Rambalakos, Andreas

    Current federal aviation regulations in the United States and around the world mandate the need for aircraft structures to meet damage tolerance requirements through out the service life. These requirements imply that the damaged aircraft structure must maintain adequate residual strength in order to sustain its integrity that is accomplished by a continuous inspection program. The multifold objective of this research is to develop a methodology based on a direct Monte Carlo simulation process and to assess the reliability of aircraft structures. Initially, the structure is modeled as a parallel system with active redundancy comprised of elements with uncorrelated (statistically independent) strengths and subjected to an equal load distribution. Closed form expressions for the system capacity cumulative distribution function (CDF) are developed by expanding the current expression for the capacity CDF of a parallel system comprised by three elements to a parallel system comprised with up to six elements. These newly developed expressions will be used to check the accuracy of the implementation of a Monte Carlo simulation algorithm to determine the probability of failure of a parallel system comprised of an arbitrary number of statistically independent elements. The second objective of this work is to compute the probability of failure of a fuselage skin lap joint under static load conditions through a Monte Carlo simulation scheme by utilizing the residual strength of the fasteners subjected to various initial load distributions and then subjected to a new unequal load distribution resulting from subsequent fastener sequential failures. The final and main objective of this thesis is to present a methodology for computing the resulting gradual deterioration of the reliability of an aircraft structural component by employing a direct Monte Carlo simulation approach. The uncertainties associated with the time to crack initiation, the probability of crack detection, the

  1. Parallel computing by Monte Carlo codes MVP/GMVP

    International Nuclear Information System (INIS)

    Nagaya, Yasunobu; Nakagawa, Masayuki; Mori, Takamasa

    2001-01-01

    General-purpose Monte Carlo codes MVP/GMVP are well-vectorized and thus enable us to perform high-speed Monte Carlo calculations. In order to achieve more speedups, we parallelized the codes on the different types of parallel computing platforms or by using a standard parallelization library MPI. The platforms used for benchmark calculations are a distributed-memory vector-parallel computer Fujitsu VPP500, a distributed-memory massively parallel computer Intel paragon and a distributed-memory scalar-parallel computer Hitachi SR2201, IBM SP2. As mentioned generally, linear speedup could be obtained for large-scale problems but parallelization efficiency decreased as the batch size per a processing element(PE) was smaller. It was also found that the statistical uncertainty for assembly powers was less than 0.1% by the PWR full-core calculation with more than 10 million histories and it took about 1.5 hours by massively parallel computing. (author)

  2. Optimizing the HLT Buffer Strategy with Monte Carlo Simulations

    CERN Document Server

    AUTHOR|(CDS)2266763

    2017-01-01

    This project aims to optimize the strategy of utilizing the disk buffer for the High Level Trigger (HLT) of the LHCb experiment with the help of Monte-Carlo simulations. A method is developed, which simulates the Event Filter Farm (EFF) -- a computing cluster for the High Level Trigger -- as a compound of nodes with different performance properties. In this way, the behavior of the computing farm can be analyzed at a deeper level than before. It is demonstrated that the current operating strategy might be improved when data taking is reaching a mid-year scheduled stop or the year-end technical stop. The processing time of the buffered data can be lowered by distributing the detector data according to the processing power of the nodes instead of the relative disk size as long as the occupancy level of the buffer is low enough. Moreover, this ensures that data taken and stored on the buffer at the same time is processed by different nodes nearly simultaneously, which reduces load on the infrastructure.

  3. Computer system for Monte Carlo experimentation

    International Nuclear Information System (INIS)

    Grier, D.A.

    1986-01-01

    A new computer system for Monte Carlo Experimentation is presented. The new system speeds and simplifies the process of coding and preparing a Monte Carlo Experiment; it also encourages the proper design of Monte Carlo Experiments, and the careful analysis of the experimental results. A new functional language is the core of this system. Monte Carlo Experiments, and their experimental designs, are programmed in this new language; those programs are compiled into Fortran output. The Fortran output is then compiled and executed. The experimental results are analyzed with a standard statistics package such as Si, Isp, or Minitab or with a user-supplied program. Both the experimental results and the experimental design may be directly loaded into the workspace of those packages. The new functional language frees programmers from many of the details of programming an experiment. Experimental designs such as factorial, fractional factorial, or latin square are easily described by the control structures and expressions of the language. Specific mathematical modes are generated by the routines of the language

  4. Simulation of Rossi-α method with analog Monte-Carlo method

    International Nuclear Information System (INIS)

    Lu Yuzhao; Xie Qilin; Song Lingli; Liu Hangang

    2012-01-01

    The analog Monte-Carlo code for simulating Rossi-α method based on Geant4 was developed. The prompt neutron decay constant α of six metal uranium configurations in Oak Ridge National Laboratory were calculated. α was also calculated by Burst-Neutron method and the result was consistent with the result of Rossi-α method. There is the difference between results of analog Monte-Carlo simulation and experiment, and the reasons for the difference is the gaps between uranium layers. The influence of gaps decrease as the sub-criticality deepens. The relative difference between results of analog Monte-Carlo simulation and experiment changes from 19% to 0.19%. (authors)

  5. Direct utilization of information from nuclear data files in Monte Carlo simulation of neutron and photon transport

    International Nuclear Information System (INIS)

    Androseno, P.; Zholudov, D.; Kompaniyets, A.; Smirnova, O.

    2000-01-01

    In order to improve both the economics of Nuclear Power Plants (NPPs) as well as their safety, data and computer codes that perform benchmark calculations while simulating NPP parameters must be utilized. This work is mainly concerned with application of computer codes using the Monte Carlo method, which provides advanced accuracy of equations to be calculated. (authors)

  6. Monte Carlo simulation of neutron scattering instruments

    International Nuclear Information System (INIS)

    Seeger, P.A.; Daemen, L.L.; Hjelm, R.P. Jr.

    1998-01-01

    A code package consisting of the Monte Carlo Library MCLIB, the executing code MC RUN, the web application MC Web, and various ancillary codes is proposed as an open standard for simulation of neutron scattering instruments. The architecture of the package includes structures to define surfaces, regions, and optical elements contained in regions. A particle is defined by its vector position and velocity, its time of flight, its mass and charge, and a polarization vector. The MC RUN code handles neutron transport and bookkeeping, while the action on the neutron within any region is computed using algorithms that may be deterministic, probabilistic, or a combination. Complete versatility is possible because the existing library may be supplemented by any procedures a user is able to code. Some examples are shown

  7. Dynamic bounds coupled with Monte Carlo simulations

    NARCIS (Netherlands)

    Rajabali Nejad, Mohammadreza; Meester, L.E.; van Gelder, P.H.A.J.M.; Vrijling, J.K.

    2011-01-01

    For the reliability analysis of engineering structures a variety of methods is known, of which Monte Carlo (MC) simulation is widely considered to be among the most robust and most generally applicable. To reduce simulation cost of the MC method, variance reduction methods are applied. This paper

  8. Multilevel Monte Carlo in Approximate Bayesian Computation

    KAUST Repository

    Jasra, Ajay

    2017-02-13

    In the following article we consider approximate Bayesian computation (ABC) inference. We introduce a method for numerically approximating ABC posteriors using the multilevel Monte Carlo (MLMC). A sequential Monte Carlo version of the approach is developed and it is shown under some assumptions that for a given level of mean square error, this method for ABC has a lower cost than i.i.d. sampling from the most accurate ABC approximation. Several numerical examples are given.

  9. Could running experience on SPMD computers contribute to the architectural choices for future dedicated computers for high energy physics simulation

    International Nuclear Information System (INIS)

    Jejcic, A.; Maillard, J.; Silva, J.; Auguin, M.; Boeri, F.

    1989-01-01

    Results obtained on strongly coupled parallel computer are reported. They concern Monte-Carlo simulation and pattern recognition. Though the calculations were made on an experimental computer of rather low processing power, it is believed that the quoted figures could give useful indications on architectural choices for dedicated computers

  10. Analysis of Monte Carlo methods for the simulation of photon transport

    International Nuclear Information System (INIS)

    Carlsson, G.A.; Kusoffsky, L.

    1975-01-01

    In connection with the transport of low-energy photons (30 - 140 keV) through layers of water of different thicknesses, various aspects of Monte Carlo methods are examined in order to improve their effectivity (to produce statistically more reliable results with shorter computer times) and to bridge the gap between more physical methods and more mathematical ones. The calculations are compared with results of experiments involving the simulation of photon transport, using direct methods and collision density ones (J.S.)

  11. Analysis of skin tissues spatial fluorescence distribution by the Monte Carlo simulation

    International Nuclear Information System (INIS)

    Churmakov, D Y; Meglinski, I V; Piletsky, S A; Greenhalgh, D A

    2003-01-01

    A novel Monte Carlo technique of simulation of spatial fluorescence distribution within the human skin is presented. The computational model of skin takes into account the spatial distribution of fluorophores, which would arise due to the structure of collagen fibres, compared to the epidermis and stratum corneum where the distribution of fluorophores is assumed to be homogeneous. The results of simulation suggest that distribution of auto-fluorescence is significantly suppressed in the near-infrared spectral region, whereas the spatial distribution of fluorescence sources within a sensor layer embedded in the epidermis is localized at an 'effective' depth

  12. Analysis of skin tissues spatial fluorescence distribution by the Monte Carlo simulation

    Energy Technology Data Exchange (ETDEWEB)

    Churmakov, D Y [School of Engineering, Cranfield University, Cranfield, MK43 0AL (United Kingdom); Meglinski, I V [School of Engineering, Cranfield University, Cranfield, MK43 0AL (United Kingdom); Piletsky, S A [Institute of BioScience and Technology, Cranfield University, Silsoe, MK45 4DT (United Kingdom); Greenhalgh, D A [School of Engineering, Cranfield University, Cranfield, MK43 0AL (United Kingdom)

    2003-07-21

    A novel Monte Carlo technique of simulation of spatial fluorescence distribution within the human skin is presented. The computational model of skin takes into account the spatial distribution of fluorophores, which would arise due to the structure of collagen fibres, compared to the epidermis and stratum corneum where the distribution of fluorophores is assumed to be homogeneous. The results of simulation suggest that distribution of auto-fluorescence is significantly suppressed in the near-infrared spectral region, whereas the spatial distribution of fluorescence sources within a sensor layer embedded in the epidermis is localized at an 'effective' depth.

  13. Analysis of skin tissues spatial fluorescence distribution by the Monte Carlo simulation

    Science.gov (United States)

    Y Churmakov, D.; Meglinski, I. V.; Piletsky, S. A.; Greenhalgh, D. A.

    2003-07-01

    A novel Monte Carlo technique of simulation of spatial fluorescence distribution within the human skin is presented. The computational model of skin takes into account the spatial distribution of fluorophores, which would arise due to the structure of collagen fibres, compared to the epidermis and stratum corneum where the distribution of fluorophores is assumed to be homogeneous. The results of simulation suggest that distribution of auto-fluorescence is significantly suppressed in the near-infrared spectral region, whereas the spatial distribution of fluorescence sources within a sensor layer embedded in the epidermis is localized at an `effective' depth.

  14. Steady-State Electrodiffusion from the Nernst-Planck Equation Coupled to Local Equilibrium Monte Carlo Simulations.

    Science.gov (United States)

    Boda, Dezső; Gillespie, Dirk

    2012-03-13

    We propose a procedure to compute the steady-state transport of charged particles based on the Nernst-Planck (NP) equation of electrodiffusion. To close the NP equation and to establish a relation between the concentration and electrochemical potential profiles, we introduce the Local Equilibrium Monte Carlo (LEMC) method. In this method, Grand Canonical Monte Carlo simulations are performed using the electrochemical potential specified for the distinct volume elements. An iteration procedure that self-consistently solves the NP and flux continuity equations with LEMC is shown to converge quickly. This NP+LEMC technique can be used in systems with diffusion of charged or uncharged particles in complex three-dimensional geometries, including systems with low concentrations and small applied voltages that are difficult for other particle simulation techniques.

  15. Stock Price Simulation Using Bootstrap and Monte Carlo

    Directory of Open Access Journals (Sweden)

    Pažický Martin

    2017-06-01

    Full Text Available In this paper, an attempt is made to assessment and comparison of bootstrap experiment and Monte Carlo experiment for stock price simulation. Since the stock price evolution in the future is extremely important for the investors, there is the attempt to find the best method how to determine the future stock price of BNP Paribas′ bank. The aim of the paper is define the value of the European and Asian option on BNP Paribas′ stock at the maturity date. There are employed four different methods for the simulation. First method is bootstrap experiment with homoscedastic error term, second method is blocked bootstrap experiment with heteroscedastic error term, third method is Monte Carlo simulation with heteroscedastic error term and the last method is Monte Carlo simulation with homoscedastic error term. In the last method there is necessary to model the volatility using econometric GARCH model. The main purpose of the paper is to compare the mentioned methods and select the most reliable. The difference between classical European option and exotic Asian option based on the experiment results is the next aim of tis paper.

  16. Toward high-efficiency and detailed Monte Carlo simulation study of the granular flow spallation target

    Science.gov (United States)

    Cai, Han-Jie; Zhang, Zhi-Lei; Fu, Fen; Li, Jian-Yang; Zhang, Xun-Chao; Zhang, Ya-Ling; Yan, Xue-Song; Lin, Ping; Xv, Jian-Ya; Yang, Lei

    2018-02-01

    The dense granular flow spallation target is a new target concept chosen for the Accelerator-Driven Subcritical (ADS) project in China. For the R&D of this kind of target concept, a dedicated Monte Carlo (MC) program named GMT was developed to perform the simulation study of the beam-target interaction. Owing to the complexities of the target geometry, the computational cost of the MC simulation of particle tracks is highly expensive. Thus, improvement of computational efficiency will be essential for the detailed MC simulation studies of the dense granular target. Here we present the special design of the GMT program and its high efficiency performance. In addition, the speedup potential of the GPU-accelerated spallation models is discussed.

  17. Monte-Carlo Simulation for PDC-Based Optical CDMA System

    Directory of Open Access Journals (Sweden)

    FAHIM AZIZ UMRANI

    2010-10-01

    Full Text Available This paper presents the Monte-Carlo simulation of Optical CDMA (Code Division Multiple Access systems, and analyse its performance in terms of the BER (Bit Error Rate. The spreading sequence chosen for CDMA is Perfect Difference Codes. Furthermore, this paper derives the expressions of noise variances from first principles to calibrate the noise for both bipolar (electrical domain and unipolar (optical domain signalling required for Monte-Carlo simulation. The simulated results conform to the theory and show that the receiver gain mismatch and splitter loss at the transceiver degrades the system performance.

  18. Assessing the convergence of LHS Monte Carlo simulations of wastewater treatment models.

    Science.gov (United States)

    Benedetti, Lorenzo; Claeys, Filip; Nopens, Ingmar; Vanrolleghem, Peter A

    2011-01-01

    Monte Carlo (MC) simulation appears to be the only currently adopted tool to estimate global sensitivities and uncertainties in wastewater treatment modelling. Such models are highly complex, dynamic and non-linear, requiring long computation times, especially in the scope of MC simulation, due to the large number of simulations usually required. However, no stopping rule to decide on the number of simulations required to achieve a given confidence in the MC simulation results has been adopted so far in the field. In this work, a pragmatic method is proposed to minimize the computation time by using a combination of several criteria. It makes no use of prior knowledge about the model, is very simple, intuitive and can be automated: all convenient features in engineering applications. A case study is used to show an application of the method, and the results indicate that the required number of simulations strongly depends on the model output(s) selected, and on the type and desired accuracy of the analysis conducted. Hence, no prior indication is available regarding the necessary number of MC simulations, but the proposed method is capable of dealing with these variations and stopping the calculations after convergence is reached.

  19. Monte Carlo Simulation of the Echo Signals from Low-Flying Targets for Airborne Radar

    Directory of Open Access Journals (Sweden)

    Mingyuan Man

    2014-01-01

    Full Text Available A demonstrated hybrid method based on the combination of half-space physical optics method (PO, graphical-electromagnetic computing (GRECO, and Monte Carlo method on echo signals from low-flying targets based on actual environment for airborne radar is presented in this paper. The half-space physical optics method , combined with the graphical-electromagnetic computing (GRECO method to eliminate the shadow regions quickly and rebuild the target automatically, is employed to calculate the radar cross section (RCS of the conductive targets in half space fast and accurately. The direct echo is computed based on the radar equation. The reflected paths from sea or ground surface cause multipath effects. In order to accurately obtain the echo signals, the phase factors are modified for fluctuations in multipath, and the statistical average value of the echo signals is obtained using the Monte Carlo method. A typical simulation is performed, and the numerical results show the accuracy of the proposed method.

  20. Multiple histogram method and static Monte Carlo sampling

    NARCIS (Netherlands)

    Inda, M.A.; Frenkel, D.

    2004-01-01

    We describe an approach to use multiple-histogram methods in combination with static, biased Monte Carlo simulations. To illustrate this, we computed the force-extension curve of an athermal polymer from multiple histograms constructed in a series of static Rosenbluth Monte Carlo simulations. From

  1. Application of direct simulation Monte Carlo method for analysis of AVLIS evaporation process

    International Nuclear Information System (INIS)

    Nishimura, Akihiko

    1995-01-01

    The computation code of the direct simulation Monte Carlo (DSMC) method was developed in order to analyze the atomic vapor evaporation in atomic vapor laser isotope separation (AVLIS). The atomic excitation temperatures of gadolinium atom were calculated for the model with five low lying states. Calculation results were compared with the experiments obtained by laser absorption spectroscopy. Two types of DSMC simulations which were different in inelastic collision procedure were carried out. It was concluded that the energy transfer was forbidden unless the total energy of the colliding atoms exceeds a threshold value. (author)

  2. Could running experience on SPMD computers contribute to the architectural choices for future dedicated computers for high energy physics simulation?

    International Nuclear Information System (INIS)

    Jejcic, A.; Maillard, J.; Silva, J.; Auguin, M.; Boeri, F.

    1989-01-01

    Results obtained on a strongly coupled parallel computer are reported. They concern Monte-Carlo simulation and pattern recognition. Though the calculations were made on an experimental computer of rather low processing power, it is believed that the quoted figures could give useful indications on architectural choices for dedicated computers. (orig.)

  3. Direct Simulation Monte Carlo (DSMC) on the Connection Machine

    International Nuclear Information System (INIS)

    Wong, B.C.; Long, L.N.

    1992-01-01

    The massively parallel computer Connection Machine is utilized to map an improved version of the direct simulation Monte Carlo (DSMC) method for solving flows with the Boltzmann equation. The kinetic theory is required for analyzing hypersonic aerospace applications, and the features and capabilities of the DSMC particle-simulation technique are discussed. The DSMC is shown to be inherently massively parallel and data parallel, and the algorithm is based on molecule movements, cross-referencing their locations, locating collisions within cells, and sampling macroscopic quantities in each cell. The serial DSMC code is compared to the present parallel DSMC code, and timing results show that the speedup of the parallel version is approximately linear. The correct physics can be resolved from the results of the complete DSMC method implemented on the connection machine using the data-parallel approach. 41 refs

  4. Monte Carlo techniques in radiation therapy

    CERN Document Server

    Verhaegen, Frank

    2013-01-01

    Modern cancer treatment relies on Monte Carlo simulations to help radiotherapists and clinical physicists better understand and compute radiation dose from imaging devices as well as exploit four-dimensional imaging data. With Monte Carlo-based treatment planning tools now available from commercial vendors, a complete transition to Monte Carlo-based dose calculation methods in radiotherapy could likely take place in the next decade. Monte Carlo Techniques in Radiation Therapy explores the use of Monte Carlo methods for modeling various features of internal and external radiation sources, including light ion beams. The book-the first of its kind-addresses applications of the Monte Carlo particle transport simulation technique in radiation therapy, mainly focusing on external beam radiotherapy and brachytherapy. It presents the mathematical and technical aspects of the methods in particle transport simulations. The book also discusses the modeling of medical linacs and other irradiation devices; issues specific...

  5. Use of Monte Carlo simulation software for calculating effective dose in cone beam computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Gomes B, W. O., E-mail: wilsonottobatista@gmail.com [Instituto Federal da Bahia, Rua Emidio dos Santos s/n, Barbalho 40301-015, Salvador de Bahia (Brazil)

    2016-10-15

    This study aimed to develop a geometry of irradiation applicable to the software PCXMC and the consequent calculation of effective dose in applications of the Computed Tomography Cone Beam (CBCT). We evaluated two different CBCT equipment s for dental applications: Care stream Cs 9000 3-dimensional tomograph; i-CAT and GENDEX GXCB-500. Initially characterize each protocol measuring the surface kerma input and the product kerma air-area, P{sub KA}, with solid state detectors RADCAL and PTW transmission chamber. Then we introduce the technical parameters of each preset protocols and geometric conditions in the PCXMC software to obtain the values of effective dose. The calculated effective dose is within the range of 9.0 to 15.7 μSv for 3-dimensional computer 9000 Cs; within the range 44.5 to 89 μSv for GXCB-500 equipment and in the range of 62-111 μSv for equipment Classical i-CAT. These values were compared with results obtained dosimetry using TLD implanted in anthropomorphic phantom and are considered consistent. Os effective dose results are very sensitive to the geometry of radiation (beam position in mathematical phantom). This factor translates to a factor of fragility software usage. But it is very useful to get quick answers to regarding process optimization tool conclusions protocols. We conclude that use software PCXMC Monte Carlo simulation is useful assessment protocols for CBCT tests in dental applications. (Author)

  6. Development and validation of Monte Carlo dose computations for contrast-enhanced stereotactic synchrotron radiation therapy

    International Nuclear Information System (INIS)

    Vautrin, M.

    2011-01-01

    Contrast-enhanced stereotactic synchrotron radiation therapy (SSRT) is an innovative technique based on localized dose-enhancement effects obtained by reinforced photoelectric absorption in the tumor. Medium energy monochromatic X-rays (50 - 100 keV) are used for irradiating tumors previously loaded with a high-Z element. Clinical trials of SSRT are being prepared at the European Synchrotron Radiation Facility (ESRF), an iodinated contrast agent will be used. In order to compute the energy deposited in the patient (dose), a dedicated treatment planning system (TPS) has been developed for the clinical trials, based on the ISOgray TPS. This work focuses on the SSRT specific modifications of the TPS, especially to the PENELOPE-based Monte Carlo dose engine. The TPS uses a dedicated Monte Carlo simulation of medium energy polarized photons to compute the deposited energy in the patient. Simulations are performed considering the synchrotron source, the modeled beamline geometry and finally the patient. Specific materials were also implemented in the voxelized geometry of the patient, to consider iodine concentrations in the tumor. The computation process has been optimized and parallelized. Finally a specific computation of absolute doses and associated irradiation times (instead of monitor units) was implemented. The dedicated TPS was validated with depth dose curves, dose profiles and absolute dose measurements performed at the ESRF in a water tank and solid water phantoms with or without bone slabs. (author) [fr

  7. Scouting the feasibility of Monte Carlo reactor dynamics simulations

    International Nuclear Information System (INIS)

    Legrady, David; Hoogenboom, J. Eduard

    2008-01-01

    In this paper we present an overview of the methodological questions related to Monte Carlo simulation of time dependent power transients in nuclear reactors. Investigations using a small fictional 3D reactor with isotropic scattering and a single energy group we have performed direct Monte Carlo transient calculations with simulation of delayed neutrons and with and without thermal feedback. Using biased delayed neutron sampling and population control at time step boundaries calculation times were kept reasonably low. We have identified the initial source determination and the prompt chain simulations as key issues that require most attention. (authors)

  8. Scouting the feasibility of Monte Carlo reactor dynamics simulations

    Energy Technology Data Exchange (ETDEWEB)

    Legrady, David [Forschungszentrum Dresden-Rossendorf, Dresden (Germany); Hoogenboom, J. Eduard [Delft University of Technology, Delft (Netherlands)

    2008-07-01

    In this paper we present an overview of the methodological questions related to Monte Carlo simulation of time dependent power transients in nuclear reactors. Investigations using a small fictional 3D reactor with isotropic scattering and a single energy group we have performed direct Monte Carlo transient calculations with simulation of delayed neutrons and with and without thermal feedback. Using biased delayed neutron sampling and population control at time step boundaries calculation times were kept reasonably low. We have identified the initial source determination and the prompt chain simulations as key issues that require most attention. (authors)

  9. Monte Carlo Simulations of Phosphate Polyhedron Connectivity in Glasses

    Energy Technology Data Exchange (ETDEWEB)

    ALAM,TODD M.

    1999-12-21

    Monte Carlo simulations of phosphate tetrahedron connectivity distributions in alkali and alkaline earth phosphate glasses are reported. By utilizing a discrete bond model, the distribution of next-nearest neighbor connectivities between phosphate polyhedron for random, alternating and clustering bonding scenarios was evaluated as a function of the relative bond energy difference. The simulated distributions are compared to experimentally observed connectivities reported for solid-state two-dimensional exchange and double-quantum NMR experiments of phosphate glasses. These Monte Carlo simulations demonstrate that the polyhedron connectivity is best described by a random distribution in lithium phosphate and calcium phosphate glasses.

  10. A computationally efficient moment-preserving Monte Carlo electron transport method with implementation in Geant4

    Energy Technology Data Exchange (ETDEWEB)

    Dixon, D.A., E-mail: ddixon@lanl.gov [Los Alamos National Laboratory, P.O. Box 1663, MS P365, Los Alamos, NM 87545 (United States); Prinja, A.K., E-mail: prinja@unm.edu [Department of Nuclear Engineering, MSC01 1120, 1 University of New Mexico, Albuquerque, NM 87131-0001 (United States); Franke, B.C., E-mail: bcfrank@sandia.gov [Sandia National Laboratories, Albuquerque, NM 87123 (United States)

    2015-09-15

    This paper presents the theoretical development and numerical demonstration of a moment-preserving Monte Carlo electron transport method. Foremost, a full implementation of the moment-preserving (MP) method within the Geant4 particle simulation toolkit is demonstrated. Beyond implementation details, it is shown that the MP method is a viable alternative to the condensed history (CH) method for inclusion in current and future generation transport codes through demonstration of the key features of the method including: systematically controllable accuracy, computational efficiency, mathematical robustness, and versatility. A wide variety of results common to electron transport are presented illustrating the key features of the MP method. In particular, it is possible to achieve accuracy that is statistically indistinguishable from analog Monte Carlo, while remaining up to three orders of magnitude more efficient than analog Monte Carlo simulations. Finally, it is shown that the MP method can be generalized to any applicable analog scattering DCS model by extending previous work on the MP method beyond analytical DCSs to the partial-wave (PW) elastic tabulated DCS data.

  11. Lattice gauge theories and Monte Carlo simulations

    International Nuclear Information System (INIS)

    Rebbi, C.

    1981-11-01

    After some preliminary considerations, the discussion of quantum gauge theories on a Euclidean lattice takes up the definition of Euclidean quantum theory and treatment of the continuum limit; analogy is made with statistical mechanics. Perturbative methods can produce useful results for strong or weak coupling. In the attempts to investigate the properties of the systems for intermediate coupling, numerical methods known as Monte Carlo simulations have proved valuable. The bulk of this paper illustrates the basic ideas underlying the Monte Carlo numerical techniques and the major results achieved with them according to the following program: Monte Carlo simulations (general theory, practical considerations), phase structure of Abelian and non-Abelian models, the observables (coefficient of the linear term in the potential between two static sources at large separation, mass of the lowest excited state with the quantum numbers of the vacuum (the so-called glueball), the potential between two static sources at very small distance, the critical temperature at which sources become deconfined), gauge fields coupled to basonic matter (Higgs) fields, and systems with fermions

  12. Monte Carlo Techniques for Nuclear Systems - Theory Lectures

    International Nuclear Information System (INIS)

    Brown, Forrest B.; Univ. of New Mexico, Albuquerque, NM

    2016-01-01

    These are lecture notes for a Monte Carlo class given at the University of New Mexico. The following topics are covered: course information; nuclear eng. review & MC; random numbers and sampling; computational geometry; collision physics; tallies and statistics; eigenvalue calculations I; eigenvalue calculations II; eigenvalue calculations III; variance reduction; parallel Monte Carlo; parameter studies; fission matrix and higher eigenmodes; doppler broadening; Monte Carlo depletion; HTGR modeling; coupled MC and T/H calculations; fission energy deposition. Solving particle transport problems with the Monte Carlo method is simple - just simulate the particle behavior. The devil is in the details, however. These lectures provide a balanced approach to the theory and practice of Monte Carlo simulation codes. The first lectures provide an overview of Monte Carlo simulation methods, covering the transport equation, random sampling, computational geometry, collision physics, and statistics. The next lectures focus on the state-of-the-art in Monte Carlo criticality simulations, covering the theory of eigenvalue calculations, convergence analysis, dominance ratio calculations, bias in Keff and tallies, bias in uncertainties, a case study of a realistic calculation, and Wielandt acceleration techniques. The remaining lectures cover advanced topics, including HTGR modeling and stochastic geometry, temperature dependence, fission energy deposition, depletion calculations, parallel calculations, and parameter studies. This portion of the class focuses on using MCNP to perform criticality calculations for reactor physics and criticality safety applications. It is an intermediate level class, intended for those with at least some familiarity with MCNP. Class examples provide hands-on experience at running the code, plotting both geometry and results, and understanding the code output. The class includes lectures & hands-on computer use for a variety of Monte Carlo calculations

  13. Monte Carlo Techniques for Nuclear Systems - Theory Lectures

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Forrest B. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). Monte Carlo Methods, Codes, and Applications Group; Univ. of New Mexico, Albuquerque, NM (United States). Nuclear Engineering Dept.

    2016-11-29

    These are lecture notes for a Monte Carlo class given at the University of New Mexico. The following topics are covered: course information; nuclear eng. review & MC; random numbers and sampling; computational geometry; collision physics; tallies and statistics; eigenvalue calculations I; eigenvalue calculations II; eigenvalue calculations III; variance reduction; parallel Monte Carlo; parameter studies; fission matrix and higher eigenmodes; doppler broadening; Monte Carlo depletion; HTGR modeling; coupled MC and T/H calculations; fission energy deposition. Solving particle transport problems with the Monte Carlo method is simple - just simulate the particle behavior. The devil is in the details, however. These lectures provide a balanced approach to the theory and practice of Monte Carlo simulation codes. The first lectures provide an overview of Monte Carlo simulation methods, covering the transport equation, random sampling, computational geometry, collision physics, and statistics. The next lectures focus on the state-of-the-art in Monte Carlo criticality simulations, covering the theory of eigenvalue calculations, convergence analysis, dominance ratio calculations, bias in Keff and tallies, bias in uncertainties, a case study of a realistic calculation, and Wielandt acceleration techniques. The remaining lectures cover advanced topics, including HTGR modeling and stochastic geometry, temperature dependence, fission energy deposition, depletion calculations, parallel calculations, and parameter studies. This portion of the class focuses on using MCNP to perform criticality calculations for reactor physics and criticality safety applications. It is an intermediate level class, intended for those with at least some familiarity with MCNP. Class examples provide hands-on experience at running the code, plotting both geometry and results, and understanding the code output. The class includes lectures & hands-on computer use for a variety of Monte Carlo calculations

  14. Monte Carlo simulations in skin radiotherapy

    International Nuclear Information System (INIS)

    Sarvari, A.; Jeraj, R.; Kron, T.

    2000-01-01

    The primary goal of this work was to develop a procedure for calculation the appropriate filter shape for a brachytherapy applicator used for skin radiotherapy. In the applicator a radioactive source is positioned close to the skin. Without a filter, the resultant dose distribution would be highly nonuniform.High uniformity is usually required however. This can be achieved using an appropriately shaped filter, which flattens the dose profile. Because of the complexity of the transport and geometry, Monte Carlo simulations had to be used. An 192 Ir high dose rate photon source was used. All necessary transport parameters were simulated with the MCNP4B Monte Carlo code. A highly efficient iterative procedure was developed, which enabled calculation of the optimal filter shape in only few iterations. The initially non-uniform dose distributions became uniform within a percent when applying the filter calculated by this procedure. (author)

  15. Monte Carlo simulations in theoretical physic

    International Nuclear Information System (INIS)

    Billoire, A.

    1991-01-01

    After a presentation of the MONTE CARLO method principle, the method is applied, first to the critical exponents calculations in the three dimensions ISING model, and secondly to the discrete quantum chromodynamic with calculation times in function of computer power. 28 refs., 4 tabs

  16. Closed-shell variational quantum Monte Carlo simulation for the ...

    African Journals Online (AJOL)

    Closed-shell variational quantum Monte Carlo simulation for the electric dipole moment calculation of hydrazine molecule using casino-code. ... Nigeria Journal of Pure and Applied Physics ... The variational quantum Monte Carlo (VQMC) technique used in this work employed the restricted Hartree-Fock (RHF) scheme.

  17. Multi-Scale Coupling Between Monte Carlo Molecular Simulation and Darcy-Scale Flow in Porous Media

    KAUST Repository

    Saad, Ahmed Mohamed

    2016-06-01

    In this work, an efficient coupling between Monte Carlo (MC) molecular simulation and Darcy-scale flow in porous media is presented. The cell centered finite difference method with non-uniform rectangular mesh were used to discretize the simulation domain and solve the governing equations. To speed up the MC simulations, we implemented a recently developed scheme that quickly generates MC Markov chains out of pre-computed ones, based on the reweighting and reconstruction algorithm. This method astonishingly reduces the required computational times by MC simulations from hours to seconds. To demonstrate the strength of the proposed coupling in terms of computational time efficiency and numerical accuracy in fluid properties, various numerical experiments covering different compressible single-phase flow scenarios were conducted. The novelty in the introduced scheme is in allowing an efficient coupling of the molecular scale and the Darcy\\'s one in reservoir simulators. This leads to an accurate description of thermodynamic behavior of the simulated reservoir fluids; consequently enhancing the confidence in the flow predictions in porous media.

  18. Atomistic Monte Carlo Simulation of Lipid Membranes

    Directory of Open Access Journals (Sweden)

    Daniel Wüstner

    2014-01-01

    Full Text Available Biological membranes are complex assemblies of many different molecules of which analysis demands a variety of experimental and computational approaches. In this article, we explain challenges and advantages of atomistic Monte Carlo (MC simulation of lipid membranes. We provide an introduction into the various move sets that are implemented in current MC methods for efficient conformational sampling of lipids and other molecules. In the second part, we demonstrate for a concrete example, how an atomistic local-move set can be implemented for MC simulations of phospholipid monomers and bilayer patches. We use our recently devised chain breakage/closure (CBC local move set in the bond-/torsion angle space with the constant-bond-length approximation (CBLA for the phospholipid dipalmitoylphosphatidylcholine (DPPC. We demonstrate rapid conformational equilibration for a single DPPC molecule, as assessed by calculation of molecular energies and entropies. We also show transition from a crystalline-like to a fluid DPPC bilayer by the CBC local-move MC method, as indicated by the electron density profile, head group orientation, area per lipid, and whole-lipid displacements. We discuss the potential of local-move MC methods in combination with molecular dynamics simulations, for example, for studying multi-component lipid membranes containing cholesterol.

  19. Hybrid Monte Carlo methods in computational finance

    NARCIS (Netherlands)

    Leitao Rodriguez, A.

    2017-01-01

    Monte Carlo methods are highly appreciated and intensively employed in computational finance in the context of financial derivatives valuation or risk management. The method offers valuable advantages like flexibility, easy interpretation and straightforward implementation. Furthermore, the

  20. Study on efficiency of time computation in x-ray imaging simulation base on Monte Carlo algorithm using graphics processing unit

    International Nuclear Information System (INIS)

    Setiani, Tia Dwi; Suprijadi; Haryanto, Freddy

    2016-01-01

    Monte Carlo (MC) is one of the powerful techniques for simulation in x-ray imaging. MC method can simulate the radiation transport within matter with high accuracy and provides a natural way to simulate radiation transport in complex systems. One of the codes based on MC algorithm that are widely used for radiographic images simulation is MC-GPU, a codes developed by Andrea Basal. This study was aimed to investigate the time computation of x-ray imaging simulation in GPU (Graphics Processing Unit) compared to a standard CPU (Central Processing Unit). Furthermore, the effect of physical parameters to the quality of radiographic images and the comparison of image quality resulted from simulation in the GPU and CPU are evaluated in this paper. The simulations were run in CPU which was simulated in serial condition, and in two GPU with 384 cores and 2304 cores. In simulation using GPU, each cores calculates one photon, so, a large number of photon were calculated simultaneously. Results show that the time simulations on GPU were significantly accelerated compared to CPU. The simulations on the 2304 core of GPU were performed about 64 -114 times faster than on CPU, while the simulation on the 384 core of GPU were performed about 20 – 31 times faster than in a single core of CPU. Another result shows that optimum quality of images from the simulation was gained at the history start from 10"8 and the energy from 60 Kev to 90 Kev. Analyzed by statistical approach, the quality of GPU and CPU images are relatively the same.

  1. Study on efficiency of time computation in x-ray imaging simulation base on Monte Carlo algorithm using graphics processing unit

    Energy Technology Data Exchange (ETDEWEB)

    Setiani, Tia Dwi, E-mail: tiadwisetiani@gmail.com [Computational Science, Faculty of Mathematics and Natural Sciences, Institut Teknologi Bandung Jalan Ganesha 10 Bandung, 40132 (Indonesia); Suprijadi [Computational Science, Faculty of Mathematics and Natural Sciences, Institut Teknologi Bandung Jalan Ganesha 10 Bandung, 40132 (Indonesia); Nuclear Physics and Biophysics Reaserch Division, Faculty of Mathematics and Natural Sciences, Institut Teknologi Bandung Jalan Ganesha 10 Bandung, 40132 (Indonesia); Haryanto, Freddy [Nuclear Physics and Biophysics Reaserch Division, Faculty of Mathematics and Natural Sciences, Institut Teknologi Bandung Jalan Ganesha 10 Bandung, 40132 (Indonesia)

    2016-03-11

    Monte Carlo (MC) is one of the powerful techniques for simulation in x-ray imaging. MC method can simulate the radiation transport within matter with high accuracy and provides a natural way to simulate radiation transport in complex systems. One of the codes based on MC algorithm that are widely used for radiographic images simulation is MC-GPU, a codes developed by Andrea Basal. This study was aimed to investigate the time computation of x-ray imaging simulation in GPU (Graphics Processing Unit) compared to a standard CPU (Central Processing Unit). Furthermore, the effect of physical parameters to the quality of radiographic images and the comparison of image quality resulted from simulation in the GPU and CPU are evaluated in this paper. The simulations were run in CPU which was simulated in serial condition, and in two GPU with 384 cores and 2304 cores. In simulation using GPU, each cores calculates one photon, so, a large number of photon were calculated simultaneously. Results show that the time simulations on GPU were significantly accelerated compared to CPU. The simulations on the 2304 core of GPU were performed about 64 -114 times faster than on CPU, while the simulation on the 384 core of GPU were performed about 20 – 31 times faster than in a single core of CPU. Another result shows that optimum quality of images from the simulation was gained at the history start from 10{sup 8} and the energy from 60 Kev to 90 Kev. Analyzed by statistical approach, the quality of GPU and CPU images are relatively the same.

  2. Initial Assessment of Parallelization of Monte Carlo Calculation using Graphics Processing Units

    International Nuclear Information System (INIS)

    Choi, Sung Hoon; Joo, Han Gyu

    2009-01-01

    Monte Carlo (MC) simulation is an effective tool for calculating neutron transports in complex geometry. However, because Monte Carlo simulates each neutron behavior one by one, it takes a very long computing time if enough neutrons are used for high precision of calculation. Accordingly, methods that reduce the computing time are required. In a Monte Carlo code, parallel calculation is well-suited since it simulates the behavior of each neutron independently and thus parallel computation is natural. The parallelization of the Monte Carlo codes, however, was done using multi CPUs. By the global demand for high quality 3D graphics, the Graphics Processing Unit (GPU) has developed into a highly parallel, multi-core processor. This parallel processing capability of GPUs can be available to engineering computing once a suitable interface is provided. Recently, NVIDIA introduced CUDATM, a general purpose parallel computing architecture. CUDA is a software environment that allows developers to manage GPU using C/C++ or other languages. In this work, a GPU-based Monte Carlo is developed and the initial assessment of it parallel performance is investigated

  3. Monte Carlo Simulation of a Linear Accelerator and Electron Beam Parameters Used in Radiotherapy

    Directory of Open Access Journals (Sweden)

    Mohammad Taghi Bahreyni Toossi

    2009-06-01

    Full Text Available Introduction: In recent decades, several Monte Carlo codes have been introduced for research and medical applications. These methods provide both accurate and detailed calculation of particle transport from linear accelerators. The main drawback of Monte Carlo techniques is the extremely long computing time that is required in order to obtain a dose distribution with good statistical accuracy. Material and Methods: In this study, the MCNP-4C Monte Carlo code was used to simulate the electron beams generated by a Neptun 10 PC linear accelerator. The depth dose curves and related parameters to depth dose and beam profiles were calculated for 6, 8 and 10 MeV electron beams with different field sizes and these data were compared with the corresponding measured values. The actual dosimetry was performed by employing a Welhofer-Scanditronix dose scanning system, semiconductor detectors and ionization chambers. Results: The result showed good agreement (better than 2% between calculated and measured depth doses and lateral dose profiles for all energies in different field sizes. Also good agreements were achieved between calculated and measured related electron beam parameters such as E0, Rq, Rp and R50. Conclusion: The simulated model of the linac developed in this study is capable of computing electron beam data in a water phantom for different field sizes and the resulting data can be used to predict the dose distributions in other complex geometries.

  4. Monte Carlo simulation on kinetics of batch and semi-batch free radical polymerization

    KAUST Repository

    Shao, Jing

    2015-10-27

    Based on Monte Carlo simulation technology, we proposed a hybrid routine which combines reaction mechanism together with coarse-grained molecular simulation to study the kinetics of free radical polymerization. By comparing with previous experimental and simulation studies, we showed the capability of our Monte Carlo scheme on representing polymerization kinetics in batch and semi-batch processes. Various kinetics information, such as instant monomer conversion, molecular weight, and polydispersity etc. are readily calculated from Monte Carlo simulation. The kinetic constants such as polymerization rate k p is determined in the simulation without of “steady-state” hypothesis. We explored the mechanism for the variation of polymerization kinetics those observed in previous studies, as well as polymerization-induced phase separation. Our Monte Carlo simulation scheme is versatile on studying polymerization kinetics in batch and semi-batch processes.

  5. Autocorrelations in hybrid Monte Carlo simulations

    International Nuclear Information System (INIS)

    Schaefer, Stefan; Virotta, Francesco

    2010-11-01

    Simulations of QCD suffer from severe critical slowing down towards the continuum limit. This problem is known to be prominent in the topological charge, however, all observables are affected to various degree by these slow modes in the Monte Carlo evolution. We investigate the slowing down in high statistics simulations and propose a new error analysis method, which gives a realistic estimate of the contribution of the slow modes to the errors. (orig.)

  6. Simulating the reactions of CO2 in aqueous monoethanolamine solution by reaction ensemble Monte Carlo using the continuous fractional component method

    NARCIS (Netherlands)

    Balaji, S.P.; Gangarapu, S.; Ramdin, M.; Torres-Knoop, A.; Zuilhof, H.; Goetheer, E.L.V.; Dubbeldam, D.; Vlugt, T.J.H.

    2015-01-01

    Molecular simulations were used to compute the equilibrium concentrations of the different species in CO2/monoethanolamine solutions for different CO2 loadings. Simulations were performed in the Reaction Ensemble using the continuous fractional component Monte Carlo method at temperatures of 293,

  7. Non-analogue Monte Carlo method, application to neutron simulation; Methode de Monte Carlo non analogue, application a la simulation des neutrons

    Energy Technology Data Exchange (ETDEWEB)

    Morillon, B.

    1996-12-31

    With most of the traditional and contemporary techniques, it is still impossible to solve the transport equation if one takes into account a fully detailed geometry and if one studies precisely the interactions between particles and matters. Only the Monte Carlo method offers such a possibility. However with significant attenuation, the natural simulation remains inefficient: it becomes necessary to use biasing techniques where the solution of the adjoint transport equation is essential. The Monte Carlo code Tripoli has been using such techniques successfully for a long time with different approximate adjoint solutions: these methods require from the user to find out some parameters. If this parameters are not optimal or nearly optimal, the biases simulations may bring about small figures of merit. This paper presents a description of the most important biasing techniques of the Monte Carlo code Tripoli ; then we show how to calculate the importance function for general geometry with multigroup cases. We present a completely automatic biasing technique where the parameters of the biased simulation are deduced from the solution of the adjoint transport equation calculated by collision probabilities. In this study we shall estimate the importance function through collision probabilities method and we shall evaluate its possibilities thanks to a Monte Carlo calculation. We compare different biased simulations with the importance function calculated by collision probabilities for one-group and multigroup problems. We have run simulations with new biasing method for one-group transport problems with isotropic shocks and for multigroup problems with anisotropic shocks. The results show that for the one-group and homogeneous geometry transport problems the method is quite optimal without splitting and russian roulette technique but for the multigroup and heterogeneous X-Y geometry ones the figures of merit are higher if we add splitting and russian roulette technique.

  8. An introduction to statistical computing a simulation-based approach

    CERN Document Server

    Voss, Jochen

    2014-01-01

    A comprehensive introduction to sampling-based methods in statistical computing The use of computers in mathematics and statistics has opened up a wide range of techniques for studying otherwise intractable problems.  Sampling-based simulation techniques are now an invaluable tool for exploring statistical models.  This book gives a comprehensive introduction to the exciting area of sampling-based methods. An Introduction to Statistical Computing introduces the classical topics of random number generation and Monte Carlo methods.  It also includes some advanced met

  9. Power-feedwater enthalpy operating domain for SBWR applying Monte Carlo simulation

    International Nuclear Information System (INIS)

    Quezada-Garcia, S.; Espinosa-Martinez, E.-G.; Vazquez-Rodriguez, A.; Varela-Ham, J.R.; Espinosa-Paredes, G.

    2014-01-01

    In this work the analyses of the feedwater enthalpy effects on reactor power in a simplified boiling water reactor (SBWR) applying a methodology based on Monte Carlo's simulation (MCS), is presented. The MCS methodology was applied systematically to establish operating domain, due that the SBWR are not yet in operation, the analysis of the nuclear and thermalhydraulic processes must rely on numerical modeling, with the purpose of developing or confirming the design basis and qualifying the existing or new computer codes to enable reliable analyses. (author)

  10. Monte Carlo molecular simulation of phase-coexistence for oil production and processing

    KAUST Repository

    Li, Jun

    2011-01-01

    The Gibbs-NVT ensemble Monte Carlo method is used to simulate the liquid-vapor coexistence diagram and the simulation results of methane agree well with the experimental data in a wide range of temperatures. For systems with two components, the Gibbs-NPT ensemble Monte Carlo method is employed in the simulation while the mole fraction of each component in each phase is modeled as a Leonard-Jones fluid. As the results of Monte Carlo simulations usually contain huge statistical error, the blocking method is used to estimate the variance of the simulation results. Additionally, in order to improve the simulation efficiency, the step sizes of different trial moves is adjusted automatically so that their acceptance probabilities can approach to the preset values.

  11. Influence of radioactive sources discretization in the Monte Carlo computational simulations of brachytherapy procedures: a case study on the procedures for treatment of prostate cancer

    International Nuclear Information System (INIS)

    Barbosa, Antonio Konrado de Santana; Vieira, Jose Wilson; Costa, Kleber Souza Silva; Lima, Fernando Roberto de Andrade

    2011-01-01

    Radiotherapy computational simulation procedures using Monte Carlo (MC) methods have shown to be increasingly important to the improvement of cancer fighting strategies. One of the biases in this practice is the discretization of the radioactive source in brachytherapy simulations, which often do not match with a real situation. This study had the aim to identify and to measure the influence of radioactive sources discretization in brachytherapy MC simulations when compared to those that do not present discretization, using prostate brachytherapy with Iodine-125 radionuclide as model. Simulations were carried out with 108 events with both types of sources to compare them using EGSnrc code associated to MASH phantom in orthostatic and supine positions with some anatomic adaptations. Significant alterations were found, especially regarding bladder, rectum and the prostate itself. It can be concluded that there is a need to discretized sources in brachytherapy simulations to ensure its representativeness. (author)

  12. CAD-based Monte Carlo program for integrated simulation of nuclear system SuperMC

    International Nuclear Information System (INIS)

    Wu, Y.; Song, J.; Zheng, H.; Sun, G.; Hao, L.; Long, P.; Hu, L.

    2013-01-01

    SuperMC is a (Computer-Aided-Design) CAD-based Monte Carlo (MC) program for integrated simulation of nuclear systems developed by FDS Team (China), making use of hybrid MC-deterministic method and advanced computer technologies. The design aim, architecture and main methodology of SuperMC are presented in this paper. The taking into account of multi-physics processes and the use of advanced computer technologies such as automatic geometry modeling, intelligent data analysis and visualization, high performance parallel computing and cloud computing, contribute to the efficiency of the code. SuperMC2.1, the latest version of the code for neutron, photon and coupled neutron and photon transport calculation, has been developed and validated by using a series of benchmarking cases such as the fusion reactor ITER model and the fast reactor BN-600 model

  13. Computing Optimal Stochastic Portfolio Execution Strategies: A Parametric Approach Using Simulations

    Science.gov (United States)

    Moazeni, Somayeh; Coleman, Thomas F.; Li, Yuying

    2010-09-01

    Computing optimal stochastic portfolio execution strategies under appropriate risk consideration presents great computational challenge. We investigate a parametric approach for computing optimal stochastic strategies using Monte Carlo simulations. This approach allows reduction in computational complexity by computing coefficients for a parametric representation of a stochastic dynamic strategy based on static optimization. Using this technique, constraints can be similarly handled using appropriate penalty functions. We illustrate the proposed approach to minimize the expected execution cost and Conditional Value-at-Risk (CVaR).

  14. Stabilization effect of fission source in coupled Monte Carlo simulations

    Energy Technology Data Exchange (ETDEWEB)

    Olsen, Borge; Dufek, Jan [Div. of Nuclear Reactor Technology, KTH Royal Institute of Technology, AlbaNova University Center, Stockholm (Sweden)

    2017-08-15

    A fission source can act as a stabilization element in coupled Monte Carlo simulations. We have observed this while studying numerical instabilities in nonlinear steady-state simulations performed by a Monte Carlo criticality solver that is coupled to a xenon feedback solver via fixed-point iteration. While fixed-point iteration is known to be numerically unstable for some problems, resulting in large spatial oscillations of the neutron flux distribution, we show that it is possible to stabilize it by reducing the number of Monte Carlo criticality cycles simulated within each iteration step. While global convergence is ensured, development of any possible numerical instability is prevented by not allowing the fission source to converge fully within a single iteration step, which is achieved by setting a small number of criticality cycles per iteration step. Moreover, under these conditions, the fission source may converge even faster than in criticality calculations with no feedback, as we demonstrate in our numerical test simulations.

  15. Stabilization effect of fission source in coupled Monte Carlo simulations

    Directory of Open Access Journals (Sweden)

    Börge Olsen

    2017-08-01

    Full Text Available A fission source can act as a stabilization element in coupled Monte Carlo simulations. We have observed this while studying numerical instabilities in nonlinear steady-state simulations performed by a Monte Carlo criticality solver that is coupled to a xenon feedback solver via fixed-point iteration. While fixed-point iteration is known to be numerically unstable for some problems, resulting in large spatial oscillations of the neutron flux distribution, we show that it is possible to stabilize it by reducing the number of Monte Carlo criticality cycles simulated within each iteration step. While global convergence is ensured, development of any possible numerical instability is prevented by not allowing the fission source to converge fully within a single iteration step, which is achieved by setting a small number of criticality cycles per iteration step. Moreover, under these conditions, the fission source may converge even faster than in criticality calculations with no feedback, as we demonstrate in our numerical test simulations.

  16. Experimental and Monte Carlo simulation studies of open cylindrical radon monitoring device using CR-39 detector

    Energy Technology Data Exchange (ETDEWEB)

    Rehman, Fazal-ur- E-mail: fazalr@kfupm.edu.sa; Jamil, K.; Zakaullah, M.; Abu-Jarad, F.; Mujahid, S.A

    2003-07-01

    There are several methods of measuring radon concentrations but nuclear track detector cylindrical dosimeters are widely employed. In this investigation, the consequence of effective volumes of the dosimeters on the registration of alpha tracks in a CR-39 detector was studied. In a series of experiments an optimum radius for a CR-39-based open cylindrical radon dosimeter was found to be about 3 cm. Monte Carlo simulation techniques hav been employed to verify the experimental results. In this context, a computer code Monte Carlo simulation dosimetry (MOCSID) was developed. Monte Carlo simulation experiments gave the optimum radius of the dosimeters as 3.0 cm. The experimental results are in good agreement with those obtained by Monte Carlo design calculations. In addition to this, plate-out effects of radon progeny were also studied. It was observed that the contribution of radon progeny ({sup 218}Po and {sup 214}Po) plated-out on the wall of the dosimeters increases with an increase of dosimeter radii and then decrease to 0 at a radius of about 3 cm if a point detector has been installed at the center of the dosimeter base. In the code MOCSID different types of random number generators were employed. The results of this research are very useful for designing an optimum size of radon dosimeters.

  17. Experimental and Monte Carlo simulation studies of open cylindrical radon monitoring device using CR-39 detector

    International Nuclear Information System (INIS)

    Rehman, Fazal-ur-; Jamil, K.; Zakaullah, M.; Abu-Jarad, F.; Mujahid, S.A.

    2003-01-01

    There are several methods of measuring radon concentrations but nuclear track detector cylindrical dosimeters are widely employed. In this investigation, the consequence of effective volumes of the dosimeters on the registration of alpha tracks in a CR-39 detector was studied. In a series of experiments an optimum radius for a CR-39-based open cylindrical radon dosimeter was found to be about 3 cm. Monte Carlo simulation techniques hav been employed to verify the experimental results. In this context, a computer code Monte Carlo simulation dosimetry (MOCSID) was developed. Monte Carlo simulation experiments gave the optimum radius of the dosimeters as 3.0 cm. The experimental results are in good agreement with those obtained by Monte Carlo design calculations. In addition to this, plate-out effects of radon progeny were also studied. It was observed that the contribution of radon progeny ( 218 Po and 214 Po) plated-out on the wall of the dosimeters increases with an increase of dosimeter radii and then decrease to 0 at a radius of about 3 cm if a point detector has been installed at the center of the dosimeter base. In the code MOCSID different types of random number generators were employed. The results of this research are very useful for designing an optimum size of radon dosimeters

  18. Volume Measurement Algorithm for Food Product with Irregular Shape using Computer Vision based on Monte Carlo Method

    Directory of Open Access Journals (Sweden)

    Joko Siswantoro

    2014-11-01

    Full Text Available Volume is one of important issues in the production and processing of food product. Traditionally, volume measurement can be performed using water displacement method based on Archimedes’ principle. Water displacement method is inaccurate and considered as destructive method. Computer vision offers an accurate and nondestructive method in measuring volume of food product. This paper proposes algorithm for volume measurement of irregular shape food product using computer vision based on Monte Carlo method. Five images of object were acquired from five different views and then processed to obtain the silhouettes of object. From the silhouettes of object, Monte Carlo method was performed to approximate the volume of object. The simulation result shows that the algorithm produced high accuracy and precision for volume measurement.

  19. Kinetic Monte Carlo simulations for transient thermal fields: Computational methodology and application to the submicrosecond laser processes in implanted silicon.

    Science.gov (United States)

    Fisicaro, G; Pelaz, L; Lopez, P; La Magna, A

    2012-09-01

    Pulsed laser irradiation of damaged solids promotes ultrafast nonequilibrium kinetics, on the submicrosecond scale, leading to microscopic modifications of the material state. Reliable theoretical predictions of this evolution can be achieved only by simulating particle interactions in the presence of large and transient gradients of the thermal field. We propose a kinetic Monte Carlo (KMC) method for the simulation of damaged systems in the extremely far-from-equilibrium conditions caused by the laser irradiation. The reference systems are nonideal crystals containing point defect excesses, an order of magnitude larger than the equilibrium density, due to a preirradiation ion implantation process. The thermal and, eventual, melting problem is solved within the phase-field methodology, and the numerical solutions for the space- and time-dependent thermal field were then dynamically coupled to the KMC code. The formalism, implementation, and related tests of our computational code are discussed in detail. As an application example we analyze the evolution of the defect system caused by P ion implantation in Si under nanosecond pulsed irradiation. The simulation results suggest a significant annihilation of the implantation damage which can be well controlled by the laser fluence.

  20. Monte Carlo simulation of the microcanonical ensemble

    International Nuclear Information System (INIS)

    Creutz, M.

    1984-01-01

    We consider simulating statistical systems with a random walk on a constant energy surface. This combines features of deterministic molecular dynamics techniques and conventional Monte Carlo simulations. For discrete systems the method can be programmed to run an order of magnitude faster than other approaches. It does not require high quality random numbers and may also be useful for nonequilibrium studies. 10 references

  1. Monte Carlo simulation of the turbulent transport of airborne contaminants

    International Nuclear Information System (INIS)

    Watson, C.W.; Barr, S.

    1975-09-01

    A generalized, three-dimensional Monte Carlo model and computer code (SPOOR) are described for simulating atmospheric transport and dispersal of small pollutant clouds. A cloud is represented by a large number of particles that we track by statistically sampling simulated wind and turbulence fields. These fields are based on generalized wind data for large-scale flow and turbulent energy spectra for the micro- and mesoscales. The large-scale field can be input from a climatological data base, or by means of real-time analyses, or from a separate, subjectively defined data base. We introduce the micro- and mesoscale wind fluctuations through a power spectral density, to include effects from a broad spectrum of turbulent-energy scales. The role of turbulence is simulated in both meander and dispersal. Complex flow fields and time-dependent diffusion rates are accounted for naturally, and shear effects are simulated automatically in the ensemble of particle trajectories. An important adjunct has been the development of computer-graphics displays. These include two- and three-dimensional (perspective) snapshots and color motion pictures of particle ensembles, plus running displays of differential and integral cloud characteristics. The model's versatility makes it a valuable atmospheric research tool that we can adapt easily into broader, multicomponent systems-analysis codes. Removal, transformation, dry or wet deposition, and resuspension of contaminant particles can be readily included

  2. Srna - Monte Carlo codes for proton transport simulation in combined and voxelized geometries

    Directory of Open Access Journals (Sweden)

    Ilić Radovan D.

    2002-01-01

    Full Text Available This paper describes new Monte Carlo codes for proton transport simulations in complex geometrical forms and in materials of different composition. The SRNA codes were developed for three dimensional (3D dose distribution calculation in proton therapy and dosimetry. The model of these codes is based on the theory of proton multiple scattering and a simple model of compound nucleus decay. The developed package consists of two codes: SRNA-2KG and SRNA-VOX. The first code simulates proton transport in combined geometry that can be described by planes and second order surfaces. The second one uses the voxelized geometry of material zones and is specifically adopted for the application of patient computer tomography data. Transition probabilities for both codes are given by the SRNADAT program. In this paper, we will present the models and algorithms of our programs, as well as the results of the numerical experiments we have carried out applying them, along with the results of proton transport simulation obtained through the PETRA and GEANT programs. The simulation of the proton beam characterization by means of the Multi-Layer Faraday Cup and spatial distribution of positron emitters obtained by our program indicate the imminent application of Monte Carlo techniques in clinical practice.

  3. Monte Carlo simulations of plutonium gamma-ray spectra

    International Nuclear Information System (INIS)

    Koenig, Z.M.; Carlson, J.B.; Wang, Tzu-Fang; Ruhter, W.D.

    1993-01-01

    Monte Carlo calculations were investigated as a means of simulating the gamma-ray spectra of Pu. These simulated spectra will be used to develop and evaluate gamma-ray analysis techniques for various nondestructive measurements. Simulated spectra of calculational standards can be used for code intercomparisons, to understand systematic biases and to estimate minimum detection levels of existing and proposed nondestructive analysis instruments. The capability to simulate gamma-ray spectra from HPGe detectors could significantly reduce the costs of preparing large numbers of real reference materials. MCNP was used for the Monte Carlo transport of the photons. Results from the MCNP calculations were folded in with a detector response function for a realistic spectrum. Plutonium spectrum peaks were produced with Lorentzian shapes, for the x-rays, and Gaussian distributions. The MGA code determined the Pu isotopes and specific power of this calculated spectrum and compared it to a similar analysis on a measured spectrum

  4. An analytical model for backscattered luminance in fog: comparisons with Monte Carlo computations and experimental results

    International Nuclear Information System (INIS)

    Taillade, Frédéric; Dumont, Eric; Belin, Etienne

    2008-01-01

    We propose an analytical model for backscattered luminance in fog and derive an expression for the visibility signal-to-noise ratio as a function of meteorological visibility distance. The model uses single scattering processes. It is based on the Mie theory and the geometry of the optical device (emitter and receiver). In particular, we present an overlap function and take the phase function of fog into account. The results of the backscattered luminance obtained with our analytical model are compared to simulations made using the Monte Carlo method based on multiple scattering processes. An excellent agreement is found in that the discrepancy between the results is smaller than the Monte Carlo standard uncertainties. If we take no account of the geometry of the optical device, the results of the model-estimated backscattered luminance differ from the simulations by a factor 20. We also conclude that the signal-to-noise ratio computed with the Monte Carlo method and our analytical model is in good agreement with experimental results since the mean difference between the calculations and experimental measurements is smaller than the experimental uncertainty

  5. A study on the shielding element using Monte Carlo simulation

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Ki Jeong [Dept. of Radiology, Konkuk University Medical Center, Seoul (Korea, Republic of); Shim, Jae Goo [Dept. of Radiologic Technology, Daegu Health College, Daegu (Korea, Republic of)

    2017-06-15

    In this research, we simulated the elementary star shielding ability using Monte Carlo simulation to apply medical radiation shielding sheet which can replace existing lead. In the selection of elements, mainly elements and metal elements having a large atomic number, which are known to have high shielding performance, recently, various composite materials have improved shielding performance, so that weight reduction, processability, In consideration of activity etc., 21 elements were selected. The simulation tools were utilized Monte Carlo method. As a result of simulating the shielding performance by each element, it was estimated that the shielding ratio is the highest at 98.82% and 98.44% for tungsten and gold.

  6. Fast CPU-based Monte Carlo simulation for radiotherapy dose calculation

    Science.gov (United States)

    Ziegenhein, Peter; Pirner, Sven; Kamerling, Cornelis Ph; Oelfke, Uwe

    2015-08-01

    Monte-Carlo (MC) simulations are considered to be the most accurate method for calculating dose distributions in radiotherapy. Its clinical application, however, still is limited by the long runtimes conventional implementations of MC algorithms require to deliver sufficiently accurate results on high resolution imaging data. In order to overcome this obstacle we developed the software-package PhiMC, which is capable of computing precise dose distributions in a sub-minute time-frame by leveraging the potential of modern many- and multi-core CPU-based computers. PhiMC is based on the well verified dose planning method (DPM). We could demonstrate that PhiMC delivers dose distributions which are in excellent agreement to DPM. The multi-core implementation of PhiMC scales well between different computer architectures and achieves a speed-up of up to 37× compared to the original DPM code executed on a modern system. Furthermore, we could show that our CPU-based implementation on a modern workstation is between 1.25× and 1.95× faster than a well-known GPU implementation of the same simulation method on a NVIDIA Tesla C2050. Since CPUs work on several hundreds of GB RAM the typical GPU memory limitation does not apply for our implementation and high resolution clinical plans can be calculated.

  7. A new scaling approach for the mesoscale simulation of magnetic domain structures using Monte Carlo simulations

    Energy Technology Data Exchange (ETDEWEB)

    Radhakrishnan, B., E-mail: radhakrishnb@ornl.gov; Eisenbach, M.; Burress, T.A.

    2017-06-15

    Highlights: • Developed new scaling technique for dipole–dipole interaction energy. • Developed new scaling technique for exchange interaction energy. • Used scaling laws to extend atomistic simulations to micrometer length scale. • Demonstrated transition from mono-domain to vortex magnetic structure. • Simulated domain wall width and transition length scale agree with experiments. - Abstract: A new scaling approach has been proposed for the spin exchange and the dipole–dipole interaction energy as a function of the system size. The computed scaling laws are used in atomistic Monte Carlo simulations of magnetic moment evolution to predict the transition from single domain to a vortex structure as the system size increases. The width of a 180° – domain wall extracted from the simulated structures is in close agreement with experimentally values for an F–Si alloy. The transition size from a single domain to a vortex structure is also in close agreement with theoretically predicted and experimentally measured values for Fe.

  8. Comparative evaluations of the Monte Carlo-based light propagation simulation packages for optical imaging

    Directory of Open Access Journals (Sweden)

    Lin Wang

    2018-01-01

    Full Text Available Monte Carlo simulation of light propagation in turbid medium has been studied for years. A number of software packages have been developed to handle with such issue. However, it is hard to compare these simulation packages, especially for tissues with complex heterogeneous structures. Here, we first designed a group of mesh datasets generated by Iso2Mesh software, and used them to cross-validate the accuracy and to evaluate the performance of four Monte Carlo-based simulation packages, including Monte Carlo model of steady-state light transport in multi-layered tissues (MCML, tetrahedron-based inhomogeneous Monte Carlo optical simulator (TIMOS, Molecular Optical Simulation Environment (MOSE, and Mesh-based Monte Carlo (MMC. The performance of each package was evaluated based on the designed mesh datasets. The merits and demerits of each package were also discussed. Comparative results showed that the TIMOS package provided the best performance, which proved to be a reliable, efficient, and stable MC simulation package for users.

  9. Simulation of quantum systems by the tomography Monte Carlo method

    International Nuclear Information System (INIS)

    Bogdanov, Yu I

    2007-01-01

    A new method of statistical simulation of quantum systems is presented which is based on the generation of data by the Monte Carlo method and their purposeful tomography with the energy minimisation. The numerical solution of the problem is based on the optimisation of the target functional providing a compromise between the maximisation of the statistical likelihood function and the energy minimisation. The method does not involve complicated and ill-posed multidimensional computational procedures and can be used to calculate the wave functions and energies of the ground and excited stationary sates of complex quantum systems. The applications of the method are illustrated. (fifth seminar in memory of d.n. klyshko)

  10. A stochastic Markov chain approach for tennis: Monte Carlo simulation and modeling

    Science.gov (United States)

    Aslam, Kamran

    This dissertation describes the computational formulation of probability density functions (pdfs) that facilitate head-to-head match simulations in tennis along with ranking systems developed from their use. A background on the statistical method used to develop the pdfs , the Monte Carlo method, and the resulting rankings are included along with a discussion on ranking methods currently being used both in professional sports and in other applications. Using an analytical theory developed by Newton and Keller in [34] that defines a tennis player's probability of winning a game, set, match and single elimination tournament, a computational simulation has been developed in Matlab that allows further modeling not previously possible with the analytical theory alone. Such experimentation consists of the exploration of non-iid effects, considers the concept the varying importance of points in a match and allows an unlimited number of matches to be simulated between unlikely opponents. The results of these studies have provided pdfs that accurately model an individual tennis player's ability along with a realistic, fair and mathematically sound platform for ranking them.

  11. EGS code system: computer programs for the Monte Carlo simulation of electromagnetic cascade showers. Version 3

    International Nuclear Information System (INIS)

    Ford, R.L.; Nelson, W.R.

    1978-06-01

    A code to simulate almost any electron--photon transport problem conceivable is described. The report begins with a lengthy historical introduction and a description of the shower generation process. Then the detailed physics of the shower processes and the methods used to simulate them are presented. Ideas of sampling theory, transport techniques, particle interactions in general, and programing details are discussed. Next, EGS calculations and various experiments and other Monte Carlo results are compared. The remainder of the report consists of user manuals for EGS, PEGS, and TESTSR codes; options, input specifications, and typical output are included. 38 figures, 12 tables

  12. Monte Carlo applications to radiation shielding problems

    International Nuclear Information System (INIS)

    Subbaiah, K.V.

    2009-01-01

    Monte Carlo methods are a class of computational algorithms that rely on repeated random sampling of physical and mathematical systems to compute their results. However, basic concepts of MC are both simple and straightforward and can be learned by using a personal computer. Uses of Monte Carlo methods require large amounts of random numbers, and it was their use that spurred the development of pseudorandom number generators, which were far quicker to use than the tables of random numbers which had been previously used for statistical sampling. In Monte Carlo simulation of radiation transport, the history (track) of a particle is viewed as a random sequence of free flights that end with an interaction event where the particle changes its direction of movement, loses energy and, occasionally, produces secondary particles. The Monte Carlo simulation of a given experimental arrangement (e.g., an electron beam, coming from an accelerator and impinging on a water phantom) consists of the numerical generation of random histories. To simulate these histories we need an interaction model, i.e., a set of differential cross sections (DCS) for the relevant interaction mechanisms. The DCSs determine the probability distribution functions (pdf) of the random variables that characterize a track; 1) free path between successive interaction events, 2) type of interaction taking place and 3) energy loss and angular deflection in a particular event (and initial state of emitted secondary particles, if any). Once these pdfs are known, random histories can be generated by using appropriate sampling methods. If the number of generated histories is large enough, quantitative information on the transport process may be obtained by simply averaging over the simulated histories. The Monte Carlo method yields the same information as the solution of the Boltzmann transport equation, with the same interaction model, but is easier to implement. In particular, the simulation of radiation

  13. A Monte Carlo simulation model for stationary non-Gaussian processes

    DEFF Research Database (Denmark)

    Grigoriu, M.; Ditlevsen, Ove Dalager; Arwade, S. R.

    2003-01-01

    includes translation processes and is useful for both Monte Carlo simulation and analytical studies. As for translation processes, the mixture of translation processes can have a wide range of marginal distributions and correlation functions. Moreover, these processes can match a broader range of second...... athe proposed Monte Carlo algorithm and compare features of translation processes and mixture of translation processes. Keywords: Monte Carlo simulation, non-Gaussian processes, sampling theorem, stochastic processes, translation processes......A class of stationary non-Gaussian processes, referred to as the class of mixtures of translation processes, is defined by their finite dimensional distributions consisting of mixtures of finite dimensional distributions of translation processes. The class of mixtures of translation processes...

  14. Effective dose evaluation of NORM-added consumer products using Monte Carlo simulations and the ICRP computational human phantoms

    International Nuclear Information System (INIS)

    Lee, Hyun Cheol; Yoo, Do Hyeon; Testa, Mauro; Shin, Wook-Geun; Choi, Hyun Joon; Ha, Wi-Ho; Yoo, Jaeryong; Yoon, Seokwon; Min, Chul Hee

    2016-01-01

    The aim of this study is to evaluate the potential hazard of naturally occurring radioactive material (NORM) added consumer products. Using the Monte Carlo method, the radioactive products were simulated with ICRP reference phantom and the organ doses were calculated with the usage scenario. Finally, the annual effective doses were evaluated as lower than the public dose limit of 1 mSv y"−"1 for 44 products. It was demonstrated that NORM-added consumer products could be quantitatively assessed for the safety regulation. - Highlights: • Consumer products considered that NORM would be included should be regulated. • 44 products were collected and its gamma activities were measured with HPGe detector. • Through Monte Carlo simulation, organ equivalent doses and effective doses on human phantom were calculated. • All annual effective doses for the products were evaluated as lower than dose limit for the public.

  15. Radiation dose considerations by intra-individual Monte Carlo simulations in dual source spiral coronary computed tomography angiography with electrocardiogram-triggered tube current modulation and adaptive pitch

    International Nuclear Information System (INIS)

    May, Matthias S.; Kuettner, Axel; Lell, Michael M.; Wuest, Wolfgang; Scharf, Michael; Uder, Michael; Deak, Paul; Kalender, Willi A.; Keller, Andrea K.; Haeberle, Lothar; Achenbach, Stephan; Seltmann, Martin

    2012-01-01

    To evaluate radiation dose levels in patients undergoing spiral coronary computed tomography angiography (CTA) on a dual-source system in clinical routine. Coronary CTA was performed for 56 patients with electrocardiogram-triggered tube current modulation (TCM) and heart-rate (HR) dependent pitch adaptation. Individual Monte Carlo (MC) simulations were performed for dose assessment. Retrospective simulations with constant tube current (CTC) served as reference. Lung tissue was segmented and used for organ and effective dose (ED) calculation. Estimates for mean relative ED was 7.1 ± 2.1 mSv/100 mAs for TCM and 12.5 ± 5.3 mSv/100 mAs for CTC (P 70 bpm, 29 ± 12%). However lowest ED is achieved at high HR (5.2 ± 1.5 mSv/100 mAs), compared with intermediate (6.7 ± 1.6 mSv/100 mAs) and low (8.3 ± 2.1 mSv/100 mAs) HR when automated pitch adaptation is applied. Radiation dose savings up to 52% are achievable by TCM at low and regular HR. However lowest ED is attained at high HR by pitch adaptation despite inferior radiation dose reduction by TCM. circle Monte Carlo simulations allow for individual radiation dose calculations. (orig.)

  16. Investigating the impossible: Monte Carlo simulations

    International Nuclear Information System (INIS)

    Kramer, Gary H.; Crowley, Paul; Burns, Linda C.

    2000-01-01

    Designing and testing new equipment can be an expensive and time consuming process or the desired performance characteristics may preclude its construction due to technological shortcomings. Cost may also prevent equipment being purchased for other scenarios to be tested. An alternative is to use Monte Carlo simulations to make the investigations. This presentation exemplifies how Monte Carlo code calculations can be used to fill the gap. An example is given for the investigation of two sizes of germanium detector (70 mm and 80 mm diameter) at four different crystal thicknesses (15, 20, 25, and 30 mm) and makes predictions on how the size affects the counting efficiency and the Minimum Detectable Activity (MDA). The Monte Carlo simulations have shown that detector efficiencies can be adequately modelled using photon transport if the data is used to investigate trends. The investigation of the effect of detector thickness on the counting efficiency has shown that thickness for a fixed diameter detector of either 70 mm or 80 mm is unimportant up to 60 keV. At higher photon energies, the counting efficiency begins to decrease as the thickness decreases as expected. The simulations predict that the MDA of either the 70 mm or 80 mm diameter detectors does not differ by more than a factor of 1.15 at 17 keV or 1.2 at 60 keV when comparing detectors of equivalent thicknesses. The MDA is slightly increased at 17 keV, and rises by about 52% at 660 keV, when the thickness is decreased from 30 mm to 15 mm. One could conclude from this information that the extra cost associated with the larger area Ge detectors may not be justified for the slight improvement predicted in the MDA. (author)

  17. Ultrafast cone-beam CT scatter correction with GPU-based Monte Carlo simulation

    Directory of Open Access Journals (Sweden)

    Yuan Xu

    2014-03-01

    Full Text Available Purpose: Scatter artifacts severely degrade image quality of cone-beam CT (CBCT. We present an ultrafast scatter correction framework by using GPU-based Monte Carlo (MC simulation and prior patient CT image, aiming at automatically finish the whole process including both scatter correction and reconstruction within 30 seconds.Methods: The method consists of six steps: 1 FDK reconstruction using raw projection data; 2 Rigid Registration of planning CT to the FDK results; 3 MC scatter calculation at sparse view angles using the planning CT; 4 Interpolation of the calculated scatter signals to other angles; 5 Removal of scatter from the raw projections; 6 FDK reconstruction using the scatter-corrected projections. In addition to using GPU to accelerate MC photon simulations, we also use a small number of photons and a down-sampled CT image in simulation to further reduce computation time. A novel denoising algorithm is used to eliminate MC noise from the simulated scatter images caused by low photon numbers. The method is validated on one simulated head-and-neck case with 364 projection angles.Results: We have examined variation of the scatter signal among projection angles using Fourier analysis. It is found that scatter images at 31 angles are sufficient to restore those at all angles with < 0.1% error. For the simulated patient case with a resolution of 512 × 512 × 100, we simulated 5 × 106 photons per angle. The total computation time is 20.52 seconds on a Nvidia GTX Titan GPU, and the time at each step is 2.53, 0.64, 14.78, 0.13, 0.19, and 2.25 seconds, respectively. The scatter-induced shading/cupping artifacts are substantially reduced, and the average HU error of a region-of-interest is reduced from 75.9 to 19.0 HU.Conclusion: A practical ultrafast MC-based CBCT scatter correction scheme is developed. It accomplished the whole procedure of scatter correction and reconstruction within 30 seconds.----------------------------Cite this

  18. Proceedings of the conference on frontiers of Quantum Monte Carlo

    International Nuclear Information System (INIS)

    Gubernatis, J.E.

    1986-01-01

    This journal of conference proceedings includes papers on topics such as: computers and science; Quantum Monte Carlo; condensed matter physics (with papers including the statistical error of Green's Function Monte Carlo, a study of Trotter-like approximations, simulations of the Hubbard model, and stochastic simulation of fermions); chemistry (including papers on quantum simulations of aqueous systems, fourier path integral methods, and a study of electron solvation in polar solvents using path integral calculations); atomic molecular and nuclear physics; high-energy physics, and advanced computer designs

  19. Uncertainty Propagation Analysis for the Monte Carlo Time-Dependent Simulations

    International Nuclear Information System (INIS)

    Shaukata, Nadeem; Shim, Hyung Jin

    2015-01-01

    In this paper, a conventional method to control the neutron population for super-critical systems is implemented. Instead of considering the cycles, the simulation is divided in time intervals. At the end of each time interval, neutron population control is applied on the banked neutrons. Randomly selected neutrons are discarded, until the size of neutron population matches the initial neutron histories at the beginning of time simulation. A time-dependent simulation mode has also been implemented in the development version of SERPENT 2 Monte Carlo code. In this mode, sequential population control mechanism has been proposed for modeling of prompt super-critical systems. A Monte Carlo method has been properly used in TART code for dynamic criticality calculations. For super-critical systems, the neutron population is allowed to grow over a period of time. The neutron population is uniformly combed to return it to the neutron population started with at the beginning of time boundary. In this study, conventional time-dependent Monte Carlo (TDMC) algorithm is implemented. There is an exponential growth of neutron population in estimation of neutron density tally for super-critical systems and the number of neutrons being tracked exceed the memory of the computer. In order to control this exponential growth at the end of each time boundary, a conventional time cut-off controlling population strategy is included in TDMC. A scale factor is introduced to tally the desired neutron density at the end of each time boundary. The main purpose of this paper is the quantification of uncertainty propagation in neutron densities at the end of each time boundary for super-critical systems. This uncertainty is caused by the uncertainty resulting from the introduction of scale factor. The effectiveness of TDMC is examined for one-group infinite homogeneous problem (the rod model) and two-group infinite homogeneous problem. The desired neutron density is tallied by the introduction of

  20. Uncertainty Propagation Analysis for the Monte Carlo Time-Dependent Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Shaukata, Nadeem; Shim, Hyung Jin [Seoul National University, Seoul (Korea, Republic of)

    2015-10-15

    In this paper, a conventional method to control the neutron population for super-critical systems is implemented. Instead of considering the cycles, the simulation is divided in time intervals. At the end of each time interval, neutron population control is applied on the banked neutrons. Randomly selected neutrons are discarded, until the size of neutron population matches the initial neutron histories at the beginning of time simulation. A time-dependent simulation mode has also been implemented in the development version of SERPENT 2 Monte Carlo code. In this mode, sequential population control mechanism has been proposed for modeling of prompt super-critical systems. A Monte Carlo method has been properly used in TART code for dynamic criticality calculations. For super-critical systems, the neutron population is allowed to grow over a period of time. The neutron population is uniformly combed to return it to the neutron population started with at the beginning of time boundary. In this study, conventional time-dependent Monte Carlo (TDMC) algorithm is implemented. There is an exponential growth of neutron population in estimation of neutron density tally for super-critical systems and the number of neutrons being tracked exceed the memory of the computer. In order to control this exponential growth at the end of each time boundary, a conventional time cut-off controlling population strategy is included in TDMC. A scale factor is introduced to tally the desired neutron density at the end of each time boundary. The main purpose of this paper is the quantification of uncertainty propagation in neutron densities at the end of each time boundary for super-critical systems. This uncertainty is caused by the uncertainty resulting from the introduction of scale factor. The effectiveness of TDMC is examined for one-group infinite homogeneous problem (the rod model) and two-group infinite homogeneous problem. The desired neutron density is tallied by the introduction of

  1. Medical images of patients in voxel structures in high resolution for Monte Carlo simulation

    International Nuclear Information System (INIS)

    Boia, Leonardo S.; Menezes, Artur F.; Silva, Ademir X.

    2011-01-01

    This work aims to present a computational process of conversion of tomographic and MRI medical images from patients in voxel structures to an input file, which will be manipulated in Monte Carlo Simulation code for tumor's radiotherapic treatments. The problem's scenario inherent to the patient is simulated by such process, using the volume element (voxel) as a unit of computational tracing. The head's voxel structure geometry has voxels with volumetric dimensions around 1 mm 3 and a population of millions, which helps - in that way, for a realistic simulation and a decrease in image's digital process techniques for adjustments and equalizations. With such additional data from the code, a more critical analysis can be developed in order to determine the volume of the tumor, and the protection, beside the patients' medical images were borrowed by Clinicas Oncologicas Integradas (COI/RJ), joined to the previous performed planning. In order to execute this computational process, SAPDI computational system is used in a digital image process for optimization of data, conversion program Scan2MCNP, which manipulates, processes, and converts the medical images into voxel structures to input files and the graphic visualizer Moritz for the verification of image's geometry placing. (author)

  2. TU-H-CAMPUS-IeP1-01: Bias and Computational Efficiency of Variance Reduction Methods for the Monte Carlo Simulation of Imaging Detectors

    Energy Technology Data Exchange (ETDEWEB)

    Sharma, D; Badano, A [Division of Imaging, Diagnostics and Software Reliability, OSEL/CDRH, Food & Drug Administration, MD (United States); Sempau, J [Technical University of Catalonia, Barcelona (Spain)

    2016-06-15

    Purpose: Variance reduction techniques (VRTs) are employed in Monte Carlo simulations to obtain estimates with reduced statistical uncertainty for a given simulation time. In this work, we study the bias and efficiency of a VRT for estimating the response of imaging detectors. Methods: We implemented Directed Sampling (DS), preferentially directing a fraction of emitted optical photons directly towards the detector by altering the isotropic model. The weight of each optical photon is appropriately modified to maintain simulation estimates unbiased. We use a Monte Carlo tool called fastDETECT2 (part of the hybridMANTIS open-source package) for optical transport, modified for VRT. The weight of each photon is calculated as the ratio of original probability (no VRT) and the new probability for a particular direction. For our analysis of bias and efficiency, we use pulse height spectra, point response functions, and Swank factors. We obtain results for a variety of cases including analog (no VRT, isotropic distribution), and DS with 0.2 and 0.8 optical photons directed towards the sensor plane. We used 10,000, 25-keV primaries. Results: The Swank factor for all cases in our simplified model converged fast (within the first 100 primaries) to a stable value of 0.9. The root mean square error per pixel for DS VRT for the point response function between analog and VRT cases was approximately 5e-4. Conclusion: Our preliminary results suggest that DS VRT does not affect the estimate of the mean for the Swank factor. Our findings indicate that it may be possible to design VRTs for imaging detector simulations to increase computational efficiency without introducing bias.

  3. Particle-transport simulation with the Monte Carlo method

    International Nuclear Information System (INIS)

    Carter, L.L.; Cashwell, E.D.

    1975-01-01

    Attention is focused on the application of the Monte Carlo method to particle transport problems, with emphasis on neutron and photon transport. Topics covered include sampling methods, mathematical prescriptions for simulating particle transport, mechanics of simulating particle transport, neutron transport, and photon transport. A literature survey of 204 references is included. (GMT)

  4. An evaluation of the Monte Carlo simulation of SPECT projection data using MCNP and SimSPECT

    International Nuclear Information System (INIS)

    Selcow, E.C.; Dobrzeniecki, A.B.; Yanch, J.C.; Lu, A.; Belanger, M.J.

    1996-01-01

    Simulation of the complete nuclear medicine imaging situation for SPECT (Single Photon Emission Computed Tomography) produces synthetic images that are useful in the analysis and improvement of existing imaging systems and in the design of new and improved systems. The simulation methods the authors employ are based on probabilistic numerical calculations (Monte Carlo); they require enormous amounts of computer time and employ highly complex models (the tomographic acquisition of images through intricate collimators). The presentation consists of three parts. In the first, they describe the techniques developed to achieve reasonable simulation times and the tools built to allow interactive and effective analysis and processing of the resultant synthetic images. In the next part, they explore the limitations of such techniques for performing simulations of medical imaging situations. In the final part, they describe the areas of research that are promising for increasing the quality and breadth of the simulation process

  5. Topological zero modes in Monte Carlo simulations

    International Nuclear Information System (INIS)

    Dilger, H.

    1994-08-01

    We present an improvement of global Metropolis updating steps, the instanton hits, used in a hybrid Monte Carlo simulation of the two-flavor Schwinger model with staggered fermions. These hits are designed to change the topological sector of the gauge field. In order to match these hits to an unquenched simulation with pseudofermions, the approximate zero mode structure of the lattice Dirac operator has to be considered explicitly. (orig.)

  6. Kinetic energy of solid and liquid para-hydrogen: a path integral Monte Carlo simulation

    International Nuclear Information System (INIS)

    Zoppi, M.; Neumann, M.

    1992-01-01

    The translational (center of mass) kinetic energy of solid and liquid para-hydrogen have been recently measured by means of Deep Inelastic Neutron Scattering. We have evaluated the same quantity, in similar thermodynamic conditions, by means of Path Integral Monte Carlo computer simulation, modelling the system as composed of a set of spherical molecules interacting through a pairwise additive Lennard-Jones potential. In spite of the crude approximations on the interaction potential, the agreement is excellent. The pressure was also computed by means of the same simulations. This quantity, compared with the equation of state for solid para-hydrogen given by Driessen and Silvera, gives an agreement of a lesser quality and a negative value for the liquid state. We attribute this discrepancy to the limitations of the Lennard-Jones potential. (orig.)

  7. Magnetic properties of Ni/Au core/shell studied by Monte Carlo simulations

    Energy Technology Data Exchange (ETDEWEB)

    Masrour, R., E-mail: rachidmasrour@hotmail.com [Laboratory of Materials, Processes, Environment and Quality, Cady Ayyed University, National School of Applied Sciences, Sidi Bouzid, Safi, 63 4600 (Morocco); LMPHE (URAC 12), Faculté des Sciences, Université Mohammed V-Agdal, Av. Ibn Batouta, B.P. 1014, Rabat (Morocco); Bahmad, L. [LMPHE (URAC 12), Faculté des Sciences, Université Mohammed V-Agdal, Av. Ibn Batouta, B.P. 1014, Rabat (Morocco); Hamedoun, M. [Institute of Nanomaterials and Nanotechnologies, MAScIR, Rabat (Morocco); Benyoussef, A. [LMPHE (URAC 12), Faculté des Sciences, Université Mohammed V-Agdal, Av. Ibn Batouta, B.P. 1014, Rabat (Morocco); Institute of Nanomaterials and Nanotechnologies, MAScIR, Rabat (Morocco); Hassan II Academy of Science and Technology, Rabat (Morocco); Hlil, E.K. [Institut Néel, CNRS et Université Joseph Fourier, BP 166, F-38042 Grenoble cedex 9 (France)

    2014-01-10

    The magnetic properties of ferromagnetic Ni/Au core/shell have been studied using Monte Carlo simulations within the Ising model framework. The considered Hamiltonian includes the exchange interactions between Ni–Ni, Au–Au and Ni–Au and the external magnetic field. The thermal total magnetizations and total magnetic susceptibilities of core/shell Ni/Au are computed. The critical temperature is deduced. The exchange interaction between Ni and Au atoms is obtained. In addition, the total magnetizations versus the external magnetic field and crystal filed for different temperature are also established.

  8. Crop canopy BRDF simulation and analysis using Monte Carlo method

    NARCIS (Netherlands)

    Huang, J.; Wu, B.; Tian, Y.; Zeng, Y.

    2006-01-01

    This author designs the random process between photons and crop canopy. A Monte Carlo model has been developed to simulate the Bi-directional Reflectance Distribution Function (BRDF) of crop canopy. Comparing Monte Carlo model to MCRM model, this paper analyzes the variations of different LAD and

  9. Monte Carlo simulation of radiative transfer in scattering, emitting, absorbing slab with gradient index

    International Nuclear Information System (INIS)

    Huang Yong; Liang Xingang; Xia Xinlin

    2005-01-01

    The Monte Carlo method is used to simulate the thermal emission of absorbing-emitting-scattering slab with gradient index. Three Monte Carlo ray-tracing strategies are considered. The first strategy is keeping the real distribution of the refractive index and to trace bundles in a curve route. The second strategy is discretizing the slab into sub-layers, each having constant refractive index. The bundle is traced in a straight route in each sub-layer and the reflection at the inner interface is taken into account. The third strategy is similar to the second one but only the total reflection at the inner interface is computed. Little difference is observed among the results of apparent thermal emission by these three different Monte Carlo ray tracing strategies. The results also show that the apparent hemispherical emissivity non-monotonously varies with increasing optical thickness of the slab with strong scattering gradient index. Many parameters can influence the apparent thermal emission greatly

  10. Characterization of a cylindrical plastic β-detector with Monte Carlo simulations of optical photons

    Energy Technology Data Exchange (ETDEWEB)

    Guadilla, V., E-mail: victor.guadilla@ific.uv.es [Instituto de Física Corpuscular, CSIC-Universidad de Valencia, E-46071 Valencia (Spain); Algora, A. [Instituto de Física Corpuscular, CSIC-Universidad de Valencia, E-46071 Valencia (Spain); Institute of Nuclear Research of the Hungarian Academy of Sciences, Debrecen H-4026 (Hungary); Tain, J.L.; Agramunt, J. [Instituto de Física Corpuscular, CSIC-Universidad de Valencia, E-46071 Valencia (Spain); Äystö, J. [University of Jyvaskyla, Department of Physics, P.O. Box 35, FI-40014 (Finland); Briz, J.A.; Cucoanes, A. [Subatech, CNRS/IN2P3, Nantes, EMN, F-44307 Nantes (France); Eronen, T. [University of Jyvaskyla, Department of Physics, P.O. Box 35, FI-40014 (Finland); Estienne, M.; Fallot, M. [Subatech, CNRS/IN2P3, Nantes, EMN, F-44307 Nantes (France); Fraile, L.M. [Universidad Complutense, Grupo de Física Nuclear, CEI Moncloa, E-28040 Madrid (Spain); Ganioğlu, E. [Department of Physics, Istanbul University, 34134 Istanbul (Turkey); Gelletly, W. [Instituto de Física Corpuscular, CSIC-Universidad de Valencia, E-46071 Valencia (Spain); Department of Physics, University of Surrey, GU2 7XH Guildford (United Kingdom); Gorelov, D.; Hakala, J.; Jokinen, A. [University of Jyvaskyla, Department of Physics, P.O. Box 35, FI-40014 (Finland); Jordan, D. [Instituto de Física Corpuscular, CSIC-Universidad de Valencia, E-46071 Valencia (Spain); Kankainen, A.; Kolhinen, V.; Koponen, J. [University of Jyvaskyla, Department of Physics, P.O. Box 35, FI-40014 (Finland); and others

    2017-05-11

    In this work we report on the Monte Carlo study performed to understand and reproduce experimental measurements of a new plastic β-detector with cylindrical geometry. Since energy deposition simulations differ from the experimental measurements for such a geometry, we show how the simulation of production and transport of optical photons does allow one to obtain the shapes of the experimental spectra. Moreover, taking into account the computational effort associated with this kind of simulation, we develop a method to convert the simulations of energy deposited into light collected, depending only on the interaction point in the detector. This method represents a useful solution when extensive simulations have to be done, as in the case of the calculation of the response function of the spectrometer in a total absorption γ-ray spectroscopy analysis.

  11. Monte Carlo simulation of X-ray imaging and spectroscopy experiments using quadric geometry and variance reduction techniques

    Science.gov (United States)

    Golosio, Bruno; Schoonjans, Tom; Brunetti, Antonio; Oliva, Piernicola; Masala, Giovanni Luca

    2014-03-01

    The simulation of X-ray imaging experiments is often performed using deterministic codes, which can be relatively fast and easy to use. However, such codes are generally not suitable for the simulation of even slightly more complex experimental conditions, involving, for instance, first-order or higher-order scattering, X-ray fluorescence emissions, or more complex geometries, particularly for experiments that combine spatial resolution with spectral information. In such cases, simulations are often performed using codes based on the Monte Carlo method. In a simple Monte Carlo approach, the interaction position of an X-ray photon and the state of the photon after an interaction are obtained simply according to the theoretical probability distributions. This approach may be quite inefficient because the final channels of interest may include only a limited region of space or photons produced by a rare interaction, e.g., fluorescent emission from elements with very low concentrations. In the field of X-ray fluorescence spectroscopy, this problem has been solved by combining the Monte Carlo method with variance reduction techniques, which can reduce the computation time by several orders of magnitude. In this work, we present a C++ code for the general simulation of X-ray imaging and spectroscopy experiments, based on the application of the Monte Carlo method in combination with variance reduction techniques, with a description of sample geometry based on quadric surfaces. We describe the benefits of the object-oriented approach in terms of code maintenance, the flexibility of the program for the simulation of different experimental conditions and the possibility of easily adding new modules. Sample applications in the fields of X-ray imaging and X-ray spectroscopy are discussed. Catalogue identifier: AERO_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AERO_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland

  12. Profit Forecast Model Using Monte Carlo Simulation in Excel

    Directory of Open Access Journals (Sweden)

    Petru BALOGH

    2014-01-01

    Full Text Available Profit forecast is very important for any company. The purpose of this study is to provide a method to estimate the profit and the probability of obtaining the expected profit. Monte Carlo methods are stochastic techniques–meaning they are based on the use of random numbers and probability statistics to investigate problems. Monte Carlo simulation furnishes the decision-maker with a range of possible outcomes and the probabilities they will occur for any choice of action. Our example of Monte Carlo simulation in Excel will be a simplified profit forecast model. Each step of the analysis will be described in detail. The input data for the case presented: the number of leads per month, the percentage of leads that result in sales, , the cost of a single lead, the profit per sale and fixed cost, allow obtaining profit and associated probabilities of achieving.

  13. Monte Carlo Simulation of Influence of Input Parameters Uncertainty on Output Data

    International Nuclear Information System (INIS)

    Sobek, Lukas

    2010-01-01

    Input parameters of a complex system in the probabilistic simulation are treated by means of probability density function (PDF). The result of the simulation have also probabilistic character. Monte Carlo simulation is widely used to obtain predictions concerning the probability of the risk. The Monte Carlo method was performed to calculate histograms of PDF for release rate given by uncertainty in distribution coefficient of radionuclides 135 Cs and 235 U.

  14. A brief history of the introduction of generalized ensembles to Markov chain Monte Carlo simulations

    Science.gov (United States)

    Berg, Bernd A.

    2017-03-01

    The most efficient weights for Markov chain Monte Carlo calculations of physical observables are not necessarily those of the canonical ensemble. Generalized ensembles, which do not exist in nature but can be simulated on computers, lead often to a much faster convergence. In particular, they have been used for simulations of first order phase transitions and for simulations of complex systems in which conflicting constraints lead to a rugged free energy landscape. Starting off with the Metropolis algorithm and Hastings' extension, I present a minireview which focuses on the explosive use of generalized ensembles in the early 1990s. Illustrations are given, which range from spin models to peptides.

  15. Monte-Carlo simulation of heavy-ion collisions

    International Nuclear Information System (INIS)

    Schenke, Bjoern; Jeon, Sangyong; Gale, Charles

    2011-01-01

    We present Monte-Carlo simulations for heavy-ion collisions combining PYTHIA and the McGill-AMY formalism to describe the evolution of hard partons in a soft background, modelled using hydrodynamic simulations. MARTINI generates full event configurations in the high p T region that take into account thermal QCD and QED effects as well as effects of the evolving medium. This way it is possible to perform detailed quantitative comparisons with experimental observables.

  16. Atomic scale Monte Carlo simulations of BF3 plasma immersion ion implantation in Si

    International Nuclear Information System (INIS)

    La Magna, Antonino; Fisicaro, Giuseppe; Nicotra, Giuseppe; Spiegel, Yohann; Torregrosa, Frank

    2014-01-01

    We present a numerical model aimed to accurately simulate the plasma immersion ion implantation (PIII) process in micro and nano-patterned Si samples. The code, based on the Monte Carlo approach, is designed to reproduce all the relevant physical phenomena involved in the process. The particle based simulation technique is fundamental to efficiently compute the material modifications promoted by the plasma implantation at the atomic resolution. The accuracy in the description of the process kinetic is achieved linking (one to one) each virtual Monte Carlo event to each possible atomic phenomenon (e.g. ion penetration, neutral absorption, ion induced surface modification, etc.). The code is designed to be coupled with a generic plasma status, characterized by the particle types (ions and neutrals), their flow rates and their energy/angle distributions. The coupling with a Poisson solver allows the simulation of the correct trajectories of charged particles in the void regions of the micro-structures. The implemented model is able to predict the implantation 2D profiles and significantly support the process design. (copyright 2014 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim) (orig.)

  17. Analytical simulation platform describing projections in computed tomography systems

    International Nuclear Information System (INIS)

    Youn, Hanbean; Kim, Ho Kyung

    2013-01-01

    To reduce the patient dose, several approaches such as spectral imaging using photon counting detectors and statistical image reconstruction, are being considered. Although image-reconstruction algorithms may significantly enhance image quality in reconstructed images with low dose, true signal-to-noise properties are mainly determined by image quality in projections. We are developing an analytical simulation platform describing projections to investigate how quantum-interaction physics in each component configuring CT systems affect image quality in projections. This simulator will be very useful for an improved design or optimization of CT systems in economy as well as the development of novel image-reconstruction algorithms. In this study, we present the progress of development of the simulation platform with an emphasis on the theoretical framework describing the generation of projection data. We have prepared the analytical simulation platform describing projections in computed tomography systems. The remained further study before the meeting includes the following: Each stage in the cascaded signal-transfer model for obtaining projections will be validated by the Monte Carlo simulations. We will build up energy-dependent scatter and pixel-crosstalk kernels, and show their effects on image quality in projections and reconstructed images. We will investigate the effects of projections obtained from various imaging conditions and system (or detector) operation parameters on reconstructed images. It is challenging to include the interaction physics due to photon-counting detectors into the simulation platform. Detailed descriptions of the simulator will be presented with discussions on its performance and limitation as well as Monte Carlo validations. Computational cost will also be addressed in detail. The proposed method in this study is simple and can be used conveniently in lab environment

  18. Monte Carlo Simulations of Neutron Oil well Logging Tools

    International Nuclear Information System (INIS)

    Azcurra, Mario

    2002-01-01

    Monte Carlo simulations of simple neutron oil well logging tools into typical geological formations are presented.The simulated tools consist of both 14 MeV pulsed and continuous Am-Be neutron sources with time gated and continuous gamma ray detectors respectively.The geological formation consists of pure limestone with 15% absolute porosity in a wide range of oil saturation.The particle transport was performed with the Monte Carlo N-Particle Transport Code System, MCNP-4B.Several gamma ray spectra were obtained at the detector position that allow to perform composition analysis of the formation.In particular, the ratio C/O was analyzed as an indicator of oil saturation.Further calculations are proposed to simulate actual detector responses in order to contribute to understand the relation between the detector response with the formation composition

  19. Monte Carlo simulations of neutron oil well logging tools

    International Nuclear Information System (INIS)

    Azcurra, Mario O.; Zamonsky, Oscar M.

    2003-01-01

    Monte Carlo simulations of simple neutron oil well logging tools into typical geological formations are presented. The simulated tools consist of both 14 MeV pulsed and continuous Am-Be neutron sources with time gated and continuous gamma ray detectors respectively. The geological formation consists of pure limestone with 15% absolute porosity in a wide range of oil saturation. The particle transport was performed with the Monte Carlo N-Particle Transport Code System, MCNP-4B. Several gamma ray spectra were obtained at the detector position that allow to perform composition analysis of the formation. In particular, the ratio C/O was analyzed as an indicator of oil saturation. Further calculations are proposed to simulate actual detector responses in order to contribute to understand the relation between the detector response with the formation composition. (author)

  20. Multilevel Monte Carlo methods using ensemble level mixed MsFEM for two-phase flow and transport simulations

    KAUST Repository

    Efendiev, Yalchin R.

    2013-08-21

    In this paper, we propose multilevel Monte Carlo (MLMC) methods that use ensemble level mixed multiscale methods in the simulations of multiphase flow and transport. The contribution of this paper is twofold: (1) a design of ensemble level mixed multiscale finite element methods and (2) a novel use of mixed multiscale finite element methods within multilevel Monte Carlo techniques to speed up the computations. The main idea of ensemble level multiscale methods is to construct local multiscale basis functions that can be used for any member of the ensemble. In this paper, we consider two ensemble level mixed multiscale finite element methods: (1) the no-local-solve-online ensemble level method (NLSO); and (2) the local-solve-online ensemble level method (LSO). The first approach was proposed in Aarnes and Efendiev (SIAM J. Sci. Comput. 30(5):2319-2339, 2008) while the second approach is new. Both mixed multiscale methods use a number of snapshots of the permeability media in generating multiscale basis functions. As a result, in the off-line stage, we construct multiple basis functions for each coarse region where basis functions correspond to different realizations. In the no-local-solve-online ensemble level method, one uses the whole set of precomputed basis functions to approximate the solution for an arbitrary realization. In the local-solve-online ensemble level method, one uses the precomputed functions to construct a multiscale basis for a particular realization. With this basis, the solution corresponding to this particular realization is approximated in LSO mixed multiscale finite element method (MsFEM). In both approaches, the accuracy of the method is related to the number of snapshots computed based on different realizations that one uses to precompute a multiscale basis. In this paper, ensemble level multiscale methods are used in multilevel Monte Carlo methods (Giles 2008a, Oper.Res. 56(3):607-617, b). In multilevel Monte Carlo methods, more accurate

  1. MCViNE – An object oriented Monte Carlo neutron ray tracing simulation package

    Energy Technology Data Exchange (ETDEWEB)

    Lin, Jiao Y.Y., E-mail: linjiao@ornl.gov [Caltech Center for Advanced Computing Research, California Institute of Technology (United States); Department of Applied Physics and Materials Science, California Institute of Technology (United States); Neutron Data Analysis and Visualization Division, Oak Ridge National Laboratory (United States); Smith, Hillary L. [Department of Applied Physics and Materials Science, California Institute of Technology (United States); Granroth, Garrett E., E-mail: granrothge@ornl.gov [Neutron Data Analysis and Visualization Division, Oak Ridge National Laboratory (United States); Abernathy, Douglas L.; Lumsden, Mark D.; Winn, Barry; Aczel, Adam A. [Quantum Condensed Matter Division, Oak Ridge National Laboratory (United States); Aivazis, Michael [Caltech Center for Advanced Computing Research, California Institute of Technology (United States); Fultz, Brent, E-mail: btf@caltech.edu [Department of Applied Physics and Materials Science, California Institute of Technology (United States)

    2016-02-21

    MCViNE (Monte-Carlo VIrtual Neutron Experiment) is an open-source Monte Carlo (MC) neutron ray-tracing software for performing computer modeling and simulations that mirror real neutron scattering experiments. We exploited the close similarity between how instrument components are designed and operated and how such components can be modeled in software. For example we used object oriented programming concepts for representing neutron scatterers and detector systems, and recursive algorithms for implementing multiple scattering. Combining these features together in MCViNE allows one to handle sophisticated neutron scattering problems in modern instruments, including, for example, neutron detection by complex detector systems, and single and multiple scattering events in a variety of samples and sample environments. In addition, MCViNE can use simulation components from linear-chain-based MC ray tracing packages which facilitates porting instrument models from those codes. Furthermore it allows for components written solely in Python, which expedites prototyping of new components. These developments have enabled detailed simulations of neutron scattering experiments, with non-trivial samples, for time-of-flight inelastic instruments at the Spallation Neutron Source. Examples of such simulations for powder and single-crystal samples with various scattering kernels, including kernels for phonon and magnon scattering, are presented. With simulations that closely reproduce experimental results, scattering mechanisms can be turned on and off to determine how they contribute to the measured scattering intensities, improving our understanding of the underlying physics.

  2. Evaluation of cobalt-60 energy deposit in mouse and monkey using Monte Carlo simulation

    Energy Technology Data Exchange (ETDEWEB)

    Woo, Sang Keun; Kim, Wook; Park, Yong Sung; Kang, Joo Hyun; Lee, Yong Jin [Korea Institute of Radiological and Medical Sciences, KIRAMS, Seoul (Korea, Republic of); Cho, Doo Wan; Lee, Hong Soo; Han, Su Cheol [Jeonbuk Department of Inhalation Research, Korea Institute of toxicology, KRICT, Jeongeup (Korea, Republic of)

    2016-12-15

    These absorbed dose can calculated using the Monte Carlo transport code MCNP (Monte Carlo N-particle transport code). Internal radiotherapy absorbed dose was calculated using conventional software, such as OLINDA/EXM or Monte Carlo simulation. However, the OLINDA/EXM does not calculate individual absorbed dose and non-standard organ, such as tumor. While the Monte Carlo simulation can calculated non-standard organ and specific absorbed dose using individual CT image. External radiotherapy, absorbed dose can calculated by specific absorbed energy in specific organs using Monte Carlo simulation. The specific absorbed energy in each organ was difference between species or even if the same species. Since they have difference organ sizes, position, and density of organs. The aim of this study was to individually evaluated cobalt-60 energy deposit in mouse and monkey using Monte Carlo simulation. We evaluation of cobalt-60 energy deposit in mouse and monkey using Monte Carlo simulation. The absorbed energy in each organ compared with mouse heart was 54.6 fold higher than monkey absorbed energy in heart. Likewise lung was 88.4, liver was 16.0, urinary bladder was 29.4 fold higher than monkey. It means that the distance of each organs and organ mass was effects of the absorbed energy. This result may help to can calculated absorbed dose and more accuracy plan for external radiation beam therapy and internal radiotherapy.

  3. Evaluation of cobalt-60 energy deposit in mouse and monkey using Monte Carlo simulation

    International Nuclear Information System (INIS)

    Woo, Sang Keun; Kim, Wook; Park, Yong Sung; Kang, Joo Hyun; Lee, Yong Jin; Cho, Doo Wan; Lee, Hong Soo; Han, Su Cheol

    2016-01-01

    These absorbed dose can calculated using the Monte Carlo transport code MCNP (Monte Carlo N-particle transport code). Internal radiotherapy absorbed dose was calculated using conventional software, such as OLINDA/EXM or Monte Carlo simulation. However, the OLINDA/EXM does not calculate individual absorbed dose and non-standard organ, such as tumor. While the Monte Carlo simulation can calculated non-standard organ and specific absorbed dose using individual CT image. External radiotherapy, absorbed dose can calculated by specific absorbed energy in specific organs using Monte Carlo simulation. The specific absorbed energy in each organ was difference between species or even if the same species. Since they have difference organ sizes, position, and density of organs. The aim of this study was to individually evaluated cobalt-60 energy deposit in mouse and monkey using Monte Carlo simulation. We evaluation of cobalt-60 energy deposit in mouse and monkey using Monte Carlo simulation. The absorbed energy in each organ compared with mouse heart was 54.6 fold higher than monkey absorbed energy in heart. Likewise lung was 88.4, liver was 16.0, urinary bladder was 29.4 fold higher than monkey. It means that the distance of each organs and organ mass was effects of the absorbed energy. This result may help to can calculated absorbed dose and more accuracy plan for external radiation beam therapy and internal radiotherapy.

  4. Monte Carlo simulations of the particle transport in semiconductor detectors of fast neutrons

    International Nuclear Information System (INIS)

    Sedlačková, Katarína; Zaťko, Bohumír; Šagátová, Andrea; Nečas, Vladimír

    2013-01-01

    Several Monte Carlo all-particle transport codes are under active development around the world. In this paper we focused on the capabilities of the MCNPX code (Monte Carlo N-Particle eXtended) to follow the particle transport in semiconductor detector of fast neutrons. Semiconductor detector based on semi-insulating GaAs was the object of our investigation. As converter material capable to produce charged particles from the (n, p) interaction, a high-density polyethylene (HDPE) was employed. As the source of fast neutrons, the 239 Pu–Be neutron source was used in the model. The simulations were performed using the MCNPX code which makes possible to track not only neutrons but also recoiled protons at all interesting energies. Hence, the MCNPX code enables seamless particle transport and no other computer program is needed to process the particle transport. The determination of the optimal thickness of the conversion layer and the minimum thickness of the active region of semiconductor detector as well as the energy spectra simulation were the principal goals of the computer modeling. Theoretical detector responses showed that the best detection efficiency can be achieved for 500 μm thick HDPE converter layer. The minimum detector active region thickness has been estimated to be about 400 μm. -- Highlights: ► Application of the MCNPX code for fast neutron detector design is demonstrated. ► Simulations of the particle transport through conversion film of HDPE are presented. ► Simulations of the particle transport through detector active region are presented. ► The optimal thickness of the HDPE conversion film has been calculated. ► Detection efficiency of 0.135% was reached for 500 μm thick HDPE conversion film

  5. Monte Carlo simulations of adsorption-induced segregation

    DEFF Research Database (Denmark)

    Christoffersen, Ebbe; Stoltze, Per; Nørskov, Jens Kehlet

    2002-01-01

    Through the use of Monte Carlo simulations we study the effect of adsorption-induced segregation. From the bulk composition, degree of dispersion and the partial pressure of the gas phase species we calculate the surface composition of bimetallic alloys. We show that both segregation and adsorption...

  6. Gamma irradiation of cultural artifacts for disinfection using Monte Carlo simulations

    International Nuclear Information System (INIS)

    Choi, Jong-il; Yoon, Minchul; Kim, Dongho

    2012-01-01

    In this study, it has been investigated the disinfection of Korean cultural artifacts by gamma irradiation, simulating the absorbed dose distribution on the object with the Monte Carlo methodology. Fungal contamination was identified on two traditional Korean agricultural tools, Hongdukkae and Holtae, which had been stored in a museum. Nine primary species were identified from these items: Bjerkandera adusta, Dothideomycetes sp., Penicillium sp., Cladosporium tenuissimum, Aspergillus versicolor, Penicillium sp., Entrophospora sp., Aspergillus sydowii, and Corynascus sepedonium. However, these fungi were completely inactivated by gamma irradiation at an absorbed dose of 20 kGy on the front side. Monte Carlo N Particle Transport Code was used to simulate the doses applied to these cultural artifacts, and the measured dose distributions were well predicted by the simulations. These results show that irradiation is effective for the disinfection of cultural artifacts and that dose distribution can be predicted with Monte Carlo simulations, allowing the optimization of the radiation treatment. - Highlights: ► Radiation was applied for the disinfection of Korean cultural artifacts. ► Fungi on the artifacts were completely inactivated by the irradiation. ► Monte Carlo N Particle Transport Code was used to predict the dose distribution. ► This study is applicable for the preservation of cultural artifacts by irradiation.

  7. Variational Monte Carlo Technique

    Indian Academy of Sciences (India)

    ias

    on the development of nuclear weapons in Los Alamos ..... cantly improved the paper. ... Carlo simulations of solids, Reviews of Modern Physics, Vol.73, pp.33– ... The computer algorithms are usually based on a random seed that starts the ...

  8. Guideline of Monte Carlo calculation. Neutron/gamma ray transport simulation by Monte Carlo method

    CERN Document Server

    2002-01-01

    This report condenses basic theories and advanced applications of neutron/gamma ray transport calculations in many fields of nuclear energy research. Chapters 1 through 5 treat historical progress of Monte Carlo methods, general issues of variance reduction technique, cross section libraries used in continuous energy Monte Carlo codes. In chapter 6, the following issues are discussed: fusion benchmark experiments, design of ITER, experiment analyses of fast critical assembly, core analyses of JMTR, simulation of pulsed neutron experiment, core analyses of HTTR, duct streaming calculations, bulk shielding calculations, neutron/gamma ray transport calculations of the Hiroshima atomic bomb. Chapters 8 and 9 treat function enhancements of MCNP and MVP codes, and a parallel processing of Monte Carlo calculation, respectively. An important references are attached at the end of this report.

  9. Clustering and traveling waves in the Monte Carlo criticality simulation of decoupled and confined media

    Directory of Open Access Journals (Sweden)

    Eric Dumonteil

    2017-09-01

    Full Text Available The Monte Carlo criticality simulation of decoupled systems, as for instance in large reactor cores, has been a challenging issue for a long time. In particular, due to limited computer time resources, the number of neutrons simulated per generation is still many order of magnitudes below realistic statistics, even during the start-up phases of reactors. This limited number of neutrons triggers a strong clustering effect of the neutron population that affects Monte Carlo tallies. Below a certain threshold, not only is the variance affected but also the estimation of the eigenvectors. In this paper we will build a time-dependent diffusion equation that takes into account both spatial correlations and population control (fixed number of neutrons along generations. We will show that its solution obeys a traveling wave dynamic, and we will discuss the mechanism that explains this biasing of local tallies whenever leakage boundary conditions are applied to the system.

  10. Monte Carlo simulation of x-ray spectra in diagnostic radiology and mammography using MCNP4C

    Energy Technology Data Exchange (ETDEWEB)

    Ay, M R [Department of Physics and Nuclear Sciences, AmirKabir University of Technology, Tehran (Iran, Islamic Republic of); Shahriari, M [Department of Nuclear Engineering, Shahid Beheshti University, Tehran (Iran, Islamic Republic of); Sarkar, S [Department of Medical Physics, Tehran University of Medical Science, Tehran (Iran, Islamic Republic of); Adib, M [TPP Co., GE Medical Systems, Iran Authorized Distributor, Tehran (Iran, Islamic Republic of); Zaidi, H [Division of Nuclear Medicine, Geneva University Hospital, 1211 Geneva (Switzerland)

    2004-11-07

    The general purpose Monte Carlo N-particle radiation transport computer code (MCNP4C) was used for the simulation of x-ray spectra in diagnostic radiology and mammography. The electrons were transported until they slow down and stop in the target. Both bremsstrahlung and characteristic x-ray production were considered in this work. We focus on the simulation of various target/filter combinations to investigate the effect of tube voltage, target material and filter thickness on x-ray spectra in the diagnostic radiology and mammography energy ranges. The simulated x-ray spectra were compared with experimental measurements and spectra calculated by IPEM report number 78. In addition, the anode heel effect and off-axis x-ray spectra were assessed for different anode angles and target materials and the results were compared with EGS4-based Monte Carlo simulations and measured data. Quantitative evaluation of the differences between our Monte Carlo simulated and comparison spectra was performed using student's t-test statistical analysis. Generally, there is a good agreement between the simulated x-ray and comparison spectra, although there are systematic differences between the simulated and reference spectra especially in the K-characteristic x-rays intensity. Nevertheless, no statistically significant differences have been observed between IPEM spectra and the simulated spectra. It has been shown that the difference between MCNP simulated spectra and IPEM spectra in the low energy range is the result of the overestimation of characteristic photons following the normalization procedure. The transmission curves produced by MCNP4C have good agreement with the IPEM report especially for tube voltages of 50 kV and 80 kV. The systematic discrepancy for higher tube voltages is the result of systematic differences between the corresponding spectra.

  11. Monte Carlo simulation of x-ray spectra in diagnostic radiology and mammography using MCNP4C

    Science.gov (United States)

    Ay, M. R.; Shahriari, M.; Sarkar, S.; Adib, M.; Zaidi, H.

    2004-11-01

    The general purpose Monte Carlo N-particle radiation transport computer code (MCNP4C) was used for the simulation of x-ray spectra in diagnostic radiology and mammography. The electrons were transported until they slow down and stop in the target. Both bremsstrahlung and characteristic x-ray production were considered in this work. We focus on the simulation of various target/filter combinations to investigate the effect of tube voltage, target material and filter thickness on x-ray spectra in the diagnostic radiology and mammography energy ranges. The simulated x-ray spectra were compared with experimental measurements and spectra calculated by IPEM report number 78. In addition, the anode heel effect and off-axis x-ray spectra were assessed for different anode angles and target materials and the results were compared with EGS4-based Monte Carlo simulations and measured data. Quantitative evaluation of the differences between our Monte Carlo simulated and comparison spectra was performed using student's t-test statistical analysis. Generally, there is a good agreement between the simulated x-ray and comparison spectra, although there are systematic differences between the simulated and reference spectra especially in the K-characteristic x-rays intensity. Nevertheless, no statistically significant differences have been observed between IPEM spectra and the simulated spectra. It has been shown that the difference between MCNP simulated spectra and IPEM spectra in the low energy range is the result of the overestimation of characteristic photons following the normalization procedure. The transmission curves produced by MCNP4C have good agreement with the IPEM report especially for tube voltages of 50 kV and 80 kV. The systematic discrepancy for higher tube voltages is the result of systematic differences between the corresponding spectra.

  12. Simulating Controlled Radical Polymerizations with mcPolymer—A Monte Carlo Approach

    Directory of Open Access Journals (Sweden)

    Georg Drache

    2012-07-01

    Full Text Available Utilizing model calculations may lead to a better understanding of the complex kinetics of the controlled radical polymerization. We developed a universal simulation tool (mcPolymer, which is based on the widely used Monte Carlo simulation technique. This article focuses on the software architecture of the program, including its data management and optimization approaches. We were able to simulate polymer chains as individual objects, allowing us to gain more detailed microstructural information of the polymeric products. For all given examples of controlled radical polymerization (nitroxide mediated radical polymerization (NMRP homo- and copolymerization, atom transfer radical polymerization (ATRP, reversible addition fragmentation chain transfer polymerization (RAFT, we present detailed performance analyses demonstrating the influence of the system size, concentrations of reactants, and the peculiarities of data. Different possibilities were exemplarily illustrated for finding an adequate balance between precision, memory consumption, and computation time of the simulation. Due to its flexible software architecture, the application of mcPolymer is not limited to the controlled radical polymerization, but can be adjusted in a straightforward manner to further polymerization models.

  13. Simplified monte carlo simulation for Beijing spectrometer

    International Nuclear Information System (INIS)

    Wang Taijie; Wang Shuqin; Yan Wuguang; Huang Yinzhi; Huang Deqiang; Lang Pengfei

    1986-01-01

    The Monte Carlo method based on the functionization of the performance of detectors and the transformation of values of kinematical variables into ''measured'' ones by means of smearing has been used to program the Monte Carlo simulation of the performance of the Beijing Spectrometer (BES) in FORTRAN language named BESMC. It can be used to investigate the multiplicity, the particle type, and the distribution of four-momentum of the final states of electron-positron collision, and also the response of the BES to these final states. Thus, it provides a measure to examine whether the overall design of the BES is reasonable and to decide the physical topics of the BES

  14. Modelling of an industrial environment, part 1.: Monte Carlo simulations of photon transport

    International Nuclear Information System (INIS)

    Kis, Z.; Eged, K.; Meckbach, R.; Voigt, G.

    2002-01-01

    After a nuclear accident releasing radioactive material into the environment the external exposures may contribute significantly to the radiation exposure of the population (UNSCEAR 1988, 2000). For urban populations the external gamma exposure from radionuclides deposited on the surfaces of the urban-industrial environments yields the dominant contributions to the total dose to the public (Kelly 1987; Jacob and Meckbach 1990). The radiation field is naturally influenced by the environment around the sources. For calculations of the shielding effect of the structures in complex and realistic urban environments Monte Carlo methods turned out to be useful tools (Jacob and Meckbach 1987; Meckbach et al. 1988). Using these methods a complex environment can be set up in which the photon transport can be solved on a reliable way. The accuracy of the methods is in principle limited only by the knowledge of the atomic cross sections and the computational time. Several papers using Monte Carlo results for calculating doses from the external gamma exposures were published (Jacob and Meckbach 1987, 1990; Meckbach et al. 1988; Rochedo et al. 1996). In these papers the Monte Carlo simulations were run in urban environments and for different photon energies. The industrial environment can be defined as such an area where productive and/or commercial activity is carried out. A good example can be a factory or a supermarket. An industrial environment can rather be different from the urban ones as for the types and structures of the buildings and their dimensions. These variations will affect the radiation field of this environment. Hence there is a need to run new Monte Carlo simulations designed specially for the industrial environments

  15. Many-integrated core (MIC) technology for accelerating Monte Carlo simulation of radiation transport: A study based on the code DPM

    Science.gov (United States)

    Rodriguez, M.; Brualla, L.

    2018-04-01

    Monte Carlo simulation of radiation transport is computationally demanding to obtain reasonably low statistical uncertainties of the estimated quantities. Therefore, it can benefit in a large extent from high-performance computing. This work is aimed at assessing the performance of the first generation of the many-integrated core architecture (MIC) Xeon Phi coprocessor with respect to that of a CPU consisting of a double 12-core Xeon processor in Monte Carlo simulation of coupled electron-photonshowers. The comparison was made twofold, first, through a suite of basic tests including parallel versions of the random number generators Mersenne Twister and a modified implementation of RANECU. These tests were addressed to establish a baseline comparison between both devices. Secondly, through the p DPM code developed in this work. p DPM is a parallel version of the Dose Planning Method (DPM) program for fast Monte Carlo simulation of radiation transport in voxelized geometries. A variety of techniques addressed to obtain a large scalability on the Xeon Phi were implemented in p DPM. Maximum scalabilities of 84 . 2 × and 107 . 5 × were obtained in the Xeon Phi for simulations of electron and photon beams, respectively. Nevertheless, in none of the tests involving radiation transport the Xeon Phi performed better than the CPU. The disadvantage of the Xeon Phi with respect to the CPU owes to the low performance of the single core of the former. A single core of the Xeon Phi was more than 10 times less efficient than a single core of the CPU for all radiation transport simulations.

  16. Monte Carlo simulation of the resolution volume for the SEQUOIA spectrometer

    Directory of Open Access Journals (Sweden)

    Granroth G.E.

    2015-01-01

    Full Text Available Monte Carlo ray tracing simulations, of direct geometry spectrometers, have been particularly useful in instrument design and characterization. However, these tools can also be useful for experiment planning and analysis. To this end, the McStas Monte Carlo ray tracing model of SEQUOIA, the fine resolution fermi chopper spectrometer at the Spallation Neutron Source (SNS of Oak Ridge National Laboratory (ORNL, has been modified to include the time of flight resolution sample and detector components. With these components, the resolution ellipsoid can be calculated for any detector pixel and energy bin of the instrument. The simulation is split in two pieces. First, the incident beamline up to the sample is simulated for 1 × 1011 neutron packets (4 days on 30 cores. This provides a virtual source for the backend that includes the resolution sample and monitor components. Next, a series of detector and energy pixels are computed in parallel. It takes on the order of 30 s to calculate a single resolution ellipsoid on a single core. Python scripts have been written to transform the ellipsoid into the space of an oriented single crystal, and to characterize the ellipsoid in various ways. Though this tool is under development as a planning tool, we have successfully used it to provide the resolution function for convolution with theoretical models. Specifically, theoretical calculations of the spin waves in YFeO3 were compared to measurements taken on SEQUOIA. Though the overall features of the spectra can be explained while neglecting resolution effects, the variation in intensity of the modes is well described once the resolution is included. As this was a single sharp mode, the simulated half intensity value of the resolution ellipsoid was used to provide the resolution width. A description of the simulation, its use, and paths forward for this technique will be discussed.

  17. Estimation of the dose deposited by electron beams in radiotherapy in voxelised phantoms using the Monte Carlo simulation platform GATE based on GEANT4 in a grid environment

    International Nuclear Information System (INIS)

    Perrot, Y.

    2011-01-01

    Radiation therapy treatment planning requires accurate determination of absorbed dose in the patient. Monte Carlo simulation is the most accurate method for solving the transport problem of particles in matter. This thesis is the first study dealing with the validation of the Monte Carlo simulation platform GATE (GEANT4 Application for Tomographic Emission), based on GEANT4 (Geometry And Tracking) libraries, for the computation of absorbed dose deposited by electron beams. This thesis aims at demonstrating that GATE/GEANT4 calculations are able to reach treatment planning requirements in situations where analytical algorithms are not satisfactory. The goal is to prove that GATE/GEANT4 is useful for treatment planning using electrons and competes with well validated Monte Carlo codes. This is demonstrated by the simulations with GATE/GEANT4 of realistic electron beams and electron sources used for external radiation therapy or targeted radiation therapy. The computed absorbed dose distributions are in agreement with experimental measurements and/or calculations from other Monte Carlo codes. Furthermore, guidelines are proposed to fix the physics parameters of the GATE/GEANT4 simulations in order to ensure the accuracy of absorbed dose calculations according to radiation therapy requirements. (author)

  18. Image based Monte Carlo modeling for computational phantom

    International Nuclear Information System (INIS)

    Cheng, M.; Wang, W.; Zhao, K.; Fan, Y.; Long, P.; Wu, Y.

    2013-01-01

    Full text of the publication follows. The evaluation on the effects of ionizing radiation and the risk of radiation exposure on human body has been becoming one of the most important issues for radiation protection and radiotherapy fields, which is helpful to avoid unnecessary radiation and decrease harm to human body. In order to accurately evaluate the dose on human body, it is necessary to construct more realistic computational phantom. However, manual description and verification of the models for Monte Carlo (MC) simulation are very tedious, error-prone and time-consuming. In addition, it is difficult to locate and fix the geometry error, and difficult to describe material information and assign it to cells. MCAM (CAD/Image-based Automatic Modeling Program for Neutronics and Radiation Transport Simulation) was developed as an interface program to achieve both CAD- and image-based automatic modeling. The advanced version (Version 6) of MCAM can achieve automatic conversion from CT/segmented sectioned images to computational phantoms such as MCNP models. Imaged-based automatic modeling program(MCAM6.0) has been tested by several medical images and sectioned images. And it has been applied in the construction of Rad-HUMAN. Following manual segmentation and 3D reconstruction, a whole-body computational phantom of Chinese adult female called Rad-HUMAN was created by using MCAM6.0 from sectioned images of a Chinese visible human dataset. Rad-HUMAN contains 46 organs/tissues, which faithfully represented the average anatomical characteristics of the Chinese female. The dose conversion coefficients (Dt/Ka) from kerma free-in-air to absorbed dose of Rad-HUMAN were calculated. Rad-HUMAN can be applied to predict and evaluate dose distributions in the Treatment Plan System (TPS), as well as radiation exposure for human body in radiation protection. (authors)

  19. Analysis of possibility to apply new mathematical methods (R-function theory) in Monte Carlo simulation of complex geometry

    International Nuclear Information System (INIS)

    Altiparmakov, D.

    1988-12-01

    This analysis is part of the report on ' Implementation of geometry module of 05R code in another Monte Carlo code', chapter 6.0: establishment of future activity related to geometry in Monte Carlo method. The introduction points out some problems in solving complex three-dimensional models which induce the need for developing more efficient geometry modules in Monte Carlo calculations. Second part include formulation of the problem and geometry module. Two fundamental questions to be solved are defined: (1) for a given point, it is necessary to determine material region or boundary where it belongs, and (2) for a given direction, all cross section points with material regions should be determined. Third part deals with possible connection with Monte Carlo calculations for computer simulation of geometry objects. R-function theory enables creation of geometry module base on the same logic (complex regions are constructed by elementary regions sets operations) as well as construction geometry codes. R-functions can efficiently replace functions of three-value logic in all significant models. They are even more appropriate for application since three-value logic is not typical for digital computers which operate in two-value logic. This shows that there is a need for work in this field. It is shown that there is a possibility to develop interactive code for computer modeling of geometry objects in parallel with development of geometry module [sr

  20. Lectures on Monte Carlo methods

    CERN Document Server

    Madras, Neal

    2001-01-01

    Monte Carlo methods form an experimental branch of mathematics that employs simulations driven by random number generators. These methods are often used when others fail, since they are much less sensitive to the "curse of dimensionality", which plagues deterministic methods in problems with a large number of variables. Monte Carlo methods are used in many fields: mathematics, statistics, physics, chemistry, finance, computer science, and biology, for instance. This book is an introduction to Monte Carlo methods for anyone who would like to use these methods to study various kinds of mathemati

  1. Radiation dose considerations by intra-individual Monte Carlo simulations in dual source spiral coronary computed tomography angiography with electrocardiogram-triggered tube current modulation and adaptive pitch

    Energy Technology Data Exchange (ETDEWEB)

    May, Matthias S.; Kuettner, Axel; Lell, Michael M.; Wuest, Wolfgang; Scharf, Michael; Uder, Michael [University of Erlangen, Department of Radiology, Erlangen (Germany); Deak, Paul; Kalender, Willi A. [University of Erlangen, Department of Medical Physics, Erlangen (Germany); Keller, Andrea K.; Haeberle, Lothar [University of Erlangen, Department of Medical Informatics, Biometry and Epidemiology, Erlangen (Germany); Achenbach, Stephan; Seltmann, Martin [University of Erlangen, Department of Cardiology, Erlangen (Germany)

    2012-03-15

    To evaluate radiation dose levels in patients undergoing spiral coronary computed tomography angiography (CTA) on a dual-source system in clinical routine. Coronary CTA was performed for 56 patients with electrocardiogram-triggered tube current modulation (TCM) and heart-rate (HR) dependent pitch adaptation. Individual Monte Carlo (MC) simulations were performed for dose assessment. Retrospective simulations with constant tube current (CTC) served as reference. Lung tissue was segmented and used for organ and effective dose (ED) calculation. Estimates for mean relative ED was 7.1 {+-} 2.1 mSv/100 mAs for TCM and 12.5 {+-} 5.3 mSv/100 mAs for CTC (P < 0.001). Relative dose reduction at low HR ({<=}60 bpm) was highest (49 {+-} 5%) compared to intermediate (60-70 bpm, 33 {+-} 12%) and high HR (>70 bpm, 29 {+-} 12%). However lowest ED is achieved at high HR (5.2 {+-} 1.5 mSv/100 mAs), compared with intermediate (6.7 {+-} 1.6 mSv/100 mAs) and low (8.3 {+-} 2.1 mSv/100 mAs) HR when automated pitch adaptation is applied. Radiation dose savings up to 52% are achievable by TCM at low and regular HR. However lowest ED is attained at high HR by pitch adaptation despite inferior radiation dose reduction by TCM. circle Monte Carlo simulations allow for individual radiation dose calculations. (orig.)

  2. Massively parallel Monte Carlo for many-particle simulations on GPUs

    Energy Technology Data Exchange (ETDEWEB)

    Anderson, Joshua A.; Jankowski, Eric [Department of Chemical Engineering, University of Michigan, Ann Arbor, MI 48109 (United States); Grubb, Thomas L. [Department of Materials Science and Engineering, University of Michigan, Ann Arbor, MI 48109 (United States); Engel, Michael [Department of Chemical Engineering, University of Michigan, Ann Arbor, MI 48109 (United States); Glotzer, Sharon C., E-mail: sglotzer@umich.edu [Department of Chemical Engineering, University of Michigan, Ann Arbor, MI 48109 (United States); Department of Materials Science and Engineering, University of Michigan, Ann Arbor, MI 48109 (United States)

    2013-12-01

    Current trends in parallel processors call for the design of efficient massively parallel algorithms for scientific computing. Parallel algorithms for Monte Carlo simulations of thermodynamic ensembles of particles have received little attention because of the inherent serial nature of the statistical sampling. In this paper, we present a massively parallel method that obeys detailed balance and implement it for a system of hard disks on the GPU. We reproduce results of serial high-precision Monte Carlo runs to verify the method. This is a good test case because the hard disk equation of state over the range where the liquid transforms into the solid is particularly sensitive to small deviations away from the balance conditions. On a Tesla K20, our GPU implementation executes over one billion trial moves per second, which is 148 times faster than on a single Intel Xeon E5540 CPU core, enables 27 times better performance per dollar, and cuts energy usage by a factor of 13. With this improved performance we are able to calculate the equation of state for systems of up to one million hard disks. These large system sizes are required in order to probe the nature of the melting transition, which has been debated for the last forty years. In this paper we present the details of our computational method, and discuss the thermodynamics of hard disks separately in a companion paper.

  3. Monte Carlo simulation of zinc protoporphyrin fluorescence in the retina

    Science.gov (United States)

    Chen, Xiaoyan; Lane, Stephen

    2010-02-01

    We have used Monte Carlo simulation of autofluorescence in the retina to determine that noninvasive detection of nutritional iron deficiency is possible. Nutritional iron deficiency (which leads to iron deficiency anemia) affects more than 2 billion people worldwide, and there is an urgent need for a simple, noninvasive diagnostic test. Zinc protoporphyrin (ZPP) is a fluorescent compound that accumulates in red blood cells and is used as a biomarker for nutritional iron deficiency. We developed a computational model of the eye, using parameters that were identified either by literature search, or by direct experimental measurement to test the possibility of detecting ZPP non-invasively in retina. By incorporating fluorescence into Steven Jacques' original code for multi-layered tissue, we performed Monte Carlo simulation of fluorescence in the retina and determined that if the beam is not focused on a blood vessel in a neural retina layer or if part of light is hitting the vessel, ZPP fluorescence will be 10-200 times higher than background lipofuscin fluorescence coming from the retinal pigment epithelium (RPE) layer directly below. In addition we found that if the light can be focused entirely onto a blood vessel in the neural retina layer, the fluorescence signal comes only from ZPP. The fluorescence from layers below in this second situation does not contribute to the signal. Therefore, the possibility that a device could potentially be built and detect ZPP fluorescence in retina looks very promising.

  4. Deficiency in Monte Carlo simulations of coupled neutron-gamma-ray fields

    NARCIS (Netherlands)

    Maleka, Peane P.; Maucec, Marko; de Meijer, Robert J.

    2011-01-01

    The deficiency in Monte Carlo simulations of coupled neutron-gamma-ray field was investigated by benchmarking two simulation codes with experimental data. Simulations showed better correspondence with the experimental data for gamma-ray transport only. In simulations, the neutron interactions with

  5. The impact of Monte Carlo simulation: a scientometric analysis of scholarly literature

    CERN Document Server

    Pia, Maria Grazia; Bell, Zane W; Dressendorfer, Paul V

    2010-01-01

    A scientometric analysis of Monte Carlo simulation and Monte Carlo codes has been performed over a set of representative scholarly journals related to radiation physics. The results of this study are reported and discussed. They document and quantitatively appraise the role of Monte Carlo methods and codes in scientific research and engineering applications.

  6. Medical images of patients in voxel structures in high resolution for Monte Carlo simulation

    Energy Technology Data Exchange (ETDEWEB)

    Boia, Leonardo S.; Menezes, Artur F.; Silva, Ademir X., E-mail: lboia@con.ufrj.b, E-mail: ademir@con.ufrj.b [Universidade Federal do Rio de Janeiro (PEN/COPPE/UFRJ), RJ (Brazil). Coordenacao dos Programas de Pos-Graduacao de Engenharia. Programa de Engenharia Nuclear; Salmon Junior, Helio A. [Clinicas Oncologicas Integradas (COI), Rio de Janeiro, RJ (Brazil)

    2011-07-01

    This work aims to present a computational process of conversion of tomographic and MRI medical images from patients in voxel structures to an input file, which will be manipulated in Monte Carlo Simulation code for tumor's radiotherapic treatments. The problem's scenario inherent to the patient is simulated by such process, using the volume element (voxel) as a unit of computational tracing. The head's voxel structure geometry has voxels with volumetric dimensions around 1 mm{sup 3} and a population of millions, which helps - in that way, for a realistic simulation and a decrease in image's digital process techniques for adjustments and equalizations. With such additional data from the code, a more critical analysis can be developed in order to determine the volume of the tumor, and the protection, beside the patients' medical images were borrowed by Clinicas Oncologicas Integradas (COI/RJ), joined to the previous performed planning. In order to execute this computational process, SAPDI computational system is used in a digital image process for optimization of data, conversion program Scan2MCNP, which manipulates, processes, and converts the medical images into voxel structures to input files and the graphic visualizer Moritz for the verification of image's geometry placing. (author)

  7. Track 4: basic nuclear science variance reduction for Monte Carlo criticality simulations. 6. Variational Variance Reduction for Monte Carlo Criticality Calculations

    International Nuclear Information System (INIS)

    Densmore, Jeffery D.; Larsen, Edward W.

    2001-01-01

    Recently, it has been shown that the figure of merit (FOM) of Monte Carlo source-detector problems can be enhanced by using a variational rather than a direct functional to estimate the detector response. The direct functional, which is traditionally employed in Monte Carlo simulations, requires an estimate of the solution of the forward problem within the detector region. The variational functional is theoretically more accurate than the direct functional, but it requires estimates of the solutions of the forward and adjoint source-detector problems over the entire phase-space of the problem. In recent work, we have performed Monte Carlo simulations using the variational functional by (a) approximating the adjoint solution deterministically and representing this solution as a function in phase-space and (b) estimating the forward solution using Monte Carlo. We have called this general procedure variational variance reduction (VVR). The VVR method is more computationally expensive per history than traditional Monte Carlo because extra information must be tallied and processed. However, the variational functional yields a more accurate estimate of the detector response. Our simulations have shown that the VVR reduction in variance usually outweighs the increase in cost, resulting in an increased FOM. In recent work on source-detector problems, we have calculated the adjoint solution deterministically and represented this solution as a linear-in-angle, histogram-in-space function. This procedure has several advantages over previous implementations: (a) it requires much less adjoint information to be stored and (b) it is highly efficient for diffusive problems, due to the accurate linear-in-angle representation of the adjoint solution. (Traditional variance-reduction methods perform poorly for diffusive problems.) Here, we extend this VVR method to Monte Carlo criticality calculations, which are often diffusive and difficult for traditional variance-reduction methods

  8. Testing Lorentz Invariance Emergence in the Ising Model using Monte Carlo simulations

    CERN Document Server

    Dias Astros, Maria Isabel

    2017-01-01

    In the context of the Lorentz invariance as an emergent phenomenon at low energy scales to study quantum gravity a system composed by two 3D interacting Ising models (one with an anisotropy in one direction) was proposed. Two Monte Carlo simulations were run: one for the 2D Ising model and one for the target model. In both cases the observables (energy, magnetization, heat capacity and magnetic susceptibility) were computed for different lattice sizes and a Binder cumulant introduced in order to estimate the critical temperature of the systems. Moreover, the correlation function was calculated for the 2D Ising model.

  9. MOCARS: a Monte Carlo code for determining the distribution and simulation limits

    International Nuclear Information System (INIS)

    Matthews, S.D.

    1977-07-01

    MOCARS is a computer program designed for the INEL CDC 76-173 operating system to determine the distribution and simulation limits for a function by Monte Carlo techniques. The code randomly samples data from any of the 12 user-specified distributions and then either evaluates the cut set system unavailability or a user-specified function with the sample data. After the data are ordered, the values at various quantities and associated confidence bounds are calculated for output. Also available for output on microfilm are the frequency and cumulative distribution histograms from the sample data. 29 figures, 4 tables

  10. On stochastic error and computational efficiency of the Markov Chain Monte Carlo method

    KAUST Repository

    Li, Jun

    2014-01-01

    In Markov Chain Monte Carlo (MCMC) simulations, thermal equilibria quantities are estimated by ensemble average over a sample set containing a large number of correlated samples. These samples are selected in accordance with the probability distribution function, known from the partition function of equilibrium state. As the stochastic error of the simulation results is significant, it is desirable to understand the variance of the estimation by ensemble average, which depends on the sample size (i.e., the total number of samples in the set) and the sampling interval (i.e., cycle number between two consecutive samples). Although large sample sizes reduce the variance, they increase the computational cost of the simulation. For a given CPU time, the sample size can be reduced greatly by increasing the sampling interval, while having the corresponding increase in variance be negligible if the original sampling interval is very small. In this work, we report a few general rules that relate the variance with the sample size and the sampling interval. These results are observed and confirmed numerically. These variance rules are derived for theMCMCmethod but are also valid for the correlated samples obtained using other Monte Carlo methods. The main contribution of this work includes the theoretical proof of these numerical observations and the set of assumptions that lead to them. © 2014 Global-Science Press.

  11. Advanced Mesh-Enabled Monte carlo capability for Multi-Physics Reactor Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, Paul; Evans, Thomas; Tautges, Tim

    2012-12-24

    This project will accumulate high-precision fluxes throughout reactor geometry on a non- orthogonal grid of cells to support multi-physics coupling, in order to more accurately calculate parameters such as reactivity coefficients and to generate multi-group cross sections. This work will be based upon recent developments to incorporate advanced geometry and mesh capability in a modular Monte Carlo toolkit with computational science technology that is in use in related reactor simulation software development. Coupling this capability with production-scale Monte Carlo radiation transport codes can provide advanced and extensible test-beds for these developments. Continuous energy Monte Carlo methods are generally considered to be the most accurate computational tool for simulating radiation transport in complex geometries, particularly neutron transport in reactors. Nevertheless, there are several limitations for their use in reactor analysis. Most significantly, there is a trade-off between the fidelity of results in phase space, statistical accuracy, and the amount of computer time required for simulation. Consequently, to achieve an acceptable level of statistical convergence in high-fidelity results required for modern coupled multi-physics analysis, the required computer time makes Monte Carlo methods prohibitive for design iterations and detailed whole-core analysis. More subtly, the statistical uncertainty is typically not uniform throughout the domain, and the simulation quality is limited by the regions with the largest statistical uncertainty. In addition, the formulation of neutron scattering laws in continuous energy Monte Carlo methods makes it difficult to calculate adjoint neutron fluxes required to properly determine important reactivity parameters. Finally, most Monte Carlo codes available for reactor analysis have relied on orthogonal hexahedral grids for tallies that do not conform to the geometric boundaries and are thus generally not well

  12. Dosimetric control of radiotherapy treatments by Monte Carlo simulation of transmitted portal dose image

    International Nuclear Information System (INIS)

    Badel, Jean-Noel

    2009-01-01

    This research thesis addresses the dosimetric control of radiotherapy treatments by using amorphous silicon digital portal imagery. In a first part, the author reports the analysis of the dosimetric abilities of the imager (iViewGT) which is used in the radiotherapy department. The stability of the imager response on a short and on a long term has been studied. A relationship between the image grey level and the dose has been established for a reference irradiation field. The influence of irradiation parameters on the grey level variation with respect to the dose has been assessed. The obtained results show the possibility to use this system for dosimetry provided that a precise calibration is performed while taking the most influencing irradiation parameters into account, i.e. photon beam nominal energy, field size, and patient thickness. The author reports the development of a Monte Carlo simulation to model the imager response. It models the accelerator head by a generalized source point. Space and energy distributions of photons are calculated. This modelling can also be applied to the calculation of dose distribution within a patient, or to study physical interactions in the accelerator head. Then, the author explores a new approach to dose portal image prediction within the frame of an in vivo dosimetric control. He computes the image transmitted through the patient by Monte Carlo simulation, and measures the portal image of the irradiation field without the patient. Validation experiments are reported, and problems to be solved are highlighted (computation time, improvement of the collimator simulation) [fr

  13. Single-site Lennard-Jones models via polynomial chaos surrogates of Monte Carlo molecular simulation

    KAUST Repository

    Kadoura, Ahmad Salim

    2016-06-01

    In this work, two Polynomial Chaos (PC) surrogates were generated to reproduce Monte Carlo (MC) molecular simulation results of the canonical (single-phase) and the NVT-Gibbs (two-phase) ensembles for a system of normalized structureless Lennard-Jones (LJ) particles. The main advantage of such surrogates, once generated, is the capability of accurately computing the needed thermodynamic quantities in a few seconds, thus efficiently replacing the computationally expensive MC molecular simulations. Benefiting from the tremendous computational time reduction, the PC surrogates were used to conduct large-scale optimization in order to propose single-site LJ models for several simple molecules. Experimental data, a set of supercritical isotherms, and part of the two-phase envelope, of several pure components were used for tuning the LJ parameters (ε, σ). Based on the conducted optimization, excellent fit was obtained for different noble gases (Ar, Kr, and Xe) and other small molecules (CH4, N2, and CO). On the other hand, due to the simplicity of the LJ model used, dramatic deviations between simulation and experimental data were observed, especially in the two-phase region, for more complex molecules such as CO2 and C2 H6.

  14. Computer simulation of high energy displacement cascades

    International Nuclear Information System (INIS)

    Heinisch, H.L.

    1990-01-01

    A methodology developed for modeling many aspects of high energy displacement cascades with molecular level computer simulations is reviewed. The initial damage state is modeled in the binary collision approximation (using the MARLOWE computer code), and the subsequent disposition of the defects within a cascade is modeled with a Monte Carlo annealing simulation (the ALSOME code). There are few adjustable parameters, and none are set to physically unreasonable values. The basic configurations of the simulated high energy cascades in copper, i.e., the number, size and shape of damage regions, compare well with observations, as do the measured numbers of residual defects and the fractions of freely migrating defects. The success of these simulations is somewhat remarkable, given the relatively simple models of defects and their interactions that are employed. The reason for this success is that the behavior of the defects is very strongly influenced by their initial spatial distributions, which the binary collision approximation adequately models. The MARLOWE/ALSOME system, with input from molecular dynamics and experiments, provides a framework for investigating the influence of high energy cascades on microstructure evolution. (author)

  15. Fast dose planning Monte Carlo simulations in inhomogeneous phantoms submerged in uniform, static magnetic fields

    International Nuclear Information System (INIS)

    Yanez, R.; Dempsey, J. F.

    2007-01-01

    We present studies in support of the development of a magnetic resonance imaging (MRI) guided intensity modulated radiation therapy (IMRT) device for the treatment of cancer patients. Fast and accurate computation of the absorbed ionizing radiation dose delivered in the presence of the MRI magnetic field are required for clinical implementation. The fast Monte Carlo simulation code DPM, optimized for radiotherapy treatment planning, is modified to simulate absorbed doses in uniform, static magnetic fields, and benchmarked against PENELOPE. Simulations of dose deposition in inhomogeneous phantoms in which a low density material is sandwiched in water shows that a lower MRI field strength (0.3 T) is to prefer in order to avoid dose build-up near material boundaries. (authors)

  16. MO-G-17A-04: Internal Dosimetric Calculations for Pediatric Nuclear Imaging Applications, Using Monte Carlo Simulations and High-Resolution Pediatric Computational Models

    Energy Technology Data Exchange (ETDEWEB)

    Papadimitroulas, P; Kagadis, GC [University of Patras, Rion, Ahaia (Greece); Loudos, G [Technical Educational Institute of Athens, Aigaleo, Attiki (Greece)

    2014-06-15

    Purpose: Our purpose is to evaluate the administered absorbed dose in pediatric, nuclear imaging studies. Monte Carlo simulations with the incorporation of pediatric computational models can serve as reference for the accurate determination of absorbed dose. The procedure of the calculated dosimetric factors is described, while a dataset of reference doses is created. Methods: Realistic simulations were executed using the GATE toolkit and a series of pediatric computational models, developed by the “IT'IS Foundation”. The series of the phantoms used in our work includes 6 models in the range of 5–14 years old (3 boys and 3 girls). Pre-processing techniques were applied to the images, to incorporate the phantoms in GATE simulations. The resolution of the phantoms was set to 2 mm3. The most important organ densities were simulated according to the GATE “Materials Database”. Several used radiopharmaceuticals in SPECT and PET applications are being tested, following the EANM pediatric dosage protocol. The biodistributions of the several isotopes used as activity maps in the simulations, were derived by the literature. Results: Initial results of absorbed dose per organ (mGy) are presented in a 5 years old girl from the whole body exposure to 99mTc - SestaMIBI, 30 minutes after administration. Heart, kidney, liver, ovary, pancreas and brain are the most critical organs, in which the S-factors are calculated. The statistical uncertainty in the simulation procedure was kept lower than 5%. The Sfactors for each target organ are calculated in Gy/(MBq*sec) with highest dose being absorbed in kidneys and pancreas (9.29*10{sup 10} and 0.15*10{sup 10} respectively). Conclusion: An approach for the accurate dosimetry on pediatric models is presented, creating a reference dosage dataset for several radionuclides in children computational models with the advantages of MC techniques. Our study is ongoing, extending our investigation to other reference models and

  17. MO-G-17A-04: Internal Dosimetric Calculations for Pediatric Nuclear Imaging Applications, Using Monte Carlo Simulations and High-Resolution Pediatric Computational Models

    International Nuclear Information System (INIS)

    Papadimitroulas, P; Kagadis, GC; Loudos, G

    2014-01-01

    Purpose: Our purpose is to evaluate the administered absorbed dose in pediatric, nuclear imaging studies. Monte Carlo simulations with the incorporation of pediatric computational models can serve as reference for the accurate determination of absorbed dose. The procedure of the calculated dosimetric factors is described, while a dataset of reference doses is created. Methods: Realistic simulations were executed using the GATE toolkit and a series of pediatric computational models, developed by the “IT'IS Foundation”. The series of the phantoms used in our work includes 6 models in the range of 5–14 years old (3 boys and 3 girls). Pre-processing techniques were applied to the images, to incorporate the phantoms in GATE simulations. The resolution of the phantoms was set to 2 mm3. The most important organ densities were simulated according to the GATE “Materials Database”. Several used radiopharmaceuticals in SPECT and PET applications are being tested, following the EANM pediatric dosage protocol. The biodistributions of the several isotopes used as activity maps in the simulations, were derived by the literature. Results: Initial results of absorbed dose per organ (mGy) are presented in a 5 years old girl from the whole body exposure to 99mTc - SestaMIBI, 30 minutes after administration. Heart, kidney, liver, ovary, pancreas and brain are the most critical organs, in which the S-factors are calculated. The statistical uncertainty in the simulation procedure was kept lower than 5%. The Sfactors for each target organ are calculated in Gy/(MBq*sec) with highest dose being absorbed in kidneys and pancreas (9.29*10 10 and 0.15*10 10 respectively). Conclusion: An approach for the accurate dosimetry on pediatric models is presented, creating a reference dosage dataset for several radionuclides in children computational models with the advantages of MC techniques. Our study is ongoing, extending our investigation to other reference models and evaluating the

  18. Monte Carlo simulation of virtual compton scattering at MAMI

    International Nuclear Information System (INIS)

    D'Hose, N.; Ducret, J.E.; Gousset, TH.; Guichon, P.A.M.; Kerhoas, S.; Lhuillier, D.; Marchand, C.; Marchand, D.; Martino, J.; Mougey, J.; Roche, J.; Vanderhaeghen, M.; Vernin, P.; Bohm, H.; Distler, M.; Edelhoff, R.; Friedrich, J.M.; Geiges, R.; Jennewein, P.; Kahrau, M.; Korn, M.; Kramer, H.; Krygier, K.W.; Kunde, V.; Liesenfeld, A.; Merkel, H.; Merle, K.; Neuhausen, R.; Pospischil, TH.; Rosner, G.; Sauer, P.; Schmieden, H.; Schardt, S.; Tamas, G.; Wagner, A.; Walcher, TH.; Wolf, S.; Hyde-Wright, CH.; Boeglin, W.U.; Van de Wiele, J.

    1996-01-01

    The Monte Carlo simulation developed specially for the VCS experiments taking place at MAMI in fully described. This simulation can generate events according to the Bethe-Heitler + Born cross section behaviour and takes into account resolution deteriorating effects. It is used to determine solid angles for the various experimental settings. (authors)

  19. Monte Carlo Molecular Simulation with Isobaric-Isothermal and Gibbs-NPT Ensembles

    KAUST Repository

    Du, Shouhong

    2012-01-01

    This thesis presents Monte Carlo methods for simulations of phase behaviors of Lennard-Jones fluids. The isobaric-isothermal (NPT) ensemble and Gibbs-NPT ensemble are introduced in detail. NPT ensemble is employed to determine the phase diagram of pure component. The reduced simulation results are verified by comparison with the equation of state by by Johnson et al. and results with L-J parameters of methane agree considerably with the experiment measurements. We adopt the blocking method for variance estimation and error analysis of the simulation results. The relationship between variance and number of Monte Carlo cycles, error propagation and Random Number Generator performance are also investigated. We review the Gibbs-NPT ensemble employed for phase equilibrium of binary mixture. The phase equilibrium is achieved by performing three types of trial move: particle displacement, volume rearrangement and particle transfer. The simulation models and the simulation details are introduced. The simulation results of phase coexistence for methane and ethane are reported with comparison of the experimental data. Good agreement is found for a wide range of pressures. The contribution of this thesis work lies in the study of the error analysis with respect to the Monte Carlo cycles and number of particles in some interesting aspects.

  20. Monte Carlo Molecular Simulation with Isobaric-Isothermal and Gibbs-NPT Ensembles

    KAUST Repository

    Du, Shouhong

    2012-05-01

    This thesis presents Monte Carlo methods for simulations of phase behaviors of Lennard-Jones fluids. The isobaric-isothermal (NPT) ensemble and Gibbs-NPT ensemble are introduced in detail. NPT ensemble is employed to determine the phase diagram of pure component. The reduced simulation results are verified by comparison with the equation of state by by Johnson et al. and results with L-J parameters of methane agree considerably with the experiment measurements. We adopt the blocking method for variance estimation and error analysis of the simulation results. The relationship between variance and number of Monte Carlo cycles, error propagation and Random Number Generator performance are also investigated. We review the Gibbs-NPT ensemble employed for phase equilibrium of binary mixture. The phase equilibrium is achieved by performing three types of trial move: particle displacement, volume rearrangement and particle transfer. The simulation models and the simulation details are introduced. The simulation results of phase coexistence for methane and ethane are reported with comparison of the experimental data. Good agreement is found for a wide range of pressures. The contribution of this thesis work lies in the study of the error analysis with respect to the Monte Carlo cycles and number of particles in some interesting aspects.

  1. Computer simulation of grain growth in HAZ

    Science.gov (United States)

    Gao, Jinhua

    Two different models for Monte Carlo simulation of normal grain growth in metals and alloys were developed. Each simulation model was based on a different approach to couple the Monte Carlo simulation time to real time-temperature. These models demonstrated the applicability of Monte Carlo simulation to grain growth in materials processing. A grain boundary migration (GBM) model coupled the Monte Carlo simulation to a first principle grain boundary migration model. The simulation results, by applying this model to isothermal grain growth in zone-refined tin, showed good agreement with experimental results. An experimental data based (EDB) model coupled the Monte Carlo simulation with grain growth kinetics obtained from the experiment. The results of the application of the EDB model to the grain growth during continuous heating of a beta titanium alloy correlated well with experimental data. In order to acquire the grain growth kinetics from the experiment, a new mathematical method was developed and utilized to analyze the experimental data on isothermal grain growth. Grain growth in the HAZ of 0.2% Cu-Al alloy was successfully simulated using the EDB model combined with grain growth kinetics obtained from the experiment and measured thermal cycles from the welding process. The simulated grain size distribution in the HAZ was in good agreement with experimental results. The pinning effect of second phase particles on grain growth was also simulated in this work. The simulation results confirmed that by introducing the variable R, degree of contact between grain boundaries and second phase particles, the Zener pinning model can be modified as${D/ r} = {K/{Rf}}$where D is the pinned grain size, r the mean size of second phase particles, K a constant, f the area fraction (or the volume fraction in 3-D) of second phase.

  2. Monte Carlo simulation of the X-ray response of a germanium microstrip detector with energy and position resolution

    CERN Document Server

    Rossi, G; Fajardo, P; Morse, J

    1999-01-01

    We present Monte Carlo computer simulations of the X-ray response of a micro-strip germanium detector over the energy range 30-100 keV. The detector consists of a linear array of lithographically defined 150 mu m wide strips on a high purity monolithic germanium crystal of 6 mm thickness. The simulation code is divided into two parts. We first consider a 10 mu m wide X-ray beam striking the detector surface at normal incidence and compute the interaction processes possible for each photon. Photon scattering and absorption inside the detector crystal are simulated using the EGS4 code with the LSCAT extension for low energies. A history of events is created of the deposited energies which is read by the second part of the code which computes the energy histogram for each detector strip. Appropriate algorithms are introduced to account for lateral charge spreading occurring during charge carrier drift to the detector surface, and Fano and preamplifier electronic noise contributions. Computed spectra for differen...

  3. Computing bubble-points of CO

    NARCIS (Netherlands)

    Ramdin, M.; Balaji, S.P.; Vicent Luna, J.M.; Torres-Knoop, A; Chen, Q.; Dubbeldam, D.; Calero, S; de Loos, T.W.; Vlugt, T.J.H.

    2016-01-01

    Computing bubble-points of multicomponent mixtures using Monte Carlo simulations is a non-trivial task. A new method is used to compute gas compositions from a known temperature, bubble-point pressure, and liquid composition. Monte Carlo simulations are used to calculate the bubble-points of

  4. Combinatorial geometry domain decomposition strategies for Monte Carlo simulations

    Energy Technology Data Exchange (ETDEWEB)

    Li, G.; Zhang, B.; Deng, L.; Mo, Z.; Liu, Z.; Shangguan, D.; Ma, Y.; Li, S.; Hu, Z. [Institute of Applied Physics and Computational Mathematics, Beijing, 100094 (China)

    2013-07-01

    Analysis and modeling of nuclear reactors can lead to memory overload for a single core processor when it comes to refined modeling. A method to solve this problem is called 'domain decomposition'. In the current work, domain decomposition algorithms for a combinatorial geometry Monte Carlo transport code are developed on the JCOGIN (J Combinatorial Geometry Monte Carlo transport INfrastructure). Tree-based decomposition and asynchronous communication of particle information between domains are described in the paper. Combination of domain decomposition and domain replication (particle parallelism) is demonstrated and compared with that of MERCURY code. A full-core reactor model is simulated to verify the domain decomposition algorithms using the Monte Carlo particle transport code JMCT (J Monte Carlo Transport Code), which has being developed on the JCOGIN infrastructure. Besides, influences of the domain decomposition algorithms to tally variances are discussed. (authors)

  5. Combinatorial geometry domain decomposition strategies for Monte Carlo simulations

    International Nuclear Information System (INIS)

    Li, G.; Zhang, B.; Deng, L.; Mo, Z.; Liu, Z.; Shangguan, D.; Ma, Y.; Li, S.; Hu, Z.

    2013-01-01

    Analysis and modeling of nuclear reactors can lead to memory overload for a single core processor when it comes to refined modeling. A method to solve this problem is called 'domain decomposition'. In the current work, domain decomposition algorithms for a combinatorial geometry Monte Carlo transport code are developed on the JCOGIN (J Combinatorial Geometry Monte Carlo transport INfrastructure). Tree-based decomposition and asynchronous communication of particle information between domains are described in the paper. Combination of domain decomposition and domain replication (particle parallelism) is demonstrated and compared with that of MERCURY code. A full-core reactor model is simulated to verify the domain decomposition algorithms using the Monte Carlo particle transport code JMCT (J Monte Carlo Transport Code), which has being developed on the JCOGIN infrastructure. Besides, influences of the domain decomposition algorithms to tally variances are discussed. (authors)

  6. Monte Carlo simulation of radiation streaming from a radioactive material shipping cask

    International Nuclear Information System (INIS)

    Liu, Y.Y.; Schwarz, R.A.; Tang, J.S.

    1996-01-01

    Simulated detection of gamma radiation streaming from a radioactive material shipping cask have been performed with the Monte Carlo codes MCNP4A and MORSE-SGC/S. Despite inherent difficulties in simulating deep penetration of radiation and streaming, the simulations have yielded results that agree within one order of magnitude with the radiation survey data, with reasonable statistics. These simulations have also provided insight into modeling radiation detection, notably on location and orientation of the radiation detector with respect to photon streaming paths, and on techniques used to reduce variance in the Monte Carlo calculations. 13 refs., 4 figs., 2 tabs

  7. Monte Carlo simulation for the design of industrial gamma-ray transmission tomography

    International Nuclear Information System (INIS)

    Kim, Jongbum; Jung, Sunghee; Moon, Jinho; Kwon, Taekyong; Cho, Gyuseong

    2011-01-01

    The Monte Carlo simulation and experiment were carried out for a large-scale industrial gamma ray tomographic scanning geometry. The geometry of the tomographic system has a moving source with 16 stationary detectors. This geometry is advantageous for the diagnosis of a large-scale industrial plant. The simulation data was carried out for the phantom with 32 views, 16 detectors, and a different energy bin. The simulation data was processed to be used for image reconstruction. Image reconstruction was performed by a Diagonally-Scaled Gradient-Ascent algorithm for simulation data. Experiments were conducted in a 78 cm diameter column filled with polypropylene grains. Sixteen 0.5-inch-thick and 1 inch long NaI(Tl) cylindrical detectors, and 20 mCi of 137 Cs radioactive source were used. The experimental results were compared to the simulation data. The experimental results were similar to Monte Carlo simulation results. This result showed that the Monte Carlo simulation is useful for predicting the result of the industrial gamma tomographic scan method And it can also give a solution for designing the industrial gamma tomography system and preparing the field experiment. (author)

  8. RNA folding kinetics using Monte Carlo and Gillespie algorithms.

    Science.gov (United States)

    Clote, Peter; Bayegan, Amir H

    2018-04-01

    RNA secondary structure folding kinetics is known to be important for the biological function of certain processes, such as the hok/sok system in E. coli. Although linear algebra provides an exact computational solution of secondary structure folding kinetics with respect to the Turner energy model for tiny ([Formula: see text]20 nt) RNA sequences, the folding kinetics for larger sequences can only be approximated by binning structures into macrostates in a coarse-grained model, or by repeatedly simulating secondary structure folding with either the Monte Carlo algorithm or the Gillespie algorithm. Here we investigate the relation between the Monte Carlo algorithm and the Gillespie algorithm. We prove that asymptotically, the expected time for a K-step trajectory of the Monte Carlo algorithm is equal to [Formula: see text] times that of the Gillespie algorithm, where [Formula: see text] denotes the Boltzmann expected network degree. If the network is regular (i.e. every node has the same degree), then the mean first passage time (MFPT) computed by the Monte Carlo algorithm is equal to MFPT computed by the Gillespie algorithm multiplied by [Formula: see text]; however, this is not true for non-regular networks. In particular, RNA secondary structure folding kinetics, as computed by the Monte Carlo algorithm, is not equal to the folding kinetics, as computed by the Gillespie algorithm, although the mean first passage times are roughly correlated. Simulation software for RNA secondary structure folding according to the Monte Carlo and Gillespie algorithms is publicly available, as is our software to compute the expected degree of the network of secondary structures of a given RNA sequence-see http://bioinformatics.bc.edu/clote/RNAexpNumNbors .

  9. DNA strand breaks induced by electrons simulated with nanodosimetry Monte Carlo simulation code: NASIC

    International Nuclear Information System (INIS)

    Li, Junli; Qiu, Rui; Yan, Congchong; Xie, Wenzhang; Zeng, Zhi; Li, Chunyan; Wu, Zhen; Tung, Chuanjong

    2015-01-01

    The method of Monte Carlo simulation is a powerful tool to investigate the details of radiation biological damage at the molecular level. In this paper, a Monte Carlo code called NASIC (Nanodosimetry Monte Carlo Simulation Code) was developed. It includes physical module, pre-chemical module, chemical module, geometric module and DNA damage module. The physical module can simulate physical tracks of low-energy electrons in the liquid water event-by-event. More than one set of inelastic cross sections were calculated by applying the dielectric function method of Emfietzoglou's optical-data treatments, with different optical data sets and dispersion models. In the pre-chemical module, the ionised and excited water molecules undergo dissociation processes. In the chemical module, the produced radiolytic chemical species diffuse and react. In the geometric module, an atomic model of 46 chromatin fibres in a spherical nucleus of human lymphocyte was established. In the DNA damage module, the direct damages induced by the energy depositions of the electrons and the indirect damages induced by the radiolytic chemical species were calculated. The parameters should be adjusted to make the simulation results be agreed with the experimental results. In this paper, the influence study of the inelastic cross sections and vibrational excitation reaction on the parameters and the DNA strand break yields were studied. Further work of NASIC is underway (authors)

  10. Monte Carlo-molecular dynamics simulations for two-dimensional magnets

    International Nuclear Information System (INIS)

    Kawabata, C.; takeuchi, M.; Bishop, A.R.

    1985-01-01

    A combined Monte Carlo-molecular dynamics simulation technique is used to study the dynamic structure factor on a square lattice for isotropic Heisenberg and planar classical ferromagnetic spin Hamiltonians

  11. Monte Carlo simulations of microchannel plate detectors I: steady-state voltage bias results

    Energy Technology Data Exchange (ETDEWEB)

    Ming Wu, Craig Kruschwitz, Dane Morgan, Jiaming Morgan

    2008-07-01

    X-ray detectors based on straight-channel microchannel plates (MCPs) are a powerful diagnostic tool for two-dimensional, time-resolved imaging and timeresolved x-ray spectroscopy in the fields of laser-driven inertial confinement fusion and fast z-pinch experiments. Understanding the behavior of microchannel plates as used in such detectors is critical to understanding the data obtained. The subject of this paper is a Monte Carlo computer code we have developed to simulate the electron cascade in a microchannel plate under a static applied voltage. Also included in the simulation is elastic reflection of low-energy electrons from the channel wall, which is important at lower voltages. When model results were compared to measured microchannel plate sensitivities, good agreement was found. Spatial resolution simulations of MCP-based detectors were also presented and found to agree with experimental measurements.

  12. Monte-Carlo simulation on the cold neutron guides at CARR

    International Nuclear Information System (INIS)

    Guo Liping; Wang Hongli; Yang Tonghua; Cheng Zhixu; Liu Yi

    2003-01-01

    The designs of the two cold neutron guides to be built at China Advanced Research Reactor (CARR) are simulated with Monte-Carlo simulation software VITESS. Various parameters of the guides, e.g. transmission efficiency, neutron flux, divergence, etc., are obtained. (author)

  13. Variance Reduction Techniques in Monte Carlo Methods

    NARCIS (Netherlands)

    Kleijnen, Jack P.C.; Ridder, A.A.N.; Rubinstein, R.Y.

    2010-01-01

    Monte Carlo methods are simulation algorithms to estimate a numerical quantity in a statistical model of a real system. These algorithms are executed by computer programs. Variance reduction techniques (VRT) are needed, even though computer speed has been increasing dramatically, ever since the

  14. Simulation of dose deposition in stereotactic synchrotron radiation therapy: a fast approach combining Monte Carlo and deterministic algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Smekens, F; Freud, N; Letang, J M; Babot, D [CNDRI (Nondestructive Testing using Ionizing Radiations) Laboratory, INSA-Lyon, 69621 Villeurbanne Cedex (France); Adam, J-F; Elleaume, H; Esteve, F [INSERM U-836, Equipe 6 ' Rayonnement Synchrotron et Recherche Medicale' , Institut des Neurosciences de Grenoble (France); Ferrero, C; Bravin, A [European Synchrotron Radiation Facility, Grenoble (France)], E-mail: francois.smekens@insa-lyon.fr

    2009-08-07

    A hybrid approach, combining deterministic and Monte Carlo (MC) calculations, is proposed to compute the distribution of dose deposited during stereotactic synchrotron radiation therapy treatment. The proposed approach divides the computation into two parts: (i) the dose deposited by primary radiation (coming directly from the incident x-ray beam) is calculated in a deterministic way using ray casting techniques and energy-absorption coefficient tables and (ii) the dose deposited by secondary radiation (Rayleigh and Compton scattering, fluorescence) is computed using a hybrid algorithm combining MC and deterministic calculations. In the MC part, a small number of particle histories are simulated. Every time a scattering or fluorescence event takes place, a splitting mechanism is applied, so that multiple secondary photons are generated with a reduced weight. The secondary events are further processed in a deterministic way, using ray casting techniques. The whole simulation, carried out within the framework of the Monte Carlo code Geant4, is shown to converge towards the same results as the full MC simulation. The speed of convergence is found to depend notably on the splitting multiplicity, which can easily be optimized. To assess the performance of the proposed algorithm, we compare it to state-of-the-art MC simulations, accelerated by the track length estimator technique (TLE), considering a clinically realistic test case. It is found that the hybrid approach is significantly faster than the MC/TLE method. The gain in speed in a test case was about 25 for a constant precision. Therefore, this method appears to be suitable for treatment planning applications.

  15. Instantons in Quantum Annealing: Thermally Assisted Tunneling Vs Quantum Monte Carlo Simulations

    Science.gov (United States)

    Jiang, Zhang; Smelyanskiy, Vadim N.; Boixo, Sergio; Isakov, Sergei V.; Neven, Hartmut; Mazzola, Guglielmo; Troyer, Matthias

    2015-01-01

    Recent numerical result (arXiv:1512.02206) from Google suggested that the D-Wave quantum annealer may have an asymptotic speed-up than simulated annealing, however, the asymptotic advantage disappears when it is compared to quantum Monte Carlo (a classical algorithm despite its name). We show analytically that the asymptotic scaling of quantum tunneling is exactly the same as the escape rate in quantum Monte Carlo for a class of problems. Thus, the Google result might be explained in our framework. We also found that the transition state in quantum Monte Carlo corresponds to the instanton solution in quantum tunneling problems, which is observed in numerical simulations.

  16. MC 93 - Proceedings of the International Conference on Monte Carlo Simulation in High Energy and Nuclear Physics

    Science.gov (United States)

    Dragovitsch, Peter; Linn, Stephan L.; Burbank, Mimi

    1994-01-01

    The Table of Contents for the book is as follows: * Preface * Heavy Fragment Production for Hadronic Cascade Codes * Monte Carlo Simulations of Space Radiation Environments * Merging Parton Showers with Higher Order QCD Monte Carlos * An Order-αs Two-Photon Background Study for the Intermediate Mass Higgs Boson * GEANT Simulation of Hall C Detector at CEBAF * Monte Carlo Simulations in Radioecology: Chernobyl Experience * UNIMOD2: Monte Carlo Code for Simulation of High Energy Physics Experiments; Some Special Features * Geometrical Efficiency Analysis for the Gamma-Neutron and Gamma-Proton Reactions * GISMO: An Object-Oriented Approach to Particle Transport and Detector Modeling * Role of MPP Granularity in Optimizing Monte Carlo Programming * Status and Future Trends of the GEANT System * The Binary Sectioning Geometry for Monte Carlo Detector Simulation * A Combined HETC-FLUKA Intranuclear Cascade Event Generator * The HARP Nucleon Polarimeter * Simulation and Data Analysis Software for CLAS * TRAP -- An Optical Ray Tracing Program * Solutions of Inverse and Optimization Problems in High Energy and Nuclear Physics Using Inverse Monte Carlo * FLUKA: Hadronic Benchmarks and Applications * Electron-Photon Transport: Always so Good as We Think? Experience with FLUKA * Simulation of Nuclear Effects in High Energy Hadron-Nucleus Collisions * Monte Carlo Simulations of Medium Energy Detectors at COSY Jülich * Complex-Valued Monte Carlo Method and Path Integrals in the Quantum Theory of Localization in Disordered Systems of Scatterers * Radiation Levels at the SSCL Experimental Halls as Obtained Using the CLOR89 Code System * Overview of Matrix Element Methods in Event Generation * Fast Electromagnetic Showers * GEANT Simulation of the RMC Detector at TRIUMF and Neutrino Beams for KAON * Event Display for the CLAS Detector * Monte Carlo Simulation of High Energy Electrons in Toroidal Geometry * GEANT 3.14 vs. EGS4: A Comparison Using the DØ Uranium/Liquid Argon

  17. Extraction of diffuse correlation spectroscopy flow index by integration of Nth-order linear model with Monte Carlo simulation

    Science.gov (United States)

    Shang, Yu; Li, Ting; Chen, Lei; Lin, Yu; Toborek, Michal; Yu, Guoqiang

    2014-05-01

    Conventional semi-infinite solution for extracting blood flow index (BFI) from diffuse correlation spectroscopy (DCS) measurements may cause errors in estimation of BFI (αDB) in tissues with small volume and large curvature. We proposed an algorithm integrating Nth-order linear model of autocorrelation function with the Monte Carlo simulation of photon migrations in tissue for the extraction of αDB. The volume and geometry of the measured tissue were incorporated in the Monte Carlo simulation, which overcome the semi-infinite restrictions. The algorithm was tested using computer simulations on four tissue models with varied volumes/geometries and applied on an in vivo stroke model of mouse. Computer simulations shows that the high-order (N ≥ 5) linear algorithm was more accurate in extracting αDB (errors values of errors in extracting αDB were similar to those reconstructed from the noise-free DCS data. In addition, the errors in extracting the relative changes of αDB using both linear algorithm and semi-infinite solution were fairly small (errors < ±2.0%) and did not rely on the tissue volume/geometry. The experimental results from the in vivo stroke mice agreed with those in simulations, demonstrating the robustness of the linear algorithm. DCS with the high-order linear algorithm shows the potential for the inter-subject comparison and longitudinal monitoring of absolute BFI in a variety of tissues/organs with different volumes/geometries.

  18. Monte Carlo simulation of tomography techniques using the platform Gate

    International Nuclear Information System (INIS)

    Barbouchi, Asma

    2007-01-01

    Simulations play a key role in functional imaging, with applications ranging from scanner design, scatter correction, protocol optimisation. GATE (Geant4 for Application Tomography Emission) is a platform for Monte Carlo Simulation. It is based on Geant4 to generate and track particles, to model geometry and physics process. Explicit modelling of time includes detector motion, time of flight, tracer kinetics. Interfaces to voxellised models and image reconstruction packages improve the integration of GATE in the global modelling cycle. In this work Monte Carlo simulations are used to understand and optimise the gamma camera's performances. We study the effect of the distance between source and collimator, the diameter of the holes and the thick of the collimator on the spatial resolution, energy resolution and efficiency of the gamma camera. We also study the reduction of simulation's time and implement a model of left ventricle in GATE. (Author). 7 refs

  19. Interface methods for hybrid Monte Carlo-diffusion radiation-transport simulations

    International Nuclear Information System (INIS)

    Densmore, Jeffery D.

    2006-01-01

    Discrete diffusion Monte Carlo (DDMC) is a technique for increasing the efficiency of Monte Carlo simulations in diffusive media. An important aspect of DDMC is the treatment of interfaces between diffusive regions, where DDMC is used, and transport regions, where standard Monte Carlo is employed. Three previously developed methods exist for treating transport-diffusion interfaces: the Marshak interface method, based on the Marshak boundary condition, the asymptotic interface method, based on the asymptotic diffusion-limit boundary condition, and the Nth-collided source technique, a scheme that allows Monte Carlo particles to undergo several collisions in a diffusive region before DDMC is used. Numerical calculations have shown that each of these interface methods gives reasonable results as part of larger radiation-transport simulations. In this paper, we use both analytic and numerical examples to compare the ability of these three interface techniques to treat simpler, transport-diffusion interface problems outside of a more complex radiation-transport calculation. We find that the asymptotic interface method is accurate regardless of the angular distribution of Monte Carlo particles incident on the interface surface. In contrast, the Marshak boundary condition only produces correct solutions if the incident particles are isotropic. We also show that the Nth-collided source technique has the capacity to yield accurate results if spatial cells are optically small and Monte Carlo particles are allowed to undergo many collisions within a diffusive region before DDMC is employed. These requirements make the Nth-collided source technique impractical for realistic radiation-transport calculations

  20. Monte Carlo simulation for the transport beamline

    Energy Technology Data Exchange (ETDEWEB)

    Romano, F.; Cuttone, G.; Jia, S. B.; Varisano, A. [INFN, Laboratori Nazionali del Sud, Via Santa Sofia 62, Catania (Italy); Attili, A.; Marchetto, F.; Russo, G. [INFN, Sezione di Torino, Via P.Giuria, 1 10125 Torino (Italy); Cirrone, G. A. P.; Schillaci, F.; Scuderi, V. [INFN, Laboratori Nazionali del Sud, Via Santa Sofia 62, Catania, Italy and Institute of Physics Czech Academy of Science, ELI-Beamlines project, Na Slovance 2, Prague (Czech Republic); Carpinelli, M. [INFN Sezione di Cagliari, c/o Dipartimento di Fisica, Università di Cagliari, Cagliari (Italy); Tramontana, A. [INFN, Laboratori Nazionali del Sud, Via Santa Sofia 62, Catania, Italy and Università di Catania, Dipartimento di Fisica e Astronomia, Via S. Sofia 64, Catania (Italy)

    2013-07-26

    In the framework of the ELIMED project, Monte Carlo (MC) simulations are widely used to study the physical transport of charged particles generated by laser-target interactions and to preliminarily evaluate fluence and dose distributions. An energy selection system and the experimental setup for the TARANIS laser facility in Belfast (UK) have been already simulated with the GEANT4 (GEometry ANd Tracking) MC toolkit. Preliminary results are reported here. Future developments are planned to implement a MC based 3D treatment planning in order to optimize shots number and dose delivery.

  1. Monte Carlo simulation for the transport beamline

    International Nuclear Information System (INIS)

    Romano, F.; Cuttone, G.; Jia, S. B.; Varisano, A.; Attili, A.; Marchetto, F.; Russo, G.; Cirrone, G. A. P.; Schillaci, F.; Scuderi, V.; Carpinelli, M.; Tramontana, A.

    2013-01-01

    In the framework of the ELIMED project, Monte Carlo (MC) simulations are widely used to study the physical transport of charged particles generated by laser-target interactions and to preliminarily evaluate fluence and dose distributions. An energy selection system and the experimental setup for the TARANIS laser facility in Belfast (UK) have been already simulated with the GEANT4 (GEometry ANd Tracking) MC toolkit. Preliminary results are reported here. Future developments are planned to implement a MC based 3D treatment planning in order to optimize shots number and dose delivery

  2. Direct simulation Monte Carlo modeling of relaxation processes in polyatomic gases

    Science.gov (United States)

    Pfeiffer, M.; Nizenkov, P.; Mirza, A.; Fasoulas, S.

    2016-02-01

    Relaxation processes of polyatomic molecules are modeled and implemented in an in-house Direct Simulation Monte Carlo code in order to enable the simulation of atmospheric entry maneuvers at Mars and Saturn's Titan. The description of rotational and vibrational relaxation processes is derived from basic quantum-mechanics using a rigid rotator and a simple harmonic oscillator, respectively. Strategies regarding the vibrational relaxation process are investigated, where good agreement for the relaxation time according to the Landau-Teller expression is found for both methods, the established prohibiting double relaxation method and the new proposed multi-mode relaxation. Differences and applications areas of these two methods are discussed. Consequently, two numerical methods used for sampling of energy values from multi-dimensional distribution functions are compared. The proposed random-walk Metropolis algorithm enables the efficient treatment of multiple vibrational modes within a time step with reasonable computational effort. The implemented model is verified and validated by means of simple reservoir simulations and the comparison to experimental measurements of a hypersonic, carbon-dioxide flow around a flat-faced cylinder.

  3. Direct simulation Monte Carlo modeling of relaxation processes in polyatomic gases

    International Nuclear Information System (INIS)

    Pfeiffer, M.; Nizenkov, P.; Mirza, A.; Fasoulas, S.

    2016-01-01

    Relaxation processes of polyatomic molecules are modeled and implemented in an in-house Direct Simulation Monte Carlo code in order to enable the simulation of atmospheric entry maneuvers at Mars and Saturn’s Titan. The description of rotational and vibrational relaxation processes is derived from basic quantum-mechanics using a rigid rotator and a simple harmonic oscillator, respectively. Strategies regarding the vibrational relaxation process are investigated, where good agreement for the relaxation time according to the Landau-Teller expression is found for both methods, the established prohibiting double relaxation method and the new proposed multi-mode relaxation. Differences and applications areas of these two methods are discussed. Consequently, two numerical methods used for sampling of energy values from multi-dimensional distribution functions are compared. The proposed random-walk Metropolis algorithm enables the efficient treatment of multiple vibrational modes within a time step with reasonable computational effort. The implemented model is verified and validated by means of simple reservoir simulations and the comparison to experimental measurements of a hypersonic, carbon-dioxide flow around a flat-faced cylinder

  4. Direct simulation Monte Carlo modeling of relaxation processes in polyatomic gases

    Energy Technology Data Exchange (ETDEWEB)

    Pfeiffer, M., E-mail: mpfeiffer@irs.uni-stuttgart.de; Nizenkov, P., E-mail: nizenkov@irs.uni-stuttgart.de; Mirza, A., E-mail: mirza@irs.uni-stuttgart.de; Fasoulas, S., E-mail: fasoulas@irs.uni-stuttgart.de [Institute of Space Systems, University of Stuttgart, Pfaffenwaldring 29, D-70569 Stuttgart (Germany)

    2016-02-15

    Relaxation processes of polyatomic molecules are modeled and implemented in an in-house Direct Simulation Monte Carlo code in order to enable the simulation of atmospheric entry maneuvers at Mars and Saturn’s Titan. The description of rotational and vibrational relaxation processes is derived from basic quantum-mechanics using a rigid rotator and a simple harmonic oscillator, respectively. Strategies regarding the vibrational relaxation process are investigated, where good agreement for the relaxation time according to the Landau-Teller expression is found for both methods, the established prohibiting double relaxation method and the new proposed multi-mode relaxation. Differences and applications areas of these two methods are discussed. Consequently, two numerical methods used for sampling of energy values from multi-dimensional distribution functions are compared. The proposed random-walk Metropolis algorithm enables the efficient treatment of multiple vibrational modes within a time step with reasonable computational effort. The implemented model is verified and validated by means of simple reservoir simulations and the comparison to experimental measurements of a hypersonic, carbon-dioxide flow around a flat-faced cylinder.

  5. Monte Carlo simulation of fission yields, kinetic energy, fission neutron spectrum and decay γ-ray spectrum for 232Th(n,f) reaction induced by 3H(d,n) 4He neutron source

    International Nuclear Information System (INIS)

    Zheng Wei; Zeen Yao; Changlin Lan; Yan Yan; Yunjian Shi; Siqi Yan; Jie Wang; Junrun Wang; Jingen Chen; Chinese Academy of Sciences, Shanghai

    2015-01-01

    Monte Carlo transport code Geant4 has been successfully utilised to study of neutron-induced fission reaction for 232 Th in the transport neutrons generated from 3 H(d,n) 4 He neutron source. The purpose of this work is to examine the applicability of Monte Carlo simulations for the computation of fission reaction process. For this, Monte Carlo simulates and calculates the characteristics of fission reaction process of 232 Th(n,f), such as the fission yields distribution, kinetic energy distribution, fission neutron spectrum and decay γ-ray spectrum. This is the first time to simulate the process of neutron-induced fission reaction using Geant4 code. Typical computational results of neutron-induced fission reaction of 232 Th(n,f) reaction are presented. The computational results are compared with the previous experimental data and evaluated nuclear data to confirm the certain physical process model in Geant4 of scientific rationality. (author)

  6. A Fast Monte Carlo Simulation for the International Linear Collider Detector

    International Nuclear Information System (INIS)

    Furse, D.

    2005-01-01

    The following paper contains details concerning the motivation for, implementation and performance of a Java-based fast Monte Carlo simulation for a detector designed to be used in the International Linear Collider. This simulation, presently included in the SLAC ILC group's org.lcsim package, reads in standard model or SUSY events in STDHEP file format, stochastically simulates the blurring in physics measurements caused by intrinsic detector error, and writes out an LCIO format file containing a set of final particles statistically similar to those that would have found by a full Monte Carlo simulation. In addition to the reconstructed particles themselves, descriptions of the calorimeter hit clusters and tracks that these particles would have produced are also included in the LCIO output. These output files can then be put through various analysis codes in order to characterize the effectiveness of a hypothetical detector at extracting relevant physical information about an event. Such a tool is extremely useful in preliminary detector research and development, as full simulations are extremely cumbersome and taxing on processor resources; a fast, efficient Monte Carlo can facilitate and even make possible detector physics studies that would be very impractical with the full simulation by sacrificing what is in many cases inappropriate attention to detail for valuable gains in time required for results

  7. Entrance surface dose distribution and organ dose assessment for cone-beam computed tomography using measurements and Monte Carlo simulations with voxel phantoms

    Science.gov (United States)

    Baptista, M.; Di Maria, S.; Vieira, S.; Vaz, P.

    2017-11-01

    Cone-Beam Computed Tomography (CBCT) enables high-resolution volumetric scanning of the bone and soft tissue anatomy under investigation at the treatment accelerator. This technique is extensively used in Image Guided Radiation Therapy (IGRT) for pre-treatment verification of patient position and target volume localization. When employed daily and several times per patient, CBCT imaging may lead to high cumulative imaging doses to the healthy tissues surrounding the exposed organs. This work aims at (1) evaluating the dose distribution during a CBCT scan and (2) calculating the organ doses involved in this image guiding procedure for clinically available scanning protocols. Both Monte Carlo (MC) simulations and measurements were performed. To model and simulate the kV imaging system mounted on a linear accelerator (Edge™, Varian Medical Systems) the state-of-the-art MC radiation transport program MCNPX 2.7.0 was used. In order to validate the simulation results, measurements of the Computed Tomography Dose Index (CTDI) were performed, using standard PMMA head and body phantoms, with 150 mm length and a standard pencil ionizing chamber (IC) 100 mm long. Measurements for head and pelvis scanning protocols, usually adopted in clinical environment were acquired, using two acquisition modes (full-fan and half fan). To calculate the organ doses, the implemented MC model of the CBCT scanner together with a male voxel phantom ("Golem") was used. The good agreement between the MCNPX simulations and the CTDIw measurements (differences up to 17%) presented in this work reveals that the CBCT MC model was successfully validated, taking into account the several uncertainties. The adequacy of the computational model to map dose distributions during a CBCT scan is discussed in order to identify ways to reduce the total CBCT imaging dose. The organ dose assessment highlights the need to evaluate the therapeutic and the CBCT imaging doses, in a more balanced approach, and the

  8. Monte Carlo Simulations of Compressible Ising Models: Do We Understand Them?

    Science.gov (United States)

    Landau, D. P.; Dünweg, B.; Laradji, M.; Tavazza, F.; Adler, J.; Cannavaccioulo, L.; Zhu, X.

    Extensive Monte Carlo simulations have begun to shed light on our understanding of phase transitions and universality classes for compressible Ising models. A comprehensive analysis of a Landau-Ginsburg-Wilson hamiltonian for systems with elastic degrees of freedom resulted in the prediction that there should be four distinct cases that would have different behavior, depending upon symmetries and thermodynamic constraints. We shall provide an account of the results of careful Monte Carlo simulations for a simple compressible Ising model that can be suitably modified so as to replicate all four cases.

  9. On Monte Carlo Simulation and Analysis of Electricity Markets

    International Nuclear Information System (INIS)

    Amelin, Mikael

    2004-07-01

    This dissertation is about how Monte Carlo simulation can be used to analyse electricity markets. There are a wide range of applications for simulation; for example, players in the electricity market can use simulation to decide whether or not an investment can be expected to be profitable, and authorities can by means of simulation find out which consequences a certain market design can be expected to have on electricity prices, environmental impact, etc. In the first part of the dissertation, the focus is which electricity market models are suitable for Monte Carlo simulation. The starting point is a definition of an ideal electricity market. Such an electricity market is partly practical from a mathematical point of view (it is simple to formulate and does not require too complex calculations) and partly it is a representation of the best possible resource utilisation. The definition of the ideal electricity market is followed by analysis how the reality differs from the ideal model, what consequences the differences have on the rules of the electricity market and the strategies of the players, as well as how non-ideal properties can be included in a mathematical model. Particularly, questions about environmental impact, forecast uncertainty and grid costs are studied. The second part of the dissertation treats the Monte Carlo technique itself. To reduce the number of samples necessary to obtain accurate results, variance reduction techniques can be used. Here, six different variance reduction techniques are studied and possible applications are pointed out. The conclusions of these studies are turned into a method for efficient simulation of basic electricity markets. The method is applied to some test systems and the results show that the chosen variance reduction techniques can produce equal or better results using 99% fewer samples compared to when the same system is simulated without any variance reduction technique. More complex electricity market models

  10. Dose-image quality study in digital chest radiography using Monte Carlo simulation

    International Nuclear Information System (INIS)

    Correa, S.C.A.; Souza, E.M.; Silva, A.X.; Lopes, R.T.; Yoriyaz, H.

    2008-01-01

    One of the main preoccupations of diagnostic radiology is to guarantee a good image-sparing dose to the patient. In the present study, Monte Carlo simulations, with MCNPX code, coupled with an adult voxel female model (FAX) were performed to investigate how image quality and dose in digital chest radiography vary with tube voltage (80-150 kV) using air-gap technique and a computed radiography system. Calculated quantities were normalized to a fixed value of entrance skin exposure (ESE) of 0.0136 R. The results of the present analysis show that the image quality for chest radiography with imaging plate is improved and the dose reduced at lower tube voltage

  11. Taylor-expansion Monte Carlo simulations of classical fluids in the canonical and grand canonical ensemble

    International Nuclear Information System (INIS)

    Schoen, M.

    1995-01-01

    In this article the Taylor-expansion method is introduced by which Monte Carlo (MC) simulations in the canonical ensemble can be speeded up significantly, Substantial gains in computational speed of 20-40% over conventional implementations of the MC technique are obtained over a wide range of densities in homogeneous bulk phases. The basic philosophy behind the Taylor-expansion method is a division of the neighborhood of each atom (or molecule) into three different spatial zones. Interactions between atoms belonging to each zone are treated at different levels of computational sophistication. For example, only interactions between atoms belonging to the primary zone immediately surrounding an atom are treated explicitly before and after displacement. The change in the configurational energy contribution from secondary-zone interactions is obtained from the first-order term of a Taylor expansion of the configurational energy in terms of the displacement vector d. Interactions with atoms in the tertiary zone adjacent to the secondary zone are neglected throughout. The Taylor-expansion method is not restricted to the canonical ensemble but may be employed to enhance computational efficiency of MC simulations in other ensembles as well. This is demonstrated for grand canonical ensemble MC simulations of an inhomogeneous fluid which can be performed essentially on a modern personal computer

  12. Monte Carlo simulation of PET and SPECT imaging of {sup 90}Y

    Energy Technology Data Exchange (ETDEWEB)

    Takahashi, Akihiko, E-mail: takahsr@hs.med.kyushu-u.ac.jp; Sasaki, Masayuki [Department of Health Sciences, Faculty of Medical Sciences, Kyushu University, 3-1-1 Maidashi, Higashi-ku, Fukuoka 812-8582 (Japan); Himuro, Kazuhiko; Yamashita, Yasuo; Komiya, Isao [Division of Radiology, Department of Medical Technology, Kyushu University Hospital, 3-1-1 Maidashi, Higashi-ku, Fukuoka 812-8582 (Japan); Baba, Shingo [Department of Clinical Radiology, Kyushu University Hospital, 3-1-1 Maidashi, Higashi-ku, Fukuoka 812-8582 (Japan)

    2015-04-15

    Purpose: Yittrium-90 ({sup 90}Y) is traditionally thought of as a pure beta emitter, and is used in targeted radionuclide therapy, with imaging performed using bremsstrahlung single-photon emission computed tomography (SPECT). However, because {sup 90}Y also emits positrons through internal pair production with a very small branching ratio, positron emission tomography (PET) imaging is also available. Because of the insufficient image quality of {sup 90}Y bremsstrahlung SPECT, PET imaging has been suggested as an alternative. In this paper, the authors present the Monte Carlo-based simulation–reconstruction framework for {sup 90}Y to comprehensively analyze the PET and SPECT imaging techniques and to quantitatively consider the disadvantages associated with them. Methods: Our PET and SPECT simulation modules were developed using Monte Carlo simulation of Electrons and Photons (MCEP), developed by Dr. S. Uehara. PET code (MCEP-PET) generates a sinogram, and reconstructs the tomography image using a time-of-flight ordered subset expectation maximization (TOF-OSEM) algorithm with attenuation compensation. To evaluate MCEP-PET, simulated results of {sup 18}F PET imaging were compared with the experimental results. The results confirmed that MCEP-PET can simulate the experimental results very well. The SPECT code (MCEP-SPECT) models the collimator and NaI detector system, and generates the projection images and projection data. To save the computational time, the authors adopt the prerecorded {sup 90}Y bremsstrahlung photon data calculated by MCEP. The projection data are also reconstructed using the OSEM algorithm. The authors simulated PET and SPECT images of a water phantom containing six hot spheres filled with different concentrations of {sup 90}Y without background activity. The amount of activity was 163 MBq, with an acquisition time of 40 min. Results: The simulated {sup 90}Y-PET image accurately simulated the experimental results. PET image is visually

  13. Monte Carlo simulation experiments on box-type radon dosimeter

    International Nuclear Information System (INIS)

    Jamil, Khalid; Kamran, Muhammad; Illahi, Ahsan; Manzoor, Shahid

    2014-01-01

    Epidemiological studies show that inhalation of radon gas ( 222 Rn) may be carcinogenic especially to mine workers, people living in closed indoor energy conserved environments and underground dwellers. It is, therefore, of paramount importance to measure the 222 Rn concentrations (Bq/m 3 ) in indoors environments. For this purpose, box-type passive radon dosimeters employing ion track detector like CR-39 are widely used. Fraction of the number of radon alphas emitted in the volume of the box type dosimeter resulting in latent track formation on CR-39 is the latent track registration efficiency. Latent track registration efficiency is ultimately required to evaluate the radon concentration which consequently determines the effective dose and the radiological hazards. In this research, Monte Carlo simulation experiments were carried out to study the alpha latent track registration efficiency for box type radon dosimeter as a function of dosimeter’s dimensions and range of alpha particles in air. Two different self developed Monte Carlo simulation techniques were employed namely: (a) Surface ratio (SURA) method and (b) Ray hitting (RAHI) method. Monte Carlo simulation experiments revealed that there are two types of efficiencies i.e. intrinsic efficiency (η int ) and alpha hit efficiency (η hit ). The η int depends upon only on the dimensions of the dosimeter and η hit depends both upon dimensions of the dosimeter and range of the alpha particles. The total latent track registration efficiency is the product of both intrinsic and hit efficiencies. It has been concluded that if diagonal length of box type dosimeter is kept smaller than the range of alpha particle then hit efficiency is achieved as 100%. Nevertheless the intrinsic efficiency keeps playing its role. The Monte Carlo simulation experimental results have been found helpful to understand the intricate track registration mechanisms in the box type dosimeter. This paper explains that how radon

  14. Monte Carlo simulation experiments on box-type radon dosimeter

    Energy Technology Data Exchange (ETDEWEB)

    Jamil, Khalid, E-mail: kjamil@comsats.edu.pk; Kamran, Muhammad; Illahi, Ahsan; Manzoor, Shahid

    2014-11-11

    Epidemiological studies show that inhalation of radon gas ({sup 222}Rn) may be carcinogenic especially to mine workers, people living in closed indoor energy conserved environments and underground dwellers. It is, therefore, of paramount importance to measure the {sup 222}Rn concentrations (Bq/m{sup 3}) in indoors environments. For this purpose, box-type passive radon dosimeters employing ion track detector like CR-39 are widely used. Fraction of the number of radon alphas emitted in the volume of the box type dosimeter resulting in latent track formation on CR-39 is the latent track registration efficiency. Latent track registration efficiency is ultimately required to evaluate the radon concentration which consequently determines the effective dose and the radiological hazards. In this research, Monte Carlo simulation experiments were carried out to study the alpha latent track registration efficiency for box type radon dosimeter as a function of dosimeter’s dimensions and range of alpha particles in air. Two different self developed Monte Carlo simulation techniques were employed namely: (a) Surface ratio (SURA) method and (b) Ray hitting (RAHI) method. Monte Carlo simulation experiments revealed that there are two types of efficiencies i.e. intrinsic efficiency (η{sub int}) and alpha hit efficiency (η{sub hit}). The η{sub int} depends upon only on the dimensions of the dosimeter and η{sub hit} depends both upon dimensions of the dosimeter and range of the alpha particles. The total latent track registration efficiency is the product of both intrinsic and hit efficiencies. It has been concluded that if diagonal length of box type dosimeter is kept smaller than the range of alpha particle then hit efficiency is achieved as 100%. Nevertheless the intrinsic efficiency keeps playing its role. The Monte Carlo simulation experimental results have been found helpful to understand the intricate track registration mechanisms in the box type dosimeter. This paper

  15. Speeding up Monte Carlo molecular simulation by a non-conservative early rejection scheme

    KAUST Repository

    Kadoura, Ahmad Salim

    2015-04-23

    Monte Carlo (MC) molecular simulation describes fluid systems with rich information, and it is capable of predicting many fluid properties of engineering interest. In general, it is more accurate and representative than equations of state. On the other hand, it requires much more computational effort and simulation time. For that purpose, several techniques have been developed in order to speed up MC molecular simulations while preserving their precision. In particular, early rejection schemes are capable of reducing computational cost by reaching the rejection decision for the undesired MC trials at an earlier stage in comparison to the conventional scheme. In a recent work, we have introduced a ‘conservative’ early rejection scheme as a method to accelerate MC simulations while producing exactly the same results as the conventional algorithm. In this paper, we introduce a ‘non-conservative’ early rejection scheme, which is much faster than the conservative scheme, yet it preserves the precision of the method. The proposed scheme is tested for systems of structureless Lennard-Jones particles in both canonical and NVT-Gibbs ensembles. Numerical experiments were conducted at several thermodynamic conditions for different number of particles. Results show that at certain thermodynamic conditions, the non-conservative method is capable of doubling the speed of the MC molecular simulations in both canonical and NVT-Gibbs ensembles. © 2015 Taylor & Francis

  16. Direct Simulation Monte Carlo Application of the Three Dimensional Forced Harmonic Oscillator Model

    Science.gov (United States)

    2017-12-07

    NUMBER (Include area code) 07 December 2017 Journal Article 24 February 2017 - 31 December 2017 Direct Simulation Monte Carlo Application of the...is proposed. The implementation employs precalculated lookup tables for transition probabilities and is suitable for the direct simulation Monte Carlo...method. It takes into account the microscopic reversibility between the excitation and deexcitation processes , and it satisfies the detailed balance

  17. Monte Carlo simulation for dual head gamma camera

    International Nuclear Information System (INIS)

    Osman, Yousif Bashir Soliman

    2015-12-01

    Monte Carlo (MC) simulation technique was used widely in medical physics applications. In nuclear medicine MC was used to design new medical imaging devices such as positron emission tomography (PET), gamma camera and single photon emission computed tomography (SPECT). Also it can be used to study the factors affecting image quality and internal dosimetry, Gate is on of monte Carlo code that has a number of advantages for simulation of SPECT and PET. There is a limit accessibilities in machines which are used in clinics because of the work load of machines. This makes it hard to evaluate some factors effecting machine performance which must be evaluated routinely. Also because of difficulties of carrying out scientific research and training of students, MC model can be optimum solution for the problem. The aim of this study was to use gate monte Carlo code to model Nucline spirit, medico dual head gamma camera hosted in radiation and isotopes center of Khartoum which is equipped with low energy general purpose LEGP collimators. This was used model to evaluate spatial resolution and sensitivity which is important factor affecting image quality and to demonstrate the validity of gate by comparing experimental results with simulation results on spatial resolution. The gate model of Nuclide spirit, medico dual head gamma camera was developed by applying manufacturer specifications. Then simulation was run. In evaluation of spatial resolution the FWHM was calculated from image profile of line source of Tc 99m gammas emitter of energy 140 KeV at different distances from modeled camera head at 5,10,15,20,22,27,32,37 cm and for these distances the spatial resolution was founded to be 5.76, 7.73, 10.7, 13.8, 14.01,16.91, 19.75 and 21.9 mm, respectively. These results showed a decrement of spatial resolution with increase of the distance between object (line source) and collimator in linear manner. FWHM calculated at 10 cm was compared with experimental results. The

  18. Methods for Monte Carlo simulations of biomacromolecules.

    Science.gov (United States)

    Vitalis, Andreas; Pappu, Rohit V

    2009-01-01

    The state-of-the-art for Monte Carlo (MC) simulations of biomacromolecules is reviewed. Available methodologies for sampling conformational equilibria and associations of biomacromolecules in the canonical ensemble, given a continuum description of the solvent environment, are reviewed. Detailed sections are provided dealing with the choice of degrees of freedom, the efficiencies of MC algorithms and algorithmic peculiarities, as well as the optimization of simple movesets. The issue of introducing correlations into elementary MC moves, and the applicability of such methods to simulations of biomacromolecules is discussed. A brief discussion of multicanonical methods and an overview of recent simulation work highlighting the potential of MC methods are also provided. It is argued that MC simulations, while underutilized biomacromolecular simulation community, hold promise for simulations of complex systems and phenomena that span multiple length scales, especially when used in conjunction with implicit solvation models or other coarse graining strategies.

  19. Monte Carlo simulation of Touschek effect

    Directory of Open Access Journals (Sweden)

    Aimin Xiao

    2010-07-01

    Full Text Available We present a Monte Carlo method implementation in the code elegant for simulating Touschek scattering effects in a linac beam. The local scattering rate and the distribution of scattered electrons can be obtained from the code either for a Gaussian-distributed beam or for a general beam whose distribution function is given. In addition, scattered electrons can be tracked through the beam line and the local beam-loss rate and beam halo information recorded.

  20. Single-site Lennard-Jones models via polynomial chaos surrogates of Monte Carlo molecular simulation

    Energy Technology Data Exchange (ETDEWEB)

    Kadoura, Ahmad, E-mail: ahmad.kadoura@kaust.edu.sa, E-mail: adil.siripatana@kaust.edu.sa, E-mail: shuyu.sun@kaust.edu.sa, E-mail: omar.knio@kaust.edu.sa; Sun, Shuyu, E-mail: ahmad.kadoura@kaust.edu.sa, E-mail: adil.siripatana@kaust.edu.sa, E-mail: shuyu.sun@kaust.edu.sa, E-mail: omar.knio@kaust.edu.sa [Computational Transport Phenomena Laboratory, The Earth Sciences and Engineering Department, The Physical Sciences and Engineering Division, King Abdullah University of Science and Technology, Thuwal 23955-6900 (Saudi Arabia); Siripatana, Adil, E-mail: ahmad.kadoura@kaust.edu.sa, E-mail: adil.siripatana@kaust.edu.sa, E-mail: shuyu.sun@kaust.edu.sa, E-mail: omar.knio@kaust.edu.sa; Hoteit, Ibrahim, E-mail: ibrahim.hoteit@kaust.edu.sa [Earth Fluid Modeling and Predicting Group, The Earth Sciences and Engineering Department, The Physical Sciences and Engineering Division, King Abdullah University of Science and Technology, Thuwal 23955-6900 (Saudi Arabia); Knio, Omar, E-mail: ahmad.kadoura@kaust.edu.sa, E-mail: adil.siripatana@kaust.edu.sa, E-mail: shuyu.sun@kaust.edu.sa, E-mail: omar.knio@kaust.edu.sa [Uncertainty Quantification Center, The Applied Mathematics and Computational Science Department, The Computer, Electrical and Mathematical Sciences and Engineering Division, King Abdullah University of Science and Technology, Thuwal 23955-6900 (Saudi Arabia)

    2016-06-07

    In this work, two Polynomial Chaos (PC) surrogates were generated to reproduce Monte Carlo (MC) molecular simulation results of the canonical (single-phase) and the NVT-Gibbs (two-phase) ensembles for a system of normalized structureless Lennard-Jones (LJ) particles. The main advantage of such surrogates, once generated, is the capability of accurately computing the needed thermodynamic quantities in a few seconds, thus efficiently replacing the computationally expensive MC molecular simulations. Benefiting from the tremendous computational time reduction, the PC surrogates were used to conduct large-scale optimization in order to propose single-site LJ models for several simple molecules. Experimental data, a set of supercritical isotherms, and part of the two-phase envelope, of several pure components were used for tuning the LJ parameters (ε, σ). Based on the conducted optimization, excellent fit was obtained for different noble gases (Ar, Kr, and Xe) and other small molecules (CH{sub 4}, N{sub 2}, and CO). On the other hand, due to the simplicity of the LJ model used, dramatic deviations between simulation and experimental data were observed, especially in the two-phase region, for more complex molecules such as CO{sub 2} and C{sub 2} H{sub 6}.

  1. A Monte Carlo simulation study of associated liquid crystals

    Science.gov (United States)

    Berardi, R.; Fehervari, M.; Zannoni, C.

    We have performed a Monte Carlo simulation study of a system of ellipsoidal particles with donor-acceptor sites modelling complementary hydrogen-bonding groups in real molecules. We have considered elongated Gay-Berne particles with terminal interaction sites allowing particles to associate and form dimers. The changes in the phase transitions and in the molecular organization and the interplay between orientational ordering and dimer formation are discussed. Particle flip and dimer moves have been used to increase the convergency rate of the Monte Carlo (MC) Markov chain.

  2. Patient-specific scatter correction in clinical cone beam computed tomography imaging made possible by the combination of Monte Carlo simulations and a ray tracing algorithm

    International Nuclear Information System (INIS)

    Thing, Rune S.; Bernchou, Uffe; Brink, Carsten; Mainegra-Hing, Ernesto

    2013-01-01

    Purpose: Cone beam computed tomography (CBCT) image quality is limited by scattered photons. Monte Carlo (MC) simulations provide the ability of predicting the patient-specific scatter contamination in clinical CBCT imaging. Lengthy simulations prevent MC-based scatter correction from being fully implemented in a clinical setting. This study investigates the combination of using fast MC simulations to predict scatter distributions with a ray tracing algorithm to allow calibration between simulated and clinical CBCT images. Material and methods: An EGSnrc-based user code (egs c bct), was used to perform MC simulations of an Elekta XVI CBCT imaging system. A 60keV x-ray source was used, and air kerma scored at the detector plane. Several variance reduction techniques (VRTs) were used to increase the scatter calculation efficiency. Three patient phantoms based on CT scans were simulated, namely a brain, a thorax and a pelvis scan. A ray tracing algorithm was used to calculate the detector signal due to primary photons. A total of 288 projections were simulated, one for each thread on the computer cluster used for the investigation. Results: Scatter distributions for the brain, thorax and pelvis scan were simulated within 2 % statistical uncertainty in two hours per scan. Within the same time, the ray tracing algorithm provided the primary signal for each of the projections. Thus, all the data needed for MC-based scatter correction in clinical CBCT imaging was obtained within two hours per patient, using a full simulation of the clinical CBCT geometry. Conclusions: This study shows that use of MC-based scatter corrections in CBCT imaging has a great potential to improve CBCT image quality. By use of powerful VRTs to predict scatter distributions and a ray tracing algorithm to calculate the primary signal, it is possible to obtain the necessary data for patient specific MC scatter correction within two hours per patient

  3. Enhancement of precision and accuracy by Monte-Carlo simulation of a well-type pressurized ionization chamber used in radionuclide metrology

    International Nuclear Information System (INIS)

    Kryeziu, D.

    2006-09-01

    The aim of this work was to test and validate the Monte-Carlo (MC) ionization chamber simulation method in calculating the activity of radioactive solutions. This is required when no or not sufficient experimental calibration figures are available as well as to improve the accuracy of activity measurements for other radionuclides. Well-type or 4π γ ISOCAL IV ionization chambers (IC) are widely used in many national standard laboratories around the world. As secondary standard measuring systems these radionuclide calibrators serve to maintain measurement consistency checks and to ensure the quality of standards disseminated to users for a wide range of radionuclide where many of them are with special interest in nuclear medicine as well as in different applications on radionuclide metrology. For the studied radionuclides the calibration figures (efficiencies) and their respective volume correction factors are determined by using the PENELOPE MC computer code system. The ISOCAL IV IC filled with nitrogen gas at approximately 1 MPa is simulated. The simulated models of the chamber are designed by means of reduced quadric equation and applying the appropriate mathematical transformations. The simulations are done for various container geometries of the standard solution which take forms of: i) sealed Jena glass 5 ml PTB standard ampoule, ii) 10 ml (P6) vial and iii) 10 R Schott Type 1+ vial. Simulation of the ISOCAL IV IC is explained. The effect of density variation of the nitrogen filling gas on the sensitivity of the chamber is investigated. The code is also used to examine the effects of using lead and copper shields as well as to evaluate the sensitivity of the chamber to electrons and positrons. Validation of the Monte-Carlo simulation method has been proved by comparing the Monte-Carlo simulation calculated and experimental calibration figures available from the National Physical Laboratory (NPL) England which are deduced from the absolute activity

  4. Simulation of neutron transport equation using parallel Monte Carlo for deep penetration problems

    International Nuclear Information System (INIS)

    Bekar, K. K.; Tombakoglu, M.; Soekmen, C. N.

    2001-01-01

    Neutron transport equation is simulated using parallel Monte Carlo method for deep penetration neutron transport problem. Monte Carlo simulation is parallelized by using three different techniques; direct parallelization, domain decomposition and domain decomposition with load balancing, which are used with PVM (Parallel Virtual Machine) software on LAN (Local Area Network). The results of parallel simulation are given for various model problems. The performances of the parallelization techniques are compared with each other. Moreover, the effects of variance reduction techniques on parallelization are discussed

  5. Treatment planning for a small animal using Monte Carlo simulation

    International Nuclear Information System (INIS)

    Chow, James C. L.; Leung, Michael K. K.

    2007-01-01

    The development of a small animal model for radiotherapy research requires a complete setup of customized imaging equipment, irradiators, and planning software that matches the sizes of the subjects. The purpose of this study is to develop and demonstrate the use of a flexible in-house research environment for treatment planning on small animals. The software package, called DOSCTP, provides a user-friendly platform for DICOM computed tomography-based Monte Carlo dose calculation using the EGSnrcMP-based DOSXYZnrc code. Validation of the treatment planning was performed by comparing the dose distributions for simple photon beam geometries calculated through the Pinnacle3 treatment planning system and measurements. A treatment plan for a mouse based on a CT image set by a 360-deg photon arc is demonstrated. It is shown that it is possible to create 3D conformal treatment plans for small animals with consideration of inhomogeneities using small photon beam field sizes in the diameter range of 0.5-5 cm, with conformal dose covering the target volume while sparing the surrounding critical tissue. It is also found that Monte Carlo simulation is suitable to carry out treatment planning dose calculation for small animal anatomy with voxel size about one order of magnitude smaller than that of the human

  6. Variation in computer time with geometry prescription in monte carlo code KENO-IV

    International Nuclear Information System (INIS)

    Gopalakrishnan, C.R.

    1988-01-01

    In most studies, the Monte Carlo criticality code KENO-IV has been compared with other Monte Carlo codes, but evaluation of its performance with different box descriptions has not been done so far. In Monte Carlo computations, any fractional savings of computing time is highly desirable. Variation in computation time with box description in KENO for two different fast reactor fuel subassemblies of FBTR and PFBR is studied. The K eff of an infinite array of fuel subassemblies is calculated by modelling the subassemblies in two different ways (i) multi-region, (ii) multi-box. In addition to these two cases, excess reactivity calculations of FBTR are also performed in two ways to study this effect in a complex geometry. It is observed that the K eff values calculated by multi-region and multi-box models agree very well. However the increase in computation time from the multi-box to the multi-region is considerable, while the difference in computer storage requirements for the two models is negligible. This variation in computing time arises from the way the neutron is tracked in the two cases. (author)

  7. A simple methodology for characterization of germanium coaxial detectors by using Monte Carlo simulation and evolutionary algorithms

    International Nuclear Information System (INIS)

    Guerra, J.G.; Rubiano, J.G.; Winter, G.; Guerra, A.G.; Alonso, H.; Arnedo, M.A.; Tejera, A.; Gil, J.M.; Rodríguez, R.; Martel, P.; Bolivar, J.P.

    2015-01-01

    The determination in a sample of the activity concentration of a specific radionuclide by gamma spectrometry needs to know the full energy peak efficiency (FEPE) for the energy of interest. The difficulties related to the experimental calibration make it advisable to have alternative methods for FEPE determination, such as the simulation of the transport of photons in the crystal by the Monte Carlo method, which requires an accurate knowledge of the characteristics and geometry of the detector. The characterization process is mainly carried out by Canberra Industries Inc. using proprietary techniques and methodologies developed by that company. It is a costly procedure (due to shipping and to the cost of the process itself) and for some research laboratories an alternative in situ procedure can be very useful. The main goal of this paper is to find an alternative to this costly characterization process, by establishing a method for optimizing the parameters of characterizing the detector, through a computational procedure which could be reproduced at a standard research lab. This method consists in the determination of the detector geometric parameters by using Monte Carlo simulation in parallel with an optimization process, based on evolutionary algorithms, starting from a set of reference FEPEs determined experimentally or computationally. The proposed method has proven to be effective and simple to implement. It provides a set of characterization parameters which it has been successfully validated for different source-detector geometries, and also for a wide range of environmental samples and certified materials. - Highlights: • A computational method for characterizing an HPGe spectrometer has been developed. • Detector characterized using as reference photopeak efficiencies obtained experimentally or by Monte Carlo calibration. • The characterization obtained has been validated for samples with different geometries and composition. • Good agreement

  8. Multi-level Monte Carlo Methods for Efficient Simulation of Coulomb Collisions

    Science.gov (United States)

    Ricketson, Lee

    2013-10-01

    We discuss the use of multi-level Monte Carlo (MLMC) schemes--originally introduced by Giles for financial applications--for the efficient simulation of Coulomb collisions in the Fokker-Planck limit. The scheme is based on a Langevin treatment of collisions, and reduces the computational cost of achieving a RMS error scaling as ɛ from O (ɛ-3) --for standard Langevin methods and binary collision algorithms--to the theoretically optimal scaling O (ɛ-2) for the Milstein discretization, and to O (ɛ-2 (logɛ)2) with the simpler Euler-Maruyama discretization. In practice, this speeds up simulation by factors up to 100. We summarize standard MLMC schemes, describe some tricks for achieving the optimal scaling, present results from a test problem, and discuss the method's range of applicability. This work was performed under the auspices of the U.S. DOE by the University of California, Los Angeles, under grant DE-FG02-05ER25710, and by LLNL under contract DE-AC52-07NA27344.

  9. Determination of true coincidence correction factors using Monte-Carlo simulation techniques

    Directory of Open Access Journals (Sweden)

    Chionis Dionysios A.

    2014-01-01

    Full Text Available Aim of this work is the numerical calculation of the true coincidence correction factors by means of Monte-Carlo simulation techniques. For this purpose, the Monte Carlo computer code PENELOPE was used and the main program PENMAIN was properly modified in order to include the effect of the true coincidence phenomenon. The modified main program that takes into consideration the true coincidence phenomenon was used for the full energy peak efficiency determination of an XtRa Ge detector with relative efficiency 104% and the results obtained for the 1173 keV and 1332 keV photons of 60Co were found consistent with respective experimental ones. The true coincidence correction factors were calculated as the ratio of the full energy peak efficiencies was determined from the original main program PENMAIN and the modified main program PENMAIN. The developed technique was applied for 57Co, 88Y, and 134Cs and for two source-to-detector geometries. The results obtained were compared with true coincidence correction factors calculated from the "TrueCoinc" program and the relative bias was found to be less than 2%, 4%, and 8% for 57Co, 88Y, and 134Cs, respectively.

  10. Monte Carlo Simulation of stepping source in afterloading intracavitary brachytherapy for GZP6 unit

    International Nuclear Information System (INIS)

    Toossi, M.T.B.; Abdollahi, M.; Ghorbani, M.

    2010-01-01

    Full text: Stepping source in brachytherapy systems is used to treat a target lesion longer than the effective treatment length of the source. Dose calculation accuracy plays a vital role in the outcome of brachytherapy treatment. In this study, the stepping source (channel 6) of GZP6 brachytherapy unit was simulated by Monte Carlo simulation and matrix shift method. The stepping source of GZP6 was simulated by Monte Carlo MCNPX code. The Mesh tally (type I) was employed for absorbed dose calculation in a cylindrical water phantom. 5 x 108 photon histories were scored and a 0.2% statistical uncertainty was obtained by Monte Carlo calculations. Dose distributions were obtained by our matrix shift method for esophageal cancer tumor lengths of 8 and 10 cm. Isodose curves produced by simulation and TPS were superimposed to estimate the differences. Results Comparison of Monte Carlo and TPS dose distributions show that in longitudinal direction (source movement direction) Monte Carlo and TPS dose distributions are comparable. [n transverse direction, the dose differences of 7 and 5% were observed for esophageal tumor lengths of 8 and 10 cm respectively. Conclusions Although, the results show that the maximum difference between Monte Carlo and TPS calculations is about 7%, but considering that the certified activity is given with ± I 0%, uncertainty, then an error of the order of 20% for Monte Carlo calculation would be reasonable. It can be suggested that accuracy of the dose distribution produced by TPS is acceptable for clinical applications. (author)

  11. Pore-scale uncertainty quantification with multilevel Monte Carlo

    KAUST Repository

    Icardi, Matteo

    2014-01-06

    Computational fluid dynamics (CFD) simulations of pore-scale transport processes in porous media have recently gained large popularity. However the geometrical details of the pore structures can be known only in a very low number of samples and the detailed flow computations can be carried out only on a limited number of cases. The explicit introduction of randomness in the geometry and in other setup parameters can be crucial for the optimization of pore-scale investigations for random homogenization. Since there are no generic ways to parametrize the randomness in the porescale structures, Monte Carlo techniques are the most accessible to compute statistics. We propose a multilevel Monte Carlo (MLMC) technique to reduce the computational cost of estimating quantities of interest within a prescribed accuracy constraint. Random samples of pore geometries with a hierarchy of geometrical complexities and grid refinements, are synthetically generated and used to propagate the uncertainties in the flow simulations and compute statistics of macro-scale effective parameters.

  12. Voxel-based Monte Carlo simulation of X-ray imaging and spectroscopy experiments

    International Nuclear Information System (INIS)

    Bottigli, U.; Brunetti, A.; Golosio, B.; Oliva, P.; Stumbo, S.; Vincze, L.; Randaccio, P.; Bleuet, P.; Simionovici, A.; Somogyi, A.

    2004-01-01

    A Monte Carlo code for the simulation of X-ray imaging and spectroscopy experiments in heterogeneous samples is presented. The energy spectrum, polarization and profile of the incident beam can be defined so that X-ray tube systems as well as synchrotron sources can be simulated. The sample is modeled as a 3D regular grid. The chemical composition and density is given at each point of the grid. Photoelectric absorption, fluorescent emission, elastic and inelastic scattering are included in the simulation. The core of the simulation is a fast routine for the calculation of the path lengths of the photon trajectory intersections with the grid voxels. The voxel representation is particularly useful for samples that cannot be well described by a small set of polyhedra. This is the case of most naturally occurring samples. In such cases, voxel-based simulations are much less expensive in terms of computational cost than simulations on a polygonal representation. The efficient scheme used for calculating the path lengths in the voxels and the use of variance reduction techniques make the code suitable for the detailed simulation of complex experiments on generic samples in a relatively short time. Examples of applications to X-ray imaging and spectroscopy experiments are discussed

  13. Voxel-based Monte Carlo simulation of X-ray imaging and spectroscopy experiments

    Energy Technology Data Exchange (ETDEWEB)

    Bottigli, U. [Istituto di Matematica e Fisica dell' Universita di Sassari, via Vienna 2, 07100, Sassari (Italy); Sezione INFN di Cagliari (Italy); Brunetti, A. [Istituto di Matematica e Fisica dell' Universita di Sassari, via Vienna 2, 07100, Sassari (Italy); Golosio, B. [Istituto di Matematica e Fisica dell' Universita di Sassari, via Vienna 2, 07100, Sassari (Italy) and Sezione INFN di Cagliari (Italy)]. E-mail: golosio@uniss.it; Oliva, P. [Istituto di Matematica e Fisica dell' Universita di Sassari, via Vienna 2, 07100, Sassari (Italy); Stumbo, S. [Istituto di Matematica e Fisica dell' Universita di Sassari, via Vienna 2, 07100, Sassari (Italy); Vincze, L. [Department of Chemistry, University of Antwerp (Belgium); Randaccio, P. [Dipartimento di Fisica dell' Universita di Cagliari and Sezione INFN di Cagliari (Italy); Bleuet, P. [European Synchrotron Radiation Facility, Grenoble (France); Simionovici, A. [European Synchrotron Radiation Facility, Grenoble (France); Somogyi, A. [European Synchrotron Radiation Facility, Grenoble (France)

    2004-10-08

    A Monte Carlo code for the simulation of X-ray imaging and spectroscopy experiments in heterogeneous samples is presented. The energy spectrum, polarization and profile of the incident beam can be defined so that X-ray tube systems as well as synchrotron sources can be simulated. The sample is modeled as a 3D regular grid. The chemical composition and density is given at each point of the grid. Photoelectric absorption, fluorescent emission, elastic and inelastic scattering are included in the simulation. The core of the simulation is a fast routine for the calculation of the path lengths of the photon trajectory intersections with the grid voxels. The voxel representation is particularly useful for samples that cannot be well described by a small set of polyhedra. This is the case of most naturally occurring samples. In such cases, voxel-based simulations are much less expensive in terms of computational cost than simulations on a polygonal representation. The efficient scheme used for calculating the path lengths in the voxels and the use of variance reduction techniques make the code suitable for the detailed simulation of complex experiments on generic samples in a relatively short time. Examples of applications to X-ray imaging and spectroscopy experiments are discussed.

  14. Monte Carlo-based simulation of dynamic jaws tomotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Sterpin, E.; Chen, Y.; Chen, Q.; Lu, W.; Mackie, T. R.; Vynckier, S. [Department of Molecular Imaging, Radiotherapy and Oncology, Universite Catholique de Louvain, 54 Avenue Hippocrate, 1200 Brussels, Belgium and Department of Medical Physics, University of Wisconsin-Madison, Madison, Wisconsin 53705 (United States); TomoTherapy Inc., 1240 Deming Way, Madison, Wisconsin 53717 (United States); 21 Century Oncology., 1240 D' onofrio, Madison, Wisconsin 53719 (United States); TomoTherapy Inc., 1240 Deming Way, Madison, Wisconsin 53717 and Department of Medical Physics, University of Wisconsin-Madison, Madison, Wisconsin 53705 (United States); Department of Radiotherapy and Oncology, Universite Catholique de Louvain, St-Luc University Hospital, 10 Avenue Hippocrate, 1200 Brussels (Belgium)

    2011-09-15

    Purpose: Original TomoTherapy systems may involve a trade-off between conformity and treatment speed, the user being limited to three slice widths (1.0, 2.5, and 5.0 cm). This could be overcome by allowing the jaws to define arbitrary fields, including very small slice widths (<1 cm), which are challenging for a beam model. The aim of this work was to incorporate the dynamic jaws feature into a Monte Carlo (MC) model called TomoPen, based on the MC code PENELOPE, previously validated for the original TomoTherapy system. Methods: To keep the general structure of TomoPen and its efficiency, the simulation strategy introduces several techniques: (1) weight modifiers to account for any jaw settings using only the 5 cm phase-space file; (2) a simplified MC based model called FastStatic to compute the modifiers faster than pure MC; (3) actual simulation of dynamic jaws. Weight modifiers computed with both FastStatic and pure MC were compared. Dynamic jaws simulations were compared with the convolution/superposition (C/S) of TomoTherapy in the ''cheese'' phantom for a plan with two targets longitudinally separated by a gap of 3 cm. Optimization was performed in two modes: asymmetric jaws-constant couch speed (''running start stop,'' RSS) and symmetric jaws-variable couch speed (''symmetric running start stop,'' SRSS). Measurements with EDR2 films were also performed for RSS for the formal validation of TomoPen with dynamic jaws. Results: Weight modifiers computed with FastStatic were equivalent to pure MC within statistical uncertainties (0.5% for three standard deviations). Excellent agreement was achieved between TomoPen and C/S for both asymmetric jaw opening/constant couch speed and symmetric jaw opening/variable couch speed, with deviations well within 2%/2 mm. For RSS procedure, agreement between C/S and measurements was within 2%/2 mm for 95% of the points and 3%/3 mm for 98% of the points, where dose is

  15. Monte Carlo-based simulation of dynamic jaws tomotherapy

    International Nuclear Information System (INIS)

    Sterpin, E.; Chen, Y.; Chen, Q.; Lu, W.; Mackie, T. R.; Vynckier, S.

    2011-01-01

    Purpose: Original TomoTherapy systems may involve a trade-off between conformity and treatment speed, the user being limited to three slice widths (1.0, 2.5, and 5.0 cm). This could be overcome by allowing the jaws to define arbitrary fields, including very small slice widths (<1 cm), which are challenging for a beam model. The aim of this work was to incorporate the dynamic jaws feature into a Monte Carlo (MC) model called TomoPen, based on the MC code PENELOPE, previously validated for the original TomoTherapy system. Methods: To keep the general structure of TomoPen and its efficiency, the simulation strategy introduces several techniques: (1) weight modifiers to account for any jaw settings using only the 5 cm phase-space file; (2) a simplified MC based model called FastStatic to compute the modifiers faster than pure MC; (3) actual simulation of dynamic jaws. Weight modifiers computed with both FastStatic and pure MC were compared. Dynamic jaws simulations were compared with the convolution/superposition (C/S) of TomoTherapy in the ''cheese'' phantom for a plan with two targets longitudinally separated by a gap of 3 cm. Optimization was performed in two modes: asymmetric jaws-constant couch speed (''running start stop,'' RSS) and symmetric jaws-variable couch speed (''symmetric running start stop,'' SRSS). Measurements with EDR2 films were also performed for RSS for the formal validation of TomoPen with dynamic jaws. Results: Weight modifiers computed with FastStatic were equivalent to pure MC within statistical uncertainties (0.5% for three standard deviations). Excellent agreement was achieved between TomoPen and C/S for both asymmetric jaw opening/constant couch speed and symmetric jaw opening/variable couch speed, with deviations well within 2%/2 mm. For RSS procedure, agreement between C/S and measurements was within 2%/2 mm for 95% of the points and 3%/3 mm for 98% of the points, where dose is greater than 30% of the prescription dose (gamma analysis

  16. Monte Carlo Simulation of Complete X-Ray Spectra for Use in Scanning Electron Microscopy Analysis

    International Nuclear Information System (INIS)

    Roet, David; Van Espen, Piet

    2003-01-01

    Full Text: The interactions of keV electrons and photons with matter can be simulated accurately with the aid of the Monte Carlo (MC) technique. In scanning electron microscopy x-ray analysis (SEM-EDX) such simulations can be used to perform quantitative analysis using a Reverse Monte Carlo method even if the samples have irregular geometry. Alternatively the MC technique can generate spectra of standards for use in quantization with partial least squares regression. The feasibility of these alternatives to the more classical ZAF or phi-rho-Z quantification methods has been proven already. In order to be applicable for these purposes the MC-code needs to generate accurately only the characteristic K and L x-ray lines, but also the Bremsstrahlung continuum, i.e. the complete x-ray spectrum need to be simulated. Currently two types of MC simulation codes are available. Programs like Electron Flight Simulator and CASINO simulate characteristic x-rays due to electron interaction in a fast and efficient way but lack provision for the simulation of the continuum. On the other hand, programs like EGS4, MCNP4 and PENELOPE, originally developed for high energy (MeV- GeV) applications, are more complete but difficult to use and still slow, even on todays fastest computers. We therefore started the development of a dedicated MC simulation code for use in quantitative SEM-EDX work. The selection of the most appropriate cross section for the different interactions will be discussed and the results obtained will be compared with those obtained with existing MC programs. Examples of the application of MC simulations for quantitative analysis of samples with various composition will be given

  17. A hybrid transport-diffusion Monte Carlo method for frequency-dependent radiative-transfer simulations

    International Nuclear Information System (INIS)

    Densmore, Jeffery D.; Thompson, Kelly G.; Urbatsch, Todd J.

    2012-01-01

    Discrete Diffusion Monte Carlo (DDMC) is a technique for increasing the efficiency of Implicit Monte Carlo radiative-transfer simulations in optically thick media. In DDMC, particles take discrete steps between spatial cells according to a discretized diffusion equation. Each discrete step replaces many smaller Monte Carlo steps, thus improving the efficiency of the simulation. In this paper, we present an extension of DDMC for frequency-dependent radiative transfer. We base our new DDMC method on a frequency-integrated diffusion equation for frequencies below a specified threshold, as optical thickness is typically a decreasing function of frequency. Above this threshold we employ standard Monte Carlo, which results in a hybrid transport-diffusion scheme. With a set of frequency-dependent test problems, we confirm the accuracy and increased efficiency of our new DDMC method.

  18. MBR Monte Carlo Simulation in PYTHIA8

    Science.gov (United States)

    Ciesielski, R.

    We present the MBR (Minimum Bias Rockefeller) Monte Carlo simulation of (anti)proton-proton interactions and its implementation in the PYTHIA8 event generator. We discuss the total, elastic, and total-inelastic cross sections, and three contributions from diffraction dissociation processes that contribute to the latter: single diffraction, double diffraction, and central diffraction or double-Pomeron exchange. The event generation follows a renormalized-Regge-theory model, successfully tested using CDF data. Based on the MBR-enhanced PYTHIA8 simulation, we present cross-section predictions for the LHC and beyond, up to collision energies of 50 TeV.

  19. Monte Carlo simulation of a gas-sampled hadron calorimeter

    Energy Technology Data Exchange (ETDEWEB)

    Chang, C Y; Kunori, S; Rapp, P; Talaga, R; Steinberg, P; Tylka, A J; Wang, Z M

    1988-02-15

    A prototype of the OPAL barrel hadron calorimeter, which is a gas-sampled calorimeter using plastic streamer tubes, was exposed to pions at energies between 1 and 7 GeV. The response of the detector was simulated using the CERN GEANT3 Monte Carlo program. By using the observed high energy muon signals to deduce details of the streamer formation, the Monte Carlo program was able to reproduce the observed calorimeter response. The behavior of the hadron calorimeter when placed behind a lead glass electromagnetic calorimeter was also investigated.

  20. Isobaric-isothermal Monte Carlo simulations from first principles: Application to liquid water at ambient conditions

    Energy Technology Data Exchange (ETDEWEB)

    McGrath, M; Siepmann, J I; Kuo, I W; Mundy, C J; VandeVondele, J; Hutter, J; Mohamed, F; Krack, M

    2004-12-02

    A series of first principles Monte Carlo simulations in the isobaric-isothermal ensemble were carried out for liquid water at ambient conditions (T = 298 K and p = 1 atm). The Becke-Lee-Yang-Parr (BLYP) exchange and correlation energy functionals and norm-conserving Goedecker-Teter-Hutter (GTH) pseudopotentials were employed with the CP2K simulation package to examine systems consisting of 64 water molecules. The fluctuations in the system volume encountered in simulations in the isobaric-isothermal ensemble requires a reconsideration of the suitability of the typical charge density cutoff and the regular grid generation method previously used for the computation of the electrostatic energy in first principles simulations in the microcanonical or canonical ensembles. In particular, it is noted that a much higher cutoff is needed and that the most computationally efficient method of creating grids can result in poor simulations. Analysis of the simulation trajectories using a very large charge density cutoff at 1200 Ry and four different grid generation methods point to a substantially underestimated liquid density of about 0.85 g/cm{sup 3} resulting in a somewhat understructured liquid (with a value of about 2.7 for the height of the first peak in the oxygen/oxygen radial distribution function) for BLYP-GTH water at ambient conditions.

  1. Monte Carlo computations for lattice gauge theories with finite gauge groups

    International Nuclear Information System (INIS)

    Rabbi, G.

    1980-01-01

    Recourse to Monte Carlo simulations for obtaining numerical information about lattice gauge field theories is suggested by the fact that, after a Wick rotation of time to imaginary time, the weighted sum over all configurations used to define quantium expectation values becomes formally identical to a statistical sum of a four-dimensional system. Results obtained in a variety of Monte Carlo investigations are described

  2. Concepts and Plans towards fast large scale Monte Carlo production for the ATLAS Experiment

    Science.gov (United States)

    Ritsch, E.; Atlas Collaboration

    2014-06-01

    The huge success of the physics program of the ATLAS experiment at the Large Hadron Collider (LHC) during Run 1 relies upon a great number of simulated Monte Carlo events. This Monte Carlo production takes the biggest part of the computing resources being in use by ATLAS as of now. In this document we describe the plans to overcome the computing resource limitations for large scale Monte Carlo production in the ATLAS Experiment for Run 2, and beyond. A number of fast detector simulation, digitization and reconstruction techniques are being discussed, based upon a new flexible detector simulation framework. To optimally benefit from these developments, a redesigned ATLAS MC production chain is presented at the end of this document.

  3. Monte Carlo derivation of filtered tungsten anode X-ray spectra for dose computation in digital mammography

    Energy Technology Data Exchange (ETDEWEB)

    Paixao, L.; Oliveira, B. B.; Nogueira, M. do S. [Centro de Desenvolvimento da Tecnologia Nuclear, Post-graduation in Science and Technology of Radiations, Minerals and Materials, Pte. Antonio Carlos 6.627, Pampulha, 31270-901 Belo Horizonte (Brazil); Viloria, C. [UFMG, Departamento de Engenharia Nuclear, Post-graduation in Nuclear Sciences and Techniques, Pte. Antonio Carlos 6.627, Pampulha, 31270-901 Belo Horizonte (Brazil); Alves de O, M. [UFMG, Department of Anatomy and Imaging, Prof. Alfredo Balena 190, 30130-100 Belo Horizonte (Brazil); Araujo T, M. H., E-mail: lpr@cdtn.br [Dr Maria Helena Araujo Teixeira Clinic, Guajajaras 40, 30180-100 Belo Horizonte (Brazil)

    2014-08-15

    It is widely accepted that the mean glandular dose (D{sub G}) for the glandular tissue is the more useful magnitude for characterizing the breast cancer risk. The procedure to estimate the D{sub G}, for being difficult to measure it directly in the breast, it is to make the use of conversion factors that relate incident air kerma (K{sub i}) at this dose. Generally, the conversion factors vary with the x-ray spectrum half-value layer and the breast composition and thickness. Several authors through computer simulations have calculated such factors by the Monte Carlo (Mc) method. Many spectral models for D{sub G} computer simulations purposes are available in the diagnostic range. One of the models available generates unfiltered spectra. In this work, the Monte Carlo EGSnrc code package with the C++ class library (eg spp) was employed to derive filtered tungsten x-ray spectra used in digital mammography systems. Filtered spectra for rhodium and aluminium filters were obtained for tube potentials between 26 and 32 kV. The half-value layer of simulated filtered spectra were compared with those obtained experimentally with a solid state detector Unfors model 8202031-H Xi R/F and Mam Detector Platinum and 8201023-C Xi Base unit Platinum Plus w m As in a Hologic Selenia Dimensions system using a Direct Radiography mode. Calculated half-value layer values showed good agreement compared to those obtained experimentally. These results show that the filtered tungsten anode x-ray spectra and the EGSnrc Mc code can be used for D{sub G} determination in mammography. (Author)

  4. Monte Carlo derivation of filtered tungsten anode X-ray spectra for dose computation in digital mammography

    International Nuclear Information System (INIS)

    Paixao, L.; Oliveira, B. B.; Nogueira, M. do S.; Viloria, C.; Alves de O, M.; Araujo T, M. H.

    2014-08-01

    It is widely accepted that the mean glandular dose (D G ) for the glandular tissue is the more useful magnitude for characterizing the breast cancer risk. The procedure to estimate the D G , for being difficult to measure it directly in the breast, it is to make the use of conversion factors that relate incident air kerma (K i ) at this dose. Generally, the conversion factors vary with the x-ray spectrum half-value layer and the breast composition and thickness. Several authors through computer simulations have calculated such factors by the Monte Carlo (Mc) method. Many spectral models for D G computer simulations purposes are available in the diagnostic range. One of the models available generates unfiltered spectra. In this work, the Monte Carlo EGSnrc code package with the C++ class library (eg spp) was employed to derive filtered tungsten x-ray spectra used in digital mammography systems. Filtered spectra for rhodium and aluminium filters were obtained for tube potentials between 26 and 32 kV. The half-value layer of simulated filtered spectra were compared with those obtained experimentally with a solid state detector Unfors model 8202031-H Xi R/F and Mam Detector Platinum and 8201023-C Xi Base unit Platinum Plus w m As in a Hologic Selenia Dimensions system using a Direct Radiography mode. Calculated half-value layer values showed good agreement compared to those obtained experimentally. These results show that the filtered tungsten anode x-ray spectra and the EGSnrc Mc code can be used for D G determination in mammography. (Author)

  5. Radiation transport simulation in gamma irradiator systems using E G S 4 Monte Carlo code and dose mapping calculations based on point kernel technique

    International Nuclear Information System (INIS)

    Raisali, G.R.

    1992-01-01

    A series of computer codes based on point kernel technique and also Monte Carlo method have been developed. These codes perform radiation transport calculations for irradiator systems having cartesian, cylindrical and mixed geometries. The monte Carlo calculations, the computer code 'EGS4' has been applied to a radiation processing type problem. This code has been acompanied by a specific user code. The set of codes developed include: GCELLS, DOSMAPM, DOSMAPC2 which simulate the radiation transport in gamma irradiator systems having cylinderical, cartesian, and mixed geometries, respectively. The program 'DOSMAP3' based on point kernel technique, has been also developed for dose rate mapping calculations in carrier type gamma irradiators. Another computer program 'CYLDETM' as a user code for EGS4 has been also developed to simulate dose variations near the interface of heterogeneous media in gamma irradiator systems. In addition a system of computer codes 'PRODMIX' has been developed which calculates the absorbed dose in the products with different densities. validation studies of the calculated results versus experimental dosimetry has been performed and good agreement has been obtained

  6. Organ doses for reference pediatric and adolescent patients undergoing computed tomography estimated by Monte Carlo simulation

    International Nuclear Information System (INIS)

    Lee, Choonsik; Kim, Kwang Pyo; Long, Daniel J.; Bolch, Wesley E.

    2012-01-01

    Purpose: To establish an organ dose database for pediatric and adolescent reference individuals undergoing computed tomography (CT) examinations by using Monte Carlo simulation. The data will permit rapid estimates of organ and effective doses for patients of different age, gender, examination type, and CT scanner model. Methods: The Monte Carlo simulation model of a Siemens Sensation 16 CT scanner previously published was employed as a base CT scanner model. A set of absorbed doses for 33 organs/tissues normalized to the product of 100 mAs and CTDI vol (mGy/100 mAs mGy) was established by coupling the CT scanner model with age-dependent reference pediatric hybrid phantoms. A series of single axial scans from the top of head to the feet of the phantoms was performed at a slice thickness of 10 mm, and at tube potentials of 80, 100, and 120 kVp. Using the established CTDI vol - and 100 mAs-normalized dose matrix, organ doses for different pediatric phantoms undergoing head, chest, abdomen-pelvis, and chest-abdomen-pelvis (CAP) scans with the Siemens Sensation 16 scanner were estimated and analyzed. The results were then compared with the values obtained from three independent published methods: CT-Expo software, organ dose for abdominal CT scan derived empirically from patient abdominal circumference, and effective dose per dose-length product (DLP). Results: Organ and effective doses were calculated and normalized to 100 mAs and CTDI vol for different CT examinations. At the same technical setting, dose to the organs, which were entirely included in the CT beam coverage, were higher by from 40 to 80% for newborn phantoms compared to those of 15-year phantoms. An increase of tube potential from 80 to 120 kVp resulted in 2.5-2.9-fold greater brain dose for head scans. The results from this study were compared with three different published studies and/or techniques. First, organ doses were compared to those given by CT-Expo which revealed dose differences up to

  7. Monte Carlo simulation of the sequential probability ratio test for radiation monitoring

    International Nuclear Information System (INIS)

    Coop, K.L.

    1984-01-01

    A computer program simulates the Sequential Probability Ratio Test (SPRT) using Monte Carlo techniques. The program, SEQTEST, performs random-number sampling of either a Poisson or normal distribution to simulate radiation monitoring data. The results are in terms of the detection probabilities and the average time required for a trial. The computed SPRT results can be compared with tabulated single interval test (SIT) values to determine the better statistical test for particular monitoring applications. Use of the SPRT in a hand-and-foot alpha monitor shows that the SPRT provides better detection probabilities while generally requiring less counting time. Calculations are also performed for a monitor where the SPRT is not permitted to the take longer than the single interval test. Although the performance of the SPRT is degraded by this restriction, the detection probabilities are still similar to the SIT values, and average counting times are always less than 75% of the SIT time. Some optimal conditions for use of the SPRT are described. The SPRT should be the test of choice in many radiation monitoring situations. 6 references, 8 figures, 1 table

  8. Effect of phantom dimension variation on Monte Carlo simulation speed and precision

    International Nuclear Information System (INIS)

    Lin Hui; Xu Yuanying; Xu Liangfeng; Li Guoli; Jiang Jia

    2007-01-01

    There is a correlation between Monte Carlo simulation speed and the phantom dimension. The effect of the phantom dimension on the Monte Carlo simulation speed and precision was studied based on a fast Monte Carlo code DPM. The results showed that when the thickness of the phantom was reduced, the efficiency would increase exponentially without compromise of its precision except for the position at the tailor. When the width of the phantom was reduced to outside the penumbra, the effect on the efficiency would be neglectable. However when it was reduced to within the penumbra, the efficiency would be increased at some extent without precision loss. This result was applied to a clinic head case, and the remarkable increased efficiency was acquired. (authors)

  9. Risk Consideration and Cost Estimation in Construction Projects Using Monte Carlo Simulation

    Directory of Open Access Journals (Sweden)

    Claudius A. Peleskei

    2015-06-01

    Full Text Available Construction projects usually involve high investments. It is, therefore, a risky adventure for companies as actual costs of construction projects nearly always exceed the planed scenario. This is due to the various risks and the large uncertainty existing within this industry. Determination and quantification of risks and their impact on project costs within the construction industry is described to be one of the most difficult areas. This paper analyses how the cost of construction projects can be estimated using Monte Carlo Simulation. It investigates if the different cost elements in a construction project follow a specific probability distribution. The research examines the effect of correlation between different project costs on the result of the Monte Carlo Simulation. The paper finds out that Monte Carlo Simulation can be a helpful tool for risk managers and can be used for cost estimation of construction projects. The research has shown that cost distributions are positively skewed and cost elements seem to have some interdependent relationships.

  10. SELF-ABSORPTION CORRECTIONS BASED ON MONTE CARLO SIMULATIONS

    Directory of Open Access Journals (Sweden)

    Kamila Johnová

    2016-12-01

    Full Text Available The main aim of this article is to demonstrate how Monte Carlo simulations are implemented in our gamma spectrometry laboratory at the Department of Dosimetry and Application of Ionizing Radiation in order to calculate the self-absorption within the samples. A model of real HPGe detector created for MCNP simulations is presented in this paper. All of the possible parameters, which may influence the self-absorption, are at first discussed theoretically and lately described using the calculated results.

  11. Accelerating Monte Carlo molecular simulations by reweighting and reconstructing Markov chains: Extrapolation of canonical ensemble averages and second derivatives to different temperature and density conditions

    KAUST Repository

    Kadoura, Ahmad Salim; Sun, Shuyu; Salama, Amgad

    2014-01-01

    thermodynamically consistent technique to regenerate rapidly Monte Carlo Markov Chains (MCMCs) at different thermodynamic conditions from the existing data points that have been pre-computed with expensive classical simulation. This technique can speed up

  12. Timesaving techniques for decision of electron-molecule collisions in Monte Carlo simulation of electrical discharges

    International Nuclear Information System (INIS)

    Sugawara, Hirotake; Mori, Naoki; Sakai, Yosuke; Suda, Yoshiyuki

    2007-01-01

    Techniques to reduce the computational load for determination of electron-molecule collisions in Monte Carlo simulations of electrical discharges have been presented. By enhancing the detection efficiency of the no-collision case in the decision scheme of the collisional events, we can decrease the frequency of access to time-consuming subroutines to calculate the electron collision cross sections of the gas molecules for obtaining the collision probability. A benchmark test and an estimation to evaluate the present techniques have shown a practical timesaving efficiency

  13. Monte-Carlo simulations of neutron shielding for the ATLAS forward region

    CERN Document Server

    Stekl, I; Kovalenko, V E; Vorobel, V; Leroy, C; Piquemal, F; Eschbach, R; Marquet, C

    2000-01-01

    The effectiveness of different types of neutron shielding for the ATLAS forward region has been studied by means of Monte-Carlo simulations and compared with the results of an experiment performed at the CERN PS. The simulation code is based on GEANT, FLUKA, MICAP and GAMLIB. GAMLIB is a new library including processes with gamma-rays produced in (n, gamma), (n, n'gamma) neutron reactions and is interfaced to the MICAP code. The effectiveness of different types of shielding against neutrons and gamma-rays, composed from different types of material, such as pure polyethylene, borated polyethylene, lithium-filled polyethylene, lead and iron, were compared. The results from Monte-Carlo simulations were compared to the results obtained from the experiment. The simulation results reproduce the experimental data well. This agreement supports the correctness of the simulation code used to describe the generation, spreading and absorption of neutrons (up to thermal energies) and gamma-rays in the shielding materials....

  14. Efficiencies of joint non-local update moves in Monte Carlo simulations of coarse-grained polymers

    Science.gov (United States)

    Austin, Kieran S.; Marenz, Martin; Janke, Wolfhard

    2018-03-01

    In this study four update methods are compared in their performance in a Monte Carlo simulation of polymers in continuum space. The efficiencies of the update methods and combinations thereof are compared with the aid of the autocorrelation time with a fixed (optimal) acceptance ratio. Results are obtained for polymer lengths N = 14, 28 and 42 and temperatures below, at and above the collapse transition. In terms of autocorrelation, the optimal acceptance ratio is approximately 0.4. Furthermore, an overview of the step sizes of the update methods that correspond to this optimal acceptance ratio is given. This shall serve as a guide for future studies that rely on efficient computer simulations.

  15. Failure Bounding And Sensitivity Analysis Applied To Monte Carlo Entry, Descent, And Landing Simulations

    Science.gov (United States)

    Gaebler, John A.; Tolson, Robert H.

    2010-01-01

    In the study of entry, descent, and landing, Monte Carlo sampling methods are often employed to study the uncertainty in the designed trajectory. The large number of uncertain inputs and outputs, coupled with complicated non-linear models, can make interpretation of the results difficult. Three methods that provide statistical insights are applied to an entry, descent, and landing simulation. The advantages and disadvantages of each method are discussed in terms of the insights gained versus the computational cost. The first method investigated was failure domain bounding which aims to reduce the computational cost of assessing the failure probability. Next a variance-based sensitivity analysis was studied for the ability to identify which input variable uncertainty has the greatest impact on the uncertainty of an output. Finally, probabilistic sensitivity analysis is used to calculate certain sensitivities at a reduced computational cost. These methods produce valuable information that identifies critical mission parameters and needs for new technology, but generally at a significant computational cost.

  16. Optical coherence tomography: Monte Carlo simulation and improvement by optical amplification

    DEFF Research Database (Denmark)

    Tycho, Andreas

    2002-01-01

    An advanced novel Monte Carlo simulation model of the detection process of an optical coherence tomography (OCT) system is presented. For the first time it is shown analytically that the applicability of the incoherent Monte Carlo approach to model the heterodyne detection process of an OCT system...... is firmly justified. This is obtained by calculating the heterodyne mixing of the reference and sample beams in a plane conjugate to the discontinuity in the sample probed by the system. Using this approach, a novel expression for the OCT signal is derived, which only depends uopon the intensity...... flexibility of Monte Carlo simulations, this new model is demonstrated to be excellent as a numerical phantom, i.e., as a substitute for otherwise difficult experiments. Finally, a new model of the signal-to-noise ratio (SNR) of an OCT system with optical amplification of the light reflected from the sample...

  17. Monte Carlo simulation models of breeding-population advancement.

    Science.gov (United States)

    J.N. King; G.R. Johnson

    1993-01-01

    Five generations of population improvement were modeled using Monte Carlo simulations. The model was designed to address questions that are important to the development of an advanced generation breeding population. Specifically we addressed the effects on both gain and effective population size of different mating schemes when creating a recombinant population for...

  18. Srna-Monte Carlo codes for proton transport simulation in combined and voxelized geometries

    CERN Document Server

    Ilic, R D; Stankovic, S J

    2002-01-01

    This paper describes new Monte Carlo codes for proton transport simulations in complex geometrical forms and in materials of different composition. The SRNA codes were developed for three dimensional (3D) dose distribution calculation in proton therapy and dosimetry. The model of these codes is based on the theory of proton multiple scattering and a simple model of compound nucleus decay. The developed package consists of two codes: SRNA-2KG and SRNA-VOX. The first code simulates proton transport in combined geometry that can be described by planes and second order surfaces. The second one uses the voxelized geometry of material zones and is specifically adopted for the application of patient computer tomography data. Transition probabilities for both codes are given by the SRNADAT program. In this paper, we will present the models and algorithms of our programs, as well as the results of the numerical experiments we have carried out applying them, along with the results of proton transport simulation obtaine...

  19. Modeling of molecular nitrogen collisions and dissociation processes for direct simulation Monte Carlo.

    Science.gov (United States)

    Parsons, Neal; Levin, Deborah A; van Duin, Adri C T; Zhu, Tong

    2014-12-21

    The Direct Simulation Monte Carlo (DSMC) method typically used for simulating hypersonic Earth re-entry flows requires accurate total collision cross sections and reaction probabilities. However, total cross sections are often determined from extrapolations of relatively low-temperature viscosity data, so their reliability is unknown for the high temperatures observed in hypersonic flows. Existing DSMC reaction models accurately reproduce experimental equilibrium reaction rates, but the applicability of these rates to the strong thermal nonequilibrium observed in hypersonic shocks is unknown. For hypersonic flows, these modeling issues are particularly relevant for nitrogen, the dominant species of air. To rectify this deficiency, the Molecular Dynamics/Quasi-Classical Trajectories (MD/QCT) method is used to accurately compute collision and reaction cross sections for the N2(Σg+1)-N2(Σg+1) collision pair for conditions expected in hypersonic shocks using a new potential energy surface developed using a ReaxFF fit to recent advanced ab initio calculations. The MD/QCT-computed reaction probabilities were found to exhibit better physical behavior and predict less dissociation than the baseline total collision energy reaction model for strong nonequilibrium conditions expected in a shock. The MD/QCT reaction model compared well with computed equilibrium reaction rates and shock-tube data. In addition, the MD/QCT-computed total cross sections were found to agree well with established variable hard sphere total cross sections.

  20. Modeling of molecular nitrogen collisions and dissociation processes for direct simulation Monte Carlo

    International Nuclear Information System (INIS)

    Parsons, Neal; Levin, Deborah A.; Duin, Adri C. T. van; Zhu, Tong

    2014-01-01

    The Direct Simulation Monte Carlo (DSMC) method typically used for simulating hypersonic Earth re-entry flows requires accurate total collision cross sections and reaction probabilities. However, total cross sections are often determined from extrapolations of relatively low-temperature viscosity data, so their reliability is unknown for the high temperatures observed in hypersonic flows. Existing DSMC reaction models accurately reproduce experimental equilibrium reaction rates, but the applicability of these rates to the strong thermal nonequilibrium observed in hypersonic shocks is unknown. For hypersonic flows, these modeling issues are particularly relevant for nitrogen, the dominant species of air. To rectify this deficiency, the Molecular Dynamics/Quasi-Classical Trajectories (MD/QCT) method is used to accurately compute collision and reaction cross sections for the N 2 ( 1 Σ g + )-N 2 ( 1 Σ g + ) collision pair for conditions expected in hypersonic shocks using a new potential energy surface developed using a ReaxFF fit to recent advanced ab initio calculations. The MD/QCT-computed reaction probabilities were found to exhibit better physical behavior and predict less dissociation than the baseline total collision energy reaction model for strong nonequilibrium conditions expected in a shock. The MD/QCT reaction model compared well with computed equilibrium reaction rates and shock-tube data. In addition, the MD/QCT-computed total cross sections were found to agree well with established variable hard sphere total cross sections

  1. A review of the use and potential of the GATE Monte Carlo simulation code for radiation therapy and dosimetry applications

    Energy Technology Data Exchange (ETDEWEB)

    Sarrut, David, E-mail: david.sarrut@creatis.insa-lyon.fr [Université de Lyon, CREATIS, CNRS UMR5220, Inserm U1044, INSA-Lyon (France); Université Lyon 1 (France); Centre Léon Bérard (France); Bardiès, Manuel; Marcatili, Sara; Mauxion, Thibault [Inserm, UMR1037 CRCT, F-31000 Toulouse, France and Université Toulouse III-Paul Sabatier, UMR1037 CRCT, F-31000 Toulouse (France); Boussion, Nicolas [INSERM, UMR 1101, LaTIM, CHU Morvan, 29609 Brest (France); Freud, Nicolas; Létang, Jean-Michel [Université de Lyon, CREATIS, CNRS UMR5220, Inserm U1044, INSA-Lyon, Université Lyon 1, Centre Léon Bérard, 69008 Lyon (France); Jan, Sébastien [CEA/DSV/I2BM/SHFJ, Orsay 91401 (France); Loudos, George [Department of Medical Instruments Technology, Technological Educational Institute of Athens, Athens 12210 (Greece); Maigne, Lydia; Perrot, Yann [UMR 6533 CNRS/IN2P3, Université Blaise Pascal, 63171 Aubière (France); Papadimitroulas, Panagiotis [Department of Biomedical Engineering, Technological Educational Institute of Athens, 12210, Athens (Greece); Pietrzyk, Uwe [Institut für Neurowissenschaften und Medizin, Forschungszentrum Jülich GmbH, 52425 Jülich, Germany and Fachbereich für Mathematik und Naturwissenschaften, Bergische Universität Wuppertal, 42097 Wuppertal (Germany); Robert, Charlotte [IMNC, UMR 8165 CNRS, Universités Paris 7 et Paris 11, Orsay 91406 (France); and others

    2014-06-15

    In this paper, the authors' review the applicability of the open-source GATE Monte Carlo simulation platform based on the GEANT4 toolkit for radiation therapy and dosimetry applications. The many applications of GATE for state-of-the-art radiotherapy simulations are described including external beam radiotherapy, brachytherapy, intraoperative radiotherapy, hadrontherapy, molecular radiotherapy, and in vivo dose monitoring. Investigations that have been performed using GEANT4 only are also mentioned to illustrate the potential of GATE. The very practical feature of GATE making it easy to model both a treatment and an imaging acquisition within the same frameworkis emphasized. The computational times associated with several applications are provided to illustrate the practical feasibility of the simulations using current computing facilities.

  2. A review of the use and potential of the GATE Monte Carlo simulation code for radiation therapy and dosimetry applications

    International Nuclear Information System (INIS)

    Sarrut, David; Bardiès, Manuel; Marcatili, Sara; Mauxion, Thibault; Boussion, Nicolas; Freud, Nicolas; Létang, Jean-Michel; Jan, Sébastien; Loudos, George; Maigne, Lydia; Perrot, Yann; Papadimitroulas, Panagiotis; Pietrzyk, Uwe; Robert, Charlotte

    2014-01-01

    In this paper, the authors' review the applicability of the open-source GATE Monte Carlo simulation platform based on the GEANT4 toolkit for radiation therapy and dosimetry applications. The many applications of GATE for state-of-the-art radiotherapy simulations are described including external beam radiotherapy, brachytherapy, intraoperative radiotherapy, hadrontherapy, molecular radiotherapy, and in vivo dose monitoring. Investigations that have been performed using GEANT4 only are also mentioned to illustrate the potential of GATE. The very practical feature of GATE making it easy to model both a treatment and an imaging acquisition within the same frameworkis emphasized. The computational times associated with several applications are provided to illustrate the practical feasibility of the simulations using current computing facilities

  3. Monte-Carlo Simulation of Hydrogen Adsorption in Single-Wall Carbon Nano-Cones

    Directory of Open Access Journals (Sweden)

    Zohreh Ahadi

    2011-01-01

    Full Text Available The properties of hydrogen adsorption in single-walled carbon nano-cones are investigated in detail by Monte Carlo simulations. A great deal of our computational results show that the hydrogen storage capacity in single-walled carbon nano-cones is slightly smaller than the capacity of single-walled carbon nanotubes at any time at the same conditions. This indicates that the hydrogen storage capacity of single-walled carbon nano-cones is related to angles of carbon nano-cones. It seems that these type of nanotubes could not exceed the 2010 goal of 6 wt%, which is presented by the U.S. Department of Energy. In addition, these results are discussed in theory.

  4. GGEMS-Brachy: GPU GEant4-based Monte Carlo simulation for brachytherapy applications

    Science.gov (United States)

    Lemaréchal, Yannick; Bert, Julien; Falconnet, Claire; Després, Philippe; Valeri, Antoine; Schick, Ulrike; Pradier, Olivier; Garcia, Marie-Paule; Boussion, Nicolas; Visvikis, Dimitris

    2015-07-01

    In brachytherapy, plans are routinely calculated using the AAPM TG43 formalism which considers the patient as a simple water object. An accurate modeling of the physical processes considering patient heterogeneity using Monte Carlo simulation (MCS) methods is currently too time-consuming and computationally demanding to be routinely used. In this work we implemented and evaluated an accurate and fast MCS on Graphics Processing Units (GPU) for brachytherapy low dose rate (LDR) applications. A previously proposed Geant4 based MCS framework implemented on GPU (GGEMS) was extended to include a hybrid GPU navigator, allowing navigation within voxelized patient specific images and analytically modeled 125I seeds used in LDR brachytherapy. In addition, dose scoring based on track length estimator including uncertainty calculations was incorporated. The implemented GGEMS-brachy platform was validated using a comparison with Geant4 simulations and reference datasets. Finally, a comparative dosimetry study based on the current clinical standard (TG43) and the proposed platform was performed on twelve prostate cancer patients undergoing LDR brachytherapy. Considering patient 3D CT volumes of 400  × 250  × 65 voxels and an average of 58 implanted seeds, the mean patient dosimetry study run time for a 2% dose uncertainty was 9.35 s (≈500 ms 10-6 simulated particles) and 2.5 s when using one and four GPUs, respectively. The performance of the proposed GGEMS-brachy platform allows envisaging the use of Monte Carlo simulation based dosimetry studies in brachytherapy compatible with clinical practice. Although the proposed platform was evaluated for prostate cancer, it is equally applicable to other LDR brachytherapy clinical applications. Future extensions will allow its application in high dose rate brachytherapy applications.

  5. GGEMS-Brachy: GPU GEant4-based Monte Carlo simulation for brachytherapy applications

    International Nuclear Information System (INIS)

    Lemaréchal, Yannick; Bert, Julien; Schick, Ulrike; Pradier, Olivier; Garcia, Marie-Paule; Boussion, Nicolas; Visvikis, Dimitris; Falconnet, Claire; Després, Philippe; Valeri, Antoine

    2015-01-01

    In brachytherapy, plans are routinely calculated using the AAPM TG43 formalism which considers the patient as a simple water object. An accurate modeling of the physical processes considering patient heterogeneity using Monte Carlo simulation (MCS) methods is currently too time-consuming and computationally demanding to be routinely used. In this work we implemented and evaluated an accurate and fast MCS on Graphics Processing Units (GPU) for brachytherapy low dose rate (LDR) applications. A previously proposed Geant4 based MCS framework implemented on GPU (GGEMS) was extended to include a hybrid GPU navigator, allowing navigation within voxelized patient specific images and analytically modeled 125 I seeds used in LDR brachytherapy. In addition, dose scoring based on track length estimator including uncertainty calculations was incorporated. The implemented GGEMS-brachy platform was validated using a comparison with Geant4 simulations and reference datasets. Finally, a comparative dosimetry study based on the current clinical standard (TG43) and the proposed platform was performed on twelve prostate cancer patients undergoing LDR brachytherapy. Considering patient 3D CT volumes of 400  × 250  × 65 voxels and an average of 58 implanted seeds, the mean patient dosimetry study run time for a 2% dose uncertainty was 9.35 s (≈500 ms 10 −6 simulated particles) and 2.5 s when using one and four GPUs, respectively. The performance of the proposed GGEMS-brachy platform allows envisaging the use of Monte Carlo simulation based dosimetry studies in brachytherapy compatible with clinical practice. Although the proposed platform was evaluated for prostate cancer, it is equally applicable to other LDR brachytherapy clinical applications. Future extensions will allow its application in high dose rate brachytherapy applications. (paper)

  6. GPU-accelerated Gibbs ensemble Monte Carlo simulations of Lennard-Jonesium

    Science.gov (United States)

    Mick, Jason; Hailat, Eyad; Russo, Vincent; Rushaidat, Kamel; Schwiebert, Loren; Potoff, Jeffrey

    2013-12-01

    This work describes an implementation of canonical and Gibbs ensemble Monte Carlo simulations on graphics processing units (GPUs). The pair-wise energy calculations, which consume the majority of the computational effort, are parallelized using the energetic decomposition algorithm. While energetic decomposition is relatively inefficient for traditional CPU-bound codes, the algorithm is ideally suited to the architecture of the GPU. The performance of the CPU and GPU codes are assessed for a variety of CPU and GPU combinations for systems containing between 512 and 131,072 particles. For a system of 131,072 particles, the GPU-enabled canonical and Gibbs ensemble codes were 10.3 and 29.1 times faster (GTX 480 GPU vs. i5-2500K CPU), respectively, than an optimized serial CPU-bound code. Due to overhead from memory transfers from system RAM to the GPU, the CPU code was slightly faster than the GPU code for simulations containing less than 600 particles. The critical temperature Tc∗=1.312(2) and density ρc∗=0.316(3) were determined for the tail corrected Lennard-Jones potential from simulations of 10,000 particle systems, and found to be in exact agreement with prior mixed field finite-size scaling calculations [J.J. Potoff, A.Z. Panagiotopoulos, J. Chem. Phys. 109 (1998) 10914].

  7. Steady state likelihood ratio sensitivity analysis for stiff kinetic Monte Carlo simulations.

    Science.gov (United States)

    Núñez, M; Vlachos, D G

    2015-01-28

    Kinetic Monte Carlo simulation is an integral tool in the study of complex physical phenomena present in applications ranging from heterogeneous catalysis to biological systems to crystal growth and atmospheric sciences. Sensitivity analysis is useful for identifying important parameters and rate-determining steps, but the finite-difference application of sensitivity analysis is computationally demanding. Techniques based on the likelihood ratio method reduce the computational cost of sensitivity analysis by obtaining all gradient information in a single run. However, we show that disparity in time scales of microscopic events, which is ubiquitous in real systems, introduces drastic statistical noise into derivative estimates for parameters affecting the fast events. In this work, the steady-state likelihood ratio sensitivity analysis is extended to singularly perturbed systems by invoking partial equilibration for fast reactions, that is, by working on the fast and slow manifolds of the chemistry. Derivatives on each time scale are computed independently and combined to the desired sensitivity coefficients to considerably reduce the noise in derivative estimates for stiff systems. The approach is demonstrated in an analytically solvable linear system.

  8. Study of cold neutron sources: Implementation and validation of a complete computation scheme for research reactor using Monte Carlo codes TRIPOLI-4.4 and McStas

    International Nuclear Information System (INIS)

    Campioni, Guillaume; Mounier, Claude

    2006-01-01

    The main goal of the thesis about studies of cold neutrons sources (CNS) in research reactors was to create a complete set of tools to design efficiently CNS. The work raises the problem to run accurate simulations of experimental devices inside reactor reflector valid for parametric studies. On one hand, deterministic codes have reasonable computation times but introduce problems for geometrical description. On the other hand, Monte Carlo codes give the possibility to compute on precise geometry, but need computation times so important that parametric studies are impossible. To decrease this computation time, several developments were made in the Monte Carlo code TRIPOLI-4.4. An uncoupling technique is used to isolate a study zone in the complete reactor geometry. By recording boundary conditions (incoming flux), further simulations can be launched for parametric studies with a computation time reduced by a factor 60 (case of the cold neutron source of the Orphee reactor). The short response time allows to lead parametric studies using Monte Carlo code. Moreover, using biasing methods, the flux can be recorded on the surface of neutrons guides entries (low solid angle) with a further gain of running time. Finally, the implementation of a coupling module between TRIPOLI- 4.4 and the Monte Carlo code McStas for research in condensed matter field gives the possibility to obtain fluxes after transmission through neutrons guides, thus to have the neutron flux received by samples studied by scientists of condensed matter. This set of developments, involving TRIPOLI-4.4 and McStas, represent a complete computation scheme for research reactors: from nuclear core, where neutrons are created, to the exit of neutrons guides, on samples of matter. This complete calculation scheme is tested against ILL4 measurements of flux in cold neutron guides. (authors)

  9. Monte Carlo simulation of a clinical linear accelerator

    International Nuclear Information System (INIS)

    Lin, S.-Y.; Chu, T.-C.; Lin, J.-P.

    2001-01-01

    The effects of the physical parameters of an electron beam from a Siemens PRIMUS clinical linear accelerator (linac) on the dose distribution in water were investigated by Monte Carlo simulation. The EGS4 user code, OMEGA/BEAM, was used in this study. Various incident electron beams, for example, with different energies, spot sizes and distances from the point source, were simulated using the detailed linac head structure in the 6 MV photon mode. Approximately 10 million particles were collected in the scored plane, which was set under the reticle to form the so-called phase space file. The phase space file served as a source for simulating the dose distribution in water using DOSXYZ. Dose profiles at D max (1.5 cm) and PDD curves were calculated following simulating about 1 billion histories for dose profiles and 500 million histories for percent depth dose (PDD) curves in a 30x30x30 cm 3 water phantom. The simulation results were compared with the data measured by a CEA film and an ion chamber. The results show that the dose profiles are influenced by the energy and the spot size, while PDD curves are primarily influenced by the energy of the incident beam. The effect of the distance from the point source on the dose profile is not significant and is recommended to be set at infinity. We also recommend adjusting the beam energy by using PDD curves and, then, adjusting the spot size by using the dose profile to maintain the consistency of the Monte Carlo results and measured data

  10. Monte Carlo simulation of AB-copolymers with saturating bonds

    CERN Document Server

    Chertovich, A V; Khokhlov, A R; Bohr, J

    2003-01-01

    Structural transitions in a single AB-copolymer chain where saturating bonds can be formed between A-and B-units are studied by means of Monte Carlo computer simulations using the bond fluctuation model. Three transitions are found, coil-globule, coil-hairpin and globule-hairpin, depending on the nature of a particular AB-sequence: statistical random sequence, diblock sequence and 'random-complementary' sequence (one-half of such an AB-sequence is random with Bernoulli statistics while the other half is complementary to the first one). The properties of random-complementary sequences are closer to those of diblock sequences than to the properties of random sequences. The model (although quite rough) is expected to represent some basic features of real RNA molecules, i.e. the formation of secondary structure of RNA due to hydrogen bonding of corresponding bases and stacking interactions of the base pairs in helixes. We introduce the notation of RNA-like copolymers and discuss in what sense the sequences studie...

  11. Experimental and Monte Carlo simulated spectra of a liquid-metal-jet x-ray source

    International Nuclear Information System (INIS)

    Marziani, M.; Gambaccini, M.; Di Domenico, G.; Taibi, A.; Cardarelli, P.

    2014-01-01

    A prototype x-ray system based on a liquid-metal-jet anode was evaluated within the framework of the LABSYNC project. The generated spectrum was measured using a CZT-based spectrometer and was compared with spectra simulated by three Monte Carlo codes: MCNPX, PENELOPE and EGS5. Notable differences in the simulated spectra were found. These are mainly attributable to differences in the models adopted for the electron-impact ionization cross section. The simulation that more closely reproduces the experimentally measured spectrum was provided by PENELOPE. - Highlights: • The x-ray spectrum of a liquid-jet x-ray anode was measured with a CZT spectrometer. • Results were compared with Monte Carlo simulations using MCNPX, PENELOPE, EGS5. • Notable differences were found among the Monte Carlo simulated spectra. • The key role was played by the electron-impact ionization cross-section model used. • The experimentally measured spectrum was closely reproduced by the PENELOPE code

  12. Computational characterization of HPGe detectors usable for a wide variety of source geometries by using Monte Carlo simulation and a multi-objective evolutionary algorithm

    Science.gov (United States)

    Guerra, J. G.; Rubiano, J. G.; Winter, G.; Guerra, A. G.; Alonso, H.; Arnedo, M. A.; Tejera, A.; Martel, P.; Bolivar, J. P.

    2017-06-01

    In this work, we have developed a computational methodology for characterizing HPGe detectors by implementing in parallel a multi-objective evolutionary algorithm, together with a Monte Carlo simulation code. The evolutionary algorithm is used for searching the geometrical parameters of a model of detector by minimizing the differences between the efficiencies calculated by Monte Carlo simulation and two reference sets of Full Energy Peak Efficiencies (FEPEs) corresponding to two given sample geometries, a beaker of small diameter laid over the detector window and a beaker of large capacity which wrap the detector. This methodology is a generalization of a previously published work, which was limited to beakers placed over the window of the detector with a diameter equal or smaller than the crystal diameter, so that the crystal mount cap (which surround the lateral surface of the crystal), was not considered in the detector model. The generalization has been accomplished not only by including such a mount cap in the model, but also using multi-objective optimization instead of mono-objective, with the aim of building a model sufficiently accurate for a wider variety of beakers commonly used for the measurement of environmental samples by gamma spectrometry, like for instance, Marinellis, Petris, or any other beaker with a diameter larger than the crystal diameter, for which part of the detected radiation have to pass through the mount cap. The proposed methodology has been applied to an HPGe XtRa detector, providing a model of detector which has been successfully verificated for different source-detector geometries and materials and experimentally validated using CRMs.

  13. Study of TXRF experimental system by Monte Carlo simulation

    International Nuclear Information System (INIS)

    Costa, Ana Cristina M.; Leitao, Roberta G.; Lopes, Ricardo T.; Anjos, Marcelino J.; Conti, Claudio C.

    2011-01-01

    The Total-Reflection X-ray Fluorescence (TXRF) technique offers unique possibilities to study the concentrations of a wide range of trace elements in various types of samples. Besides that, the TXRF technique is widely used to study the trace elements in biological, medical and environmental samples due to its multielemental character as well as simplicity of sample preparation and quantification methods used. In general the TXRF experimental setup is not simple and might require substantial experimental efforts. On the other hand, in recent years, experimental TXRF portable systems have been developed. It has motivated us to develop our own TXRF portable system. In this work we presented a first step in order to optimize a TXRF experimental setup using Monte Carlo simulation by MCNP code. The results found show that the Monte Carlo simulation method can be used to investigate the development of a TXRF experimental system before its assembly. (author)

  14. Monte Carlo simulation: tool for the calibration in analytical determination of radionuclides

    International Nuclear Information System (INIS)

    Gonzalez, Jorge A. Carrazana; Ferrera, Eduardo A. Capote; Gomez, Isis M. Fernandez; Castro, Gloria V. Rodriguez; Ricardo, Niury Martinez

    2013-01-01

    This work shows how is established the traceability of the analytical determinations using this calibration method. Highlights the advantages offered by Monte Carlo simulation for the application of corrections by differences in chemical composition, density and height of the samples analyzed. Likewise, the results obtained by the LVRA in two exercises organized by the International Agency for Atomic Energy (IAEA) are presented. In these exercises (an intercomparison and a proficiency test) all reported analytical results were obtained based on calibrations in efficiency by Monte Carlo simulation using the DETEFF program

  15. Concepts and Plans towards fast large scale Monte Carlo production for the ATLAS Experiment

    CERN Document Server

    Chapman, J; Duehrssen, M; Elsing, M; Froidevaux, D; Harrington, R; Jansky, R; Langenberg, R; Mandrysch, R; Marshall, Z; Ritsch, E; Salzburger, A

    2014-01-01

    The huge success of the physics program of the ATLAS experiment at the Large Hadron Collider (LHC) during run I relies upon a great number of simulated Monte Carlo events. This Monte Carlo production takes the biggest part of the computing resources being in use by ATLAS as of now. In this document we describe the plans to overcome the computing resource limitations for large scale Monte Carlo production in the ATLAS Experiment for run II, and beyond. A number of fast detector simulation, digitization and reconstruction techniques and are being discussed, based upon a new flexible detector simulation framework. To optimally benefit from these developments, a redesigned ATLAS MC production chain is presented at the end of this document.

  16. Computational characterization of HPGe detectors usable for a wide variety of source geometries by using Monte Carlo simulation and a multi-objective evolutionary algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Guerra, J.G., E-mail: jglezg2002@gmail.es [Departamento de Física, Universidad de Las Palmas de Gran Canaria, 35001 Las Palmas de Gran Canaria (Spain); Rubiano, J.G. [Departamento de Física, Universidad de Las Palmas de Gran Canaria, 35001 Las Palmas de Gran Canaria (Spain); Instituto Universitario de Estudios Ambientales y Recursos Naturales, Universidad de Las Palmas de Gran Canaria, 35001 Las Palmas de Gran Canaria (Spain); Winter, G. [Instituto Universitario de Sistemas Inteligentes y Aplicaciones Numéricas en la Ingeniería, Universidad de Las Palmas de Gran Canaria, 35001 Las Palmas de Gran Canaria (Spain); Guerra, A.G.; Alonso, H.; Arnedo, M.A.; Tejera, A.; Martel, P. [Departamento de Física, Universidad de Las Palmas de Gran Canaria, 35001 Las Palmas de Gran Canaria (Spain); Instituto Universitario de Estudios Ambientales y Recursos Naturales, Universidad de Las Palmas de Gran Canaria, 35001 Las Palmas de Gran Canaria (Spain); Bolivar, J.P. [Departamento de Física Aplicada, Universidad de Huelva, 21071 Huelva (Spain)

    2017-06-21

    In this work, we have developed a computational methodology for characterizing HPGe detectors by implementing in parallel a multi-objective evolutionary algorithm, together with a Monte Carlo simulation code. The evolutionary algorithm is used for searching the geometrical parameters of a model of detector by minimizing the differences between the efficiencies calculated by Monte Carlo simulation and two reference sets of Full Energy Peak Efficiencies (FEPEs) corresponding to two given sample geometries, a beaker of small diameter laid over the detector window and a beaker of large capacity which wrap the detector. This methodology is a generalization of a previously published work, which was limited to beakers placed over the window of the detector with a diameter equal or smaller than the crystal diameter, so that the crystal mount cap (which surround the lateral surface of the crystal), was not considered in the detector model. The generalization has been accomplished not only by including such a mount cap in the model, but also using multi-objective optimization instead of mono-objective, with the aim of building a model sufficiently accurate for a wider variety of beakers commonly used for the measurement of environmental samples by gamma spectrometry, like for instance, Marinellis, Petris, or any other beaker with a diameter larger than the crystal diameter, for which part of the detected radiation have to pass through the mount cap. The proposed methodology has been applied to an HPGe XtRa detector, providing a model of detector which has been successfully verificated for different source-detector geometries and materials and experimentally validated using CRMs. - Highlights: • A computational method for characterizing HPGe detectors has been generalized. • The new version is usable for a wider range of sample geometries. • It starts from reference FEPEs obtained through a standard calibration procedure. • A model of an HPGe XtRa detector has been

  17. Computational characterization of HPGe detectors usable for a wide variety of source geometries by using Monte Carlo simulation and a multi-objective evolutionary algorithm

    International Nuclear Information System (INIS)

    Guerra, J.G.; Rubiano, J.G.; Winter, G.; Guerra, A.G.; Alonso, H.; Arnedo, M.A.; Tejera, A.; Martel, P.; Bolivar, J.P.

    2017-01-01

    In this work, we have developed a computational methodology for characterizing HPGe detectors by implementing in parallel a multi-objective evolutionary algorithm, together with a Monte Carlo simulation code. The evolutionary algorithm is used for searching the geometrical parameters of a model of detector by minimizing the differences between the efficiencies calculated by Monte Carlo simulation and two reference sets of Full Energy Peak Efficiencies (FEPEs) corresponding to two given sample geometries, a beaker of small diameter laid over the detector window and a beaker of large capacity which wrap the detector. This methodology is a generalization of a previously published work, which was limited to beakers placed over the window of the detector with a diameter equal or smaller than the crystal diameter, so that the crystal mount cap (which surround the lateral surface of the crystal), was not considered in the detector model. The generalization has been accomplished not only by including such a mount cap in the model, but also using multi-objective optimization instead of mono-objective, with the aim of building a model sufficiently accurate for a wider variety of beakers commonly used for the measurement of environmental samples by gamma spectrometry, like for instance, Marinellis, Petris, or any other beaker with a diameter larger than the crystal diameter, for which part of the detected radiation have to pass through the mount cap. The proposed methodology has been applied to an HPGe XtRa detector, providing a model of detector which has been successfully verificated for different source-detector geometries and materials and experimentally validated using CRMs. - Highlights: • A computational method for characterizing HPGe detectors has been generalized. • The new version is usable for a wider range of sample geometries. • It starts from reference FEPEs obtained through a standard calibration procedure. • A model of an HPGe XtRa detector has been

  18. Studies of urea geometry by means of ab initio methods and computer simulations of liquids

    OpenAIRE

    Cirino, José Jair Vianna; Bertran, Celso Aparecido

    2002-01-01

    A study was carried out on the urea geometries using ab initio calculation and Monte Carlo computational simulation of liquids. The ab initio calculated results showed that urea has a non-planar conformation in the gas phase in which the hydrogen atoms are out of the plane formed by the heavy atoms. Free energies associated to the rotation of the amino groups of urea in water were obtained using the Monte Carlo method in which the thermodynamic perturbation theory is implemented. The magnitud...

  19. Monte Carlo simulation of the ARGO

    International Nuclear Information System (INIS)

    Depaola, G.O.

    1997-01-01

    We use GEANT Monte Carlo code to design an outline of the geometry and simulate the performance of the Argentine gamma-ray observer (ARGO), a telescope based on silicon strip detector technlogy. The γ-ray direction is determined by geometrical means and the angular resolution is calculated for small variations of the basic design. The results show that the angular resolutions vary from a few degrees at low energies (∝50 MeV) to 0.2 , approximately, at high energies (>500 MeV). We also made simulations using as incoming γ-ray the energy spectrum of PKS0208-512 and PKS0528+134 quasars. Moreover, a method based on multiple scattering theory is also used to determine the incoming energy. We show that this method is applicable to energy spectrum. (orig.)

  20. GEANT Monte Carlo simulations for the GREAT spectrometer

    International Nuclear Information System (INIS)

    Andreyev, A.N.; Butler, P.A.; Page, R.D.; Appelbe, D.E.; Jones, G.D.; Joss, D.T.; Herzberg, R.-D.; Regan, P.H.; Simpson, J.; Wadsworth, R.

    2004-01-01

    GEANT Monte Carlo simulations for the recently developed GREAT spectrometer are presented. Some novel applications of the spectrometer for γ-ray, conversion-electron and β-decay spectroscopy are discussed. The conversion-electron spectroscopy of heavy nuclei with strongly converted transitions and the extension of the recoil decay tagging method to β-decaying nuclei are considered in detail

  1. Monte Carlo simulation of a CZT detector

    International Nuclear Information System (INIS)

    Chun, Sung Dae; Park, Se Hwan; Ha, Jang Ho; Kim, Han Soo; Cho, Yoon Ho; Kang, Sang Mook; Kim, Yong Kyun; Hong, Duk Geun

    2008-01-01

    CZT detector is one of the most promising radiation detectors for hard X-ray and γ-ray measurement. The energy spectrum of CZT detector has to be simulated to optimize the detector design. A CZT detector was fabricated with dimensions of 5x5x2 mm 3 . A Peltier cooler with a size of 40x40 mm 2 was installed below the fabricated CZT detector to reduce the operation temperature of the detector. Energy spectra of were measured with 59.5 keV γ-ray from 241 Am. A Monte Carlo code was developed to simulate the CZT energy spectrum, which was measured with a planar-type CZT detector, and the result was compared with the measured one. The simulation was extended to the CZT detector with strip electrodes. (author)

  2. Three-dimensional coupled Monte Carlo-discrete ordinates computational scheme for shielding calculations of large and complex nuclear facilities

    International Nuclear Information System (INIS)

    Chen, Y.; Fischer, U.

    2005-01-01

    Shielding calculations of advanced nuclear facilities such as accelerator based neutron sources or fusion devices of the tokamak type are complicated due to their complex geometries and their large dimensions, including bulk shields of several meters thickness. While the complexity of the geometry in the shielding calculation can be hardly handled by the discrete ordinates method, the deep penetration of radiation through bulk shields is a severe challenge for the Monte Carlo particle transport technique. This work proposes a dedicated computational scheme for coupled Monte Carlo-Discrete Ordinates transport calculations to handle this kind of shielding problems. The Monte Carlo technique is used to simulate the particle generation and transport in the target region with both complex geometry and reaction physics, and the discrete ordinates method is used to treat the deep penetration problem in the bulk shield. The coupling scheme has been implemented in a program system by loosely integrating the Monte Carlo transport code MCNP, the three-dimensional discrete ordinates code TORT and a newly developed coupling interface program for mapping process. Test calculations were performed with comparison to MCNP solutions. Satisfactory agreements were obtained between these two approaches. The program system has been chosen to treat the complicated shielding problem of the accelerator-based IFMIF neutron source. The successful application demonstrates that coupling scheme with the program system is a useful computational tool for the shielding analysis of complex and large nuclear facilities. (authors)

  3. SaaS enabled admission control for MCMC simulation in cloud computing infrastructures

    Science.gov (United States)

    Vázquez-Poletti, J. L.; Moreno-Vozmediano, R.; Han, R.; Wang, W.; Llorente, I. M.

    2017-02-01

    Markov Chain Monte Carlo (MCMC) methods are widely used in the field of simulation and modelling of materials, producing applications that require a great amount of computational resources. Cloud computing represents a seamless source for these resources in the form of HPC. However, resource over-consumption can be an important drawback, specially if the cloud provision process is not appropriately optimized. In the present contribution we propose a two-level solution that, on one hand, takes advantage of approximate computing for reducing the resource demand and on the other, uses admission control policies for guaranteeing an optimal provision to running applications.

  4. Monte Carlo simulation for radiographic applications

    International Nuclear Information System (INIS)

    Tillack, G.R.; Bellon, C.

    2003-01-01

    Standard radiography simulators are based on the attenuation law complemented by built-up-factors (BUF) to describe the interaction of radiation with material. The assumption of BUF implies that scattered radiation reduces only the contrast in radiographic images. This simplification holds for a wide range of applications like weld inspection as known from practical experience. But only a detailed description of the different underlying interaction mechanisms is capable to explain effects like mottling or others that every radiographer has experienced in practice. The application of Monte Carlo models is capable to handle primary and secondary interaction mechanisms contributing to the image formation process like photon interactions (absorption, incoherent and coherent scattering including electron-binding effects, pair production) and electron interactions (electron tracing including X-Ray fluorescence and Bremsstrahlung production). It opens up possibilities like the separation of influencing factors and the understanding of the functioning of intensifying screen used in film radiography. The paper discusses the opportunities in applying the Monte Carlo method to investigate special features in radiography in terms of selected examples. (orig.) [de

  5. Segmented slant hole collimator for stationary cardiac SPECT: Monte Carlo simulations.

    Science.gov (United States)

    Mao, Yanfei; Yu, Zhicong; Zeng, Gengsheng L

    2015-09-01

    This work is a preliminary study of a stationary cardiac SPECT system. The goal of this research is to propose a stationary cardiac SPECT system using segmented slant-hole collimators and to perform computer simulations to test the feasibility. Compared to the rotational SPECT, a stationary system has a benefit of acquiring temporally consistent projections. The most challenging issue in building a stationary system is to provide sufficient projection view-angles. A GATE (GEANT4 application for tomographic emission) Monte Carlo model was developed to simulate a two-detector stationary cardiac SPECT that uses segmented slant-hole collimators. Each detector contains seven segmented slant-hole sections that slant to a common volume at the rotation center. Consequently, 14 view-angles over 180° were acquired without any gantry rotation. The NCAT phantom was used for data generation and a tailored maximum-likelihood expectation-maximization algorithm was used for image reconstruction. Effects of limited number of view-angles and data truncation were carefully evaluated in the paper. Simulation results indicated that the proposed segmented slant-hole stationary cardiac SPECT system is able to acquire sufficient data for cardiac imaging without a loss of image quality, even when the uptakes in the liver and kidneys are high. Seven views are acquired simultaneously at each detector, leading to 5-fold sensitivity gain over the conventional dual-head system at the same total acquisition time, which in turn increases the signal-to-noise ratio by 19%. The segmented slant-hole SPECT system also showed a good performance in lesion detection. In our prototype system, a short hole-length was used to reduce the dead zone between neighboring collimator segments. The measured sensitivity gain is about 17-fold over the conventional dual-head system. The gate Monte Carlo simulations confirm the feasibility of the proposed stationary cardiac SPECT system with segmented slant

  6. Segmented slant hole collimator for stationary cardiac SPECT: Monte Carlo simulations

    Energy Technology Data Exchange (ETDEWEB)

    Mao, Yanfei, E-mail: ymao@ucair.med.utah.edu [Department of Radiology, Utah Center for Advanced Imaging Research (UCAIR), University of Utah, Salt Lake City, Utah 84108 and Department of Bioengineering, University of Utah, Salt Lake City, Utah 84112 (United States); Yu, Zhicong [Department of Radiology, Mayo Clinic, Rochester, Minnesota 55905 (United States); Zeng, Gengsheng L. [Department of Radiology, Utah Center for Advanced Imaging Research (UCAIR), University of Utah, Salt Lake City, Utah 84108 and Department of Engineering, Weber State University, Ogden, Utah 84408 (United States)

    2015-09-15

    Purpose: This work is a preliminary study of a stationary cardiac SPECT system. The goal of this research is to propose a stationary cardiac SPECT system using segmented slant-hole collimators and to perform computer simulations to test the feasibility. Compared to the rotational SPECT, a stationary system has a benefit of acquiring temporally consistent projections. The most challenging issue in building a stationary system is to provide sufficient projection view-angles. Methods: A GATE (GEANT4 application for tomographic emission) Monte Carlo model was developed to simulate a two-detector stationary cardiac SPECT that uses segmented slant-hole collimators. Each detector contains seven segmented slant-hole sections that slant to a common volume at the rotation center. Consequently, 14 view-angles over 180° were acquired without any gantry rotation. The NCAT phantom was used for data generation and a tailored maximum-likelihood expectation-maximization algorithm was used for image reconstruction. Effects of limited number of view-angles and data truncation were carefully evaluated in the paper. Results: Simulation results indicated that the proposed segmented slant-hole stationary cardiac SPECT system is able to acquire sufficient data for cardiac imaging without a loss of image quality, even when the uptakes in the liver and kidneys are high. Seven views are acquired simultaneously at each detector, leading to 5-fold sensitivity gain over the conventional dual-head system at the same total acquisition time, which in turn increases the signal-to-noise ratio by 19%. The segmented slant-hole SPECT system also showed a good performance in lesion detection. In our prototype system, a short hole-length was used to reduce the dead zone between neighboring collimator segments. The measured sensitivity gain is about 17-fold over the conventional dual-head system. Conclusions: The GATE Monte Carlo simulations confirm the feasibility of the proposed stationary cardiac

  7. Diagrammatic Monte Carlo simulations of staggered fermions at finite coupling

    CERN Document Server

    Vairinhos, Helvio

    2016-01-01

    Diagrammatic Monte Carlo has been a very fruitful tool for taming, and in some cases even solving, the sign problem in several lattice models. We have recently proposed a diagrammatic model for simulating lattice gauge theories with staggered fermions at arbitrary coupling, which extends earlier successful efforts to simulate lattice QCD at finite baryon density in the strong-coupling regime. Here we present the first numerical simulations of our model, using worm algorithms.

  8. Advanced scientific computational methods and their applications to nuclear technologies. (4) Overview of scientific computational methods, introduction of continuum simulation methods and their applications (4)

    International Nuclear Information System (INIS)

    Sekimura, Naoto; Okita, Taira

    2006-01-01

    Scientific computational methods have advanced remarkably with the progress of nuclear development. They have played the role of weft connecting each realm of nuclear engineering and then an introductory course of advanced scientific computational methods and their applications to nuclear technologies were prepared in serial form. This is the fourth issue showing the overview of scientific computational methods with the introduction of continuum simulation methods and their applications. Simulation methods on physical radiation effects on materials are reviewed based on the process such as binary collision approximation, molecular dynamics, kinematic Monte Carlo method, reaction rate method and dislocation dynamics. (T. Tanaka)

  9. A reappraisal of drug release laws using Monte Carlo simulations: the prevalence of the Weibull function.

    Science.gov (United States)

    Kosmidis, Kosmas; Argyrakis, Panos; Macheras, Panos

    2003-07-01

    To verify the Higuchi law and study the drug release from cylindrical and spherical matrices by means of Monte Carlo computer simulation. A one-dimensional matrix, based on the theoretical assumptions of the derivation of the Higuchi law, was simulated and its time evolution was monitored. Cylindrical and spherical three-dimensional lattices were simulated with sites at the boundary of the lattice having been denoted as leak sites. Particles were allowed to move inside it using the random walk model. Excluded volume interactions between the particles was assumed. We have monitored the system time evolution for different lattice sizes and different initial particle concentrations. The Higuchi law was verified using the Monte Carlo technique in a one-dimensional lattice. It was found that Fickian drug release from cylindrical matrices can be approximated nicely with the Weibull function. A simple linear relation between the Weibull function parameters and the specific surface of the system was found. Drug release from a matrix, as a result of a diffusion process assuming excluded volume interactions between the drug molecules, can be described using a Weibull function. This model, although approximate and semiempirical, has the benefit of providing a simple physical connection between the model parameters and the system geometry, which was something missing from other semiempirical models.

  10. A Monte Carlo simulation of the microdosimetric response for thick gas electron multiplier

    International Nuclear Information System (INIS)

    Hanu, A.; Byun, S.H.; Prestwich, W.V.

    2010-01-01

    The neutron microdosimetric responses of the thick gas electron multiplier (THGEM) detector were simulated. The THGEM is a promising device for microdosimetry, particularly for measuring the dose spectra of intense radiation fields and for collecting two-dimensional microdosimetric distributions. To investigate the response of the prototype THGEM microdosimetric detector, a simulation was developed using the Geant4 Monte Carlo code. The simulation calculates the deposited energy in the detector sensitive volume for an incident neutron beam. Both neutron energy and angular responses were computed for various neutron beam conditions. The energy response was compared with the reported experimental microdosimetric spectra as well as the evaluated fluence-to-kerma conversion coefficients. The effects of using non-tissue equivalent materials were also investigated by comparing the THGEM detector response with the response of an ideal detector in identical neutron field conditions. The result of the angular response simulations revealed severe angular dependencies for neutron energies above 100 keV. The simulation of a modified detector design gave an angular response pattern close to the ideal case, showing a fluctuation of less than 10% over the entire angular range.

  11. A polygon-surface reference Korean male phantom (PSRK-Man) and its direct implementation in Geant4 Monte Carlo simulation

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Chan Hyeong; Jeong, Jong Hwi [Department of Nuclear Engineering, Hanyang University, 17 Haengdang-dong, Seongdong-gu, Seoul 133-791 (Korea, Republic of); Bolch, Wesley E [Department of Nuclear and Radiological Engineering, University of Florida, Gainesville, FL 32611 (United States); Cho, Kun-Woo [Korea Institute of Nuclear Safety, 19 Guseong-dong, Yuseong-gu, Daejeon 305-600 (Korea, Republic of); Hwang, Sung Bae, E-mail: chkim@hanyang.ac.kr [Department of Physical Therapy, Kyungbuk College, Hyucheon 2-dong, Yeongju-si, Gyeongbuk 750-712 (Korea, Republic of)

    2011-05-21

    Even though the hybrid phantom embodies both the anatomic reality of voxel phantoms and the deformability of stylized phantoms, it must be voxelized to be used in a Monte Carlo code for dose calculation or some imaging simulation, which incurs the inherent limitations of voxel phantoms. In the present study, a voxel phantom named VKH-Man (Visible Korean Human-Man), was converted to a polygon-surface phantom (PSRK-Man, Polygon-Surface Reference Korean-Man), which was then adjusted to the Reference Korean data. Subsequently, the PSRK-Man polygon phantom was directly, without any voxelization process, implemented in the Geant4 Monte Carlo code for dose calculations. The calculated dose values and computation time were then compared with those of HDRK-Man (High Definition Reference Korean-Man), a corresponding voxel phantom adjusted to the same Reference Korean data from the same VKH-Man voxel phantom. Our results showed that the calculated dose values of the PSRK-Man surface phantom agreed well with those of the HDRK-Man voxel phantom. The calculation speed for the PSRK-Man polygon phantom though was 70-150 times slower than that of the HDRK-Man voxel phantom; that speed, however, could be acceptable in some applications, in that direct use of the surface phantom PSRK-Man in Geant4 does not require a separate voxelization process. Computing speed can be enhanced, in future, either by optimizing the Monte Carlo transport kernel for the polygon surfaces or by using modern computing technologies such as grid computing and general-purpose computing on graphics processing units programming.

  12. A polygon-surface reference Korean male phantom (PSRK-Man) and its direct implementation in Geant4 Monte Carlo simulation

    International Nuclear Information System (INIS)

    Kim, Chan Hyeong; Jeong, Jong Hwi; Bolch, Wesley E; Cho, Kun-Woo; Hwang, Sung Bae

    2011-01-01

    Even though the hybrid phantom embodies both the anatomic reality of voxel phantoms and the deformability of stylized phantoms, it must be voxelized to be used in a Monte Carlo code for dose calculation or some imaging simulation, which incurs the inherent limitations of voxel phantoms. In the present study, a voxel phantom named VKH-Man (Visible Korean Human-Man), was converted to a polygon-surface phantom (PSRK-Man, Polygon-Surface Reference Korean-Man), which was then adjusted to the Reference Korean data. Subsequently, the PSRK-Man polygon phantom was directly, without any voxelization process, implemented in the Geant4 Monte Carlo code for dose calculations. The calculated dose values and computation time were then compared with those of HDRK-Man (High Definition Reference Korean-Man), a corresponding voxel phantom adjusted to the same Reference Korean data from the same VKH-Man voxel phantom. Our results showed that the calculated dose values of the PSRK-Man surface phantom agreed well with those of the HDRK-Man voxel phantom. The calculation speed for the PSRK-Man polygon phantom though was 70-150 times slower than that of the HDRK-Man voxel phantom; that speed, however, could be acceptable in some applications, in that direct use of the surface phantom PSRK-Man in Geant4 does not require a separate voxelization process. Computing speed can be enhanced, in future, either by optimizing the Monte Carlo transport kernel for the polygon surfaces or by using modern computing technologies such as grid computing and general-purpose computing on graphics processing units programming.

  13. MO-F-CAMPUS-I-03: GPU Accelerated Monte Carlo Technique for Fast Concurrent Image and Dose Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Becchetti, M; Tian, X; Segars, P; Samei, E [Clinical Imaging Physics Group, Department of Radiology, Duke University Me, Durham, NC (United States)

    2015-06-15

    Purpose: To develop an accurate and fast Monte Carlo (MC) method of simulating CT that is capable of correlating dose with image quality using voxelized phantoms. Methods: A realistic voxelized phantom based on patient CT data, XCAT, was used with a GPU accelerated MC code for helical MDCT. Simulations were done with both uniform density organs and with textured organs. The organ doses were validated using previous experimentally validated simulations of the same phantom under the same conditions. Images acquired by tracking photons through the phantom with MC require lengthy computation times due to the large number of photon histories necessary for accurate representation of noise. A substantial speed up of the process was attained by using a low number of photon histories with kernel denoising of the projections from the scattered photons. These FBP reconstructed images were validated against those that were acquired in simulations using many photon histories by ensuring a minimal normalized root mean square error. Results: Organ doses simulated in the XCAT phantom are within 10% of the reference values. Corresponding images attained using projection kernel smoothing were attained with 3 orders of magnitude less computation time compared to a reference simulation using many photon histories. Conclusion: Combining GPU acceleration with kernel denoising of scattered photon projections in MC simulations allows organ dose and corresponding image quality to be attained with reasonable accuracy and substantially reduced computation time than is possible with standard simulation approaches.

  14. MO-F-CAMPUS-I-03: GPU Accelerated Monte Carlo Technique for Fast Concurrent Image and Dose Simulation

    International Nuclear Information System (INIS)

    Becchetti, M; Tian, X; Segars, P; Samei, E

    2015-01-01

    Purpose: To develop an accurate and fast Monte Carlo (MC) method of simulating CT that is capable of correlating dose with image quality using voxelized phantoms. Methods: A realistic voxelized phantom based on patient CT data, XCAT, was used with a GPU accelerated MC code for helical MDCT. Simulations were done with both uniform density organs and with textured organs. The organ doses were validated using previous experimentally validated simulations of the same phantom under the same conditions. Images acquired by tracking photons through the phantom with MC require lengthy computation times due to the large number of photon histories necessary for accurate representation of noise. A substantial speed up of the process was attained by using a low number of photon histories with kernel denoising of the projections from the scattered photons. These FBP reconstructed images were validated against those that were acquired in simulations using many photon histories by ensuring a minimal normalized root mean square error. Results: Organ doses simulated in the XCAT phantom are within 10% of the reference values. Corresponding images attained using projection kernel smoothing were attained with 3 orders of magnitude less computation time compared to a reference simulation using many photon histories. Conclusion: Combining GPU acceleration with kernel denoising of scattered photon projections in MC simulations allows organ dose and corresponding image quality to be attained with reasonable accuracy and substantially reduced computation time than is possible with standard simulation approaches

  15. Computer Simulation of Electron Positron Annihilation Processes

    Energy Technology Data Exchange (ETDEWEB)

    Chen, y

    2003-10-02

    With the launching of the Next Linear Collider coming closer and closer, there is a pressing need for physicists to develop a fully-integrated computer simulation of e{sup +}e{sup -} annihilation process at center-of-mass energy of 1TeV. A simulation program acts as the template for future experiments. Either new physics will be discovered, or current theoretical uncertainties will shrink due to more accurate higher-order radiative correction calculations. The existence of an efficient and accurate simulation will help us understand the new data and validate (or veto) some of the theoretical models developed to explain new physics. It should handle well interfaces between different sectors of physics, e.g., interactions happening at parton levels well above the QCD scale which are described by perturbative QCD, and interactions happening at much lower energy scale, which combine partons into hadrons. Also it should achieve competitive speed in real time when the complexity of the simulation increases. This thesis contributes some tools that will be useful for the development of such simulation programs. We begin our study by the development of a new Monte Carlo algorithm intended to perform efficiently in selecting weight-1 events when multiple parameter dimensions are strongly correlated. The algorithm first seeks to model the peaks of the distribution by features, adapting these features to the function using the EM algorithm. The representation of the distribution provided by these features is then improved using the VEGAS algorithm for the Monte Carlo integration. The two strategies mesh neatly into an effective multi-channel adaptive representation. We then present a new algorithm for the simulation of parton shower processes in high energy QCD. We want to find an algorithm which is free of negative weights, produces its output as a set of exclusive events, and whose total rate exactly matches the full Feynman amplitude calculation. Our strategy is to create

  16. Monte Carlo and detector simulation in OOP [Object-Oriented Programming

    International Nuclear Information System (INIS)

    Atwood, W.B.; Blankenbecler, R.; Kunz, P.; Burnett, T.; Storr, K.M.

    1990-10-01

    Object-Oriented Programming techniques are explored with an eye toward applications in High Energy Physics codes. Two prototype examples are given: McOOP (a particle Monte Carlo generator) and GISMO (a detector simulation/analysis package)

  17. astroABC : An Approximate Bayesian Computation Sequential Monte Carlo sampler for cosmological parameter estimation

    Energy Technology Data Exchange (ETDEWEB)

    Jennings, E.; Madigan, M.

    2017-04-01

    Given the complexity of modern cosmological parameter inference where we arefaced with non-Gaussian data and noise, correlated systematics and multi-probecorrelated data sets, the Approximate Bayesian Computation (ABC) method is apromising alternative to traditional Markov Chain Monte Carlo approaches in thecase where the Likelihood is intractable or unknown. The ABC method is called"Likelihood free" as it avoids explicit evaluation of the Likelihood by using aforward model simulation of the data which can include systematics. Weintroduce astroABC, an open source ABC Sequential Monte Carlo (SMC) sampler forparameter estimation. A key challenge in astrophysics is the efficient use oflarge multi-probe datasets to constrain high dimensional, possibly correlatedparameter spaces. With this in mind astroABC allows for massive parallelizationusing MPI, a framework that handles spawning of jobs across multiple nodes. Akey new feature of astroABC is the ability to create MPI groups with differentcommunicators, one for the sampler and several others for the forward modelsimulation, which speeds up sampling time considerably. For smaller jobs thePython multiprocessing option is also available. Other key features include: aSequential Monte Carlo sampler, a method for iteratively adapting tolerancelevels, local covariance estimate using scikit-learn's KDTree, modules forspecifying optimal covariance matrix for a component-wise or multivariatenormal perturbation kernel, output and restart files are backed up everyiteration, user defined metric and simulation methods, a module for specifyingheterogeneous parameter priors including non-standard prior PDFs, a module forspecifying a constant, linear, log or exponential tolerance level,well-documented examples and sample scripts. This code is hosted online athttps://github.com/EliseJ/astroABC

  18. A GPU-based large-scale Monte Carlo simulation method for systems with long-range interactions

    Science.gov (United States)

    Liang, Yihao; Xing, Xiangjun; Li, Yaohang

    2017-06-01

    In this work we present an efficient implementation of Canonical Monte Carlo simulation for Coulomb many body systems on graphics processing units (GPU). Our method takes advantage of the GPU Single Instruction, Multiple Data (SIMD) architectures, and adopts the sequential updating scheme of Metropolis algorithm. It makes no approximation in the computation of energy, and reaches a remarkable 440-fold speedup, compared with the serial implementation on CPU. We further use this method to simulate primitive model electrolytes, and measure very precisely all ion-ion pair correlation functions at high concentrations. From these data, we extract the renormalized Debye length, renormalized valences of constituent ions, and renormalized dielectric constants. These results demonstrate unequivocally physics beyond the classical Poisson-Boltzmann theory.

  19. The Linked Neighbour List (LNL) method for fast off-lattice Monte Carlo simulations of fluids

    Science.gov (United States)

    Mazzeo, M. D.; Ricci, M.; Zannoni, C.

    2010-03-01

    We present a new algorithm, called linked neighbour list (LNL), useful to substantially speed up off-lattice Monte Carlo simulations of fluids by avoiding the computation of the molecular energy before every attempted move. We introduce a few variants of the LNL method targeted to minimise memory footprint or augment memory coherence and cache utilisation. Additionally, we present a few algorithms which drastically accelerate neighbour finding. We test our methods on the simulation of a dense off-lattice Gay-Berne fluid subjected to periodic boundary conditions observing a speedup factor of about 2.5 with respect to a well-coded implementation based on a conventional link-cell. We provide several implementation details of the different key data structures and algorithms used in this work.

  20. Efficient SPECT scatter calculation in non-uniform media using correlated Monte Carlo simulation

    International Nuclear Information System (INIS)

    Beekman, F.J.

    1999-01-01

    Accurate simulation of scatter in projection data of single photon emission computed tomography (SPECT) is computationally extremely demanding for activity distribution in non-uniform dense media. This paper suggests how the computation time and memory requirements can be significantly reduced. First the scatter projection of a uniform dense object (P SDSE ) is calculated using a previously developed accurate and fast method which includes all orders of scatter (slab-derived scatter estimation), and then P SDSE is transformed towards the desired projection P which is based on the non-uniform object. The transform of P SDSE is based on two first-order Compton scatter Monte Carlo (MC) simulated projections. One is based on the uniform object (P u ) and the other on the object with non-uniformities (P ν ). P is estimated by P-tilde=P SDSE P ν /P u . A tremendous decrease in noise in P-tilde is achieved by tracking photon paths for P ν identical to those which were tracked for the calculation of P u and by using analytical rather than stochastic modelling of the collimator. The method was validated by comparing the results with standard MC-simulated scatter projections (P) of 99m Tc and 201 Tl point sources in a digital thorax phantom. After correction, excellent agreement was obtained between P-tilde and P. The total computation time required to calculate an accurate scatter projection of an extended distribution in a thorax phantom on a PC is a only few tens of seconds per projection, which makes the method attractive for application in accurate scatter correction in clinical SPECT. Furthermore, the method removes the need of excessive computer memory involved with previously proposed 3D model-based scatter correction methods. (author)

  1. A New Approach to Monte Carlo Simulations in Statistical Physics

    Science.gov (United States)

    Landau, David P.

    2002-08-01

    Monte Carlo simulations [1] have become a powerful tool for the study of diverse problems in statistical/condensed matter physics. Standard methods sample the probability distribution for the states of the system, most often in the canonical ensemble, and over the past several decades enormous improvements have been made in performance. Nonetheless, difficulties arise near phase transitions-due to critical slowing down near 2nd order transitions and to metastability near 1st order transitions, and these complications limit the applicability of the method. We shall describe a new Monte Carlo approach [2] that uses a random walk in energy space to determine the density of states directly. Once the density of states is known, all thermodynamic properties can be calculated. This approach can be extended to multi-dimensional parameter spaces and should be effective for systems with complex energy landscapes, e.g., spin glasses, protein folding models, etc. Generalizations should produce a broadly applicable optimization tool. 1. A Guide to Monte Carlo Simulations in Statistical Physics, D. P. Landau and K. Binder (Cambridge U. Press, Cambridge, 2000). 2. Fugao Wang and D. P. Landau, Phys. Rev. Lett. 86, 2050 (2001); Phys. Rev. E64, 056101-1 (2001).

  2. Improved local lattice Monte Carlo simulation for charged systems

    Science.gov (United States)

    Jiang, Jian; Wang, Zhen-Gang

    2018-03-01

    Maggs and Rossetto [Phys. Rev. Lett. 88, 196402 (2002)] proposed a local lattice Monte Carlo algorithm for simulating charged systems based on Gauss's law, which scales with the particle number N as O(N). This method includes two degrees of freedom: the configuration of the mobile charged particles and the electric field. In this work, we consider two important issues in the implementation of the method, the acceptance rate of configurational change (particle move) and the ergodicity in the phase space sampled by the electric field. We propose a simple method to improve the acceptance rate of particle moves based on the superposition principle for electric field. Furthermore, we introduce an additional updating step for the field, named "open-circuit update," to ensure that the system is fully ergodic under periodic boundary conditions. We apply this improved local Monte Carlo simulation to an electrolyte solution confined between two low dielectric plates. The results show excellent agreement with previous theoretical work.

  3. Neutron spectrum unfolding using genetic algorithm in a Monte Carlo simulation

    Energy Technology Data Exchange (ETDEWEB)

    Suman, Vitisha [Health Physics Division, Bhabha Atomic Research Centre, Mumbai 400085 (India); Sarkar, P.K., E-mail: pksarkar02@gmail.com [Manipal Centre for Natural Sciences, Manipal University, Manipal 576104 (India)

    2014-02-11

    A spectrum unfolding technique GAMCD (Genetic Algorithm and Monte Carlo based spectrum Deconvolution) has been developed using the genetic algorithm methodology within the framework of Monte Carlo simulations. Each Monte Carlo history starts with initial solution vectors (population) as randomly generated points in the hyper dimensional solution space that are related to the measured data by the response matrix of the detection system. The transition of the solution points in the solution space from one generation to another are governed by the genetic algorithm methodology using the techniques of cross-over (mating) and mutation in a probabilistic manner adding new solution points to the population. The population size is kept constant by discarding solutions having lesser fitness values (larger differences between measured and calculated results). Solutions having the highest fitness value at the end of each Monte Carlo history are averaged over all histories to obtain the final spectral solution. The present method shows promising results in neutron spectrum unfolding for both under-determined and over-determined problems with simulated test data as well as measured data when compared with some existing unfolding codes. An attractive advantage of the present method is the independence of the final spectra from the initial guess spectra.

  4. Neutron spectrum unfolding using genetic algorithm in a Monte Carlo simulation

    International Nuclear Information System (INIS)

    Suman, Vitisha; Sarkar, P.K.

    2014-01-01

    A spectrum unfolding technique GAMCD (Genetic Algorithm and Monte Carlo based spectrum Deconvolution) has been developed using the genetic algorithm methodology within the framework of Monte Carlo simulations. Each Monte Carlo history starts with initial solution vectors (population) as randomly generated points in the hyper dimensional solution space that are related to the measured data by the response matrix of the detection system. The transition of the solution points in the solution space from one generation to another are governed by the genetic algorithm methodology using the techniques of cross-over (mating) and mutation in a probabilistic manner adding new solution points to the population. The population size is kept constant by discarding solutions having lesser fitness values (larger differences between measured and calculated results). Solutions having the highest fitness value at the end of each Monte Carlo history are averaged over all histories to obtain the final spectral solution. The present method shows promising results in neutron spectrum unfolding for both under-determined and over-determined problems with simulated test data as well as measured data when compared with some existing unfolding codes. An attractive advantage of the present method is the independence of the final spectra from the initial guess spectra

  5. Physical time scale in kinetic Monte Carlo simulations of continuous-time Markov chains.

    Science.gov (United States)

    Serebrinsky, Santiago A

    2011-03-01

    We rigorously establish a physical time scale for a general class of kinetic Monte Carlo algorithms for the simulation of continuous-time Markov chains. This class of algorithms encompasses rejection-free (or BKL) and rejection (or "standard") algorithms. For rejection algorithms, it was formerly considered that the availability of a physical time scale (instead of Monte Carlo steps) was empirical, at best. Use of Monte Carlo steps as a time unit now becomes completely unnecessary.

  6. CMS Monte Carlo production in the WLCG computing grid

    International Nuclear Information System (INIS)

    Hernandez, J M; Kreuzer, P; Hof, C; Khomitch, A; Mohapatra, A; Filippis, N D; Pompili, A; My, S; Abbrescia, M; Maggi, G; Donvito, G; Weirdt, S D; Maes, J; Mulders, P v; Villella, I; Wakefield, S; Guan, W; Fanfani, A; Evans, D; Flossdorf, A

    2008-01-01

    Monte Carlo production in CMS has received a major boost in performance and scale since the past CHEP06 conference. The production system has been re-engineered in order to incorporate the experience gained in running the previous system and to integrate production with the new CMS event data model, data management system and data processing framework. The system is interfaced to the two major computing Grids used by CMS, the LHC Computing Grid (LCG) and the Open Science Grid (OSG). Operational experience and integration aspects of the new CMS Monte Carlo production system is presented together with an analysis of production statistics. The new system automatically handles job submission, resource monitoring, job queuing, job distribution according to the available resources, data merging, registration of data into the data bookkeeping, data location, data transfer and placement systems. Compared to the previous production system automation, reliability and performance have been considerably improved. A more efficient use of computing resources and a better handling of the inherent Grid unreliability have resulted in an increase of production scale by about an order of magnitude, capable of running in parallel at the order of ten thousand jobs and yielding more than two million events per day

  7. Accurate and precise determination of critical properties from Gibbs ensemble Monte Carlo simulations

    International Nuclear Information System (INIS)

    Dinpajooh, Mohammadhasan; Bai, Peng; Allan, Douglas A.; Siepmann, J. Ilja

    2015-01-01

    Since the seminal paper by Panagiotopoulos [Mol. Phys. 61, 813 (1997)], the Gibbs ensemble Monte Carlo (GEMC) method has been the most popular particle-based simulation approach for the computation of vapor–liquid phase equilibria. However, the validity of GEMC simulations in the near-critical region has been questioned because rigorous finite-size scaling approaches cannot be applied to simulations with fluctuating volume. Valleau [Mol. Simul. 29, 627 (2003)] has argued that GEMC simulations would lead to a spurious overestimation of the critical temperature. More recently, Patel et al. [J. Chem. Phys. 134, 024101 (2011)] opined that the use of analytical tail corrections would be problematic in the near-critical region. To address these issues, we perform extensive GEMC simulations for Lennard-Jones particles in the near-critical region varying the system size, the overall system density, and the cutoff distance. For a system with N = 5500 particles, potential truncation at 8σ and analytical tail corrections, an extrapolation of GEMC simulation data at temperatures in the range from 1.27 to 1.305 yields T c = 1.3128 ± 0.0016, ρ c = 0.316 ± 0.004, and p c = 0.1274 ± 0.0013 in excellent agreement with the thermodynamic limit determined by Potoff and Panagiotopoulos [J. Chem. Phys. 109, 10914 (1998)] using grand canonical Monte Carlo simulations and finite-size scaling. Critical properties estimated using GEMC simulations with different overall system densities (0.296 ≤ ρ t ≤ 0.336) agree to within the statistical uncertainties. For simulations with tail corrections, data obtained using r cut = 3.5σ yield T c and p c that are higher by 0.2% and 1.4% than simulations with r cut = 5 and 8σ but still with overlapping 95% confidence intervals. In contrast, GEMC simulations with a truncated and shifted potential show that r cut = 8σ is insufficient to obtain accurate results. Additional GEMC simulations for hard-core square-well particles with various

  8. Multilevel parallel strategy on Monte Carlo particle transport for the large-scale full-core pin-by-pin simulations

    International Nuclear Information System (INIS)

    Zhang, B.; Li, G.; Wang, W.; Shangguan, D.; Deng, L.

    2015-01-01

    This paper introduces the Strategy of multilevel hybrid parallelism of JCOGIN Infrastructure on Monte Carlo Particle Transport for the large-scale full-core pin-by-pin simulations. The particle parallelism, domain decomposition parallelism and MPI/OpenMP parallelism are designed and implemented. By the testing, JMCT presents the parallel scalability of JCOGIN, which reaches the parallel efficiency 80% on 120,000 cores for the pin-by-pin computation of the BEAVRS benchmark. (author)

  9. Study of the quantitative analysis approach of maintenance by the Monte Carlo simulation method

    International Nuclear Information System (INIS)

    Shimizu, Takashi

    2007-01-01

    This study is examination of the quantitative valuation by Monte Carlo simulation method of maintenance activities of a nuclear power plant. Therefore, the concept of the quantitative valuation of maintenance that examination was advanced in the Japan Society of Maintenology and International Institute of Universality (IUU) was arranged. Basis examination for quantitative valuation of maintenance was carried out at simple feed water system, by Monte Carlo simulation method. (author)

  10. Computer simulations of equilibrium magnetization and microstructure in magnetic fluids

    Science.gov (United States)

    Rosa, A. P.; Abade, G. C.; Cunha, F. R.

    2017-09-01

    In this work, Monte Carlo and Brownian Dynamics simulations are developed to compute the equilibrium magnetization of a magnetic fluid under action of a homogeneous applied magnetic field. The particles are free of inertia and modeled as hard spheres with the same diameters. Two different periodic boundary conditions are implemented: the minimum image method and Ewald summation technique by replicating a finite number of particles throughout the suspension volume. A comparison of the equilibrium magnetization resulting from the minimum image approach and Ewald sums is performed by using Monte Carlo simulations. The Monte Carlo simulations with minimum image and lattice sums are used to investigate suspension microstructure by computing the important radial pair-distribution function go(r), which measures the probability density of finding a second particle at a distance r from a reference particle. This function provides relevant information on structure formation and its anisotropy through the suspension. The numerical results of go(r) are compared with theoretical predictions based on quite a different approach in the absence of the field and dipole-dipole interactions. A very good quantitative agreement is found for a particle volume fraction of 0.15, providing a validation of the present simulations. In general, the investigated suspensions are dominated by structures like dimmer and trimmer chains with trimmers having probability to form an order of magnitude lower than dimmers. Using Monte Carlo with lattice sums, the density distribution function g2(r) is also examined. Whenever this function is different from zero, it indicates structure-anisotropy in the suspension. The dependence of the equilibrium magnetization on the applied field, the magnetic particle volume fraction, and the magnitude of the dipole-dipole magnetic interactions for both boundary conditions are explored in this work. Results show that at dilute regimes and with moderate dipole

  11. Random number generators for large-scale parallel Monte Carlo simulations on FPGA

    Science.gov (United States)

    Lin, Y.; Wang, F.; Liu, B.

    2018-05-01

    Through parallelization, field programmable gate array (FPGA) can achieve unprecedented speeds in large-scale parallel Monte Carlo (LPMC) simulations. FPGA presents both new constraints and new opportunities for the implementations of random number generators (RNGs), which are key elements of any Monte Carlo (MC) simulation system. Using empirical and application based tests, this study evaluates all of the four RNGs used in previous FPGA based MC studies and newly proposed FPGA implementations for two well-known high-quality RNGs that are suitable for LPMC studies on FPGA. One of the newly proposed FPGA implementations: a parallel version of additive lagged Fibonacci generator (Parallel ALFG) is found to be the best among the evaluated RNGs in fulfilling the needs of LPMC simulations on FPGA.

  12. DXRaySMCS. First user friendly interface developed for prediction of diagnostic radiology X-ray spectra produced by Monte Carlo (MCNP-4C) simulation in Iran

    International Nuclear Information System (INIS)

    Bahreyni Toossi, M.T.; Zare, H.; Moradi Faradanbe, H.

    2008-01-01

    An accurate knowledge of the output energy spectra of an x-ray tube is essential in many areas of radiological studies. It forms the basis of almost all image quality simulations and enable system designers to predict patient dose more accurately. Many radiological physics problems that can be solved by Monte Carlo simulation methods require an x-ray spectra as input data. Computer simulation of x-ray spectra is one of the most important tools for investigation of patient dose and image quality in diagnostic radiology systems. In this work the general purpose Monte Carlo N-particle radiation transport computer code (MCNP-4C) was used for the simulation of x-ray spectra in diagnostic radiology, Electron's path in the target was followed until it's energy was reduced to 10 keV. A user friendly interface named 'Diagnostic X-ray Spectra by Monte Carlo Simulation (DXRaySMCS)' was developed to facilitate the application of MCNP-4C code for diagnostic radiology spectrum prediction. The program provides a user friendly interface for modifying the MCNP input file, launching the MCNP program to simulate electron and photon transport and processing the MCNP output file to yield a summary of the results (Relative Photon Number per Energy Bin). In this article the development and characteristics of DXRaySMCS are outlined. As part of the validation process, out put spectra for 46 diagnostic radiology system settings produced by DXRaySMCS were compared with the corresponding IPEM78. Generally, there is a good agreement between the two sets of spectra. No statistically significant differences have been observed between IPEM78 reported spectra and the simulated spectra generated in this study. (author)

  13. Monte Carlo molecular simulation of phase-coexistence for oil production and processing

    KAUST Repository

    Li, Jun; Sun, Shuyu; Calo, Victor M.

    2011-01-01

    The Gibbs-NVT ensemble Monte Carlo method is used to simulate the liquid-vapor coexistence diagram and the simulation results of methane agree well with the experimental data in a wide range of temperatures. For systems with two components

  14. Monte Carlo simulations of the dose from imaging with GE eXplore 120 micro-CT using GATE

    Energy Technology Data Exchange (ETDEWEB)

    Bretin, Florian; Bahri, Mohamed Ali; Luxen, André; Phillips, Christophe; Plenevaux, Alain; Seret, Alain, E-mail: aseret@ulg.ac.be [Cyclotron Research Centre, University of Liège, Sart Tilman B30, Liège 4000 (Belgium)

    2015-10-15

    Purpose: Small animals are increasingly used as translational models in preclinical imaging studies involving microCT, during which the subjects can be exposed to large amounts of radiation. While the radiation levels are generally sublethal, studies have shown that low-level radiation can change physiological parameters in mice. In order to rule out any influence of radiation on the outcome of such experiments, or resulting deterministic effects in the subjects, the levels of radiation involved need to be addressed. The aim of this study was to investigate the radiation dose delivered by the GE eXplore 120 microCT non-invasively using Monte Carlo simulations in GATE and to compare results to previously obtained experimental values. Methods: Tungsten X-ray spectra were simulated at 70, 80, and 97 kVp using an analytical tool and their half-value layers were simulated for spectra validation against experimentally measured values of the physical X-ray tube. A Monte Carlo model of the microCT system was set up and four protocols that are regularly applied to live animal scanning were implemented. The computed tomography dose index (CTDI) inside a PMMA phantom was derived and multiple field of view acquisitions were simulated using the PMMA phantom, a representative mouse and rat. Results: Simulated half-value layers agreed with experimentally obtained results within a 7% error window. The CTDI ranged from 20 to 56 mGy and closely matched experimental values. Derived organ doses in mice reached 459 mGy in bones and up to 200 mGy in soft tissue organs using the highest energy protocol. Dose levels in rats were lower due to the increased mass of the animal compared to mice. The uncertainty of all dose simulations was below 14%. Conclusions: Monte Carlo simulations proved a valuable tool to investigate the 3D dose distribution in animals from microCT. Small animals, especially mice (due to their small volume), receive large amounts of radiation from the GE eXplore 120

  15. Monte Carlo simulation of discrete γ-ray detectors

    International Nuclear Information System (INIS)

    Bakkali, A.; Tamda, N.; Parmentier, M.; Chavanelle, J.; Pousse, A.; Kastler, B.

    2005-01-01

    Needs in medical diagnosis, especially for early and reliable breast cancer detection, lead us to consider developments in scintillation crystals and position sensitive photomultiplier tubes (PSPMT) in order to develop a high-resolution medium field γ-ray imaging device. However the ideal detector for γ-rays represents a compromise between many conflicting requirements. In order to optimize different parameters involved in the detection process, we have developed a Monte Carlo simulation software. Its aim was to study the light distribution produced by a gamma photon interacting with a pixellated scintillation crystal coupled to a PSPMT array. Several crystal properties were taken into account as well as the intrinsic response of PSPMTs. Images obtained by simulations are compared with experimental results. Agreement between simulation and experimental results validate our simulation model

  16. Influence of modifications in the positioning of phantoms in the Monte Carlo computational simulation of the prostate brachytherapy

    International Nuclear Information System (INIS)

    Barbosa, Antonio Konrado de Santana; Vieira, Jose Wilson; Costa, Kleber Souza Silva; Lima, Fernando Roberto de Andrade

    2011-01-01

    Radiotherapy simulation procedures using Monte Carlo methods have shown to be increasingly important to the improvement of cancer fighting strategies. Within this context, brachytherapy is one of the most used methods to ensure better life quality when compared to other therapeutic modalities. These procedures are planned with the use of sectional exams with the patient in lying position. However, it is known that alteration of body posture after the procedure has an influence in the localization of many organs. This study had the aim to identify and to measure the influence of such alterations in MC brachytherapy simulations. In order to do so, prostate brachytherapy with the use of Iodine-125 radionuclide was chosen as model. Simulations were carried out with 108 events using EGSnrc code associated to MASH phantom in orthostatic and supine positions. Significant alterations were found, especially regarding bladder, small intestine and testicles. (author)

  17. Probabilistic approach of resource assessment in Kerinci geothermal field using numerical simulation coupling with monte carlo simulation

    Science.gov (United States)

    Hidayat, Iki; Sutopo; Pratama, Heru Berian

    2017-12-01

    The Kerinci geothermal field is one phase liquid reservoir system in the Kerinci District, western part of Jambi Province. In this field, there are geothermal prospects that identified by the heat source up flow inside a National Park area. Kerinci field was planned to develop 1×55 MWe by Pertamina Geothermal Energy. To define reservoir characterization, the numerical simulation of Kerinci field is developed by using TOUGH2 software with information from conceptual model. The pressure and temperature profile well data of KRC-B1 are validated with simulation data to reach natural state condition. The result of the validation is suitable matching. Based on natural state simulation, the resource assessment of Kerinci geothermal field is estimated by using Monte Carlo simulation with the result P10-P50-P90 are 49.4 MW, 64.3 MW and 82.4 MW respectively. This paper is the first study of resource assessment that has been estimated successfully in Kerinci Geothermal Field using numerical simulation coupling with Monte carlo simulation.

  18. Adjoint electron Monte Carlo calculations

    International Nuclear Information System (INIS)

    Jordan, T.M.

    1986-01-01

    Adjoint Monte Carlo is the most efficient method for accurate analysis of space systems exposed to natural and artificially enhanced electron environments. Recent adjoint calculations for isotropic electron environments include: comparative data for experimental measurements on electronics boxes; benchmark problem solutions for comparing total dose prediction methodologies; preliminary assessment of sectoring methods used during space system design; and total dose predictions on an electronics package. Adjoint Monte Carlo, forward Monte Carlo, and experiment are in excellent agreement for electron sources that simulate space environments. For electron space environments, adjoint Monte Carlo is clearly superior to forward Monte Carlo, requiring one to two orders of magnitude less computer time for relatively simple geometries. The solid-angle sectoring approximations used for routine design calculations can err by more than a factor of 2 on dose in simple shield geometries. For critical space systems exposed to severe electron environments, these potential sectoring errors demand the establishment of large design margins and/or verification of shield design by adjoint Monte Carlo/experiment

  19. Electron Irradiation of Conjunctival Lymphoma-Monte Carlo Simulation of the Minute Dose Distribution and Technique Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Brualla, Lorenzo, E-mail: lorenzo.brualla@uni-due.de [NCTeam, Strahlenklinik, Universitaetsklinikum Essen, Essen (Germany); Zaragoza, Francisco J.; Sempau, Josep [Institut de Tecniques Energetiques, Universitat Politecnica de Catalunya, Barcelona (Spain); Wittig, Andrea [Department of Radiation Oncology, University Hospital Giessen and Marburg, Philipps-University Marburg, Marburg (Germany); Sauerwein, Wolfgang [NCTeam, Strahlenklinik, Universitaetsklinikum Essen, Essen (Germany)

    2012-07-15

    Purpose: External beam radiotherapy is the only conservative curative approach for Stage I non-Hodgkin lymphomas of the conjunctiva. The target volume is geometrically complex because it includes the eyeball and lid conjunctiva. Furthermore, the target volume is adjacent to radiosensitive structures, including the lens, lacrimal glands, cornea, retina, and papilla. The radiotherapy planning and optimization requires accurate calculation of the dose in these anatomical structures that are much smaller than the structures traditionally considered in radiotherapy. Neither conventional treatment planning systems nor dosimetric measurements can reliably determine the dose distribution in these small irradiated volumes. Methods and Materials: The Monte Carlo simulations of a Varian Clinac 2100 C/D and human eye were performed using the PENELOPE and PENEASYLINAC codes. Dose distributions and dose volume histograms were calculated for the bulbar conjunctiva, cornea, lens, retina, papilla, lacrimal gland, and anterior and posterior hemispheres. Results: The simulated results allow choosing the most adequate treatment setup configuration, which is an electron beam energy of 6 MeV with additional bolus and collimation by a cerrobend block with a central cylindrical hole of 3.0 cm diameter and central cylindrical rod of 1.0 cm diameter. Conclusions: Monte Carlo simulation is a useful method to calculate the minute dose distribution in ocular tissue and to optimize the electron irradiation technique in highly critical structures. Using a voxelized eye phantom based on patient computed tomography images, the dose distribution can be estimated with a standard statistical uncertainty of less than 2.4% in 3 min using a computing cluster with 30 cores, which makes this planning technique clinically relevant.

  20. Failure Probability Estimation of Wind Turbines by Enhanced Monte Carlo

    DEFF Research Database (Denmark)

    Sichani, Mahdi Teimouri; Nielsen, Søren R.K.; Naess, Arvid

    2012-01-01

    This paper discusses the estimation of the failure probability of wind turbines required by codes of practice for designing them. The Standard Monte Carlo (SMC) simulations may be used for this reason conceptually as an alternative to the popular Peaks-Over-Threshold (POT) method. However......, estimation of very low failure probabilities with SMC simulations leads to unacceptably high computational costs. In this study, an Enhanced Monte Carlo (EMC) method is proposed that overcomes this obstacle. The method has advantages over both POT and SMC in terms of its low computational cost and accuracy...... is controlled by the pitch controller. This provides a fair framework for comparison of the behavior and failure event of the wind turbine with emphasis on the effect of the pitch controller. The Enhanced Monte Carlo method is then applied to the model and the failure probabilities of the model are estimated...