WorldWideScience

Sample records for monte carlo-based program

  1. CARMEN: a system Monte Carlo based on linear programming from direct openings; CARMEN: Un sistema de planficiacion Monte Carlo basado en programacion lineal a partir de aberturas directas

    Energy Technology Data Exchange (ETDEWEB)

    Ureba, A.; Pereira-Barbeiro, A. R.; Jimenez-Ortega, E.; Baeza, J. A.; Salguero, F. J.; Leal, A.

    2013-07-01

    The use of Monte Carlo (MC) has shown an improvement in the accuracy of the calculation of the dose compared to other analytics algorithms installed on the systems of business planning, especially in the case of non-standard situations typical of complex techniques such as IMRT and VMAT. Our treatment planning system called CARMEN, is based on the complete simulation, both the beam transport in the head of the accelerator and the patient, and simulation designed for efficient operation in terms of the accuracy of the estimate and the required computation times. (Author)

  2. Quantitative Monte Carlo-based holmium-166 SPECT reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Elschot, Mattijs; Smits, Maarten L. J.; Nijsen, Johannes F. W.; Lam, Marnix G. E. H.; Zonnenberg, Bernard A.; Bosch, Maurice A. A. J. van den; Jong, Hugo W. A. M. de [Department of Radiology and Nuclear Medicine, University Medical Center Utrecht, Heidelberglaan 100, 3584 CX Utrecht (Netherlands); Viergever, Max A. [Image Sciences Institute, University Medical Center Utrecht, Heidelberglaan 100, 3584 CX Utrecht (Netherlands)

    2013-11-15

    Purpose: Quantitative imaging of the radionuclide distribution is of increasing interest for microsphere radioembolization (RE) of liver malignancies, to aid treatment planning and dosimetry. For this purpose, holmium-166 ({sup 166}Ho) microspheres have been developed, which can be visualized with a gamma camera. The objective of this work is to develop and evaluate a new reconstruction method for quantitative {sup 166}Ho SPECT, including Monte Carlo-based modeling of photon contributions from the full energy spectrum.Methods: A fast Monte Carlo (MC) simulator was developed for simulation of {sup 166}Ho projection images and incorporated in a statistical reconstruction algorithm (SPECT-fMC). Photon scatter and attenuation for all photons sampled from the full {sup 166}Ho energy spectrum were modeled during reconstruction by Monte Carlo simulations. The energy- and distance-dependent collimator-detector response was modeled using precalculated convolution kernels. Phantom experiments were performed to quantitatively evaluate image contrast, image noise, count errors, and activity recovery coefficients (ARCs) of SPECT-fMC in comparison with those of an energy window-based method for correction of down-scattered high-energy photons (SPECT-DSW) and a previously presented hybrid method that combines MC simulation of photopeak scatter with energy window-based estimation of down-scattered high-energy contributions (SPECT-ppMC+DSW). Additionally, the impact of SPECT-fMC on whole-body recovered activities (A{sup est}) and estimated radiation absorbed doses was evaluated using clinical SPECT data of six {sup 166}Ho RE patients.Results: At the same noise level, SPECT-fMC images showed substantially higher contrast than SPECT-DSW and SPECT-ppMC+DSW in spheres ≥17 mm in diameter. The count error was reduced from 29% (SPECT-DSW) and 25% (SPECT-ppMC+DSW) to 12% (SPECT-fMC). ARCs in five spherical volumes of 1.96–106.21 ml were improved from 32%–63% (SPECT-DSW) and 50%–80

  3. GPU-Monte Carlo based fast IMRT plan optimization

    Directory of Open Access Journals (Sweden)

    Yongbao Li

    2014-03-01

    , Shi F, Jiang S, Jia X. GPU-Monte Carlo based fast IMRT plan optimization. Int J Cancer Ther Oncol 2014; 2(2:020244. DOI: 10.14319/ijcto.0202.44

  4. Development of a Monte-Carlo based method for calculating the effect of stationary fluctuations

    DEFF Research Database (Denmark)

    Pettersen, E. E.; Demazire, C.; Jareteg, K.

    2015-01-01

    that corresponds to the real part of the neutron balance, and one that corresponds to the imaginary part. The two equivalent problems are in nature similar to two subcritical systems driven by external neutron sources, and can thus be treated as such in a Monte Carlo framework. The definition of these two...... of light water reactor conditions in an infinite lattice of fuel pins surrounded by water. The test case highlights flux gradients that are steeper in the Monte Carlo-based transport solution than in the diffusion-based solution. Compared to other Monte Carlo-based methods earlier proposed for carrying out...

  5. MCHITS: Monte Carlo based Method for Hyperlink Induced Topic Search on Networks

    Directory of Open Access Journals (Sweden)

    Zhaoyan Jin

    2013-10-01

    Full Text Available Hyperlink Induced Topic Search (HITS is the most authoritative and most widely used personalized ranking algorithm on networks. The HITS algorithm ranks nodes on networks according to power iteration, and has high complexity of computation. This paper models the HITS algorithm with the Monte Carlo method, and proposes Monte Carlo based algorithms for the HITS computation. Theoretical analysis and experiments show that the Monte Carlo based approximate computing of the HITS ranking reduces computing resources a lot while keeping higher accuracy, and is significantly better than related works

  6. Monte Carlo based radial shield design of typical PWR reactor

    Energy Technology Data Exchange (ETDEWEB)

    Gul, Anas; Khan, Rustam; Qureshi, M. Ayub; Azeem, Muhammad Waqar; Raza, S.A. [Pakistan Institute of Engineering and Applied Sciences, Islamabad (Pakistan). Dept. of Nuclear Engineering; Stummer, Thomas [Technische Univ. Wien (Austria). Atominst.

    2016-11-15

    Neutron and gamma flux and dose equivalent rate distribution are analysed in radial and shields of a typical PWR type reactor based on the Monte Carlo radiation transport computer code MCNP5. The ENDF/B-VI continuous energy cross-section library has been employed for the criticality and shielding analysis. The computed results are in good agreement with the reference results (maximum difference is less than 56 %). It implies that MCNP5 a good tool for accurate prediction of neutron and gamma flux and dose rates in radial shield around the core of PWR type reactors.

  7. A Monte Carlo-based model of gold nanoparticle radiosensitization

    Science.gov (United States)

    Lechtman, Eli Solomon

    The goal of radiotherapy is to operate within the therapeutic window - delivering doses of ionizing radiation to achieve locoregional tumour control, while minimizing normal tissue toxicity. A greater therapeutic ratio can be achieved by utilizing radiosensitizing agents designed to enhance the effects of radiation at the tumour. Gold nanoparticles (AuNP) represent a novel radiosensitizer with unique and attractive properties. AuNPs enhance local photon interactions, thereby converting photons into localized damaging electrons. Experimental reports of AuNP radiosensitization reveal this enhancement effect to be highly sensitive to irradiation source energy, cell line, and AuNP size, concentration and intracellular localization. This thesis explored the physics and some of the underlying mechanisms behind AuNP radiosensitization. A Monte Carlo simulation approach was developed to investigate the enhanced photoelectric absorption within AuNPs, and to characterize the escaping energy and range of the photoelectric products. Simulations revealed a 10 3 fold increase in the rate of photoelectric absorption using low-energy brachytherapy sources compared to megavolt sources. For low-energy sources, AuNPs released electrons with ranges of only a few microns in the surrounding tissue. For higher energy sources, longer ranged photoelectric products travelled orders of magnitude farther. A novel radiobiological model called the AuNP radiosensitization predictive (ARP) model was developed based on the unique nanoscale energy deposition pattern around AuNPs. The ARP model incorporated detailed Monte Carlo simulations with experimentally determined parameters to predict AuNP radiosensitization. This model compared well to in vitro experiments involving two cancer cell lines (PC-3 and SK-BR-3), two AuNP sizes (5 and 30 nm) and two source energies (100 and 300 kVp). The ARP model was then used to explore the effects of AuNP intracellular localization using 1.9 and 100 nm Au

  8. Monte Carlo based treatment planning systems for Boron Neutron Capture Therapy in Petten, The Netherlands

    Science.gov (United States)

    Nievaart, V. A.; Daquino, G. G.; Moss, R. L.

    2007-06-01

    Boron Neutron Capture Therapy (BNCT) is a bimodal form of radiotherapy for the treatment of tumour lesions. Since the cancer cells in the treatment volume are targeted with 10B, a higher dose is given to these cancer cells due to the 10B(n,α)7Li reaction, in comparison with the surrounding healthy cells. In Petten (The Netherlands), at the High Flux Reactor, a specially tailored neutron beam has been designed and installed. Over 30 patients have been treated with BNCT in 2 clinical protocols: a phase I study for the treatment of glioblastoma multiforme and a phase II study on the treatment of malignant melanoma. Furthermore, activities concerning the extra-corporal treatment of metastasis in the liver (from colorectal cancer) are in progress. The irradiation beam at the HFR contains both neutrons and gammas that, together with the complex geometries of both patient and beam set-up, demands for very detailed treatment planning calculations. A well designed Treatment Planning System (TPS) should obey the following general scheme: (1) a pre-processing phase (CT and/or MRI scans to create the geometric solid model, cross-section files for neutrons and/or gammas); (2) calculations (3D radiation transport, estimation of neutron and gamma fluences, macroscopic and microscopic dose); (3) post-processing phase (displaying of the results, iso-doses and -fluences). Treatment planning in BNCT is performed making use of Monte Carlo codes incorporated in a framework, which includes also the pre- and post-processing phases. In particular, the glioblastoma multiforme protocol used BNCT_rtpe, while the melanoma metastases protocol uses NCTPlan. In addition, an ad hoc Positron Emission Tomography (PET) based treatment planning system (BDTPS) has been implemented in order to integrate the real macroscopic boron distribution obtained from PET scanning. BDTPS is patented and uses MCNP as the calculation engine. The precision obtained by the Monte Carlo based TPSs exploited at Petten

  9. Monte Carlo-based Noise Compensation in Coil Intensity Corrected Endorectal MRI

    CERN Document Server

    Lui, Dorothy; Haider, Masoom; Wong, Alexander

    2015-01-01

    Background: Prostate cancer is one of the most common forms of cancer found in males making early diagnosis important. Magnetic resonance imaging (MRI) has been useful in visualizing and localizing tumor candidates and with the use of endorectal coils (ERC), the signal-to-noise ratio (SNR) can be improved. The coils introduce intensity inhomogeneities and the surface coil intensity correction built into MRI scanners is used to reduce these inhomogeneities. However, the correction typically performed at the MRI scanner level leads to noise amplification and noise level variations. Methods: In this study, we introduce a new Monte Carlo-based noise compensation approach for coil intensity corrected endorectal MRI which allows for effective noise compensation and preservation of details within the prostate. The approach accounts for the ERC SNR profile via a spatially-adaptive noise model for correcting non-stationary noise variations. Such a method is useful particularly for improving the image quality of coil i...

  10. New Monte Carlo-based method to evaluate fission fraction uncertainties for the reactor antineutrino experiment

    Science.gov (United States)

    Ma, X. B.; Qiu, R. M.; Chen, Y. X.

    2017-02-01

    Uncertainties regarding fission fractions are essential in understanding antineutrino flux predictions in reactor antineutrino experiments. A new Monte Carlo-based method to evaluate the covariance coefficients between isotopes is proposed. The covariance coefficients are found to vary with reactor burnup and may change from positive to negative because of balance effects in fissioning. For example, between 235U and 239Pu, the covariance coefficient changes from 0.15 to -0.13. Using the equation relating fission fraction and atomic density, consistent uncertainties in the fission fraction and covariance matrix were obtained. The antineutrino flux uncertainty is 0.55%, which does not vary with reactor burnup. The new value is about 8.3% smaller.

  11. A Monte-Carlo based model of the AX-PET demonstrator and its experimental validation.

    Science.gov (United States)

    Solevi, P; Oliver, J F; Gillam, J E; Bolle, E; Casella, C; Chesi, E; De Leo, R; Dissertori, G; Fanti, V; Heller, M; Lai, M; Lustermann, W; Nappi, E; Pauss, F; Rudge, A; Ruotsalainen, U; Schinzel, D; Schneider, T; Séguinot, J; Stapnes, S; Weilhammer, P; Tuna, U; Joram, C; Rafecas, M

    2013-08-21

    AX-PET is a novel PET detector based on axially oriented crystals and orthogonal wavelength shifter (WLS) strips, both individually read out by silicon photo-multipliers. Its design decouples sensitivity and spatial resolution, by reducing the parallax error due to the layered arrangement of the crystals. Additionally the granularity of AX-PET enhances the capability to track photons within the detector yielding a large fraction of inter-crystal scatter events. These events, if properly processed, can be included in the reconstruction stage further increasing the sensitivity. Its unique features require dedicated Monte-Carlo simulations, enabling the development of the device, interpreting data and allowing the development of reconstruction codes. At the same time the non-conventional design of AX-PET poses several challenges to the simulation and modeling tasks, mostly related to the light transport and distribution within the crystals and WLS strips, as well as the electronics readout. In this work we present a hybrid simulation tool based on an analytical model and a Monte-Carlo based description of the AX-PET demonstrator. It was extensively validated against experimental data, providing excellent agreement.

  12. Dosimetric validation of a commercial Monte Carlo based IMRT planning system.

    Science.gov (United States)

    Grofsmid, Dennis; Dirkx, Maarten; Marijnissen, Hans; Woudstra, Evert; Heijmen, Ben

    2010-02-01

    Recently a commercial Monte Carlo based IMRT planning system (Monaco version 1.0.0) was released. In this study the dosimetric accuracy of this new planning system was validated. Absolute dose profiles, depth dose curves, and output factors calculated by Monaco were compared with measurements in a water phantom. Different static on-axis and off-axis fields were tested at various source-skin distances for 6, 10, and 18 MV photon beams. Four clinical IMRT plans were evaluated in a water phantom using a linear diode detector array and another six IMRT plans for different tumor sites in solid water using a 2D detector array. In order to evaluate the accuracy of the dose engine near tissue inhomogeneities absolute dose distributions were measured with Gafchromic EBT film in an inhomogeneous slab phantom. For an end-to-end test a four-field IMRT plan was applied to an anthropomorphic lung phantom with a simulated tumor peripherally located in the right lung. Gafchromic EBT film, placed in and around the tumor area, was used to evaluate the dose distribution. Generally, the measured and the calculated dose distributions agreed within 2% dose difference or 2 mm distance-to-agreement. But mainly at interfaces with bone, some larger dose differences could be observed. Based on the results of this study, the authors concluded that the dosimetric accuracy of Monaco is adequate for clinical introduction.

  13. A comprehensive revisit of the ρ meson with improved Monte-Carlo based QCD sum rules

    Science.gov (United States)

    Wang, Qi-Nan; Zhang, Zhu-Feng; Steele, T. G.; Jin, Hong-Ying; Huang, Zhuo-Ran

    2017-07-01

    We improve the Monte-Carlo based QCD sum rules by introducing the rigorous Hölder-inequality-determined sum rule window and a Breit-Wigner type parametrization for the phenomenological spectral function. In this improved sum rule analysis methodology, the sum rule analysis window can be determined without any assumptions on OPE convergence or the QCD continuum. Therefore, an unbiased prediction can be obtained for the phenomenological parameters (the hadronic mass and width etc.). We test the new approach in the ρ meson channel with re-examination and inclusion of α s corrections to dimension-4 condensates in the OPE. We obtain results highly consistent with experimental values. We also discuss the possible extension of this method to some other channels. Supported by NSFC (11175153, 11205093, 11347020), Open Foundation of the Most Important Subjects of Zhejiang Province, and K. C. Wong Magna Fund in Ningbo University, TGS is Supported by the Natural Sciences and Engineering Research Council of Canada (NSERC), Z. F. Zhang and Z. R. Huang are Grateful to the University of Saskatchewan for its Warm Hospitality

  14. SU-E-T-761: TOMOMC, A Monte Carlo-Based Planning VerificationTool for Helical Tomotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Chibani, O; Ma, C [Fox Chase Cancer Center, Philadelphia, PA (United States)

    2015-06-15

    Purpose: Present a new Monte Carlo code (TOMOMC) to calculate 3D dose distributions for patients undergoing helical tomotherapy treatments. TOMOMC performs CT-based dose calculations using the actual dynamic variables of the machine (couch motion, gantry rotation, and MLC sequences). Methods: TOMOMC is based on the GEPTS (Gama Electron and Positron Transport System) general-purpose Monte Carlo system (Chibani and Li, Med. Phys. 29, 2002, 835). First, beam models for the Hi-Art Tomotherpy machine were developed for the different beam widths (1, 2.5 and 5 cm). The beam model accounts for the exact geometry and composition of the different components of the linac head (target, primary collimator, jaws and MLCs). The beams models were benchmarked by comparing calculated Pdds and lateral/transversal dose profiles with ionization chamber measurements in water. See figures 1–3. The MLC model was tuned in such a way that tongue and groove effect, inter-leaf and intra-leaf transmission are modeled correctly. See figure 4. Results: By simulating the exact patient anatomy and the actual treatment delivery conditions (couch motion, gantry rotation and MLC sinogram), TOMOMC is able to calculate the 3D patient dose distribution which is in principal more accurate than the one from the treatment planning system (TPS) since it relies on the Monte Carlo method (gold standard). Dose volume parameters based on the Monte Carlo dose distribution can also be compared to those produced by the TPS. Attached figures show isodose lines for a H&N patient calculated by TOMOMC (transverse and sagittal views). Analysis of differences between TOMOMC and TPS is an ongoing work for different anatomic sites. Conclusion: A new Monte Carlo code (TOMOMC) was developed for Tomotherapy patient-specific QA. The next step in this project is implementing GPU computing to speed up Monte Carlo simulation and make Monte Carlo-based treatment verification a practical solution.

  15. A Monte-Carlo based extension of the Meteor Orbit and Trajectory Software (MOTS) for computations of orbital elements

    Science.gov (United States)

    Albin, T.; Koschny, D.; Soja, R.; Srama, R.; Poppe, B.

    2016-01-01

    The Canary Islands Long-Baseline Observatory (CILBO) is a double station meteor camera system (Koschny et al., 2013; Koschny et al., 2014) that consists of 5 cameras. The two cameras considered in this report are ICC7 and ICC9, and are installed on Tenerife and La Palma. They point to the same atmospheric volume between both islands allowing stereoscopic observation of meteors. Since its installation in 2011 and the start of operation in 2012 CILBO has detected over 15000 simultaneously observed meteors. Koschny and Diaz (2002) developed the Meteor Orbit and Trajectory Software (MOTS) to compute the trajectory of such meteors. The software uses the astrometric data from the detection software MetRec (Molau, 1998) and determines the trajectory in geodetic coordinates. This work presents a Monte-Carlo based extension of the MOTS code to compute the orbital elements of simultaneously detected meteors by CILBO.

  16. Monte Carlo based calibration of scintillation detectors for laboratory and in situ gamma ray measurements

    NARCIS (Netherlands)

    van der Graaf, E. R.; Limburg, J.; Koomans, R. L.; Tijs, M.

    2011-01-01

    The calibration of scintillation detectors for gamma radiation in a well characterized setup can be transferred to other geometries using Monte Carlo simulations to account for the differences between the calibration and the other geometry. In this study a calibration facility was used that is const

  17. Monte-Carlo based prediction of radiochromic film response for hadrontherapy dosimetry

    Energy Technology Data Exchange (ETDEWEB)

    Frisson, T. [Universite de Lyon, F-69622 Lyon (France); CREATIS-LRMN, INSA, Batiment Blaise Pascal, 7 avenue Jean Capelle, 69621 Villeurbanne Cedex (France); Centre Leon Berrard - 28 rue Laennec, F-69373 Lyon Cedex 08 (France)], E-mail: frisson@creatis.insa-lyon.fr; Zahra, N. [Universite de Lyon, F-69622 Lyon (France); IPNL - CNRS/IN2P3 UMR 5822, Universite Lyon 1, Batiment Paul Dirac, 4 rue Enrico Fermi, F-69622 Villeurbanne Cedex (France); Centre Leon Berrard - 28 rue Laennec, F-69373 Lyon Cedex 08 (France); Lautesse, P. [Universite de Lyon, F-69622 Lyon (France); IPNL - CNRS/IN2P3 UMR 5822, Universite Lyon 1, Batiment Paul Dirac, 4 rue Enrico Fermi, F-69622 Villeurbanne Cedex (France); Sarrut, D. [Universite de Lyon, F-69622 Lyon (France); CREATIS-LRMN, INSA, Batiment Blaise Pascal, 7 avenue Jean Capelle, 69621 Villeurbanne Cedex (France); Centre Leon Berrard - 28 rue Laennec, F-69373 Lyon Cedex 08 (France)

    2009-07-21

    A model has been developed to calculate MD-55-V2 radiochromic film response to ion irradiation. This model is based on photon film response and film saturation by high local energy deposition computed by Monte-Carlo simulation. We have studied the response of the film to photon irradiation and we proposed a calculation method for hadron beams.

  18. Monte-Carlo based prediction of radiochromic film response for hadrontherapy dosimetry

    Science.gov (United States)

    Frisson, T.; Zahra, N.; Lautesse, P.; Sarrut, D.

    2009-07-01

    A model has been developed to calculate MD-55-V2 radiochromic film response to ion irradiation. This model is based on photon film response and film saturation by high local energy deposition computed by Monte-Carlo simulation. We have studied the response of the film to photon irradiation and we proposed a calculation method for hadron beams.

  19. Monte Carlo based calibration of scintillation detectors for laboratory and in situ gamma ray measurements

    NARCIS (Netherlands)

    van der Graaf, E. R.; Limburg, J.; Koomans, R. L.; Tijs, M.

    The calibration of scintillation detectors for gamma radiation in a well characterized setup can be transferred to other geometries using Monte Carlo simulations to account for the differences between the calibration and the other geometry. In this study a calibration facility was used that is

  20. Uncertainties in Monte Carlo-based absorbed dose calculations for an experimental benchmark.

    Science.gov (United States)

    Renner, F; Wulff, J; Kapsch, R-P; Zink, K

    2015-10-01

    There is a need to verify the accuracy of general purpose Monte Carlo codes like EGSnrc, which are commonly employed for investigations of dosimetric problems in radiation therapy. A number of experimental benchmarks have been published to compare calculated values of absorbed dose to experimentally determined values. However, there is a lack of absolute benchmarks, i.e. benchmarks without involved normalization which may cause some quantities to be cancelled. Therefore, at the Physikalisch-Technische Bundesanstalt a benchmark experiment was performed, which aimed at the absolute verification of radiation transport calculations for dosimetry in radiation therapy. A thimble-type ionization chamber in a solid phantom was irradiated by high-energy bremsstrahlung and the mean absorbed dose in the sensitive volume was measured per incident electron of the target. The characteristics of the accelerator and experimental setup were precisely determined and the results of a corresponding Monte Carlo simulation with EGSnrc are presented within this study. For a meaningful comparison, an analysis of the uncertainty of the Monte Carlo simulation is necessary. In this study uncertainties with regard to the simulation geometry, the radiation source, transport options of the Monte Carlo code and specific interaction cross sections are investigated, applying the general methodology of the Guide to the expression of uncertainty in measurement. Besides studying the general influence of changes in transport options of the EGSnrc code, uncertainties are analyzed by estimating the sensitivity coefficients of various input quantities in a first step. Secondly, standard uncertainties are assigned to each quantity which are known from the experiment, e.g. uncertainties for geometric dimensions. Data for more fundamental quantities such as photon cross sections and the I-value of electron stopping powers are taken from literature. The significant uncertainty contributions are identified as

  1. Using a Monte-Carlo-based approach to evaluate the uncertainty on fringe projection technique

    CERN Document Server

    Molimard, Jérôme

    2013-01-01

    A complete uncertainty analysis on a given fringe projection set-up has been performed using Monte-Carlo approach. In particular the calibration procedure is taken into account. Two applications are given: at a macroscopic scale, phase noise is predominant whilst at microscopic scale, both phase noise and calibration errors are important. Finally, uncertainty found at macroscopic scale is close to some experimental tests (~100 {\\mu}m).

  2. A Monte Carlo-based radiation safety assessment for astronauts in an environment with confined magnetic field shielding.

    Science.gov (United States)

    Geng, Changran; Tang, Xiaobin; Gong, Chunhui; Guan, Fada; Johns, Jesse; Shu, Diyun; Chen, Da

    2015-12-01

    The active shielding technique has great potential for radiation protection in space exploration because it has the advantage of a significant mass saving compared with the passive shielding technique. This paper demonstrates a Monte Carlo-based approach to evaluating the shielding effectiveness of the active shielding technique using confined magnetic fields (CMFs). The International Commission on Radiological Protection reference anthropomorphic phantom, as well as the toroidal CMF, was modeled using the Monte Carlo toolkit Geant4. The penetrating primary particle fluence, organ-specific dose equivalent, and male effective dose were calculated for particles in galactic cosmic radiation (GCR) and solar particle events (SPEs). Results show that the SPE protons can be easily shielded against, even almost completely deflected, by the toroidal magnetic field. GCR particles can also be more effectively shielded against by increasing the magnetic field strength. Our results also show that the introduction of a structural Al wall in the CMF did not provide additional shielding for GCR; in fact it can weaken the total shielding effect of the CMF. This study demonstrated the feasibility of accurately determining the radiation field inside the environment and evaluating the organ dose equivalents for astronauts under active shielding using the CMF.

  3. Monte Carlo based geometrical model for efficiency calculation of an n-type HPGe detector

    Energy Technology Data Exchange (ETDEWEB)

    Padilla Cabal, Fatima, E-mail: fpadilla@instec.c [Instituto Superior de Tecnologias y Ciencias Aplicadas, ' Quinta de los Molinos' Ave. Salvador Allende, esq. Luaces, Plaza de la Revolucion, Ciudad de la Habana, CP 10400 (Cuba); Lopez-Pino, Neivy; Luis Bernal-Castillo, Jose; Martinez-Palenzuela, Yisel; Aguilar-Mena, Jimmy; D' Alessandro, Katia; Arbelo, Yuniesky; Corrales, Yasser; Diaz, Oscar [Instituto Superior de Tecnologias y Ciencias Aplicadas, ' Quinta de los Molinos' Ave. Salvador Allende, esq. Luaces, Plaza de la Revolucion, Ciudad de la Habana, CP 10400 (Cuba)

    2010-12-15

    A procedure to optimize the geometrical model of an n-type detector is described. Sixteen lines from seven point sources ({sup 241}Am, {sup 133}Ba, {sup 22}Na, {sup 60}Co, {sup 57}Co, {sup 137}Cs and {sup 152}Eu) placed at three different source-to-detector distances (10, 20 and 30 cm) were used to calibrate a low-background gamma spectrometer between 26 and 1408 keV. Direct Monte Carlo techniques using the MCNPX 2.6 and GEANT 4 9.2 codes, and a semi-empirical procedure were performed to obtain theoretical efficiency curves. Since discrepancies were found between experimental and calculated data using the manufacturer parameters of the detector, a detail study of the crystal dimensions and the geometrical configuration is carried out. The relative deviation with experimental data decreases from a mean value of 18-4%, after the parameters were optimized.

  4. A Monte Carlo-based treatment-planning tool for ion beam therapy

    CERN Document Server

    Böhlen, T T; Dosanjh, M; Ferrari, A; Haberer, T; Parodi, K; Patera, V; Mairan, A

    2013-01-01

    Ion beam therapy, as an emerging radiation therapy modality, requires continuous efforts to develop and improve tools for patient treatment planning (TP) and research applications. Dose and fluence computation algorithms using the Monte Carlo (MC) technique have served for decades as reference tools for accurate dose computations for radiotherapy. In this work, a novel MC-based treatment-planning (MCTP) tool for ion beam therapy using the pencil beam scanning technique is presented. It allows single-field and simultaneous multiple-fields optimization for realistic patient treatment conditions and for dosimetric quality assurance for irradiation conditions at state-of-the-art ion beam therapy facilities. It employs iterative procedures that allow for the optimization of absorbed dose and relative biological effectiveness (RBE)-weighted dose using radiobiological input tables generated by external RBE models. Using a re-implementation of the local effect model (LEM), theMCTP tool is able to perform TP studies u...

  5. Experimental validation of a rapid Monte Carlo based micro-CT simulator

    Science.gov (United States)

    Colijn, A. P.; Zbijewski, W.; Sasov, A.; Beekman, F. J.

    2004-09-01

    We describe a newly developed, accelerated Monte Carlo simulator of a small animal micro-CT scanner. Transmission measurements using aluminium slabs are employed to estimate the spectrum of the x-ray source. The simulator incorporating this spectrum is validated with micro-CT scans of physical water phantoms of various diameters, some containing stainless steel and Teflon rods. Good agreement is found between simulated and real data: normalized error of simulated projections, as compared to the real ones, is typically smaller than 0.05. Also the reconstructions obtained from simulated and real data are found to be similar. Thereafter, effects of scatter are studied using a voxelized software phantom representing a rat body. It is shown that the scatter fraction can reach tens of per cents in specific areas of the body and therefore scatter can significantly affect quantitative accuracy in small animal CT imaging.

  6. Monte Carlo based dosimetry for neutron capture therapy of brain tumors

    Science.gov (United States)

    Zaidi, Lilia; Belgaid, Mohamed; Khelifi, Rachid

    2016-11-01

    Boron Neutron Capture Therapy (BNCT) is a biologically targeted, radiation therapy for cancer which combines neutron irradiation with a tumor targeting agent labeled with a boron10 having a high thermal neutron capture cross section. The tumor area is subjected to the neutron irradiation. After a thermal neutron capture, the excited 11B nucleus fissions into an alpha particle and lithium recoil nucleus. The high Linear Energy Transfer (LET) emitted particles deposit their energy in a range of about 10μm, which is of the same order of cell diameter [1], at the same time other reactions due to neutron activation with body component are produced. In-phantom measurement of physical dose distribution is very important for BNCT planning validation. Determination of total absorbed dose requires complex calculations which were carried out using the Monte Carlo MCNP code [2].

  7. Monte Carlo based statistical power analysis for mediation models: methods and software.

    Science.gov (United States)

    Zhang, Zhiyong

    2014-12-01

    The existing literature on statistical power analysis for mediation models often assumes data normality and is based on a less powerful Sobel test instead of the more powerful bootstrap test. This study proposes to estimate statistical power to detect mediation effects on the basis of the bootstrap method through Monte Carlo simulation. Nonnormal data with excessive skewness and kurtosis are allowed in the proposed method. A free R package called bmem is developed to conduct the power analysis discussed in this study. Four examples, including a simple mediation model, a multiple-mediator model with a latent mediator, a multiple-group mediation model, and a longitudinal mediation model, are provided to illustrate the proposed method.

  8. Monte Carlo based geometrical model for efficiency calculation of an n-type HPGe detector.

    Science.gov (United States)

    Cabal, Fatima Padilla; Lopez-Pino, Neivy; Bernal-Castillo, Jose Luis; Martinez-Palenzuela, Yisel; Aguilar-Mena, Jimmy; D'Alessandro, Katia; Arbelo, Yuniesky; Corrales, Yasser; Diaz, Oscar

    2010-12-01

    A procedure to optimize the geometrical model of an n-type detector is described. Sixteen lines from seven point sources ((241)Am, (133)Ba, (22)Na, (60)Co, (57)Co, (137)Cs and (152)Eu) placed at three different source-to-detector distances (10, 20 and 30 cm) were used to calibrate a low-background gamma spectrometer between 26 and 1408 keV. Direct Monte Carlo techniques using the MCNPX 2.6 and GEANT 4 9.2 codes, and a semi-empirical procedure were performed to obtain theoretical efficiency curves. Since discrepancies were found between experimental and calculated data using the manufacturer parameters of the detector, a detail study of the crystal dimensions and the geometrical configuration is carried out. The relative deviation with experimental data decreases from a mean value of 18-4%, after the parameters were optimized.

  9. An approach to using conventional brachytherapy software for clinical treatment planning of complex, Monte Carlo-based brachytherapy dose distributions

    Energy Technology Data Exchange (ETDEWEB)

    Rivard, Mark J.; Melhus, Christopher S.; Granero, Domingo; Perez-Calatayud, Jose; Ballester, Facundo [Department of Radiation Oncology, Tufts University School of Medicine, Boston, Massachusetts 02111 (United States); Radiation Oncology Department, Physics Section, ' ' La Fe' ' University Hospital, Avenida Campanar 21, E-46009 Valencia (Spain); Department of Atomic, Molecular, and Nuclear Physics, University of Valencia, C/Dr. Moliner 50, E-46100 Burjassot, Spain and IFIC (University of Valencia-CSIC), C/Dr. Moliner 50, E-46100 Burjassot (Spain)

    2009-06-15

    dosimetry parameter data {<=}0.1 cm was required, and the virtual brachytherapy source data set included over 5000 data points. On the other hand, the lack of consideration for applicator heterogeneity effect caused conventional dose overestimates exceeding an order of magnitude in regions of clinical interest. This approach is rationalized by the improved dose estimates. In conclusion, a new technique was developed to incorporate complex Monte Carlo-based brachytherapy dose distributions into conventional TPS. These results are generalizable to other brachytherapy source types and other TPS.

  10. Monte Carlo based unit commitment procedures for the deregulated market environment

    Energy Technology Data Exchange (ETDEWEB)

    Granelli, G.P.; Marannino, P.; Montagna, M.; Zanellini, F. [Universita di Pavia, Pavia (Italy). Dipartimento di Ingegneria Elettrica

    2006-12-15

    The unit commitment problem, originally conceived in the framework of short term operation of vertically integrated utilities, needs a thorough re-examination in the light of the ongoing transition towards the open electricity market environment. In this work the problem is re-formulated to adapt unit commitment to the viewpoint of a generation company (GENCO) which is no longer bound to satisfy its load, but is willing to maximize its profits. Moreover, with reference to the present day situation in many countries, the presence of a GENCO (the former monopolist) which is in the position of exerting the market power, requires a careful analysis to be carried out considering the different perspectives of a price taker and of the price maker GENCO. Unit commitment is thus shown to lead to a couple of distinct, yet slightly different problems. The unavoidable uncertainties in load profile and price behaviour over the time period of interest are also taken into account by means of a Monte Carlo simulation. Both the forecasted loads and prices are handled as random variables with a normal multivariate distribution. The correlation between the random input variables corresponding to successive hours of the day was considered by carrying out a statistical analysis of actual load and price data. The whole procedure was tested making use of reasonable approximations of the actual data of the thermal generation units available to come actual GENCOs operating in Italy. (author)

  11. Monte Carlo based water/medium stopping-power ratios for various ICRP and ICRU tissues.

    Science.gov (United States)

    Fernández-Varea, José M; Carrasco, Pablo; Panettieri, Vanessa; Brualla, Lorenzo

    2007-11-07

    Water/medium stopping-power ratios, s(w,m), have been calculated for several ICRP and ICRU tissues, namely adipose tissue, brain, cortical bone, liver, lung (deflated and inflated) and spongiosa. The considered clinical beams were 6 and 18 MV x-rays and the field size was 10 x 10 cm(2). Fluence distributions were scored at a depth of 10 cm using the Monte Carlo code PENELOPE. The collision stopping powers for the studied tissues were evaluated employing the formalism of ICRU Report 37 (1984 Stopping Powers for Electrons and Positrons (Bethesda, MD: ICRU)). The Bragg-Gray values of s(w,m) calculated with these ingredients range from about 0.98 (adipose tissue) to nearly 1.14 (cortical bone), displaying a rather small variation with beam quality. Excellent agreement, to within 0.1%, is found with stopping-power ratios reported by Siebers et al (2000a Phys. Med. Biol. 45 983-95) for cortical bone, inflated lung and spongiosa. In the case of cortical bone, s(w,m) changes approximately 2% when either ICRP or ICRU compositions are adopted, whereas the stopping-power ratios of lung, brain and adipose tissue are less sensitive to the selected composition. The mass density of lung also influences the calculated values of s(w,m), reducing them by around 1% (6 MV) and 2% (18 MV) when going from deflated to inflated lung.

  12. A class of Monte-Carlo-based statistical algorithms for efficient detection of repolarization alternans.

    Science.gov (United States)

    Iravanian, Shahriar; Kanu, Uche B; Christini, David J

    2012-07-01

    Cardiac repolarization alternans is an electrophysiologic condition identified by a beat-to-beat fluctuation in action potential waveform. It has been mechanistically linked to instances of T-wave alternans, a clinically defined ECG alternation in T-wave morphology, and associated with the onset of cardiac reentry and sudden cardiac death. Many alternans detection algorithms have been proposed in the past, but the majority have been designed specifically for use with T-wave alternans. Action potential duration (APD) signals obtained from experiments (especially those derived from optical mapping) possess unique characteristics, which requires the development and use of a more appropriate alternans detection method. In this paper, we present a new class of algorithms, based on the Monte Carlo method, for the detection and quantitative measurement of alternans. Specifically, we derive a set of algorithms (one an analytical and more efficient version of the other) and compare its performance with the standard spectral method and the generalized likelihood ratio test algorithm using synthetic APD sequences and optical mapping data obtained from an alternans control experiment. We demonstrate the benefits of the new algorithm in the presence of Gaussian and Laplacian noise and frame-shift errors. The proposed algorithms are well suited for experimental applications, and furthermore, have low complexity and are implementable using fixed-point arithmetic, enabling potential use with implantable cardiac devices.

  13. Markov chain Monte Carlo based analysis of post-translationally modified VDAC gating kinetics.

    Science.gov (United States)

    Tewari, Shivendra G; Zhou, Yifan; Otto, Bradley J; Dash, Ranjan K; Kwok, Wai-Meng; Beard, Daniel A

    2014-01-01

    The voltage-dependent anion channel (VDAC) is the main conduit for permeation of solutes (including nucleotides and metabolites) of up to 5 kDa across the mitochondrial outer membrane (MOM). Recent studies suggest that VDAC activity is regulated via post-translational modifications (PTMs). Yet the nature and effect of these modifications is not understood. Herein, single channel currents of wild-type, nitrosated, and phosphorylated VDAC are analyzed using a generalized continuous-time Markov chain Monte Carlo (MCMC) method. This developed method describes three distinct conducting states (open, half-open, and closed) of VDAC activity. Lipid bilayer experiments are also performed to record single VDAC activity under un-phosphorylated and phosphorylated conditions, and are analyzed using the developed stochastic search method. Experimental data show significant alteration in VDAC gating kinetics and conductance as a result of PTMs. The effect of PTMs on VDAC kinetics is captured in the parameters associated with the identified Markov model. Stationary distributions of the Markov model suggest that nitrosation of VDAC not only decreased its conductance but also significantly locked VDAC in a closed state. On the other hand, stationary distributions of the model associated with un-phosphorylated and phosphorylated VDAC suggest a reversal in channel conformation from relatively closed state to an open state. Model analyses of the nitrosated data suggest that faster reaction of nitric oxide with Cys-127 thiol group might be responsible for the biphasic effect of nitric oxide on basal VDAC conductance.

  14. Iterative reconstruction using a Monte Carlo based system transfer matrix for dedicated breast positron emission tomography.

    Science.gov (United States)

    Saha, Krishnendu; Straus, Kenneth J; Chen, Yu; Glick, Stephen J

    2014-08-28

    To maximize sensitivity, it is desirable that ring Positron Emission Tomography (PET) systems dedicated for imaging the breast have a small bore. Unfortunately, due to parallax error this causes substantial degradation in spatial resolution for objects near the periphery of the breast. In this work, a framework for computing and incorporating an accurate system matrix into iterative reconstruction is presented in an effort to reduce spatial resolution degradation towards the periphery of the breast. The GATE Monte Carlo Simulation software was utilized to accurately model the system matrix for a breast PET system. A strategy for increasing the count statistics in the system matrix computation and for reducing the system element storage space was used by calculating only a subset of matrix elements and then estimating the rest of the elements by using the geometric symmetry of the cylindrical scanner. To implement this strategy, polar voxel basis functions were used to represent the object, resulting in a block-circulant system matrix. Simulation studies using a breast PET scanner model with ring geometry demonstrated improved contrast at 45% reduced noise level and 1.5 to 3 times resolution performance improvement when compared to MLEM reconstruction using a simple line-integral model. The GATE based system matrix reconstruction technique promises to improve resolution and noise performance and reduce image distortion at FOV periphery compared to line-integral based system matrix reconstruction.

  15. Monte Carlo based verification of a beam model used in a treatment planning system

    Science.gov (United States)

    Wieslander, E.; Knöös, T.

    2008-02-01

    Modern treatment planning systems (TPSs) usually separate the dose modelling into a beam modelling phase, describing the beam exiting the accelerator, followed by a subsequent dose calculation in the patient. The aim of this work is to use the Monte Carlo code system EGSnrc to study the modelling of head scatter as well as the transmission through multi-leaf collimator (MLC) and diaphragms in the beam model used in a commercial TPS (MasterPlan, Nucletron B.V.). An Elekta Precise linear accelerator equipped with an MLC has been modelled in BEAMnrc, based on available information from the vendor regarding the material and geometry of the treatment head. The collimation in the MLC direction consists of leafs which are complemented with a backup diaphragm. The characteristics of the electron beam, i.e., energy and spot size, impinging on the target have been tuned to match measured data. Phase spaces from simulations of the treatment head are used to extract the scatter from, e.g., the flattening filter and the collimating structures. Similar data for the source models used in the TPS are extracted from the treatment planning system, thus a comprehensive analysis is possible. Simulations in a water phantom, with DOSXYZnrc, are also used to study the modelling of the MLC and the diaphragms by the TPS. The results from this study will be helpful to understand the limitations of the model in the TPS and provide knowledge for further improvements of the TPS source modelling.

  16. A Monte Carlo-based treatment planning tool for proton therapy

    Science.gov (United States)

    Mairani, A.; Böhlen, T. T.; Schiavi, A.; Tessonnier, T.; Molinelli, S.; Brons, S.; Battistoni, G.; Parodi, K.; Patera, V.

    2013-04-01

    In the field of radiotherapy, Monte Carlo (MC) particle transport calculations are recognized for their superior accuracy in predicting dose and fluence distributions in patient geometries compared to analytical algorithms which are generally used for treatment planning due to their shorter execution times. In this work, a newly developed MC-based treatment planning (MCTP) tool for proton therapy is proposed to support treatment planning studies and research applications. It allows for single-field and simultaneous multiple-field optimization in realistic treatment scenarios and is based on the MC code FLUKA. Relative biological effectiveness (RBE)-weighted dose is optimized either with the common approach using a constant RBE of 1.1 or using a variable RBE according to radiobiological input tables. A validated reimplementation of the local effect model was used in this work to generate radiobiological input tables. Examples of treatment plans in water phantoms and in patient-CT geometries together with an experimental dosimetric validation of the plans are presented for clinical treatment parameters as used at the Italian National Center for Oncological Hadron Therapy. To conclude, a versatile MCTP tool for proton therapy was developed and validated for realistic patient treatment scenarios against dosimetric measurements and commercial analytical TP calculations. It is aimed to be used in future for research and to support treatment planning at state-of-the-art ion beam therapy facilities.

  17. Iterative reconstruction using a Monte Carlo based system transfer matrix for dedicated breast positron emission tomography

    Energy Technology Data Exchange (ETDEWEB)

    Saha, Krishnendu [Ohio Medical Physics Consulting, Dublin, Ohio 43017 (United States); Straus, Kenneth J.; Glick, Stephen J. [Department of Radiology, University of Massachusetts Medical School, Worcester, Massachusetts 01655 (United States); Chen, Yu. [Department of Radiation Oncology, Columbia University, New York, New York 10032 (United States)

    2014-08-28

    To maximize sensitivity, it is desirable that ring Positron Emission Tomography (PET) systems dedicated for imaging the breast have a small bore. Unfortunately, due to parallax error this causes substantial degradation in spatial resolution for objects near the periphery of the breast. In this work, a framework for computing and incorporating an accurate system matrix into iterative reconstruction is presented in an effort to reduce spatial resolution degradation towards the periphery of the breast. The GATE Monte Carlo Simulation software was utilized to accurately model the system matrix for a breast PET system. A strategy for increasing the count statistics in the system matrix computation and for reducing the system element storage space was used by calculating only a subset of matrix elements and then estimating the rest of the elements by using the geometric symmetry of the cylindrical scanner. To implement this strategy, polar voxel basis functions were used to represent the object, resulting in a block-circulant system matrix. Simulation studies using a breast PET scanner model with ring geometry demonstrated improved contrast at 45% reduced noise level and 1.5 to 3 times resolution performance improvement when compared to MLEM reconstruction using a simple line-integral model. The GATE based system matrix reconstruction technique promises to improve resolution and noise performance and reduce image distortion at FOV periphery compared to line-integral based system matrix reconstruction.

  18. Quality assessment of Monte Carlo based system response matrices in PET

    Energy Technology Data Exchange (ETDEWEB)

    Cabello, J.; Gillam, J.E. [Valencia Univ. (Spain). Inst. de Fisica Corpuscular; Rafecas, M. [Valencia Univ. (Spain). Inst. de Fisica Corpuscular; Valencia Univ. (Spain). Dept. de Fisica Atomica, Molecular y Nuclear

    2011-07-01

    Iterative methods are currently accepted as the gold standard image reconstruction methods in nuclear medicine. The quality of the final reconstructed image greatly depends on how well physical processes are modelled in the System-Response- Matrix (SRM). The SRM can be obtained using experimental measurements, or calculated using Monte-Carlo (MC) or analytical methods. Nevertheless, independent on the method, the SRM is always contaminated by a certain level of error. MC based methods have recently gained popularity in calculation of the SRM due to the significant increase in computer power exhibited by regular commercial computers. MC methods can produce high accuracy results, but are subject to statistical noise, which affects the precision of the results. By increasing the number of annihilations simulated, the level of noise observed in the SRM decreases, at the additional cost of increased simulation time and increased file size necessary to store the SRM. The latter also has a negative impact on reconstruction time. A study on the noise of the SRM has been performed from a spatial point of view, identifying specific regions subject to higher levels of noise. This study will enable the calculation of SRM with different levels of statistics depending on the spatial location. A quantitative comparison of images, reconstructed using different SRM realizations, with similar and different levels of statistical quality, has been presented. (orig.)

  19. Monte Carlo based investigation of Berry phase for depth resolved characterization of biomedical scattering samples

    Energy Technology Data Exchange (ETDEWEB)

    Baba, Justin S [ORNL; John, Dwayne O [ORNL; Koju, Vijay [ORNL

    2015-01-01

    The propagation of light in turbid media is an active area of research with relevance to numerous investigational fields, e.g., biomedical diagnostics and therapeutics. The statistical random-walk nature of photon propagation through turbid media is ideal for computational based modeling and simulation. Ready access to super computing resources provide a means for attaining brute force solutions to stochastic light-matter interactions entailing scattering by facilitating timely propagation of sufficient (>10million) photons while tracking characteristic parameters based on the incorporated physics of the problem. One such model that works well for isotropic but fails for anisotropic scatter, which is the case for many biomedical sample scattering problems, is the diffusion approximation. In this report, we address this by utilizing Berry phase (BP) evolution as a means for capturing anisotropic scattering characteristics of samples in the preceding depth where the diffusion approximation fails. We extend the polarization sensitive Monte Carlo method of Ramella-Roman, et al.,1 to include the computationally intensive tracking of photon trajectory in addition to polarization state at every scattering event. To speed-up the computations, which entail the appropriate rotations of reference frames, the code was parallelized using OpenMP. The results presented reveal that BP is strongly correlated to the photon penetration depth, thus potentiating the possibility of polarimetric depth resolved characterization of highly scattering samples, e.g., biological tissues.

  20. Monte Carlo-based revised values of dose rate constants at discrete photon energies

    Directory of Open Access Journals (Sweden)

    T Palani Selvam

    2014-01-01

    Full Text Available Absorbed dose rate to water at 0.2 cm and 1 cm due to a point isotropic photon source as a function of photon energy is calculated using the EDKnrc user-code of the EGSnrc Monte Carlo system. This code system utilized widely used XCOM photon cross-section dataset for the calculation of absorbed dose to water. Using the above dose rates, dose rate constants are calculated. Air-kerma strength S k needed for deriving dose rate constant is based on the mass-energy absorption coefficient compilations of Hubbell and Seltzer published in the year 1995. A comparison of absorbed dose rates in water at the above distances to the published values reflects the differences in photon cross-section dataset in the low-energy region (difference is up to 2% in dose rate values at 1 cm in the energy range 30-50 keV and up to 4% at 0.2 cm at 30 keV. A maximum difference of about 8% is observed in the dose rate value at 0.2 cm at 1.75 MeV when compared to the published value. S k calculations based on the compilation of Hubbell and Seltzer show a difference of up to 2.5% in the low-energy region (20-50 keV when compared to the published values. The deviations observed in the values of dose rate and S k affect the values of dose rate constants up to 3%.

  1. An exercise in model validation: Comparing univariate statistics and Monte Carlo-based multivariate statistics

    Energy Technology Data Exchange (ETDEWEB)

    Weathers, J.B. [Shock, Noise, and Vibration Group, Northrop Grumman Shipbuilding, P.O. Box 149, Pascagoula, MS 39568 (United States)], E-mail: James.Weathers@ngc.com; Luck, R. [Department of Mechanical Engineering, Mississippi State University, 210 Carpenter Engineering Building, P.O. Box ME, Mississippi State, MS 39762-5925 (United States)], E-mail: Luck@me.msstate.edu; Weathers, J.W. [Structural Analysis Group, Northrop Grumman Shipbuilding, P.O. Box 149, Pascagoula, MS 39568 (United States)], E-mail: Jeffrey.Weathers@ngc.com

    2009-11-15

    The complexity of mathematical models used by practicing engineers is increasing due to the growing availability of sophisticated mathematical modeling tools and ever-improving computational power. For this reason, the need to define a well-structured process for validating these models against experimental results has become a pressing issue in the engineering community. This validation process is partially characterized by the uncertainties associated with the modeling effort as well as the experimental results. The net impact of the uncertainties on the validation effort is assessed through the 'noise level of the validation procedure', which can be defined as an estimate of the 95% confidence uncertainty bounds for the comparison error between actual experimental results and model-based predictions of the same quantities of interest. Although general descriptions associated with the construction of the noise level using multivariate statistics exists in the literature, a detailed procedure outlining how to account for the systematic and random uncertainties is not available. In this paper, the methodology used to derive the covariance matrix associated with the multivariate normal pdf based on random and systematic uncertainties is examined, and a procedure used to estimate this covariance matrix using Monte Carlo analysis is presented. The covariance matrices are then used to construct approximate 95% confidence constant probability contours associated with comparison error results for a practical example. In addition, the example is used to show the drawbacks of using a first-order sensitivity analysis when nonlinear local sensitivity coefficients exist. Finally, the example is used to show the connection between the noise level of the validation exercise calculated using multivariate and univariate statistics.

  2. Monte-Carlo based Uncertainty Analysis For CO2 Laser Microchanneling Model

    Science.gov (United States)

    Prakash, Shashi; Kumar, Nitish; Kumar, Subrata

    2016-09-01

    CO2 laser microchanneling has emerged as a potential technique for the fabrication of microfluidic devices on PMMA (Poly-methyl-meth-acrylate). PMMA directly vaporizes when subjected to high intensity focused CO2 laser beam. This process results in clean cut and acceptable surface finish on microchannel walls. Overall, CO2 laser microchanneling process is cost effective and easy to implement. While fabricating microchannels on PMMA using a CO2 laser, the maximum depth of the fabricated microchannel is the key feature. There are few analytical models available to predict the maximum depth of the microchannels and cut channel profile on PMMA substrate using a CO2 laser. These models depend upon the values of thermophysical properties of PMMA and laser beam parameters. There are a number of variants of transparent PMMA available in the market with different values of thermophysical properties. Therefore, for applying such analytical models, the values of these thermophysical properties are required to be known exactly. Although, the values of laser beam parameters are readily available, extensive experiments are required to be conducted to determine the value of thermophysical properties of PMMA. The unavailability of exact values of these property parameters restrict the proper control over the microchannel dimension for given power and scanning speed of the laser beam. In order to have dimensional control over the maximum depth of fabricated microchannels, it is necessary to have an idea of uncertainty associated with the predicted microchannel depth. In this research work, the uncertainty associated with the maximum depth dimension has been determined using Monte Carlo method (MCM). The propagation of uncertainty with different power and scanning speed has been predicted. The relative impact of each thermophysical property has been determined using sensitivity analysis.

  3. Monte Carlo based protocol for cell survival and tumour control probability in BNCT

    Science.gov (United States)

    Ye, Sung-Joon

    1999-02-01

    A mathematical model to calculate the theoretical cell survival probability (nominally, the cell survival fraction) is developed to evaluate preclinical treatment conditions for boron neutron capture therapy (BNCT). A treatment condition is characterized by the neutron beam spectra, single or bilateral exposure, and the choice of boron carrier drug (boronophenylalanine (BPA) or boron sulfhydryl hydride (BSH)). The cell survival probability defined from Poisson statistics is expressed with the cell-killing yield, the (n, ) reaction density, and the tolerable neutron fluence. The radiation transport calculation from the neutron source to tumours is carried out using Monte Carlo methods: (i) reactor-based BNCT facility modelling to yield the neutron beam library at an irradiation port; (ii) dosimetry to limit the neutron fluence below a tolerance dose (10.5 Gy-Eq); (iii) calculation of the (n, ) reaction density in tumours. A shallow surface tumour could be effectively treated by single exposure producing an average cell survival probability of - for probable ranges of the cell-killing yield for the two drugs, while a deep tumour will require bilateral exposure to achieve comparable cell kills at depth. With very pure epithermal beams eliminating thermal, low epithermal and fast neutrons, the cell survival can be decreased by factors of 2-10 compared with the unmodified neutron spectrum. A dominant effect of cell-killing yield on tumour cell survival demonstrates the importance of choice of boron carrier drug. However, these calculations do not indicate an unambiguous preference for one drug, due to the large overlap of tumour cell survival in the probable ranges of the cell-killing yield for the two drugs. The cell survival value averaged over a bulky tumour volume is used to predict the overall BNCT therapeutic efficacy, using a simple model of tumour control probability (TCP).

  4. Effect of statistical fluctuation in Monte Carlo based photon beam dose calculation on gamma index evaluation.

    Science.gov (United States)

    Graves, Yan Jiang; Jia, Xun; Jiang, Steve B

    2013-03-21

    The γ-index test has been commonly adopted to quantify the degree of agreement between a reference dose distribution and an evaluation dose distribution. Monte Carlo (MC) simulation has been widely used for the radiotherapy dose calculation for both clinical and research purposes. The goal of this work is to investigate both theoretically and experimentally the impact of the MC statistical fluctuation on the γ-index test when the fluctuation exists in the reference, the evaluation, or both dose distributions. To the first order approximation, we theoretically demonstrated in a simplified model that the statistical fluctuation tends to overestimate γ-index values when existing in the reference dose distribution and underestimate γ-index values when existing in the evaluation dose distribution given the original γ-index is relatively large for the statistical fluctuation. Our numerical experiments using realistic clinical photon radiation therapy cases have shown that (1) when performing a γ-index test between an MC reference dose and a non-MC evaluation dose, the average γ-index is overestimated and the gamma passing rate decreases with the increase of the statistical noise level in the reference dose; (2) when performing a γ-index test between a non-MC reference dose and an MC evaluation dose, the average γ-index is underestimated when they are within the clinically relevant range and the gamma passing rate increases with the increase of the statistical noise level in the evaluation dose; (3) when performing a γ-index test between an MC reference dose and an MC evaluation dose, the gamma passing rate is overestimated due to the statistical noise in the evaluation dose and underestimated due to the statistical noise in the reference dose. We conclude that the γ-index test should be used with caution when comparing dose distributions computed with MC simulation.

  5. Monte Carlo based NMR simulations of open fractures in porous media

    Science.gov (United States)

    Lukács, Tamás; Balázs, László

    2014-05-01

    According to the basic principles of nuclear magnetic resonance (NMR), a measurement's free induction decay curve has an exponential characteristic and its parameter is the transversal relaxation time, T2, given by the Bloch equations in rotating frame. In our simulations we are observing that particular case when the bulk's volume is neglectable to the whole system, the vertical movement is basically zero, hence the diffusion part of the T2 relation can be editted out. This small-apertured situations are common in sedimentary layers, and the smallness of the observed volume enable us to calculate with just the bulk relaxation and the surface relaxation. The simulation uses the Monte-Carlo method, so it is based on a random-walk generator which provides the brownian motions of the particles by uniformly distributed, pseudorandom generated numbers. An attached differential equation assures the bulk relaxation, the initial and the iterated conditions guarantee the simulation's replicability and enable having consistent estimations. We generate an initial geometry of a plain segment with known height, with given number of particles, the spatial distribution is set to equal to each simulation, and the surface-volume ratio remains at a constant value. It follows that to the given thickness of the open fracture, from the fitted curve's parameter, the surface relaxivity is determinable. The calculated T2 distribution curves are also indicating the inconstancy in the observed fracture situations. The effect of varying the height of the lamina at a constant diffusion coefficient also produces characteristic anomaly and for comparison we have run the simulation with the same initial volume, number of particles and conditions in spherical bulks, their profiles are clear and easily to understand. The surface relaxation enables us to estimate the interaction beetwen the materials of boundary with this two geometrically well-defined bulks, therefore the distribution takes as a

  6. Reconciliation between experimental and Monte Carlo-based simulation of the pore size distribution in mesoporous silicon.

    Science.gov (United States)

    Tadvani, Jalil Khajepour; Falamaki, Cavus

    2008-07-23

    It is demonstrated for the first time that mesoporous PS structures obtained by the electrochemical etching of p(+)(100) oriented silicon wafers might assume the peculiarity of invariance of the first peak positions in their pore size distribution curves, albeit for current densities far from the electropolishing region and at constant electrolyte composition. A new Monte Carlo-based simulation model is presented that predicts reasonably the pore size distribution of the PS layers and the observed invariance of peak position with respect to changes in current density. The main highlight of the new model is the introduction of a 'light avalanche breakdown' process in a mathematical fashion. The model is also able to predict an absolute value of 4.23 Å for the smallest pore created experimentally. It is discussed that the latter value has an exact physical meaning: it corresponds with great accuracy to the width of a void created on the surface due to the exclusion of one Si atom.

  7. Reconciliation between experimental and Monte Carlo-based simulation of the pore size distribution in mesoporous silicon

    Energy Technology Data Exchange (ETDEWEB)

    Tadvani, Jalil Khajepour [Ceramics Department, Materials and Energy Research Center, PO Box 14155-4777, Tehran (Iran, Islamic Republic of); Falamaki, Cavus [Chemical Engineering Department, Amirkabir University of Technology, Hafez Avenue, PO Box 15875-4413, Tehran (Iran, Islamic Republic of)

    2008-07-23

    It is demonstrated for the first time that mesoporous PS structures obtained by the electrochemical etching of p{sup +}(100) oriented silicon wafers might assume the peculiarity of invariance of the first peak positions in their pore size distribution curves, albeit for current densities far from the electropolishing region and at constant electrolyte composition. A new Monte Carlo-based simulation model is presented that predicts reasonably the pore size distribution of the PS layers and the observed invariance of peak position with respect to changes in current density. The main highlight of the new model is the introduction of a 'light avalanche breakdown' process in a mathematical fashion. The model is also able to predict an absolute value of 4.23 A for the smallest pore created experimentally. It is discussed that the latter value has an exact physical meaning: it corresponds with great accuracy to the width of a void created on the surface due to the exclusion of one Si atom.

  8. Monte Carlo-based diode design for correction-less small field dosimetry

    Science.gov (United States)

    Charles, P. H.; Crowe, S. B.; Kairn, T.; Knight, R. T.; Hill, B.; Kenny, J.; Langton, C. M.; Trapp, J. V.

    2013-07-01

    Due to their small collecting volume, diodes are commonly used in small field dosimetry. However, the relative sensitivity of a diode increases with decreasing small field size. Conversely, small air gaps have been shown to cause a significant decrease in the sensitivity of a detector as the field size is decreased. Therefore, this study uses Monte Carlo simulations to look at introducing air upstream to diodes such that they measure with a constant sensitivity across all field sizes in small field dosimetry. Varying thicknesses of air were introduced onto the upstream end of two commercial diodes (PTW 60016 photon diode and PTW 60017 electron diode), as well as a theoretical unenclosed silicon chip using field sizes as small as 5 mm × 5 mm. The metric \\frac{{D_{w,Q} }}{{D_{Det,Q} }} used in this study represents the ratio of the dose to a point of water to the dose to the diode active volume, for a particular field size and location. The optimal thickness of air required to provide a constant sensitivity across all small field sizes was found by plotting \\frac{{D_{w,Q} }}{{D_{Det,Q} }} as a function of introduced air gap size for various field sizes, and finding the intersection point of these plots. That is, the point at which \\frac{{D_{w,Q} }}{{D_{Det,Q} }} was constant for all field sizes was found. The optimal thickness of air was calculated to be 3.3, 1.15 and 0.10 mm for the photon diode, electron diode and unenclosed silicon chip, respectively. The variation in these results was due to the different design of each detector. When calculated with the new diode design incorporating the upstream air gap, k_{Q_{clin} ,Q_{msr} }^{f_{clin} ,f_{msr} } was equal to unity to within statistical uncertainty (0.5%) for all three diodes. Cross-axis profile measurements were also improved with the new detector design. The upstream air gap could be implanted on the commercial diodes via a cap consisting of the air cavity surrounded by water equivalent material. The

  9. Monte Carlo-based adaptive EPID dose kernel accounting for different field size responses of imagers.

    Science.gov (United States)

    Wang, Song; Gardner, Joseph K; Gordon, John J; Li, Weidong; Clews, Luke; Greer, Peter B; Siebers, Jeffrey V

    2009-08-01

    The aim of this study is to present an efficient method to generate imager-specific Monte Carlo (MC)-based dose kernels for amorphous silicon-based electronic portal image device dose prediction and determine the effective backscattering thicknesses for such imagers. EPID field size-dependent responses were measured for five matched Varian accelerators from three institutions with 6 MV beams at the source to detector distance (SDD) of 105 cm. For two imagers, measurements were made with and without the imager mounted on the robotic supporting arm. Monoenergetic energy deposition kernels with 0-2.5 cm of water backscattering thicknesses were simultaneously computed by MC to a high precision. For each imager, the backscattering thickness required to match measured field size responses was determined. The monoenergetic kernel method was validated by comparing measured and predicted field size responses at 150 cm SDD, 10 x 10 cm2 multileaf collimator (MLC) sliding window fields created with 5, 10, 20, and 50 mm gaps, and a head-and-neck (H&N) intensity modulated radiation therapy (IMRT) patient field. Field size responses for the five different imagers deviated by up to 1.3%. When imagers were removed from the robotic arms, response deviations were reduced to 0.2%. All imager field size responses were captured by using between 1.0 and 1.6 cm backscatter. The predicted field size responses by the imager-specific kernels matched measurements for all involved imagers with the maximal deviation of 0.34%. The maximal deviation between the predicted and measured field size responses at 150 cm SDD is 0.39%. The maximal deviation between the predicted and measured MLC sliding window fields is 0.39%. For the patient field, gamma analysis yielded that 99.0% of the pixels have gamma < 1 by the 2%, 2 mm criteria with a 3% dose threshold. Tunable imager-specific kernels can be generated rapidly and accurately in a single MC simulation. The resultant kernels are imager position

  10. Monte Carlo-based multiphysics coupling analysis of x-ray pulsar telescope

    Science.gov (United States)

    Li, Liansheng; Deng, Loulou; Mei, Zhiwu; Zuo, Fuchang; Zhou, Hao

    2015-10-01

    X-ray pulsar telescope (XPT) is a complex optical payload, which involves optical, mechanical, electrical and thermal disciplines. The multiphysics coupling analysis (MCA) plays an important role in improving the in-orbit performance. However, the conventional MCA methods encounter two serious problems in dealing with the XTP. One is that both the energy and reflectivity information of X-ray can't be taken into consideration, which always misunderstands the essence of XPT. Another is that the coupling data can't be transferred automatically among different disciplines, leading to computational inefficiency and high design cost. Therefore, a new MCA method for XPT is proposed based on the Monte Carlo method and total reflective theory. The main idea, procedures and operational steps of the proposed method are addressed in detail. Firstly, it takes both the energy and reflectivity information of X-ray into consideration simultaneously. And formulate the thermal-structural coupling equation and multiphysics coupling analysis model based on the finite element method. Then, the thermalstructural coupling analysis under different working conditions has been implemented. Secondly, the mirror deformations are obtained using construction geometry function. Meanwhile, the polynomial function is adopted to fit the deformed mirror and meanwhile evaluate the fitting error. Thirdly, the focusing performance analysis of XPT can be evaluated by the RMS. Finally, a Wolter-I XPT is taken as an example to verify the proposed MCA method. The simulation results show that the thermal-structural coupling deformation is bigger than others, the vary law of deformation effect on the focusing performance has been obtained. The focusing performances of thermal-structural, thermal, structural deformations have degraded 30.01%, 14.35% and 7.85% respectively. The RMS of dispersion spot are 2.9143mm, 2.2038mm and 2.1311mm. As a result, the validity of the proposed method is verified through

  11. Monte Carlo-based treatment planning system calculation engine for microbeam radiation therapy

    Energy Technology Data Exchange (ETDEWEB)

    Martinez-Rovira, I.; Sempau, J.; Prezado, Y. [Institut de Tecniques Energetiques, Universitat Politecnica de Catalunya, Diagonal 647, Barcelona E-08028 (Spain) and ID17 Biomedical Beamline, European Synchrotron Radiation Facility (ESRF), 6 rue Jules Horowitz B.P. 220, F-38043 Grenoble Cedex (France); Institut de Tecniques Energetiques, Universitat Politecnica de Catalunya, Diagonal 647, Barcelona E-08028 (Spain); Laboratoire Imagerie et modelisation en neurobiologie et cancerologie, UMR8165, Centre National de la Recherche Scientifique (CNRS), Universites Paris 7 et Paris 11, Bat 440., 15 rue Georges Clemenceau, F-91406 Orsay Cedex (France)

    2012-05-15

    Purpose: Microbeam radiation therapy (MRT) is a synchrotron radiotherapy technique that explores the limits of the dose-volume effect. Preclinical studies have shown that MRT irradiations (arrays of 25-75-{mu}m-wide microbeams spaced by 200-400 {mu}m) are able to eradicate highly aggressive animal tumor models while healthy tissue is preserved. These promising results have provided the basis for the forthcoming clinical trials at the ID17 Biomedical Beamline of the European Synchrotron Radiation Facility (ESRF). The first step includes irradiation of pets (cats and dogs) as a milestone before treatment of human patients. Within this context, accurate dose calculations are required. The distinct features of both beam generation and irradiation geometry in MRT with respect to conventional techniques require the development of a specific MRT treatment planning system (TPS). In particular, a Monte Carlo (MC)-based calculation engine for the MRT TPS has been developed in this work. Experimental verification in heterogeneous phantoms and optimization of the computation time have also been performed. Methods: The penelope/penEasy MC code was used to compute dose distributions from a realistic beam source model. Experimental verification was carried out by means of radiochromic films placed within heterogeneous slab-phantoms. Once validation was completed, dose computations in a virtual model of a patient, reconstructed from computed tomography (CT) images, were performed. To this end, decoupling of the CT image voxel grid (a few cubic millimeter volume) to the dose bin grid, which has micrometer dimensions in the transversal direction of the microbeams, was performed. Optimization of the simulation parameters, the use of variance-reduction (VR) techniques, and other methods, such as the parallelization of the simulations, were applied in order to speed up the dose computation. Results: Good agreement between MC simulations and experimental results was achieved, even at

  12. Assessment of parameter uncertainty in hydrological model using a Markov-Chain-Monte-Carlo-based multilevel-factorial-analysis method

    Science.gov (United States)

    Zhang, Junlong; Li, Yongping; Huang, Guohe; Chen, Xi; Bao, Anming

    2016-07-01

    Without a realistic assessment of parameter uncertainty, decision makers may encounter difficulties in accurately describing hydrologic processes and assessing relationships between model parameters and watershed characteristics. In this study, a Markov-Chain-Monte-Carlo-based multilevel-factorial-analysis (MCMC-MFA) method is developed, which can not only generate samples of parameters from a well constructed Markov chain and assess parameter uncertainties with straightforward Bayesian inference, but also investigate the individual and interactive effects of multiple parameters on model output through measuring the specific variations of hydrological responses. A case study is conducted for addressing parameter uncertainties in the Kaidu watershed of northwest China. Effects of multiple parameters and their interactions are quantitatively investigated using the MCMC-MFA with a three-level factorial experiment (totally 81 runs). A variance-based sensitivity analysis method is used to validate the results of parameters' effects. Results disclose that (i) soil conservation service runoff curve number for moisture condition II (CN2) and fraction of snow volume corresponding to 50% snow cover (SNO50COV) are the most significant factors to hydrological responses, implying that infiltration-excess overland flow and snow water equivalent represent important water input to the hydrological system of the Kaidu watershed; (ii) saturate hydraulic conductivity (SOL_K) and soil evaporation compensation factor (ESCO) have obvious effects on hydrological responses; this implies that the processes of percolation and evaporation would impact hydrological process in this watershed; (iii) the interactions of ESCO and SNO50COV as well as CN2 and SNO50COV have an obvious effect, implying that snow cover can impact the generation of runoff on land surface and the extraction of soil evaporative demand in lower soil layers. These findings can help enhance the hydrological model

  13. Monte Carlo-based interval transformation analysis for multi-criteria decision analysis of groundwater management strategies under uncertain naphthalene concentrations and health risks

    Science.gov (United States)

    Ren, Lixia; He, Li; Lu, Hongwei; Chen, Yizhong

    2016-08-01

    A new Monte Carlo-based interval transformation analysis (MCITA) is used in this study for multi-criteria decision analysis (MCDA) of naphthalene-contaminated groundwater management strategies. The analysis can be conducted when input data such as total cost, contaminant concentration and health risk are represented as intervals. Compared to traditional MCDA methods, MCITA-MCDA has the advantages of (1) dealing with inexactness of input data represented as intervals, (2) mitigating computational time due to the introduction of Monte Carlo sampling method, (3) identifying the most desirable management strategies under data uncertainty. A real-world case study is employed to demonstrate the performance of this method. A set of inexact management alternatives are considered in each duration on the basis of four criteria. Results indicated that the most desirable management strategy lied in action 15 for the 5-year, action 8 for the 10-year, action 12 for the 15-year, and action 2 for the 20-year management.

  14. Quantifying acoustic doppler current profiler discharge uncertainty: A Monte Carlo based tool for moving-boat measurements

    Science.gov (United States)

    Mueller, David S.

    2017-01-01

    This paper presents a method using Monte Carlo simulations for assessing uncertainty of moving-boat acoustic Doppler current profiler (ADCP) discharge measurements using a software tool known as QUant, which was developed for this purpose. Analysis was performed on 10 data sets from four Water Survey of Canada gauging stations in order to evaluate the relative contribution of a range of error sources to the total estimated uncertainty. The factors that differed among data sets included the fraction of unmeasured discharge relative to the total discharge, flow nonuniformity, and operator decisions about instrument programming and measurement cross section. As anticipated, it was found that the estimated uncertainty is dominated by uncertainty of the discharge in the unmeasured areas, highlighting the importance of appropriate selection of the site, the instrument, and the user inputs required to estimate the unmeasured discharge. The main contributor to uncertainty was invalid data, but spatial inhomogeneity in water velocity and bottom-track velocity also contributed, as did variation in the edge velocity, uncertainty in the edge distances, edge coefficients, and the top and bottom extrapolation methods. To a lesser extent, spatial inhomogeneity in the bottom depth also contributed to the total uncertainty, as did uncertainty in the ADCP draft at shallow sites. The estimated uncertainties from QUant can be used to assess the adequacy of standard operating procedures. They also provide quantitative feedback to the ADCP operators about the quality of their measurements, indicating which parameters are contributing most to uncertainty, and perhaps even highlighting ways in which uncertainty can be reduced. Additionally, QUant can be used to account for self-dependent error sources such as heading errors, which are a function of heading. The results demonstrate the importance of a Monte Carlo method tool such as QUant for quantifying random and bias errors when

  15. Two computational approaches for Monte Carlo based shutdown dose rate calculation with applications to the JET fusion machine

    Energy Technology Data Exchange (ETDEWEB)

    Petrizzi, L.; Batistoni, P.; Migliori, S. [Associazione EURATOM ENEA sulla Fusione, Frascati (Roma) (Italy); Chen, Y.; Fischer, U.; Pereslavtsev, P. [Association FZK-EURATOM Forschungszentrum Karlsruhe (Germany); Loughlin, M. [EURATOM/UKAEA Fusion Association, Culham Science Centre, Abingdon, Oxfordshire, OX (United Kingdom); Secco, A. [Nice Srl Via Serra 33 Camerano Casasco AT (Italy)

    2003-07-01

    In deuterium-deuterium (D-D) and deuterium-tritium (D-T) fusion plasmas neutrons are produced causing activation of JET machine components. For safe operation and maintenance it is important to be able to predict the induced activation and the resulting shut down dose rates. This requires a suitable system of codes which is capable of simulating both the neutron induced material activation during operation and the decay gamma radiation transport after shut-down in the proper 3-D geometry. Two methodologies to calculate the dose rate in fusion devices have been developed recently and applied to fusion machines, both using the MCNP Monte Carlo code. FZK has developed a more classical approach, the rigorous 2-step (R2S) system in which MCNP is coupled to the FISPACT inventory code with an automated routing. ENEA, in collaboration with the ITER Team, has developed an alternative approach, the direct 1 step method (D1S). Neutron and decay gamma transport are handled in one single MCNP run, using an ad hoc cross section library. The intention was to tightly couple the neutron induced production of a radio-isotope and the emission of its decay gammas for an accurate spatial distribution and a reliable calculated statistical error. The two methods have been used by the two Associations to calculate the dose rate in five positions of JET machine, two inside the vacuum chamber and three outside, at cooling times between 1 second and 1 year after shutdown. The same MCNP model and irradiation conditions have been assumed. The exercise has been proposed and financed in the frame of the Fusion Technological Program of the JET machine. The scope is to supply the designers with the most reliable tool and data to calculate the dose rate on fusion machines. Results showed that there is a good agreement: the differences range between 5-35%. The next step to be considered in 2003 will be an exercise in which the comparison will be done with dose-rate data from JET taken during and

  16. Monte Carlo based approach to the LS–NaI 4πβ–γ anticoincidence extrapolation and uncertainty.

    Science.gov (United States)

    Fitzgerald, R

    2016-03-01

    The 4πβ–γ anticoincidence method is used for the primary standardization of β−, β+, electron capture (EC), α, and mixed-mode radionuclides. Efficiency extrapolation using one or more γ ray coincidence gates is typically carried out by a low-order polynomial fit. The approach presented here is to use a Geant4-based Monte Carlo simulation of the detector system to analyze the efficiency extrapolation. New code was developed to account for detector resolution, direct γ ray interaction with the PMT, and implementation of experimental β-decay shape factors. The simulation was tuned to 57Co and 60Co data, then tested with 99mTc data, and used in measurements of 18F, 129I, and 124I. The analysis method described here offers a more realistic activity value and uncertainty than those indicated from a least-squares fit alone.

  17. Monte Carlo-based subgrid parameterization of vertical velocity and stratiform cloud microphysics in ECHAM5.5-HAM2

    Directory of Open Access Journals (Sweden)

    J. Tonttila

    2013-08-01

    Full Text Available A new method for parameterizing the subgrid variations of vertical velocity and cloud droplet number concentration (CDNC is presented for general circulation models (GCMs. These parameterizations build on top of existing parameterizations that create stochastic subgrid cloud columns inside the GCM grid cells, which can be employed by the Monte Carlo independent column approximation approach for radiative transfer. The new model version adds a description for vertical velocity in individual subgrid columns, which can be used to compute cloud activation and the subgrid distribution of the number of cloud droplets explicitly. Autoconversion is also treated explicitly in the subcolumn space. This provides a consistent way of simulating the cloud radiative effects with two-moment cloud microphysical properties defined at subgrid scale. The primary impact of the new parameterizations is to decrease the CDNC over polluted continents, while over the oceans the impact is smaller. Moreover, the lower CDNC induces a stronger autoconversion of cloud water to rain. The strongest reduction in CDNC and cloud water content over the continental areas promotes weaker shortwave cloud radiative effects (SW CREs even after retuning the model. However, compared to the reference simulation, a slightly stronger SW CRE is seen e.g. over mid-latitude oceans, where CDNC remains similar to the reference simulation, and the in-cloud liquid water content is slightly increased after retuning the model.

  18. Optimisation of Simultaneous Tl-201/Tc-99m Dual Isotope Reconstruction with Monte-Carlo-Based Scatter Correction

    Directory of Open Access Journals (Sweden)

    Tuija Kangasmaa

    2012-01-01

    Full Text Available Simultaneous Tl-201/Tc-99m dual isotope myocardial perfusion SPECT is seriously hampered by down-scatter from Tc-99m into the Tl-201 energy window. This paper presents and optimises the ordered-subsets-expectation-maximisation-(OS-EM- based reconstruction algorithm, which corrects the down-scatter using an efficient Monte Carlo (MC simulator. The algorithm starts by first reconstructing the Tc-99m image with attenuation, collimator response, and MC-based scatter correction. The reconstructed Tc-99m image is then used as an input for an efficient MC-based down-scatter simulation of Tc-99m photons into the Tl-201 window. This down-scatter estimate is finally used in the Tl-201 reconstruction to correct the crosstalk between the two isotopes. The mathematical 4D NCAT phantom and physical cardiac phantoms were used to optimise the number of OS-EM iterations where the scatter estimate is updated and the number of MC simulated photons. The results showed that two scatter update iterations and 105 simulated photons are enough for the Tc-99m and Tl-201 reconstructions, whereas 106 simulated photons are needed to generate good quality down-scatter estimates. With these parameters, the entire Tl-201/Tc-99m dual isotope reconstruction can be accomplished in less than 3 minutes.

  19. Monte Carlo-based fluorescence molecular tomography reconstruction method accelerated by a cluster of graphic processing units.

    Science.gov (United States)

    Quan, Guotao; Gong, Hui; Deng, Yong; Fu, Jianwei; Luo, Qingming

    2011-02-01

    High-speed fluorescence molecular tomography (FMT) reconstruction for 3-D heterogeneous media is still one of the most challenging problems in diffusive optical fluorescence imaging. In this paper, we propose a fast FMT reconstruction method that is based on Monte Carlo (MC) simulation and accelerated by a cluster of graphics processing units (GPUs). Based on the Message Passing Interface standard, we modified the MC code for fast FMT reconstruction, and different Green's functions representing the flux distribution in media are calculated simultaneously by different GPUs in the cluster. A load-balancing method was also developed to increase the computational efficiency. By applying the Fréchet derivative, a Jacobian matrix is formed to reconstruct the distribution of the fluorochromes using the calculated Green's functions. Phantom experiments have shown that only 10 min are required to get reconstruction results with a cluster of 6 GPUs, rather than 6 h with a cluster of multiple dual opteron CPU nodes. Because of the advantages of high accuracy and suitability for 3-D heterogeneity media with refractive-index-unmatched boundaries from the MC simulation, the GPU cluster-accelerated method provides a reliable approach to high-speed reconstruction for FMT imaging.

  20. Optimisation of simultaneous tl-201/tc-99m dual isotope reconstruction with monte-carlo-based scatter correction.

    Science.gov (United States)

    Kangasmaa, Tuija; Kuikka, Jyrki; Sohlberg, Antti

    2012-01-01

    Simultaneous Tl-201/Tc-99m dual isotope myocardial perfusion SPECT is seriously hampered by down-scatter from Tc-99m into the Tl-201 energy window. This paper presents and optimises the ordered-subsets-expectation-maximisation-(OS-EM-) based reconstruction algorithm, which corrects the down-scatter using an efficient Monte Carlo (MC) simulator. The algorithm starts by first reconstructing the Tc-99m image with attenuation, collimator response, and MC-based scatter correction. The reconstructed Tc-99m image is then used as an input for an efficient MC-based down-scatter simulation of Tc-99m photons into the Tl-201 window. This down-scatter estimate is finally used in the Tl-201 reconstruction to correct the crosstalk between the two isotopes. The mathematical 4D NCAT phantom and physical cardiac phantoms were used to optimise the number of OS-EM iterations where the scatter estimate is updated and the number of MC simulated photons. The results showed that two scatter update iterations and 10(5) simulated photons are enough for the Tc-99m and Tl-201 reconstructions, whereas 10(6) simulated photons are needed to generate good quality down-scatter estimates. With these parameters, the entire Tl-201/Tc-99m dual isotope reconstruction can be accomplished in less than 3 minutes.

  1. Fast and accurate Monte Carlo-based system response modeling for a digital whole-body PET

    Science.gov (United States)

    Sun, Xiangyu; Li, Yanzhao; Yang, Lingli; Wang, Shuai; Zhang, Bo; Xiao, Peng; Xie, Qingguo

    2017-03-01

    Recently, we have developed a digital whole-body PET scanner based on multi-voltage threshold (MVT) digitizers. To mitigate the impact of resolution degrading factors, an accurate system response is calculated by Monte Carlo simulation, which is computationally expensive. To address the problem, here we improve the method of using symmetries by simulating an axial wedge region. This approach takes full advantage of intrinsic symmetries in the cylindrical PET system without significantly increasing the computation cost in the process of symmetries. A total of 4224 symmetries are exploited. It took 17 days to generate the system maxtrix on 160 cores of Xeon 2.5 GHz. Both simulation and experimental data are used to evaluate the accuracy of system response modeling. The simulation studies show the full-width-half-maximum of a line source being 2.1 mm and 3.8 mm at the center of FOV and 200 mm at the center of FOV. Experimental results show the 2.4 mm rods in the Derenzo phantom image, which can be well distinguished.

  2. Monte Carlo-based quantitative structure-activity relationship models for toxicity of organic chemicals to Daphnia magna.

    Science.gov (United States)

    Toropova, Alla P; Toropov, Andrey A; Veselinović, Aleksandar M; Veselinović, Jovana B; Leszczynska, Danuta; Leszczynski, Jerzy

    2016-11-01

    Quantitative structure-activity relationships (QSARs) for toxicity of a large set of 758 organic compounds to Daphnia magna were built up. The simplified molecular input-line entry system (SMILES) was used to represent the molecular structure. The Correlation and Logic (CORAL) software was utilized as a tool to develop the QSAR models. These models are built up using the Monte Carlo method and according to the principle "QSAR is a random event" if one checks a group of random distributions in the visible training set and the invisible validation set. Three distributions of the data into the visible training, calibration, and invisible validation sets are examined. The predictive potentials (i.e., statistical characteristics for the invisible validation set of the best model) are as follows: n = 87, r(2)  = 0.8377, root mean square error = 0.564. The mechanistic interpretations and the domain of applicability of built models are suggested and discussed. Environ Toxicol Chem 2016;35:2691-2697. © 2016 SETAC. © 2016 SETAC.

  3. Prediction in the face of uncertainty: a Monte Carlo-based approach for systems biology of cancer treatment.

    Science.gov (United States)

    Wierling, Christoph; Kühn, Alexander; Hache, Hendrik; Daskalaki, Andriani; Maschke-Dutz, Elisabeth; Peycheva, Svetlana; Li, Jian; Herwig, Ralf; Lehrach, Hans

    2012-08-15

    Cancer is known to be a complex disease and its therapy is difficult. Much information is available on molecules and pathways involved in cancer onset and progression and this data provides a valuable resource for the development of predictive computer models that can help to identify new potential drug targets or to improve therapies. Modeling cancer treatment has to take into account many cellular pathways usually leading to the construction of large mathematical models. The development of such models is complicated by the fact that relevant parameters are either completely unknown, or can at best be measured under highly artificial conditions. Here we propose an approach for constructing predictive models of such complex biological networks in the absence of accurate knowledge on parameter values, and apply this strategy to predict the effects of perturbations induced by anti-cancer drug target inhibitions on an epidermal growth factor (EGF) signaling network. The strategy is based on a Monte Carlo approach, in which the kinetic parameters are repeatedly sampled from specific probability distributions and used for multiple parallel simulations. Simulation results from different forms of the model (e.g., a model that expresses a certain mutation or mutation pattern or the treatment by a certain drug or drug combination) can be compared with the unperturbed control model and used for the prediction of the perturbation effects. This framework opens the way to experiment with complex biological networks in the computer, likely to save costs in drug development and to improve patient therapy.

  4. Patient-specific Monte Carlo-based dose-kernel approach for inverse planning in afterloading brachytherapy.

    Science.gov (United States)

    D'Amours, Michel; Pouliot, Jean; Dagnault, Anne; Verhaegen, Frank; Beaulieu, Luc

    2011-12-01

    Brachytherapy planning software relies on the Task Group report 43 dosimetry formalism. This formalism, based on a water approximation, neglects various heterogeneous materials present during treatment. Various studies have suggested that these heterogeneities should be taken into account to improve the treatment quality. The present study sought to demonstrate the feasibility of incorporating Monte Carlo (MC) dosimetry within an inverse planning algorithm to improve the dose conformity and increase the treatment quality. The method was based on precalculated dose kernels in full patient geometries, representing the dose distribution of a brachytherapy source at a single dwell position using MC simulations and the Geant4 toolkit. These dose kernels are used by the inverse planning by simulated annealing tool to produce a fast MC-based plan. A test was performed for an interstitial brachytherapy breast treatment using two different high-dose-rate brachytherapy sources: the microSelectron iridium-192 source and the electronic brachytherapy source Axxent operating at 50 kVp. A research version of the inverse planning by simulated annealing algorithm was combined with MC to provide a method to fully account for the heterogeneities in dose optimization, using the MC method. The effect of the water approximation was found to depend on photon energy, with greater dose attenuation for the lower energies of the Axxent source compared with iridium-192. For the latter, an underdosage of 5.1% for the dose received by 90% of the clinical target volume was found. A new method to optimize afterloading brachytherapy plans that uses MC dosimetric information was developed. Including computed tomography-based information in MC dosimetry in the inverse planning process was shown to take into account the full range of scatter and heterogeneity conditions. This led to significant dose differences compared with the Task Group report 43 approach for the Axxent source. Copyright © 2011

  5. Patient-Specific Monte Carlo-Based Dose-Kernel Approach for Inverse Planning in Afterloading Brachytherapy

    Energy Technology Data Exchange (ETDEWEB)

    D' Amours, Michel [Departement de Radio-Oncologie et Centre de Recherche en Cancerologie de l' Universite Laval, Hotel-Dieu de Quebec, Quebec, QC (Canada); Department of Physics, Physics Engineering, and Optics, Universite Laval, Quebec, QC (Canada); Pouliot, Jean [Department of Radiation Oncology, University of California, San Francisco, School of Medicine, San Francisco, CA (United States); Dagnault, Anne [Departement de Radio-Oncologie et Centre de Recherche en Cancerologie de l' Universite Laval, Hotel-Dieu de Quebec, Quebec, QC (Canada); Verhaegen, Frank [Department of Radiation Oncology, Maastro Clinic, GROW Research Institute, Maastricht University Medical Centre, Maastricht (Netherlands); Department of Oncology, McGill University, Montreal, QC (Canada); Beaulieu, Luc, E-mail: beaulieu@phy.ulaval.ca [Departement de Radio-Oncologie et Centre de Recherche en Cancerologie de l' Universite Laval, Hotel-Dieu de Quebec, Quebec, QC (Canada); Department of Physics, Physics Engineering, and Optics, Universite Laval, Quebec, QC (Canada)

    2011-12-01

    Purpose: Brachytherapy planning software relies on the Task Group report 43 dosimetry formalism. This formalism, based on a water approximation, neglects various heterogeneous materials present during treatment. Various studies have suggested that these heterogeneities should be taken into account to improve the treatment quality. The present study sought to demonstrate the feasibility of incorporating Monte Carlo (MC) dosimetry within an inverse planning algorithm to improve the dose conformity and increase the treatment quality. Methods and Materials: The method was based on precalculated dose kernels in full patient geometries, representing the dose distribution of a brachytherapy source at a single dwell position using MC simulations and the Geant4 toolkit. These dose kernels are used by the inverse planning by simulated annealing tool to produce a fast MC-based plan. A test was performed for an interstitial brachytherapy breast treatment using two different high-dose-rate brachytherapy sources: the microSelectron iridium-192 source and the electronic brachytherapy source Axxent operating at 50 kVp. Results: A research version of the inverse planning by simulated annealing algorithm was combined with MC to provide a method to fully account for the heterogeneities in dose optimization, using the MC method. The effect of the water approximation was found to depend on photon energy, with greater dose attenuation for the lower energies of the Axxent source compared with iridium-192. For the latter, an underdosage of 5.1% for the dose received by 90% of the clinical target volume was found. Conclusion: A new method to optimize afterloading brachytherapy plans that uses MC dosimetric information was developed. Including computed tomography-based information in MC dosimetry in the inverse planning process was shown to take into account the full range of scatter and heterogeneity conditions. This led to significant dose differences compared with the Task Group report

  6. SU-E-T-175: Clinical Evaluations of Monte Carlo-Based Inverse Treatment Plan Optimization for Intensity Modulated Radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Chi, Y; Li, Y; Tian, Z; Gu, X; Jiang, S; Jia, X [UT Southwestern Medical Center, Dallas, TX (United States)

    2015-06-15

    Purpose: Pencil-beam or superposition-convolution type dose calculation algorithms are routinely used in inverse plan optimization for intensity modulated radiation therapy (IMRT). However, due to their limited accuracy in some challenging cases, e.g. lung, the resulting dose may lose its optimality after being recomputed using an accurate algorithm, e.g. Monte Carlo (MC). It is the objective of this study to evaluate the feasibility and advantages of a new method to include MC in the treatment planning process. Methods: We developed a scheme to iteratively perform MC-based beamlet dose calculations and plan optimization. In the MC stage, a GPU-based dose engine was used and the particle number sampled from a beamlet was proportional to its optimized fluence from the previous step. We tested this scheme in four lung cancer IMRT cases. For each case, the original plan dose, plan dose re-computed by MC, and dose optimized by our scheme were obtained. Clinically relevant dosimetric quantities in these three plans were compared. Results: Although the original plan achieved a satisfactory PDV dose coverage, after re-computing doses using MC method, it was found that the PTV D95% were reduced by 4.60%–6.67%. After re-optimizing these cases with our scheme, the PTV coverage was improved to the same level as in the original plan, while the critical OAR coverages were maintained to clinically acceptable levels. Regarding the computation time, it took on average 144 sec per case using only one GPU card, including both MC-based beamlet dose calculation and treatment plan optimization. Conclusion: The achieved dosimetric gains and high computational efficiency indicate the feasibility and advantages of the proposed MC-based IMRT optimization method. Comprehensive validations in more patient cases are in progress.

  7. Posture-specific phantoms representing female and male adults in Monte Carlo-based simulations for radiological protection

    Science.gov (United States)

    Cassola, V. F.; Kramer, R.; Brayner, C.; Khoury, H. J.

    2010-08-01

    Does the posture of a patient have an effect on the organ and tissue absorbed doses caused by x-ray examinations? This study aims to find the answer to this question, based on Monte Carlo (MC) simulations of commonly performed x-ray examinations using adult phantoms modelled to represent humans in standing as well as in the supine posture. The recently published FASH (female adult mesh) and MASH (male adult mesh) phantoms have the standing posture. In a first step, both phantoms were updated with respect to their anatomy: glandular tissue was separated from adipose tissue in the breasts, visceral fat was separated from subcutaneous fat, cartilage was segmented in ears, nose and around the thyroid, and the mass of the right lung is now 15% greater than the left lung. The updated versions are called FASH2_sta and MASH2_sta (sta = standing). Taking into account the gravitational effects on organ position and fat distribution, supine versions of the FASH2 and the MASH2 phantoms have been developed in this study and called FASH2_sup and MASH2_sup. MC simulations of external whole-body exposure to monoenergetic photons and partial-body exposure to x-rays have been made with the standing and supine FASH2 and MASH2 phantoms. For external whole-body exposure for AP and PA projection with photon energies above 30 keV, the effective dose did not change by more than 5% when the posture changed from standing to supine or vice versa. Apart from that, the supine posture is quite rare in occupational radiation protection from whole-body exposure. However, in the x-ray diagnosis supine posture is frequently used for patients submitted to examinations. Changes of organ absorbed doses up to 60% were found for simulations of chest and abdomen radiographs if the posture changed from standing to supine or vice versa. A further increase of differences between posture-specific organ and tissue absorbed doses with increasing whole-body mass is to be expected.

  8. A hybrid phantom Monte Carlo-based method for historical reconstruction of organ doses in patients treated with cobalt-60 for Hodgkin's lymphoma

    Science.gov (United States)

    Petroccia, Heather; Mendenhall, Nancy; Liu, Chihray; Hammer, Clifford; Culberson, Wesley; Thar, Tim; Mitchell, Tom; Li, Zuofeng; Bolch, Wesley

    2017-08-01

    Historical radiotherapy treatment plans lack 3D images sets required for estimating mean organ doses to patients. Alternatively, Monte Carlo-based models of radiotherapy devices coupled with whole-body computational phantoms can permit estimates of historical in-field and out-of-field organ doses as needed for studies associating radiation exposure and late tissue toxicities. In recreating historical patient treatments with 60Co based systems, the major components to be modeled include the source capsule, surrounding shielding layers, collimators (both fixed and adjustable), and trimmers as needed to vary field size. In this study, a computational model and experimental validation of the Theratron T-1000 are presented. Model validation is based upon in-field commissioning data collected at the University of Florida, published out-of-field data from the British Journal of Radiology (BJR) Supplement 25, and out-of-field measurements performed at the University of Wisconsin’s Accredited Dosimetry Calibration Laboratory (UWADCL). The computational model of the Theratron T-1000 agrees with central axis percentage depth dose data to within 2% for 6  ×  6 to 30  ×  30 cm2 fields. Out-of-field doses were found to vary between 0.6% to 2.4% of central axis dose at 10 cm from field edge and 0.42% to 0.97% of central axis dose at 20 cm from the field edge, all at 5 cm depth. Absolute and relative differences between computed and measured out-of-field doses varied between  ±2.5% and  ±100%, respectively, at distances up to 60 cm from the central axis. The source-term model was subsequently combined with patient-morphometry matched computational hybrid phantoms as a method for estimating in-field and out-of-field organ doses for patients treated for Hodgkin’s Lymphoma. By changing field size and position, and adding patient-specific field shaping blocks, more complex historical treatment set-ups can be to recreated, particularly those

  9. Experimental Component Characterization, Monte-Carlo-Based Image Generation and Source Reconstruction for the Neutron Imaging System of the National Ignition Facility

    Energy Technology Data Exchange (ETDEWEB)

    Barrera, C A; Moran, M J

    2007-08-21

    The Neutron Imaging System (NIS) is one of seven ignition target diagnostics under development for the National Ignition Facility. The NIS is required to record hot-spot (13-15 MeV) and downscattered (6-10 MeV) images with a resolution of 10 microns and a signal-to-noise ratio (SNR) of 10 at the 20% contour. The NIS is a valuable diagnostic since the downscattered neutrons reveal the spatial distribution of the cold fuel during an ignition attempt, providing important information in the case of a failed implosion. The present study explores the parameter space of several line-of-sight (LOS) configurations that could serve as the basis for the final design. Six commercially available organic scintillators were experimentally characterized for their light emission decay profile and neutron sensitivity. The samples showed a long lived decay component that makes direct recording of a downscattered image impossible. The two best candidates for the NIS detector material are: EJ232 (BC422) plastic fibers or capillaries filled with EJ399B. A Monte Carlo-based end-to-end model of the NIS was developed to study the imaging capabilities of several LOS configurations and verify that the recovered sources meet the design requirements. The model includes accurate neutron source distributions, aperture geometries (square pinhole, triangular wedge, mini-penumbral, annular and penumbral), their point spread functions, and a pixelated scintillator detector. The modeling results show that a useful downscattered image can be obtained by recording the primary peak and the downscattered images, and then subtracting a decayed version of the former from the latter. The difference images need to be deconvolved in order to obtain accurate source distributions. The images are processed using a frequency-space modified-regularization algorithm and low-pass filtering. The resolution and SNR of these sources are quantified by using two surrogate sources. The simulations show that all LOS

  10. Inverse modeling of cloud-aerosol interactions -- Part 2: Sensitivity tests on liquid phase clouds using a Markov Chain Monte Carlo based simulation approach

    NARCIS (Netherlands)

    Partridge, D.G.; Vrugt, J.A.; Tunved, P.; Ekman, A.M.L.; Struthers, H.; Sooroshian, A.

    2012-01-01

    This paper presents a novel approach to investigate cloud-aerosol interactions by coupling a Markov chain Monte Carlo (MCMC) algorithm to an adiabatic cloud parcel model. Despite the number of numerical cloud-aerosol sensitivity studies previously conducted few have used statistical analysis tools t

  11. Inversion of Schlumberger resistivity sounding data from the critically dynamic Koyna region using the Hybrid Monte Carlo-based neural network approach

    Directory of Open Access Journals (Sweden)

    S. Maiti

    2011-03-01

    Full Text Available Koyna region is well-known for its triggered seismic activities since the hazardous earthquake of M=6.3 occurred around the Koyna reservoir on 10 December 1967. Understanding the shallow distribution of resistivity pattern in such a seismically critical area is vital for mapping faults, fractures and lineaments. However, deducing true resistivity distribution from the apparent resistivity data lacks precise information due to intrinsic non-linearity in the data structures. Here we present a new technique based on the Bayesian neural network (BNN theory using the concept of Hybrid Monte Carlo (HMC/Markov Chain Monte Carlo (MCMC simulation scheme. The new method is applied to invert one and two-dimensional Direct Current (DC vertical electrical sounding (VES data acquired around the Koyna region in India. Prior to apply the method on actual resistivity data, the new method was tested for simulating synthetic signal. In this approach the objective/cost function is optimized following the Hybrid Monte Carlo (HMC/Markov Chain Monte Carlo (MCMC sampling based algorithm and each trajectory was updated by approximating the Hamiltonian differential equations through a leapfrog discretization scheme. The stability of the new inversion technique was tested in presence of correlated red noise and uncertainty of the result was estimated using the BNN code. The estimated true resistivity distribution was compared with the results of singular value decomposition (SVD-based conventional resistivity inversion results. Comparative results based on the HMC-based Bayesian Neural Network are in good agreement with the existing model results, however in some cases, it also provides more detail and precise results, which appears to be justified with local geological and structural details. The new BNN approach based on HMC is faster and proved to be a promising inversion scheme to interpret complex and non-linear resistivity problems. The HMC-based BNN results

  12. Comparison of interatomic potentials of water via structure factors reconstructed from simulated partial radial distribution functions: a reverse Monte Carlo based approach

    Science.gov (United States)

    Steinczinger, Zsuzsanna; Jóvári, Pál; Pusztai, László

    2017-01-01

    Neutron- and x-ray weighted total structure factors of liquid water have been calculated on the basis of the intermolecular parts of partial radial distribution functions resulting from various computer simulations. The approach includes reverse Monte Carlo (RMC) modelling of these partials, using realistic flexible molecules, and the calculation of experimental diffraction data, including the intramolecular contributions, from the RMC particle configurations. The procedure has been applied to ten sets of intermolecular partial radial distribution functions obtained from various computer simulations, including one set from an ab initio molecular dynamics, of water. It is found that modern polarizable water potentials, such as SWM4-DP and BK3 are the most successful in reproducing measured diffraction data.

  13. Experimental validation of a Monte Carlo-based kV x-ray projection model for the Varian linac-mounted cone-beam CT imaging system

    Science.gov (United States)

    Lazos, Dimitrios; Pokhrel, Damodar; Su, Zhong; Lu, Jun; Williamson, Jeffrey F.

    2008-03-01

    Fast and accurate modeling of cone-beam CT (CBCT) x-ray projection data can improve CBCT image quality either by linearizing projection data for each patient prior to image reconstruction (thereby mitigating detector blur/lag, spectral hardening, and scatter artifacts) or indirectly by supporting rigorous comparative simulation studies of competing image reconstruction and processing algorithms. In this study, we compare Monte Carlo-computed x-ray projections with projections experimentally acquired from our Varian Trilogy CBCT imaging system for phantoms of known design. Our recently developed Monte Carlo photon-transport code, PTRAN, was used to compute primary and scatter projections for cylindrical phantom of known diameter (NA model 76-410) with and without bow-tie filter and antiscatter grid for both full- and half-fan geometries. These simulations were based upon measured 120 kVp spectra, beam profiles, and flat-panel detector (4030CB) point-spread function. Compound Poisson- process noise was simulated based upon measured beam output. Computed projections were compared to flat- and dark-field corrected 4030CB images where scatter profiles were estimated by subtracting narrow axial-from full axial width 4030CB profiles. In agreement with the literature, the difference between simulated and measured projection data is of the order of 6-8%. The measurement of the scatter profiles is affected by the long tails of the detector PSF. Higher accuracy can be achieved mainly by improving the beam modeling and correcting the non linearities induced by the detector PSF.

  14. Monte Carlo-based investigations on the impact of removing the flattening filter on beam quality specifiers for photon beam dosimetry.

    Science.gov (United States)

    Czarnecki, Damian; Poppe, Björn; Zink, Klemens

    2017-06-01

    The impact of removing the flattening filter in clinical electron accelerators on the relationship between dosimetric quantities such as beam quality specifiers and the mean photon and electron energies of the photon radiation field was investigated by Monte Carlo simulations. The purpose of this work was to determine the uncertainties when using the well-known beam quality specifiers or energy-based beam specifiers as predictors of dosimetric photon field properties when removing the flattening filter. Monte Carlo simulations applying eight different linear accelerator head models with and without flattening filter were performed in order to generate realistic radiation sources and calculate field properties such as restricted mass collision stopping power ratios (L¯/ρ)airwater, mean photon and secondary electron energies. To study the impact of removing the flattening filter on the beam quality correction factors kQ , this factor for detailed ionization chamber models was calculated by Monte Carlo simulations. Stopping power ratios (L¯/ρ)airwater and kQ values for different ionization chambers as a function of TPR1020 and %dd(10)x were calculated. Moreover, mean photon energies in air and at the point of measurement in water as well as mean secondary electron energies at the point of measurement were calculated. The results revealed that removing the flattening filter led to a change within 0.3% in the relationship between %dd(10)x and (L¯/ρ)airwater, whereby the relationship between TPR1020 and (L¯/ρ)airwater changed up to 0.8% for high energy photon beams. However, TPR1020 was a good predictor of (L¯/ρ)airwater for both types of linear accelerator with energies mean photon energy below the linear accelerators head as well as at the point of measurement may not be suitable as a predictor of (L¯/ρ)airwater and kQ to merge the dosimetry of both linear accelerator types. It was possible to derive (L¯/ρ)airwater using the mean secondary electron energy

  15. Comparison of diffusion approximation and Monte Carlo based finite element models for simulating thermal responses to laser irradiation in discrete vessels.

    Science.gov (United States)

    Zhang, Rong; Verkruysse, Wim; Aguilar, Guillermo; Nelson, J Stuart

    2005-09-07

    Both diffusion approximation (DA) and Monte Carlo (MC) models have been used to simulate light distribution in multilayered human skin with or without discrete blood vessels. However, no detailed comparison of the light distribution, heat generation and induced thermal damage between these two models has been done for discrete vessels. Three models were constructed: (1) MC-based finite element method (FEM) model, referred to as MC-FEM; (2) DA-based FEM with simple scaling factors according to chromophore concentrations (SFCC) in the epidermis and vessels, referred to as DA-FEM-SFCC; and (3) DA-FEM with improved scaling factors (ISF) obtained by equalizing the total light energy depositions that are solved from the DA and MC models in the epidermis and vessels, respectively, referred to as DA-FEM-ISF. The results show that DA-FEM-SFCC underestimates the light energy deposition in the epidermis and vessels when compared to MC-FEM. The difference is nonlinearly dependent on wavelength, dermal blood volume fraction, vessel size and depth, etc. Thus, the temperature and damage profiles are also dramatically different. DA-FEM-ISF achieves much better results in calculating heat generation and induced thermal damage when compared to MC-FEM, and has the advantages of both calculation speed and accuracy. The disadvantage is that a multidimensional ISF table is needed for DA-FEM-ISF to be a practical modelling tool.

  16. Clinically applicable Monte Carlo-based biological dose optimization for the treatment of head and neck cancers with spot-scanning proton therapy

    CERN Document Server

    Tseung, H Wan Chan; Kreofsky, C R; Ma, D; Beltran, C

    2016-01-01

    Purpose: To demonstrate the feasibility of fast Monte Carlo (MC) based inverse biological planning for the treatment of head and neck tumors in spot-scanning proton therapy. Methods: Recently, a fast and accurate Graphics Processor Unit (GPU)-based MC simulation of proton transport was developed and used as the dose calculation engine in a GPU-accelerated IMPT optimizer. Besides dose, the dose-averaged linear energy transfer (LETd) can be simultaneously scored, which makes biological dose (BD) optimization possible. To convert from LETd to BD, a linear relation was assumed. Using this novel optimizer, inverse biological planning was applied to 4 patients: 2 small and 1 large thyroid tumor targets, and 1 glioma case. To create these plans, constraints were placed to maintain the physical dose (PD) within 1.25 times the prescription while maximizing target BD. For comparison, conventional IMRT and IMPT plans were created for each case in Eclipse (Varian, Inc). The same critical structure PD constraints were use...

  17. Hydration structure in concentrated aqueous lithium chloride solutions: A reverse Monte Carlo based combination of molecular dynamics simulations and diffraction data

    Science.gov (United States)

    Harsányi, I.; Pusztai, L.

    2012-11-01

    We report on a comparison of three interaction potential models of water (SPC/E, TIP4P-2005, and SWM4-DP) for describing the structure of concentrated aqueous lithium chloride solutions. Classical molecular dynamics simulations have been carried out and total scattering structure factors, calculated from the particle configurations, were compared with experimental diffraction data. Later, reverse Monte Carlo structural modelling was applied for refining molecular dynamics results, so that particle configurations consistent with neutron and X-ray diffraction data could be prepared that, at the same time, were as close as possible to the final stage of the molecular dynamics simulations. Partial radial distribution functions, first neighbors, and angular correlations were analysed further from the best fitting particle configurations. It was found that none of the water potential models describe the structure perfectly; overall, the SWM4-DP model seems to be the most promising. At the highest concentrations the SPC/E model appears to provide the best approximation of the water structure, whereas the TIP4P-2005 model proved to be the most successful for estimating the lithium-oxygen partial radial distribution function at each concentration.

  18. Inverse modelling of cloud-aerosol interactions – Part 2: Sensitivity tests on liquid phase clouds using a Markov chain Monte Carlo based simulation approach

    Directory of Open Access Journals (Sweden)

    D. G. Partridge

    2012-03-01

    Full Text Available This paper presents a novel approach to investigate cloud-aerosol interactions by coupling a Markov chain Monte Carlo (MCMC algorithm to an adiabatic cloud parcel model. Despite the number of numerical cloud-aerosol sensitivity studies previously conducted few have used statistical analysis tools to investigate the global sensitivity of a cloud model to input aerosol physiochemical parameters. Using numerically generated cloud droplet number concentration (CDNC distributions (i.e. synthetic data as cloud observations, this inverse modelling framework is shown to successfully estimate the correct calibration parameters, and their underlying posterior probability distribution.

    The employed analysis method provides a new, integrative framework to evaluate the global sensitivity of the derived CDNC distribution to the input parameters describing the lognormal properties of the accumulation mode aerosol and the particle chemistry. To a large extent, results from prior studies are confirmed, but the present study also provides some additional insights. There is a transition in relative sensitivity from very clean marine Arctic conditions where the lognormal aerosol parameters representing the accumulation mode aerosol number concentration and mean radius and are found to be most important for determining the CDNC distribution to very polluted continental environments (aerosol concentration in the accumulation mode >1000 cm−3 where particle chemistry is more important than both number concentration and size of the accumulation mode.

    The competition and compensation between the cloud model input parameters illustrates that if the soluble mass fraction is reduced, the aerosol number concentration, geometric standard deviation and mean radius of the accumulation mode must increase in order to achieve the same CDNC distribution.

    This study demonstrates that inverse modelling provides a flexible, transparent and

  19. SU-E-T-632: Preliminary Study On Treating Nose Skin Using Energy and Intensity Modulated Electron Beams with Monte Carlo Based Dose Calculations

    Energy Technology Data Exchange (ETDEWEB)

    Jin, L; Eldib, A; Li, J; Price, R; Ma, C [Fox Chase Cancer Center, Philadelphia, PA (United States)

    2015-06-15

    Purpose: Uneven nose surfaces and air cavities underneath and the use of bolus present complexity and dose uncertainty when using a single electron energy beam to plan treatments of nose skin with a pencil beam-based planning system. This work demonstrates more accurate dose calculation and more optimal planning using energy and intensity modulated electron radiotherapy (MERT) delivered with a pMLC. Methods: An in-house developed Monte Carlo (MC)-based dose calculation/optimization planning system was employed for treatment planning. Phase space data (6, 9, 12 and 15 MeV) were used as an input source for MC dose calculations for the linac. To reduce the scatter-caused penumbra, a short SSD (61 cm) was used. Our previous work demonstrates good agreement in percentage depth dose and off-axis dose between calculations and film measurement for various field sizes. A MERT plan was generated for treating the nose skin using a patient geometry and a dose volume histogram (DVH) was obtained. The work also shows the comparison of 2D dose distributions between a clinically used conventional single electron energy plan and the MERT plan. Results: The MERT plan resulted in improved target dose coverage as compared to the conventional plan, which demonstrated a target dose deficit at the field edge. The conventional plan showed higher dose normal tissue irradiation underneath the nose skin while the MERT plan resulted in improved conformity and thus reduces normal tissue dose. Conclusion: This preliminary work illustrates that MC-based MERT planning is a promising technique in treating nose skin, not only providing more accurate dose calculation, but also offering an improved target dose coverage and conformity. In addition, this technique may eliminate the necessity of bolus, which often produces dose delivery uncertainty due to the air gaps that may exist between the bolus and skin.

  20. Evaluation of a commercial VMC++ Monte Carlo based treatment planning system for electron beams using EGSnrc/BEAMnrc simulations and measurements.

    Science.gov (United States)

    Edimo, P; Clermont, C; Kwato, M G; Vynckier, S

    2009-09-01

    In the present work, Monte Carlo (MC) models of electron beams (energies 4, 12 and 18MeV) from an Elekta SL25 medical linear accelerator were simulated using EGSnrc/BEAMnrc user code. The calculated dose distributions were benchmarked by comparison with measurements made in a water phantom for a wide range of open field sizes and insert combinations, at a single source-to-surface distance (SSD) of 100cm. These BEAMnrc models were used to evaluate the accuracy of a commercial MC dose calculation engine for electron beam treatment planning (Oncentra MasterPlan Treament Planning System (OMTPS) version 1.4, Nucletron) for two energies, 4 and 12MeV. Output factors were furthermore measured in the water phantom and compared to BEAMnrc and OMTPS. The overall agreement between predicted and measured output factors was comparable for both BEAMnrc and OMTPS, except for a few asymmetric and/or small insert cutouts, where larger deviations between measurements and the values predicted from BEAMnrc as well as OMTPS computations were recorded. However, in the heterogeneous phantom, differences between BEAMnrc and measurements ranged from 0.5 to 2.0% between two ribs and 0.6-1.0% below the ribs, whereas the range difference between OMTPS and measurements was the same (0.5-4.0%) in both areas. With respect to output factors, the overall agreement between BEAMnrc and measurements was usually within 1.0% whereas differences up to nearly 3.0% were observed for OMTPS. This paper focuses on a comparison for clinical cases, including the effects of electron beam attenuations in a heterogeneous phantom. It, therefore, complements previously reported data (only based on measurements) in one other paper on commissioning of the VMC++ dose calculation engine. These results demonstrate that the VMC++ algorithm is more robust in predicting dose distribution than Pencil beam based algorithms for the electron beams investigated.

  1. 基于蒙特卡罗模拟的可转换债券定价研究%Monte Carlo-based pricing of convertible bonds

    Institute of Scientific and Technical Information of China (English)

    赵洋; 赵立臣

    2009-01-01

    The paper applied the least-square Monte Carlo method proposed by Longstaff,et al.to price convertible bond,so as to solve the problem of prcing the path-dependent clauses and American option features embeded in convertible bonds.Convertible bonds are complex hybrid securities being subject to equity risk,credit risk,and interest rate risk.In the established pricing model,the assumption of constant volatility is relaxed and the volatility is estimated with GARCH (1,1),according to TF model,the credit risk is represented with the credit risk spread,and the yield curve is estimsted with Nelson-Siegel method.By empirical research,it is found that the convertible bonds in China are underpriced by 2% to 3%.%使用Longstaff等提出的最小二乘蒙特卡罗方法为可转换债券定价,从而解决为可转换债券中路径依赖条款和美式期权进行定价的难题.可转换债券是复杂的结构化产品,同时受股价风险、信用风险和利率风险影响.在建立的定价模型中,股价过程放松波动率为常数的假设,用GARCH(1,1)模型估计波动率,信用风险根据TF模型用常数信用利差代表,收益率曲线用Nelson-Siegel方法估计.通过实证检验发现国内可转换债券市场存在低估,低估幅度在2%~3%之间.

  2. Quasi-Monte Carlo based global uncertainty and sensitivity analysis in modeling free product migration and recovery from petroleum-contaminated aquifers.

    Science.gov (United States)

    He, Li; Huang, Gordon; Lu, Hongwei; Wang, Shuo; Xu, Yi

    2012-06-15

    This paper presents a global uncertainty and sensitivity analysis (GUSA) framework based on global sensitivity analysis (GSA) and generalized likelihood uncertainty estimation (GLUE) methods. Quasi-Monte Carlo (QMC) is employed by GUSA to obtain realizations of uncertain parameters, which are then input to the simulation model for analysis. Compared to GLUE, GUSA can not only evaluate global sensitivity and uncertainty of modeling parameter sets, but also quantify the uncertainty in modeling prediction sets. Moreover, GUSA's another advantage lies in alleviation of computational effort, since those globally-insensitive parameters can be identified and removed from the uncertain-parameter set. GUSA is applied to a practical petroleum-contaminated site in Canada to investigate free product migration and recovery processes under aquifer remediation operations. Results from global sensitivity analysis show that (1) initial free product thickness has the most significant impact on total recovery volume but least impact on residual free product thickness and recovery rate; (2) total recovery volume and recovery rate are sensitive to residual LNAPL phase saturations and soil porosity. Results from uncertainty predictions reveal that the residual thickness would remain high and almost unchanged after about half-year of skimmer-well scheme; the rather high residual thickness (0.73-1.56 m 20 years later) indicates that natural attenuation would not be suitable for the remediation. The largest total recovery volume would be from water pumping, followed by vacuum pumping, and then skimmer. The recovery rates of the three schemes would rapidly decrease after 2 years (less than 0.05 m(3)/day), thus short-term remediation is not suggested. Copyright © 2012 Elsevier B.V. All rights reserved.

  3. A virtual photon source model of an Elekta linear accelerator with integrated mini MLC for Monte Carlo based IMRT dose calculation.

    Science.gov (United States)

    Sikora, M; Dohm, O; Alber, M

    2007-08-07

    A dedicated, efficient Monte Carlo (MC) accelerator head model for intensity modulated stereotactic radiosurgery treatment planning is needed to afford a highly accurate simulation of tiny IMRT fields. A virtual source model (VSM) of a mini multi-leaf collimator (MLC) (the Elekta Beam Modulator (EBM)) is presented, allowing efficient generation of particles even for small fields. The VSM of the EBM is based on a previously published virtual photon energy fluence model (VEF) (Fippel et al 2003 Med. Phys. 30 301) commissioned with large field measurements in air and in water. The original commissioning procedure of the VEF, based on large field measurements only, leads to inaccuracies for small fields. In order to improve the VSM, it was necessary to change the VEF model by developing (1) a method to determine the primary photon source diameter, relevant for output factor calculations, (2) a model of the influence of the flattening filter on the secondary photon spectrum and (3) a more realistic primary photon spectrum. The VSM model is used to generate the source phase space data above the mini-MLC. Later the particles are transmitted through the mini-MLC by a passive filter function which significantly speeds up the time of generation of the phase space data after the mini-MLC, used for calculation of the dose distribution in the patient. The improved VSM model was commissioned for 6 and 15 MV beams. The results of MC simulation are in very good agreement with measurements. Less than 2% of local difference between the MC simulation and the diamond detector measurement of the output factors in water was achieved. The X, Y and Z profiles measured in water with an ion chamber (V = 0.125 cm(3)) and a diamond detector were used to validate the models. An overall agreement of 2%/2 mm for high dose regions and 3%/2 mm in low dose regions between measurement and MC simulation for field sizes from 0.8 x 0.8 cm(2) to 16 x 21 cm(2) was achieved. An IMRT plan film verification

  4. Inverse modeling of cloud-aerosol interactions – Part 2: Sensitivity tests on liquid phase clouds using a Markov Chain Monte Carlo based simulation approach

    Directory of Open Access Journals (Sweden)

    D. G. Partridge

    2011-07-01

    Full Text Available This paper presents a novel approach to investigate cloud-aerosol interactions by coupling a Markov Chain Monte Carlo (MCMC algorithm to a pseudo-adiabatic cloud parcel model. Despite the number of numerical cloud-aerosol sensitivity studies previously conducted few have used statistical analysis tools to investigate the sensitivity of a cloud model to input aerosol physiochemical parameters. Using synthetic data as observed values of cloud droplet number concentration (CDNC distribution, this inverse modelling framework is shown to successfully converge to the correct calibration parameters.

    The employed analysis method provides a new, integrative framework to evaluate the sensitivity of the derived CDNC distribution to the input parameters describing the lognormal properties of the accumulation mode and the particle chemistry. To a large extent, results from prior studies are confirmed, but the present study also provides some additional insightful findings. There is a clear transition from very clean marine Arctic conditions where the aerosol parameters representing the mean radius and geometric standard deviation of the accumulation mode are found to be most important for determining the CDNC distribution to very polluted continental environments (aerosol concentration in the accumulation mode >1000 cm−3 where particle chemistry is more important than both number concentration and size of the accumulation mode.

    The competition and compensation between the cloud model input parameters illustrate that if the soluble mass fraction is reduced, both the number of particles and geometric standard deviation must increase and the mean radius of the accumulation mode must increase in order to achieve the same CDNC distribution.

    For more polluted aerosol conditions, with a reduction in soluble mass fraction the parameter correlation becomes weaker and more non-linear over the range of possible solutions

  5. Dosimetric verification and clinical evaluation of a new commercially available Monte Carlo-based dose algorithm for application in stereotactic body radiation therapy (SBRT) treatment planning

    Energy Technology Data Exchange (ETDEWEB)

    Fragoso, Margarida; Wen Ning; Kumar, Sanath; Liu Dezhi; Ryu, Samuel; Movsas, Benjamin; Munther, Ajlouni; Chetty, Indrin J, E-mail: ichetty1@hfhs.or [Henry Ford Health System, Detroit, MI (United States)

    2010-08-21

    Modern cancer treatment techniques, such as intensity-modulated radiation therapy (IMRT) and stereotactic body radiation therapy (SBRT), have greatly increased the demand for more accurate treatment planning (structure definition, dose calculation, etc) and dose delivery. The ability to use fast and accurate Monte Carlo (MC)-based dose calculations within a commercial treatment planning system (TPS) in the clinical setting is now becoming more of a reality. This study describes the dosimetric verification and initial clinical evaluation of a new commercial MC-based photon beam dose calculation algorithm, within the iPlan v.4.1 TPS (BrainLAB AG, Feldkirchen, Germany). Experimental verification of the MC photon beam model was performed with film and ionization chambers in water phantoms and in heterogeneous solid-water slabs containing bone and lung-equivalent materials for a 6 MV photon beam from a Novalis (BrainLAB) linear accelerator (linac) with a micro-multileaf collimator (m{sub 3} MLC). The agreement between calculated and measured dose distributions in the water phantom verification tests was, on average, within 2%/1 mm (high dose/high gradient) and was within {+-}4%/2 mm in the heterogeneous slab geometries. Example treatment plans in the lung show significant differences between the MC and one-dimensional pencil beam (PB) algorithms within iPlan, especially for small lesions in the lung, where electronic disequilibrium effects are emphasized. Other user-specific features in the iPlan system, such as options to select dose to water or dose to medium, and the mean variance level, have been investigated. Timing results for typical lung treatment plans show the total computation time (including that for processing and I/O) to be less than 10 min for 1-2% mean variance (running on a single PC with 8 Intel Xeon X5355 CPUs, 2.66 GHz). Overall, the iPlan MC algorithm is demonstrated to be an accurate and efficient dose algorithm, incorporating robust tools for MC

  6. Reducing radiation dose to selected organs by selecting the tube start angle in MDCT helical scans: A Monte Carlo based study

    Energy Technology Data Exchange (ETDEWEB)

    Zhang Di; Zankl, Maria; DeMarco, John J.; Cagnon, Chris H.; Angel, Erin; Turner, Adam C.; McNitt-Gray, Michael F. [David Geffen School of Medicine at UCLA, Los Angeles, California 90024 (United States); German Research Center for Environmental Health (GmbH), Institute of Radiation Protection, Helmholtz Zentrum Muenchen, Ingolstaedter Landstrasse 1, 85764 Neuherberg (Germany); David Geffen School of Medicine at UCLA, Los Angeles, California 90024 (United States)

    2009-12-15

    Purpose: Previous work has demonstrated that there are significant dose variations with a sinusoidal pattern on the peripheral of a CTDI 32 cm phantom or on the surface of an anthropomorphic phantom when helical CT scanning is performed, resulting in the creation of ''hot'' spots or ''cold'' spots. The purpose of this work was to perform preliminary investigations into the feasibility of exploiting these variations to reduce dose to selected radiosensitive organs solely by varying the tube start angle in CT scans. Methods: Radiation dose to several radiosensitive organs (including breasts, thyroid, uterus, gonads, and eye lenses) resulting from MDCT scans were estimated using Monte Carlo simulation methods on voxelized patient models, including GSF's Baby, Child, and Irene. Dose to fetus was also estimated using four pregnant female models based on CT images of the pregnant patients. Whole-body scans were simulated using 120 kVp, 300 mAs, both 28.8 and 40 mm nominal collimations, and pitch values of 1.5, 1.0, and 0.75 under a wide range of start angles (0 deg. - 340 deg. in 20 deg. increments). The relationship between tube start angle and organ dose was examined for each organ, and the potential dose reduction was calculated. Results: Some organs exhibit a strong dose variation, depending on the tube start angle. For small peripheral organs (e.g., the eye lenses of the Baby phantom at pitch 1.5 with 40 mm collimation), the minimum dose can be 41% lower than the maximum dose, depending on the tube start angle. In general, larger dose reductions occur for smaller peripheral organs in smaller patients when wider collimation is used. Pitch 1.5 and pitch 0.75 have different mechanisms of dose reduction. For pitch 1.5 scans, the dose is usually lowest when the tube start angle is such that the x-ray tube is posterior to the patient when it passes the longitudinal location of the organ. For pitch 0.75 scans, the dose is lowest

  7. SU-E-T-448: On the Perturbation Factor P-cav of the Markus Parallel Plate Ion Chambers in Clinical Electron Beams, Monte Carlo Based Reintegration of An Historical Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Voigts-Rhetz, P von; Zink, K [Technische Hochschule Mittelhessen - University of Applied Sciences, Giessen, Hessen (Germany)

    2014-06-01

    Purpose: All present dosimetry protocols recommend well-guarded parallel-plate ion chambers for electron dosimetry. For the guard-less Markus chamber an energy dependent fluence perturbation correction pcav is given. This perturbation correction was experimentally determined by van der Plaetsen by comparison of the read-out of a Markus and a NACP chamber, which was assumed to be “perturbation-free”. Aim of the present study is a Monte Carlo based reiteration of this experiment. Methods: Detailed models of four parallel-plate chambers (Roos, Markus, NACP and Advanced Markus) were designed using the Monte Carlo code EGSnrc and placed in a water phantom. For all chambers the dose to the active volume filled with low density water was calculated for 13 clinical electron spectra (E{sub 0}=6-21 MeV) at the depth of maximum and at the reference depth under reference conditions. In all cases the chamber's reference point was positioned at the depth of measurement. Moreover, the dose to water DW was calculated in a small water voxel positioned at the same depth. Results: The calculated dose ratio D{sub NACP}/D{sub Markus}, which according to van der Plaetsen reflects the fluence perturbation correction of the Markus chamber, deviates less from unity than the values given by van der Plaetsen's but exhibits a similar energy dependence. The same holds for the dose ratios of the other well guarded chambers. But, in comparison to water, the Markus chamber reveals the smallest overall perturbation correction which is nearly energy independent at both investigated depths. Conclusion: The simulations principally confirm the energy dependence of the dose ratio D{sub NACP}/D{sub Markus} as published by van der Plaetsen. But, as shown by our simulations of the ratio D{sub W}/D{sub Markus}, the conclusion drawn in all dosimetry protocols is questionable: in contrast to all well-guarded chambers the guard-less Markus chamber reveals the smallest overall perturbation

  8. SU-E-T-561: Monte Carlo-Based Organ Dose Reconstruction Using Pre-Contoured Human Model for Hodgkins Lymphoma Patients Treated by Cobalt-60 External Beam Therapy

    Energy Technology Data Exchange (ETDEWEB)

    Jung, J; Pelletier, C [East Carolina University, Greenville, NC (United States); Lee, C [University of Michigan, Ann Arbor, MI (United States); Kim, J [University of Pittsburgh Medical Center, Pittsburgh, PA (United States); Pyakuryal, A; Lee, C [National Cancer Institute, Rockville, MD (United States)

    2015-06-15

    Purpose: Organ doses for the Hodgkin’s lymphoma patients treated with cobalt-60 radiation were estimated using an anthropomorphic model and Monte Carlo modeling. Methods: A cobalt-60 treatment unit modeled in the BEAMnrc Monte Carlo code was used to produce phase space data. The Monte Carlo simulation was verified with percent depth dose measurement in water at various field sizes. Radiation transport through the lung blocks were modeled by adjusting the weights of phase space data. We imported a precontoured adult female hybrid model and generated a treatment plan. The adjusted phase space data and the human model were imported to the XVMC Monte Carlo code for dose calculation. The organ mean doses were estimated and dose volume histograms were plotted. Results: The percent depth dose agreement between measurement and calculation in water phantom was within 2% for all field sizes. The mean organ doses of heart, left breast, right breast, and spleen for the selected case were 44.3, 24.1, 14.6 and 3.4 Gy, respectively with the midline prescription dose of 40.0 Gy. Conclusion: Organ doses were estimated for the patient group whose threedimensional images are not available. This development may open the door to more accurate dose reconstruction and estimates of uncertainties in secondary cancer risk for Hodgkin’s lymphoma patients. This work was partially supported by the intramural research program of the National Institutes of Health, National Cancer Institute, Division of Cancer Epidemiology and Genetics.

  9. QWalk: A Quantum Monte Carlo Program for Electronic Structure

    CERN Document Server

    Wagner, Lucas K; Mitas, Lubos

    2007-01-01

    We describe QWalk, a new computational package capable of performing Quantum Monte Carlo electronic structure calculations for molecules and solids with many electrons. We describe the structure of the program and its implementation of Quantum Monte Carlo methods. It is open-source, licensed under the GPL, and available at the web site http://www.qwalk.org

  10. 1993 farming and grazing program plans for Monte Vista NWR

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — Plans for farming and grazing at Monte Vista National Wildlife Refuge for 1993. This program will use rotations of small grain, field peas, and legumes as a farming...

  11. SPANDY: a Monte Carlo program for gas target scattering geometry

    Energy Technology Data Exchange (ETDEWEB)

    Jarmie, N.; Jett, J.H.; Niethammer, A.C.

    1977-02-01

    A Monte Carlo computer program is presented that simulates a two-slit gas target scattering geometry. The program is useful in estimating effects due to finite geometry and multiple scattering in the target foil. Details of the program are presented and experience with a specific example is discussed.

  12. Monte Carlo-based assessment of the trade-off between spatial resolution, field-of-view and scattered radiation in the variable resolution X-ray CT scanner

    NARCIS (Netherlands)

    Arabi, Hossein; Asl, Ali Reza Kamali; Ay, Mohammad Reza; Zaidi, Habib

    Objective: The purpose of this work is to evaluate the impact of optimization of magnification on performance parameters of the variable resolution X-ray (VRX) CT scanner. MethodsA realistic model based on an actual VRX CT scanner was implemented in the GATE Monte Carlo simulation platform. To

  13. Monte Carlo-based assessment of the trade-off between spatial resolution, field-of-view and scattered radiation in the variable resolution X-ray CT scanner

    NARCIS (Netherlands)

    Arabi, Hossein; Asl, Ali Reza Kamali; Ay, Mohammad Reza; Zaidi, Habib

    2015-01-01

    Objective: The purpose of this work is to evaluate the impact of optimization of magnification on performance parameters of the variable resolution X-ray (VRX) CT scanner. MethodsA realistic model based on an actual VRX CT scanner was implemented in the GATE Monte Carlo simulation platform. To evalu

  14. Development of ray tracing visualization program by Monte Carlo method

    Energy Technology Data Exchange (ETDEWEB)

    Higuchi, Kenji; Otani, Takayuki [Japan Atomic Energy Research Inst., Tokyo (Japan); Hasegawa, Yukihiro

    1997-09-01

    Ray tracing algorithm is a powerful method to synthesize three dimensional computer graphics. In conventional ray tracing algorithms, a view point is used as a starting point of ray tracing, from which the rays are tracked up to the light sources through center points of pixels on the view screen to calculate the intensities of the pixels. This manner, however, makes it difficult to define the configuration of light source as well as to strictly simulate the reflections of the rays. To resolve these problems, we have developed a new ray tracing means which traces rays from a light source, not from a view point, with use of Monte Carlo method which is widely applied in nuclear fields. Moreover, we adopt the variance reduction techniques to the program with use of the specialized machine (Monte-4) for particle transport Monte Carlo so that the computational time could be successfully reduced. (author)

  15. Development of ray tracing visualization program by Monte Carlo method

    Energy Technology Data Exchange (ETDEWEB)

    Higuchi, Kenji; Otani, Takayuki [Japan Atomic Energy Research Inst., Tokyo (Japan); Hasegawa, Yukihiro

    1997-09-01

    Ray tracing algorithm is a powerful method to synthesize three dimensional computer graphics. In conventional ray tracing algorithms, a view point is used as a starting point of ray tracing, from which the rays are tracked up to the light sources through center points of pixels on the view screen to calculate the intensities of the pixels. This manner, however, makes it difficult to define the configuration of light source as well as to strictly simulate the reflections of the rays. To resolve these problems, we have developed a new ray tracing means which traces rays from a light source, not from a view point, with use of Monte Carlo method which is widely applied in nuclear fields. Moreover, we adopt the variance reduction techniques to the program with use of the specialized machine (Monte-4) for particle transport Monte Carlo so that the computational time could be successfully reduced. (author)

  16. Monte Carlo Simulation Program from the World Petroleum Assessment 2000, DDS-60 (Emc2.xls)

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — Monte Carlo programs described in chapter MC, Monte Carlo Simulation Method. Emc2.xls was the program used to calculate the estimates of undiscovered resources for...

  17. Mont Carlo Simulation Program from the World Petroleum Assessment 2000, DDS-60 (emcee.xls).xml

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — Monte Carlo programs described in chapter MC, Monte Carlo Simulation Method. Emc2.xls was the program used to calculate the estimates of undiscovered resources for...

  18. Monte Carlo Simulation Program from the World Petroleum Assessment 2000, DDS-60 (Emc2.xls).

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — Monte Carlo programs described in chapter MC, Monte Carlo Simulation Method. Emc2.xls was the program used to calculate the estimates of undiscovered resources for...

  19. Mont Carlo Simulation Program from the World Petroleum Assessment 2000, DDS-60 (emcee.xls)

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — Monte Carlo programs described in chapter MC, Monte Carlo Simulation Method. Emc2.xls was the program used to calculate the estimates of undiscovered resources for...

  20. Feasibility of Diagrammatic Monte-Carlo based on weak-coupling expansion in asymptotically free theories: case study of $O(N)$ sigma-model in the large-$N$ limit

    CERN Document Server

    Buividovich, P V

    2015-01-01

    We discuss the feasibility of applying Diagrammatic Monte-Carlo algorithms to the weak-coupling expansions of asymptotically free quantum field theories, taking the large-$N$ limit of the $O(N)$ sigma-model as the simplest example where exact results are available. We use stereographic mapping from the sphere to the real plane to set up the perturbation theory, which results in a small bare mass term proportional to the coupling $\\lambda$. Counting the powers of coupling associated with higher-order interaction vertices, we arrive at the double-series representation for the dynamically generated mass gap in powers of both $\\lambda$ and $\\log(\\lambda)$, which converges quite quickly to the exact non-perturbative answer. We also demonstrate that it is feasible to obtain the coefficients of these double series by a Monte-Carlo sampling in the space of Feynman diagrams. In particular, the sign problem of such sampling becomes milder at small $\\lambda$, that is, close to the continuum limit.

  1. GZP型60Co源剂量学参数的蒙特卡洛模拟%A Monte Carlo-based dosimetric study of the GZP 60Co source

    Institute of Scientific and Technical Information of China (English)

    王先良; 袁珂; 唐斌; 康盛伟; 黎杰; 肖明勇; 李晓兰; 李林涛; 王培

    2016-01-01

    目的 GZP型60Co源高剂量率后装机在临床中已有应用,模拟计算GZP型60Co源的剂量学参数.方法 使用EGSnrc蒙特卡洛软件模拟计算已知的BEBIG60Co源(Co0.A86)剂量学参数,与其结果进行对比,验证方法的可行性.对GZP型高剂量率后装机60Co源进行建模,用同样方法模拟计算GZP型60Co源剂量学参数.结果 对BEBIG 60Co源,结果与标准数据吻合很好,单位活度空气比释动能强度SK/A相差0.2%,剂量率常数∧相差1.0%,径向剂量函数gL(r)和各向异性函数F(r,θ)曲线吻合.计算得到的GZP型60Co源(1、2)号通道的SK/A和∧分别是3.011×10-7 cGycm2h-1Bq-1和1.118 cGyh-1U-1,GZP (3)号通道60Co源的SK/A和∧分别是3.002× 10-7 cGycm2h-1Bq-1和1.110 cGyh-1U-1,gL(r)、F(r,θ)和水模中单位空气比释动能强度的剂量率参照AAPM推荐列出.结论 研究结果可用于GZP型60Co源的计划系统中,也可以作为GZP型60Co源的质量控制.%Objective To simulate and calculate the dosimetric parameters of the GZP 60Co source that has been clinically used in high-dose-rate brachytherapy.Methods The EGSnrc Monte Carlo software was used to simulate and calculate the dosimetric parameters of a well known BEBIG 60Co source (Co0.A86).The results were compared with the actual parameters to verify the feasibility of this method.A Monte Carlo model of the GZP 60Co source for high-dose-rate brachytherapy was established to simulate and calculate its dosimetric parameters in the same way.Results For the BEBIG 60Co source,the resuhs were well accorded with the standard.The air-kerma strength per unit activity (SK/A) and dose rate constant (∧)deviated from the standard by 0.2% and 1.0%,respectively.The curves of the radial dose function gL(r) and the anisotropy function F (r,θ) fit well.For the GZP 60Co source,the SK/A and ∧values were calculated as 3.011 × 10-7 cGycm2h-1Bq-1 and 1.118 cGyh-1 U-1 in channel l&2 and 3.002× 10-7 cGycm2h-1 Bq-1 and 1.110 cGyh-1U

  2. A comparative study on the risk of second primary cancers in out-of-field organs associated with radiotherapy of localized prostate carcinoma using Monte Carlo-based accelerator and patient models

    Energy Technology Data Exchange (ETDEWEB)

    Bednarz, Bryan; Athar, Basit; Xu, X. George [Department of Radiation Oncology, Massachusetts General Hospital and Harvard Medical School, Boston, Massachusetts 02108 and Department of Mechanical Aerospace and Nuclear Engineering, Rensselaer Polytechnic Institute, Troy, New York 12180 (United States)

    2010-05-15

    Purpose: A physician's decision regarding an ideal treatment approach (i.e., radiation, surgery, and/or hormonal) for prostate carcinoma is traditionally based on a variety of metrics. One of these metrics is the risk of radiation-induced second primary cancer following radiation treatments. The aim of this study was to investigate the significance of second cancer risks in out-of-field organs from 3D-CRT and IMRT treatments of prostate carcinoma compared to baseline cancer risks in these organs. Methods: Monte Carlo simulations were performed using a detailed medical linear accelerator model and an anatomically realistic adult male whole-body phantom. A four-field box treatment, a four-field box treatment plus a six-field boost, and a seven-field IMRT treatment were simulated. Using BEIR VII risk models, the age-dependent lifetime attributable risks to various organs outside the primary beam with a known predilection for cancer were calculated using organ-averaged equivalent doses. Results: The four-field box treatment had the lowest treatment-related second primary cancer risks to organs outside the primary beam ranging from 7.3x10{sup -9} to 2.54x10{sup -5}%/MU depending on the patients age at exposure and second primary cancer site. The risks to organs outside the primary beam from the four-field box and six-field boost and the seven-field IMRT were nearly equivalent. The risks from the four-field box and six-field boost ranged from 1.39x10{sup -8} to 1.80x10{sup -5}%/MU, and from the seven-field IMRT ranged from 1.60x10{sup -9} to 1.35x10{sup -5}%/MU. The second cancer risks in all organs considered from each plan were below the baseline risks. Conclusions: The treatment-related second cancer risks in organs outside the primary beam due to 3D-CRT and IMRT is small. New risk assessment techniques need to be investigated to address the concern of radiation-induced second cancers from prostate treatments, particularly focusing on risks to organs inside the

  3. A comparative study on the risk of second primary cancers in out-of-field organs associated with radiotherapy of localized prostate carcinoma using Monte Carlo-based accelerator and patient models

    Science.gov (United States)

    Bednarz, Bryan; Athar, Basit; Xu, X. George

    2010-01-01

    Purpose: A physician’s decision regarding an ideal treatment approach (i.e., radiation, surgery, and∕or hormonal) for prostate carcinoma is traditionally based on a variety of metrics. One of these metrics is the risk of radiation-induced second primary cancer following radiation treatments. The aim of this study was to investigate the significance of second cancer risks in out-of-field organs from 3D-CRT and IMRT treatments of prostate carcinoma compared to baseline cancer risks in these organs. Methods: Monte Carlo simulations were performed using a detailed medical linear accelerator model and an anatomically realistic adult male whole-body phantom. A four-field box treatment, a four-field box treatment plus a six-field boost, and a seven-field IMRT treatment were simulated. Using BEIR VII risk models, the age-dependent lifetime attributable risks to various organs outside the primary beam with a known predilection for cancer were calculated using organ-averaged equivalent doses. Results: The four-field box treatment had the lowest treatment-related second primary cancer risks to organs outside the primary beam ranging from 7.3×10−9 to 2.54×10−5%∕MU depending on the patients age at exposure and second primary cancer site. The risks to organs outside the primary beam from the four-field box and six-field boost and the seven-field IMRT were nearly equivalent. The risks from the four-field box and six-field boost ranged from 1.39×10−8 to 1.80×10−5%∕MU, and from the seven-field IMRT ranged from 1.60×10−9 to 1.35×10−5%∕MU. The second cancer risks in all organs considered from each plan were below the baseline risks. Conclusions: The treatment-related second cancer risks in organs outside the primary beam due to 3D-CRT and IMRT is small. New risk assessment techniques need to be investigated to address the concern of radiation-induced second cancers from prostate treatments, particularly focusing on risks to organs inside the primary beam

  4. Monte Carlo Based Toy Model for Fission Process

    CERN Document Server

    Kurniadi, R; Viridi, S

    2014-01-01

    Fission yield has been calculated notoriously by two calculations approach, macroscopic approach and microscopic approach. This work will proposes another calculation approach which the nucleus is treated as a toy model. The toy model of fission yield is a preliminary method that use random number as a backbone of the calculation. Because of nucleus as a toy model hence the fission process does not represent real fission process in nature completely. Fission event is modeled by one random number. The number is assumed as width of distribution probability of nucleon position in compound nuclei when fission process is started. The toy model is formed by Gaussian distribution of random number that randomizes distance like between particle and central point. The scission process is started by smashing compound nucleus central point into two parts that are left central and right central points. These three points have different Gaussian distribution parameters such as mean ({\\mu}CN, {\\mu}L, {\\mu}R), and standard d...

  5. Development and evaluation of Monte Carlo-based SPECT reconstruction

    NARCIS (Netherlands)

    Xiao, J.

    2009-01-01

    Single Photon Emission Computed Tomography (SPECT) is one of the most applied molecular imaging techniques to diagnose human diseases, e.g., of the heart, the brain or in oncology. For example, cardiac SPECT imaging plays a central role in diagnosing coronary heart diseases by providing clinicians w

  6. A Markov Chain Monte Carlo Based Method for System Identification

    Energy Technology Data Exchange (ETDEWEB)

    Glaser, R E; Lee, C L; Nitao, J J; Hanley, W G

    2002-10-22

    This paper describes a novel methodology for the identification of mechanical systems and structures from vibration response measurements. It combines prior information, observational data and predictive finite element models to produce configurations and system parameter values that are most consistent with the available data and model. Bayesian inference and a Metropolis simulation algorithm form the basis for this approach. The resulting process enables the estimation of distributions of both individual parameters and system-wide states. Attractive features of this approach include its ability to: (1) provide quantitative measures of the uncertainty of a generated estimate; (2) function effectively when exposed to degraded conditions including: noisy data, incomplete data sets and model misspecification; (3) allow alternative estimates to be produced and compared, and (4) incrementally update initial estimates and analysis as more data becomes available. A series of test cases based on a simple fixed-free cantilever beam is presented. These results demonstrate that the algorithm is able to identify the system, based on the stiffness matrix, given applied force and resultant nodal displacements. Moreover, it effectively identifies locations on the beam where damage (represented by a change in elastic modulus) was specified.

  7. A Monte Carlo Based Analysis of Optimal Design Criteria

    Science.gov (United States)

    2011-11-09

    MATLAB’s fmincon or SolvOpt, developed by A. Kuntsevich and F. Kappel [18, 17], with four variations of the constraint implementation. We denote by (C1...Statistics, John Wiley & Sons, Inc., New York, NY, 1981. [17] F. Kappel and A. V. Kuntsevich , An implementation of Shor’s r-algorithm, Computational...Optimization and Applications, 15 (2000), 193–205. [18] A. Kuntsevich and F. Kappel, SolvOpt, retrieved December 2009, from http://www.kfunigraz.ac.at

  8. A Monte Carlo based spent fuel analysis safeguards strategy assessment

    Energy Technology Data Exchange (ETDEWEB)

    Fensin, Michael L [Los Alamos National Laboratory; Tobin, Stephen J [Los Alamos National Laboratory; Swinhoe, Martyn T [Los Alamos National Laboratory; Menlove, Howard O [Los Alamos National Laboratory; Sandoval, Nathan P [Los Alamos National Laboratory

    2009-01-01

    Safeguarding nuclear material involves the detection of diversions of significant quantities of nuclear materials, and the deterrence of such diversions by the risk of early detection. There are a variety of motivations for quantifying plutonium in spent fuel assemblies by means of nondestructive assay (NDA) including the following: strengthening the capabilities of the International Atomic Energy Agencies ability to safeguards nuclear facilities, shipper/receiver difference, input accountability at reprocessing facilities and burnup credit at repositories. Many NDA techniques exist for measuring signatures from spent fuel; however, no single NDA technique can, in isolation, quantify elemental plutonium and other actinides of interest in spent fuel. A study has been undertaken to determine the best integrated combination of cost effective techniques for quantifying plutonium mass in spent fuel for nuclear safeguards. A standardized assessment process was developed to compare the effective merits and faults of 12 different detection techniques in order to integrate a few techniques and to down-select among the techniques in preparation for experiments. The process involves generating a basis burnup/enrichment/cooling time dependent spent fuel assembly library, creating diversion scenarios, developing detector models and quantifying the capability of each NDA technique. Because hundreds of input and output files must be managed in the couplings of data transitions for the different facets of the assessment process, a graphical user interface (GUI) was development that automates the process. This GUI allows users to visually create diversion scenarios with varied replacement materials, and generate a MCNPX fixed source detector assessment input file. The end result of the assembly library assessment is to select a set of common source terms and diversion scenarios for quantifying the capability of each of the 12 NDA techniques. We present here the generalized assessment process, the techniques employed to automate the coupled facets of the assessment process, and the standard burnup/enrichment/cooling time dependent spent fuel assembly library. We also clearly define the diversion scenarios that will be analyzed during the standardized assessments. Though this study is currently limited to generic PWR assemblies, it is expected that the results of the assessment will yield an adequate spent fuel analysis strategy knowledge that will help the down-select process for other reactor types.

  9. Hardware acceleration of Monte Carlo-based simulations

    OpenAIRE

    Echeverría Aramendi, Pedro

    2011-01-01

    During the last years there has been an enormous advance in FPGAs. Traditionally, FPGAs have been used mainly for prototyping as they offer significant advantages at a suitable low cost: flexibility and verification easiness. Their flexibility allows the implementation of different generations of a given application and provides space to designers to modify implementations until the very last moment, or even correct mistakes once the product has been released. Second, the verification of a de...

  10. Benchmarking of Proton Transport in Super Monte Carlo Simulation Program

    Science.gov (United States)

    Wang, Yongfeng; Li, Gui; Song, Jing; Zheng, Huaqing; Sun, Guangyao; Hao, Lijuan; Wu, Yican

    2014-06-01

    The Monte Carlo (MC) method has been traditionally applied in nuclear design and analysis due to its capability of dealing with complicated geometries and multi-dimensional physics problems as well as obtaining accurate results. The Super Monte Carlo Simulation Program (SuperMC) is developed by FDS Team in China for fusion, fission, and other nuclear applications. The simulations of radiation transport, isotope burn-up, material activation, radiation dose, and biology damage could be performed using SuperMC. Complicated geometries and the whole physical process of various types of particles in broad energy scale can be well handled. Bi-directional automatic conversion between general CAD models and full-formed input files of SuperMC is supported by MCAM, which is a CAD/image-based automatic modeling program for neutronics and radiation transport simulation. Mixed visualization of dynamical 3D dataset and geometry model is supported by RVIS, which is a nuclear radiation virtual simulation and assessment system. Continuous-energy cross section data from hybrid evaluated nuclear data library HENDL are utilized to support simulation. Neutronic fixed source and critical design parameters calculates for reactors of complex geometry and material distribution based on the transport of neutron and photon have been achieved in our former version of SuperMC. Recently, the proton transport has also been intergrated in SuperMC in the energy region up to 10 GeV. The physical processes considered for proton transport include electromagnetic processes and hadronic processes. The electromagnetic processes include ionization, multiple scattering, bremsstrahlung, and pair production processes. Public evaluated data from HENDL are used in some electromagnetic processes. In hadronic physics, the Bertini intra-nuclear cascade model with exitons, preequilibrium model, nucleus explosion model, fission model, and evaporation model are incorporated to treat the intermediate energy nuclear

  11. 基于Monte Carlo法的光电跟踪测量系统精度分析和布站优化方法%A new Monte Carlo based measure error analysis and station position optimization method for photoelectrical tracking system

    Institute of Scientific and Technical Information of China (English)

    张振铎; 张彬

    2012-01-01

    提出一种基于Monte Carlo法的光电跟踪测量系统的分析方法,使用坐标变换方法对光电经纬仪建立了包含照准差、横轴差、竖轴差、传感器误差和编码器误差准确的Verilog-A模型,使用最坏情况法和Monte Carlo法分析了各种误差源对系统性能的影响.并对双站交汇的布站进行了优化,在考虑经纬仪本身误差源和站点位置误差的情况下,使用Monte Carlo法计算了针对特定弹道轨迹的最优布站选择.该方法对光电跟踪测量系统设计具有一定的指导作用.%A new measure error analytic method based on Monte Carlo is provided. The accurate Verilog-A model is established based on coordinate transformation, including collimate error, vertical axes error, horizontal axes error, sensor induced error and encoder induced error. The worst condition analysis and Monte Carlo analysis are used to calculate the effect of various error sources toward system characteristic. The station location design problem of double intercross measure for customizing ballistic trajectory is analyzed using Monte Carlo method, taking consideration of the theodolite induced error and station location error.

  12. Monte Carlo direct view factor and generalized radiative heat transfer programs

    Science.gov (United States)

    Mc Williams, J. L.; Scates, J. H.

    1969-01-01

    Computer programs find the direct view factor from one surface segment to another using the Monte carlo technique, and the radioactive-transfer coefficients between surface segments. An advantage of the programs is the great generality of problems treatable and rapidity of solution from problem conception to receipt of results.

  13. SPHINX v 1.1 Monte Carlo Program for Polarized Nucleon-Nucleon Collisions (update)

    CERN Document Server

    Güllenstern, S; Górnicki, P; Mankiewicz, L; Schäfer, A; Güllenstern, Stefan; Martin, Oliver; Gornicki, Pawel; Mankiewicz, Lech; Schäfer, Andreas

    1996-01-01

    We present the updated long write-up for version 1.1 of the SPHINX Monte Carlo. The program can be used to simulate polarized nucleon - nucleon collisions at high energies. Spins of colliding particles are taken into account. The program allows the calculation of cross sections for various processes.

  14. Monte Carlo programs and other utilities for high energy physics

    Energy Technology Data Exchange (ETDEWEB)

    Palounek, A.P.T. (Lawrence Berkeley Lab., CA (USA)); Youssef, S. (Florida State Univ., Tallahassee, FL (USA). Supercomputer Computations Research Inst.)

    1990-05-01

    The Software Standards and Documentation Group of the Workshop on Physics and Detector Simulation for SSC Experiments has compiled a list of physics generators, detector simulations, and related programs. This is not meant to be an exhaustive compilation, nor is any judgment made about program quality; it is a starting point or a more complete bibliography. Where possible we have included an author and source for the code. References for most programs are in the final section.

  15. Computer program uses Monte Carlo techniques for statistical system performance analysis

    Science.gov (United States)

    Wohl, D. P.

    1967-01-01

    Computer program with Monte Carlo sampling techniques determines the effect of a component part of a unit upon the overall system performance. It utilizes the full statistics of the disturbances and misalignments of each component to provide unbiased results through simulated random sampling.

  16. A general framework for implementing NLO calculations in shower Monte Carlo programs. The POWHEG BOX

    Energy Technology Data Exchange (ETDEWEB)

    Alioli, Simone [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany); Nason, Paolo [INFN, Milano-Bicocca (Italy); Oleari, Carlo [INFN, Milano-Bicocca (Italy); Milano-Bicocca Univ. (Italy); Re, Emanuele [Durham Univ. (United Kingdom). Inst. for Particle Physics Phenomenology

    2010-02-15

    In this work we illustrate the POWHEG BOX, a general computer code framework for implementing NLO calculations in shower Monte Carlo programs according to the POWHEG method. Aim of this work is to provide an illustration of the needed theoretical ingredients, a view of how the code is organized and a description of what a user should provide in order to use it. (orig.)

  17. Toward a Monte Carlo program for simulating vapor-liquid phase equilibria from first principles

    Energy Technology Data Exchange (ETDEWEB)

    McGrath, M; Siepmann, J I; Kuo, I W; Mundy, C J; Vandevondele, J; Sprik, M; Hutter, J; Mohamed, F; Krack, M; Parrinello, M

    2004-10-20

    Efficient Monte Carlo algorithms are combined with the Quickstep energy routines of CP2K to develop a program that allows for Monte Carlo simulations in the canonical, isobaric-isothermal, and Gibbs ensembles using a first principles description of the physical system. Configurational-bias Monte Carlo techniques and pre-biasing using an inexpensive approximate potential are employed to increase the sampling efficiency and to reduce the frequency of expensive ab initio energy evaluations. The new Monte Carlo program has been validated through extensive comparison with molecular dynamics simulations using the programs CPMD and CP2K. Preliminary results for the vapor-liquid coexistence properties (T = 473 K) of water using the Becke-Lee-Yang-Parr exchange and correlation energy functionals, a triple-zeta valence basis set augmented with two sets of d-type or p-type polarization functions, and Goedecker-Teter-Hutter pseudopotentials are presented. The preliminary results indicate that this description of water leads to an underestimation of the saturated liquid density and heat of vaporization and, correspondingly, an overestimation of the saturated vapor pressure.

  18. A Proposal for a Standard Interface Between Monte Carlo Tools And One-Loop Programs

    Energy Technology Data Exchange (ETDEWEB)

    Binoth, T.; /Edinburgh U.; Boudjema, F.; /Annecy, LAPP; Dissertori, G.; Lazopoulos, A.; /Zurich, ETH; Denner, A.; /PSI, Villigen; Dittmaier, S.; /Freiburg U.; Frederix, R.; Greiner, N.; Hoeche, Stefan; /Zurich U.; Giele, W.; Skands, P.; Winter, J.; /Fermilab; Gleisberg, T.; /SLAC; Archibald, J.; Heinrich, G.; Krauss, F.; Maitre, D.; /Durham U., IPPP; Huber, M.; /Munich, Max Planck Inst.; Huston, J.; /Michigan State U.; Kauer, N.; /Royal Holloway, U. of London; Maltoni, F.; /Louvain U., CP3 /Milan Bicocca U. /INFN, Turin /Turin U. /Granada U., Theor. Phys. Astrophys. /CERN /NIKHEF, Amsterdam /Heidelberg U. /Oxford U., Theor. Phys.

    2011-11-11

    Many highly developed Monte Carlo tools for the evaluation of cross sections based on tree matrix elements exist and are used by experimental collaborations in high energy physics. As the evaluation of one-loop matrix elements has recently been undergoing enormous progress, the combination of one-loop matrix elements with existing Monte Carlo tools is on the horizon. This would lead to phenomenological predictions at the next-to-leading order level. This note summarises the discussion of the next-to-leading order multi-leg (NLM) working group on this issue which has been taking place during the workshop on Physics at TeV Colliders at Les Houches, France, in June 2009. The result is a proposal for a standard interface between Monte Carlo tools and one-loop matrix element programs.

  19. A proposal for a standard interface between Monte Carlo tools and one-loop programs

    Energy Technology Data Exchange (ETDEWEB)

    Binoth, T.; Boudjema, F.; Dissertori, G.; Lazopoulos, A.; Denner, A.; Dittmaier, S.; Frederix, R.; Greiner, N.; Hoche, S.; Giele, W.; Skands, P.

    2010-01-01

    Many highly developed Monte Carlo tools for the evaluation of cross sections based on tree matrix elements exist and are used by experimental collaborations in high energy physics. As the evaluation of one-loop matrix elements has recently been undergoing enormous progress, the combination of one-loop matrix elements with existing Monte Carlo tools is on the horizon. This would lead to phenomenological predictions at the next-to-leading order level. This note summarizes the discussion of the next-to-leading order multi-leg (NLM) working group on this issue which has been taking place during the workshop on Physics at TeV colliders at Les Houches, France, in June 2009. The result is a proposal for a standard interface between Monte Carlo tools and one-loop matrix element programs.

  20. Quantum Monte Carlo programming for atoms, molecules, clusters, and solids

    CERN Document Server

    Schattke, Wolfgang

    2013-01-01

    In one source, this textbook provides quick and comprehensive access to quantitative calculations in materials science. The authors address both newcomers as well as researchers who would like to become familiar with QMC in order to apply to their research. As such, they cover the basic theory required for applying the method, and describe how to transfer this knowledge into calculation. The book includes a series of problems of increasing difficulty with associated stand-alone programs which will be available for free download.

  1. A Monte Carlo study of temperature-programmed desorption spectra with attractive lateral interactions

    CERN Document Server

    Jansen, A P J

    1995-01-01

    We present results of a Monte Carlo study of temperature-programmed desorption in a model system with attractive lateral interactions. It is shown that even for weak interactions there are large shifts of the peak maximum temperatures with initial coverage. The system has a transition temperature below which the desorption has a negative order. An analytical expression for this temperature is derived. The relation between the model and real systems is discussed.

  2. The research program of the Liquid Scintillation Detector (LSD) in the Mont Blanc Laboratory

    Science.gov (United States)

    Dadykin, V. L.; Yakushev, V. F.; Korchagin, P. V.; Korchagin, V. B.; Malgin, A. S.; Ryassny, F. G.; Ryazhskaya, O. G.; Talochkin, V. P.; Zatsepin, G. T.; Badino, G.

    1985-01-01

    A massive (90 tons) liquid scintillation detector (LSD) has been running since October 1984 in the Mont Blanc Laboratory at a depth of 5,200 hg/sq cm of standard rock. The research program of the experiment covers a variety of topics in particle physics and astrophysics. The performance of the detector, the main fields of research are presented and the preliminary results are discussed.

  3. TITAN: a computer program for accident occurrence frequency analyses by component Monte Carlo simulation

    Energy Technology Data Exchange (ETDEWEB)

    Nomura, Yasushi [Department of Fuel Cycle Safety Research, Nuclear Safety Research Center, Tokai Research Establishment, Japan Atomic Energy Research Institute, Tokai, Ibaraki (Japan); Tamaki, Hitoshi [Department of Safety Research Technical Support, Tokai Research Establishment, Japan Atomic Energy Research Institute, Tokai, Ibaraki (Japan); Kanai, Shigeru [Fuji Research Institute Corporation, Tokyo (Japan)

    2000-04-01

    In a plant system consisting of complex equipments and components for a reprocessing facility, there might be grace time between an initiating event and a resultant serious accident, allowing operating personnel to take remedial actions, thus, terminating the ongoing accident sequence. A component Monte Carlo simulation computer program TITAN has been developed to analyze such a complex reliability model including the grace time without any difficulty to obtain an accident occurrence frequency. Firstly, basic methods for the component Monte Carlo simulation is introduced to obtain an accident occurrence frequency, and then, the basic performance such as precision, convergence, and parallelization of calculation, is shown through calculation of a prototype accident sequence model. As an example to illustrate applicability to a real scale plant model, a red oil explosion in a German reprocessing plant model is simulated to show that TITAN can give an accident occurrence frequency with relatively good accuracy. Moreover, results of uncertainty analyses by TITAN are rendered to show another performance, and a proposal is made for introducing of a new input-data format to adapt the component Monte Carlo simulation. The present paper describes the calculational method, performance, applicability to a real scale, and new proposal for the TITAN code. In the Appendixes, a conventional analytical method is shown to avoid complex and laborious calculation to obtain a strict solution of accident occurrence frequency, compared with Monte Carlo method. The user's manual and the list/structure of program are also contained in the Appendixes to facilitate TITAN computer program usage. (author)

  4. Monte Carlo techniques in radiation therapy

    CERN Document Server

    Verhaegen, Frank

    2013-01-01

    Modern cancer treatment relies on Monte Carlo simulations to help radiotherapists and clinical physicists better understand and compute radiation dose from imaging devices as well as exploit four-dimensional imaging data. With Monte Carlo-based treatment planning tools now available from commercial vendors, a complete transition to Monte Carlo-based dose calculation methods in radiotherapy could likely take place in the next decade. Monte Carlo Techniques in Radiation Therapy explores the use of Monte Carlo methods for modeling various features of internal and external radiation sources, including light ion beams. The book-the first of its kind-addresses applications of the Monte Carlo particle transport simulation technique in radiation therapy, mainly focusing on external beam radiotherapy and brachytherapy. It presents the mathematical and technical aspects of the methods in particle transport simulations. The book also discusses the modeling of medical linacs and other irradiation devices; issues specific...

  5. Quantum Monte-Carlo programming for atoms, molecules, clusters, and solids

    Energy Technology Data Exchange (ETDEWEB)

    Schattke, Wolfgang [Kiel Univ. (Germany). Inst. of Theoretical Physics and Astrophysics; Ikerbasque Foundation/Donostia International Physics Center, San Sebastian (Spain); Diez Muino, Ricardo [Centro de Fisica de Materiales CSIC-UPV/EHU (Spain); Donostia International Physics Center, San Sebastian (Spain)

    2013-11-01

    This is a book that initiates the reader into the basic concepts and practical applications of Quantum Monte Carlo. Because of the simplicity of its theoretical concept, the authors focus on the variational Quantum Monte Carlo scheme. The reader is enabled to proceed from simple examples as the hydrogen atom to advanced ones as the Lithium solid. In between, several intermediate steps are introduced, including the Hydrogen molecule (2 electrons), the Lithium atom (3 electrons) and expanding to an arbitrary number of electrons to finally treat the three-dimensional periodic array of Lithium atoms in a crystal. The book is unique, because it provides both theory and numerical programs. It pedagogically explains how to transfer into computational tools what is usually described in a theoretical textbook. It also includes the detailed physical understanding of methodology that cannot be found in a code manual. The combination of both aspects allows the reader to assimilate the fundamentals of Quantum Monte Carlo not only by reading but also by practice.

  6. Development of ANJOYMC Program for Automatic Generation of Monte Carlo Cross Section Libraries

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Kang Seog; Lee, Chung Chan

    2007-03-15

    The NJOY code developed at Los Alamos National Laboratory is to generate the cross section libraries in ACE format for the Monte Carlo codes such as MCNP and McCARD by processing the evaluated nuclear data in ENDF/B format. It takes long time to prepare all the NJOY input files for hundreds of nuclides with various temperatures, and there can be some errors in the input files. In order to solve these problems, ANJOYMC program has been developed. By using a simple user input deck, this program is not only to generate all the NJOY input files automatically, but also to generate a batch file to perform all the NJOY calculations. The ANJOYMC program is written in Fortran90 and can be executed under the WINDOWS and LINUX operating systems in Personal Computer. Cross section libraries in ACE format can be generated in a short time and without an error by using a simple user input deck.

  7. STARlight: A Monte Carlo simulation program for ultra-peripheral collisions of relativistic ions

    Science.gov (United States)

    Klein, Spencer R.; Nystrand, Joakim; Seger, Janet; Gorbunov, Yuri; Butterworth, Joey

    2017-03-01

    Ultra-peripheral collisions (UPCs) have been a significant source of study at RHIC and the LHC. In these collisions, the two colliding nuclei interact electromagnetically, via two-photon or photonuclear interactions, but not hadronically; they effectively miss each other. Photonuclear interactions produce vector meson states or more general photonuclear final states, while two-photon interactions can produce lepton or meson pairs, or single mesons. In these interactions, the collision geometry plays a major role. We present a program, STARlight, that calculates the cross-sections for a variety of UPC final states and also creates, via Monte Carlo simulation, events for use in determining detector efficiency.

  8. STARlight: A Monte Carlo simulation program for ultra-peripheral collisions of relativistic ions

    CERN Document Server

    Klein, Spencer R; Seger, Janet; Gorbunov, Yuri; Butterworth, Joey

    2016-01-01

    Ultra-peripheral collisions (UPCs) have been a significant source of study at RHIC and the LHC. In these collisions, the two colliding nuclei interact electromagnetically, via two-photon or photonuclear interactions, but not hadronically; they effectively miss each other. Photonuclear interactions produce vector meson states or more general photonuclear final states, while two-photon interactions can produce lepton or meson pairs, or single mesons. In these interactions, the collision geometry plays a major role. We present a program, STARlight, that calculates the cross-sections for a variety of UPC final states and also creates, via Monte Carlo simulation, events for use in determining detector efficiency.

  9. New techniques in Monte Carlo simulation: experience with a prototype of generic programming application to Geant4 physics processes

    CERN Document Server

    Pia, Maria Grazia; Begalli, Marcia; Quintieri, Lina; Saracco, Paolo; Sudhakar, Manju; Weidenspointner, Georg; Zoglauer, Andreas

    2010-01-01

    An investigation is in progress to evaluate extensively and quantitatively the possible benefits and drawbacks of new programming paradigms in a Monte Carlo simulation environment, namely in the domain of physics modeling. The prototype design and extensive benchmarks, including a variety of rigorous quantitative metrics, are presented. The results of this research project allow the evaluation of new software techniques for their possible adoption in Monte Carlo simulation on objective, quantitative ground.

  10. Pareto Optimal Solutions for Stochastic Dynamic Programming Problems via Monte Carlo Simulation

    Directory of Open Access Journals (Sweden)

    R. T. N. Cardoso

    2013-01-01

    Full Text Available A heuristic algorithm is proposed for a class of stochastic discrete-time continuous-variable dynamic programming problems submitted to non-Gaussian disturbances. Instead of using the expected values of the objective function, the randomness nature of the decision variables is kept along the process, while Pareto fronts weighted by all quantiles of the objective function are determined. Thus, decision makers are able to choose any quantile they wish. This new idea is carried out by using Monte Carlo simulations embedded in an approximate algorithm proposed to deterministic dynamic programming problems. The new method is tested in instances of the classical inventory control problem. The results obtained attest for the efficiency and efficacy of the algorithm in solving these important stochastic optimization problems.

  11. Litrani a General Purpose Monte-Carlo Program Simulating Light Propagation In Isotropic or Anisotropic Media

    CERN Document Server

    Gentit, François-Xavier

    2001-01-01

    Litrani is a general purpose Monte-Carlo program simulating light propagation in any type of setup describable by the shapes provided by ROOT. Each shape may be made of a different material. Dielectric constant, absorption length and diffusion length of materials may depend upon wavelength. Dielectric constant and absorption length may be anisotropic. Each face of a volume is either partially or totally in contact with a face of another volume, or covered with some wrapping having defined characteristics of absorption, reflection and diffusion. When in contact with another face of another volume, the possibility exists to have a thin slice of width d and index n between the 2 faces. The program has various sources of light: spontaneous photons, photons coming from an optical fibre, photons generated by the crossing of particles or photons generated by an electromagnetic shower. The time and wavelength spectra of emitted photons may reproduce any scintillation spectrum. As detectors, phototubes, APD, or any ge...

  12. CAD-based Monte Carlo Program for Integrated Simulation of Nuclear System SuperMC

    Science.gov (United States)

    Wu, Yican; Song, Jing; Zheng, Huaqing; Sun, Guangyao; Hao, Lijuan; Long, Pengcheng; Hu, Liqin

    2014-06-01

    Monte Carlo (MC) method has distinct advantages to simulate complicated nuclear systems and is envisioned as routine method for nuclear design and analysis in the future. High fidelity simulation with MC method coupled with multi-physical phenomenon simulation has significant impact on safety, economy and sustainability of nuclear systems. However, great challenges to current MC methods and codes prevent its application in real engineering project. SuperMC is a CAD-based Monte Carlo program for integrated simulation of nuclear system developed by FDS Team, China, making use of hybrid MC-deterministic method and advanced computer technologies. The design aim, architecture and main methodology of SuperMC were presented in this paper. SuperMC2.1, the latest version for neutron, photon and coupled neutron and photon transport calculation, has been developed and validated by using a series of benchmarking cases such as the fusion reactor ITER model and the fast reactor BN-600 model. SuperMC is still in its evolution process toward a general and routine tool for nuclear system. Warning, no authors found for 2014snam.conf06023.

  13. Applying Monte Carlo Concept and Linear Programming in Modern Portfolio Theory to Obtain Best Weighting Structure

    Directory of Open Access Journals (Sweden)

    Tumpal Sihombing

    2013-01-01

    Full Text Available The world is entering the era of recession when the trend is bearish and market is not so favorable. The capital markets in every major country were experiencing great amount of loss and people suffered in their investment. The Jakarta Composite Index (JCI has shown a great downturn for the past one year but the trend bearish year of the JCI. Therefore, rational investors should consider restructuring their portfolio to set bigger proportion in bonds and cash instead of stocks. Investors can apply modern portfolio theory by Harry Markowitz to find the optimum asset allocation for their portfolio. Higher return is always associated with higher risk. This study shows investors how to find out the lowest risk of a portfolio investment by providing them with several structures of portfolio weighting. By this way, investor can compare and make the decision based on risk-return consideration and opportunity cost as well. Keywords: Modern portfolio theory, Monte Carlo, linear programming

  14. DOSIS: a Monte Carlo simulation program for dose related studies in mammography.

    Science.gov (United States)

    Delis, H; Spyrou, G; Panayiotakis, G; Tzanakos, G

    2005-06-01

    Dosimetric studies in mammography are addressed by means of a Monte Carlo simulation program. The core of this program (DOSIS: dosimetry simulation studies) is a simulation model developed using FORTRAN 90, enriched with a graphical user interface developed in MS Visual Basic. User defined mammographic technique parameters affecting breast dose are imported to the simulation model and the produced results are provided by means of both absolute (surface dose, exposure at detector plane) and relative quantities (percentage depth dose, isodose curves). The program functionality has been demonstrated in the evaluation of various mammographic examination techniques. Specifically, the influence of tube voltage and filtration on the surface dose and the exposure at detector plane has been studied utilizing a water phantom. Increase of tube voltage from 25 to 30 kVp for a Mo/Mo system resulted in a 42% decrease of the surface dose for a thick breast (6 cm), without changing the exposure at the detector plane. Use of 1.02 mm Al filter for a W anode system operating at 30 kVp resulted in a 19.1% decrease of the surface dose delivered to a 5 cm water equivalent breast. Overall, W/Al systems appear to have improved dosimetric performance, resulting up to a 65% decrease of surface dose compared to Mo/Mo systems, for identical exposures at the detector plane and breast thicknesses.

  15. PCXMC, a Monte Carlo program for calculating patient doses in medical x-ray examinations

    Energy Technology Data Exchange (ETDEWEB)

    Tapiovaara, M.; Siiskonen, T.

    2008-11-15

    PCXMC is a Monte Carlo program for calculating patients' organ doses and effective doses in medical x-ray examinations. The organs and tissues considered in the program are: active bone marrow, adrenals, brain, breasts, colon (upper and lower large intestine), extrathoracic airways, gall bladder, heart, kidneys, liver, lungs, lymph nodes, muscle, oesophagus, oral mucosa, ovaries, pancreas, prostate, salivary glands, skeleton, skin, small intestine, spleen, stomach, testicles, thymus, thyroid, urinary bladder and uterus. The program calculates the effective dose with both the present tissue weighting factors of ICRP Publication 103 (2007) and the old tissue weighting factors of ICRP Publication 60 (1991). The anatomical data are based on the mathematical hermaphrodite phantom models of Cristy and Eckerman (1987), which describe patients of six different ages: new-born, 1, 5, 10, 15-year-old and adult patients. Some changes are made to these phantoms in order to make them more realistic for external irradiation conditions and to enable the calculation of the effective dose according to the new ICRP Publication 103 tissue weighting factors. The phantom sizes are adjustable to mimic patients of an arbitrary weight and height. PCXMC allows a free adjustment of the x-ray beam projection and other examination conditions of projection radiography and fluoroscopy

  16. An automated Monte-Carlo based method for the calculation of cascade summing factors

    Science.gov (United States)

    Jackson, M. J.; Britton, R.; Davies, A. V.; McLarty, J. L.; Goodwin, M.

    2016-10-01

    A versatile method has been developed to calculate cascade summing factors for use in quantitative gamma-spectrometry analysis procedures. The proposed method is based solely on Evaluated Nuclear Structure Data File (ENSDF) nuclear data, an X-ray energy library, and accurate efficiency characterisations for single detector counting geometries. The algorithm, which accounts for γ-γ, γ-X, γ-511 and γ-e- coincidences, can be applied to any design of gamma spectrometer and can be expanded to incorporate any number of nuclides. Efficiency characterisations can be derived from measured or mathematically modelled functions, and can accommodate both point and volumetric source types. The calculated results are shown to be consistent with an industry standard gamma-spectrometry software package. Additional benefits including calculation of cascade summing factors for all gamma and X-ray emissions, not just the major emission lines, are also highlighted.

  17. Monte Carlo based performance assessment of different animal PET architectures using pixellated CZT detectors

    Energy Technology Data Exchange (ETDEWEB)

    Visvikis, D. [INSERM U650, LaTIM, University Hospital Medical School, F-29609 Brest (France)]. E-mail: Visvikis.Dimitris@univ-brest.fr; Lefevre, T. [INSERM U650, LaTIM, University Hospital Medical School, F-29609 Brest (France); Lamare, F. [INSERM U650, LaTIM, University Hospital Medical School, F-29609 Brest (France); Kontaxakis, G. [ETSI Telecomunicacion Universidad Politecnica de Madrid, Ciudad Universitaria, s/n 28040, Madrid (Spain); Santos, A. [ETSI Telecomunicacion Universidad Politecnica de Madrid, Ciudad Universitaria, s/n 28040, Madrid (Spain); Darambara, D. [Department of Physics, School of Engineering and Physical Sciences, University of Surrey, Guildford (United Kingdom)

    2006-12-20

    The majority of present position emission tomography (PET) animal systems are based on the coupling of high-density scintillators and light detectors. A disadvantage of these detector configurations is the compromise between image resolution, sensitivity and energy resolution. In addition, current combined imaging devices are based on simply placing back-to-back and in axial alignment different apparatus without any significant level of software or hardware integration. The use of semiconductor CdZnTe (CZT) detectors is a promising alternative to scintillators for gamma-ray imaging systems. At the same time CZT detectors have the potential properties necessary for the construction of a truly integrated imaging device (PET/SPECT/CT). The aims of this study was to assess the performance of different small animal PET scanner architectures based on CZT pixellated detectors and compare their performance with that of state of the art existing PET animal scanners. Different scanner architectures were modelled using GATE (Geant4 Application for Tomographic Emission). Particular scanner design characteristics included an overall cylindrical scanner format of 8 and 24 cm in axial and transaxial field of view, respectively, and a temporal coincidence window of 8 ns. Different individual detector modules were investigated, considering pixel pitch down to 0.625 mm and detector thickness from 1 to 5 mm. Modified NEMA NU2-2001 protocols were used in order to simulate performance based on mouse, rat and monkey imaging conditions. These protocols allowed us to directly compare the performance of the proposed geometries with the latest generation of current small animal systems. Results attained demonstrate the potential for higher NECR with CZT based scanners in comparison to scintillator based animal systems.

  18. Fast online Monte Carlo-based IMRT planning for the MRI linear accelerator

    NARCIS (Netherlands)

    Bol, G.H.; Hissoiny, S.; Lagendijk, J. J. W.; Raaymakers, B. W.

    2012-01-01

    The MRI accelerator, a combination of a 6 MV linear accelerator with a 1.5 T MRI, facilitates continuous patient anatomy updates regarding translations, rotations and deformations of targets and organs at risk. Accounting for these demands high speed, online intensity-modulated radiotherapy (IMRT) r

  19. Monte-Carlo-based studies of a polarized positron source for International Linear Collider (ILC)

    Science.gov (United States)

    Dollan, Ralph; Laihem, Karim; Schälicke, Andreas

    2006-04-01

    The full exploitation of the physics potential of an International Linear Collider (ILC) requires the development of a polarized positron beam. New concepts of polarized positron sources are based on the development of circularly polarized photon sources. The polarized photons create electron-positron pairs in a thin target and transfer their polarization state to the outgoing leptons. To achieve a high level of positron polarization the understanding of the production mechanisms in the target is crucial. Therefore, a general framework for the simulation of polarized processes with GEANT4 is under development. In this contribution the current status of the project and its application to a study of the positron production process for the ILC is presented.

  20. Monte Carlo based studies of a polarized positron source for international linear collider (ILC).

    OpenAIRE

    Schälicke, A.; Dollan, R.; Laihem, K.

    2006-01-01

    The full exploitation of the physics potential of an International Linear Collider (ILC) requires the development of a polarized positron beam. New concepts of polarized positron sources are based on the development of circularly polarized photon sources. The polarized photons create electron-positron pairs in a thin target and transfer their polarization state to the outgoing leptons. To achieve a high level of positron polarization the understanding of the production mechanisms in the targe...

  1. Development of a space radiation Monte Carlo computer simulation based on the FLUKA and ROOT codes

    CERN Document Server

    Pinsky, L; Ferrari, A; Sala, P; Carminati, F; Brun, R

    2001-01-01

    This NASA funded project is proceeding to develop a Monte Carlo-based computer simulation of the radiation environment in space. With actual funding only initially in place at the end of May 2000, the study is still in the early stage of development. The general tasks have been identified and personnel have been selected. The code to be assembled will be based upon two major existing software packages. The radiation transport simulation will be accomplished by updating the FLUKA Monte Carlo program, and the user interface will employ the ROOT software being developed at CERN. The end-product will be a Monte Carlo-based code which will complement the existing analytic codes such as BRYNTRN/HZETRN presently used by NASA to evaluate the effects of radiation shielding in space. The planned code will possess the ability to evaluate the radiation environment for spacecraft and habitats in Earth orbit, in interplanetary space, on the lunar surface, or on a planetary surface such as Mars. Furthermore, it will be usef...

  2. Monte Carlo simulations of temperature-programmed and isothermal desorption from single-crystal surfaces

    Energy Technology Data Exchange (ETDEWEB)

    Lombardo, S.J. (California Inst. of Tech., Pasadena, CA (USA). Dept. of Chemical Engineering Lawrence Berkeley Lab., CA (USA))

    1990-08-01

    The kinetics of temperature-programmed and isothermal desorption have been simulated with a Monte Carlo model. Included in the model are the elementary steps of adsorption, surface diffusion, and desorption. Interactions between adsorbates and the metal as well as interactions between the adsorbates are taken into account with the Bond-Order-Conservation-Morse-Potential method. The shape, number, and location of the TPD peaks predicted by the simulations is shown to be sensitive to the binding energy, coverage, and coordination of the adsorbates. In addition, the occurrence of lateral interactions between adsorbates is seen to strongly effect the distribution of adsorbates is seen to strongly effect the distribution of adsorbates on the surface. Temperature-programmed desorption spectra of a single type of adsorbate have been simulated for the following adsorbate-metal systems: CO on Pd(100); H{sub 2} on Mo(100); and H{sub 2} on Ni(111). The model predictions are in good agreement with experimental observation. TPD spectra have also been simulated for two species coadsorbed on a surface; the model predictions are in qualitative agreement with the experimental results for H{sub 2} coadsorbed with strongly bound atomic species on Mo(100) and Fe(100) surfaces as well as for CO and H{sub 2} coadsorbed on Ni(100) and Rh(100) surfaces. Finally, the desorption kinetics of CO from Pd(100) and Ni(100) in the presence of gas-phase CO have been examined. The effect of pressure is seen to lead to an increase in the rate of desorption relative to the rate observed in the absence of gas-phase CO. This increase arises as a consequence of higher coverages and therefore stronger lateral interactions between the adsorbed CO molecules.

  3. Creation of a GUI for Zori, a Quantum Monte Carlo program, usingRappture

    Energy Technology Data Exchange (ETDEWEB)

    Olivares-Amaya, R.; Salomon Ferrer, R.; Lester Jr., W.A.; Amador-Bedolla, C.

    2007-12-01

    In their research laboratories, academic institutions produce some of the most advanced software for scientific applications. However, this software is usually developed only for local application in the research laboratory or for method development. In spite of having the latest advances in the particular field of science, such software often lacks adequate documentation and therefore is difficult to use by anyone other than the code developers. As such codes become more complex, so typically do the input files and command statements necessary to operate them. Many programs offer the flexibility of performing calculations based on different methods that have their own set of variables and options to be specified. Moreover, situations can arise in which certain options are incompatible with each other. For this reason, users outside the development group can be unaware of how the program runs in detail. The opportunity can be lost to make the software readily available outside of the laboratory of origin. This is a long-standing problem in scientific programming. Rappture, Rapid Application Infrastructure [1], is a new GUI development kit that enables a developer to build an I/O interface for a specific application. This capability enables users to work only with the generated GUI and avoids the problem of the user needing to learn details of the code. Further, it reduces input errors by explicitly specifying the variables required. Zori, a quantum Monte Carlo (QMC) program, developed by the Lester group at the University of California, Berkeley [2], is one of the few free tools available for this field. Like many scientific computer packages, Zori suffers from the problems described above. Potential users outside the research group have acquired it, but some have found the code difficult to use. Furthermore, new members of the Lester group usually have to take considerable time learning all the options the code has to offer before they can use it successfully. In

  4. A comparison of Monte-Carlo simulation programs with experiment: the effect of a focusing guide on resolution

    Energy Technology Data Exchange (ETDEWEB)

    Wildes, A.R.; Farhi, E.; Anderson, I.; Hoghoj, P.; Brochier, A. [Institut Laue-Langevin, BP 156, 38042 Grenoble Cedex 9 (France); Saroun, J. [Institut Laue-Langevin, BP 156, 38042 Grenoble Cedex 9 (France); Nuclear Physics Institute, 25068 Rez near Prague (Czech Republic)

    2002-07-01

    Two Monte-Carlo neutron instrument simulation programs, RESTRAX and McSTAS, were used to determine the effect of using a converging supermirror guide between a monochromator and sample on the divergence of the incident beam. The results are compared with the test results on implementing such a focusing guide on the IN14 cold neutron spectrometer, Institut Laue-Langevin. The measured non-trivial incident beam divergence in both real and reciprocal space is reproduced by both the programs, giving confidence in the accuracy of the calculations and highlighting the dangers of using such devices on high-resolution instruments. (orig.)

  5. A comparison of Monte-Carlo simulation programs with experiment: the effect of a focusing guide on resolution

    Science.gov (United States)

    Wildes, A. R.; Saroun, J.; Farhi, E.; Anderson, I.; Hoghoj, P.; Brochier, A.

    Two Monte-Carlo neutron instrument simulation programs, RESTRAX and McSTAS, were used to determine the effect of using a converging supermirror guide between a monochromator and sample on the divergence of the incident beam. The results are compared with the test results on implementing such a focusing guide on the IN14 cold neutron spectrometer, Institut Laue-Langevin. The measured non-trivial incident beam divergence in both real and reciprocal space is reproduced by both the programs, giving confidence in the accuracy of the calculations and highlighting the dangers of using such devices on high-resolution instruments.

  6. A comparison of Monte-Carlo simulation programs with experiment the effect of a focusing guide on resolution

    CERN Document Server

    Wildes, A R; Anderson, I; Hoghoj, P; Brochier, A; Saroun, J

    2002-01-01

    Two Monte-Carlo neutron instrument simulation programs, RESTRAX and McSTAS, were used to determine the effect of using a converging supermirror guide between a monochromator and sample on the divergence of the incident beam. The results are compared with the test results on implementing such a focusing guide on the IN14 cold neutron spectrometer, Institut Laue-Langevin. The measured non-trivial incident beam divergence in both real and reciprocal space is reproduced by both the programs, giving confidence in the accuracy of the calculations and highlighting the dangers of using such devices on high-resolution instruments. (orig.)

  7. Monte Carlo simulations of molecular gas flow: some applications in accelerator vacuum technology using a versatile personal computer program

    Energy Technology Data Exchange (ETDEWEB)

    Pace, A.; Poncet, A. (European Organization for Nuclear Research, Geneva (Switzerland))

    1990-01-01

    The Monte Carlo technique has been used extensively in the past to solve the problem of molecular flow through vacuum pipes or structures with specific boundary conditions for which analytical or even approximate solutions do not exist. Starting from a specific program written in 1975, the idea germinated over the years to produce handy, rather general, problem solving applications capable of running efficiently on modern microcomputers, mainly for ease of transportability and interactivity. Here, the latest version is described. The capabilities and limitations of these tools are presented through a few practical cases of conductance and pumping speed calculations pertinent to accelerator vacuum technology. (author).

  8. [Montérégie Comprehensive Cancer Care Centre: integrating nurse navigators in Montérégie's oncology teams: one aspect of implementing the Cancer Control Program--Part 1].

    Science.gov (United States)

    Plante, Anne; Joannette, Sonia

    2009-01-01

    The oncology patient navigator role was developed to ensure both continuity and consultation in the delivery of care to cancer patients and their families. In Québec, this role is filled by a nurse. This first article in a series of two, aims to explain why nurses were selected as patient navigators and to describe how this new role has been integrated in the Montérégie Region. The Québec Cancer Control Program, the definition established for the oncology nurse navigator role and the implementation of an integrated care network based on the Montérégie experience will be discussed.

  9. Geometrical and Monte Carlo projectors in 3D PET reconstruction

    OpenAIRE

    Aguiar, Pablo; Rafecas López, Magdalena; Ortuno, Juan Enrique; Kontaxakis, George; Santos, Andrés; Pavía, Javier; Ros, Domènec

    2010-01-01

    Purpose: In the present work, the authors compare geometrical and Monte Carlo projectors in detail. The geometrical projectors considered were the conventional geometrical Siddon ray-tracer (S-RT) and the orthogonal distance-based ray-tracer (OD-RT), based on computing the orthogonal distance from the center of image voxel to the line-of-response. A comparison of these geometrical projectors was performed using different point spread function (PSF) models. The Monte Carlo-based method under c...

  10. A Monte Carlo program to calculate the exposure rate from airborne radioactive gases inside a nuclear reactor containment building.

    Science.gov (United States)

    Sherbini, S; Tamasanis, D; Sykes, J; Porter, S W

    1986-12-01

    A program was developed to calculate the exposure rate resulting from airborne gases inside a reactor containment building. The calculations were performed at the location of a wall-mounted area radiation monitor. The program uses Monte Carlo techniques and accounts for both the direct and scattered components of the radiation field at the detector. The scattered component was found to contribute about 30% of the total exposure rate at 50 keV and dropped to about 7% at 2000 keV. The results of the calculations were normalized to unit activity per unit volume of air in the containment. This allows the exposure rate readings of the area monitor to be used to estimate the airborne activity in containment in the early phases of an accident. Such estimates, coupled with containment leak rates, provide a method to obtain a release rate for use in offsite dose projection calculations.

  11. A Grand Canonical Monte Carlo simulation program for computing ion distributions around biomolecules in hard sphere solvents

    Energy Technology Data Exchange (ETDEWEB)

    2017-02-24

    The GIBS software program is a Grand Canonical Monte Carlo (GCMC) simulation program (written in C++) that can be used for 1) computing the excess chemical potential of ions and the mean activity coefficients of salts in homogeneous electrolyte solutions; and, 2) for computing the distribution of ions around fixed macromolecules such as, nucleic acids and proteins. The solvent can be represented as neutral hard spheres or as a dielectric continuum. The ions are represented as charged hard spheres that can interact via Coulomb, hard-sphere, or Lennard-Jones potentials. In addition to hard-sphere repulsions, the ions can also be made to interact with the solvent hard spheres via short-ranged attractive square-well potentials.

  12. Hybrid Parallel Programming Models for AMR Neutron Monte-Carlo Transport

    Science.gov (United States)

    Dureau, David; Poëtte, Gaël

    2014-06-01

    This paper deals with High Performance Computing (HPC) applied to neutron transport theory on complex geometries, thanks to both an Adaptive Mesh Refinement (AMR) algorithm and a Monte-Carlo (MC) solver. Several Parallelism models are presented and analyzed in this context, among them shared memory and distributed memory ones such as Domain Replication and Domain Decomposition, together with Hybrid strategies. The study is illustrated by weak and strong scalability tests on complex benchmarks on several thousands of cores thanks to the petaflopic supercomputer Tera100.

  13. Kinetic Monte Carlo simulations of temperature programed desorption of O/Rh(111).

    Science.gov (United States)

    Franz, T; Mittendorfer, F

    2010-05-21

    We present a kinetic Monte Carlo simulation based on ab initio calculations for the thermal desorption of oxygen from a Rh(111) surface. Several models have been used for the parametrization of the interaction between the adsorbed atoms. We find that models based on a parametrization with only pairwise interactions have a relatively large error in the predicted adsorption energies. This error can be significantly reduced by including three- and four-body interactions. In addition, we find that a significant amount of atoms adsorb in a second adsorption site - the hcp-hollow site - at an elevated temperature. Consequently, only a many-body multisite model of the oxygen interactions yields appropriate desorption spectra for the full coverage range, while more simple models only capture the correct shape in the low-coverage case. Our parametrization allows us to predict the adsorption energies of an arbitrary configuration of adsorbates with a mean average error of less than 6 meV/atom.

  14. Monte Carlo scatter correction for SPECT

    Science.gov (United States)

    Liu, Zemei

    The goal of this dissertation is to present a quantitatively accurate and computationally fast scatter correction method that is robust and easily accessible for routine applications in SPECT imaging. A Monte Carlo based scatter estimation method is investigated and developed further. The Monte Carlo simulation program SIMIND (Simulating Medical Imaging Nuclear Detectors), was specifically developed to simulate clinical SPECT systems. The SIMIND scatter estimation (SSE) method was developed further using a multithreading technique to distribute the scatter estimation task across multiple threads running concurrently on multi-core CPU's to accelerate the scatter estimation process. An analytical collimator that ensures less noise was used during SSE. The research includes the addition to SIMIND of charge transport modeling in cadmium zinc telluride (CZT) detectors. Phenomena associated with radiation-induced charge transport including charge trapping, charge diffusion, charge sharing between neighboring detector pixels, as well as uncertainties in the detection process are addressed. Experimental measurements and simulation studies were designed for scintillation crystal based SPECT and CZT based SPECT systems to verify and evaluate the expanded SSE method. Jaszczak Deluxe and Anthropomorphic Torso Phantoms (Data Spectrum Corporation, Hillsborough, NC, USA) were used for experimental measurements and digital versions of the same phantoms employed during simulations to mimic experimental acquisitions. This study design enabled easy comparison of experimental and simulated data. The results have consistently shown that the SSE method performed similarly or better than the triple energy window (TEW) and effective scatter source estimation (ESSE) methods for experiments on all the clinical SPECT systems. The SSE method is proven to be a viable method for scatter estimation for routine clinical use.

  15. An Educational MONTE CARLO Simulation/Animation Program for the Cosmic Rays Muons and a Prototype Computer-Driven Hardware Display.

    Science.gov (United States)

    Kalkanis, G.; Sarris, M. M.

    1999-01-01

    Describes an educational software program for the study of and detection methods for the cosmic ray muons passing through several light transparent materials (i.e., water, air, etc.). Simulates muons and Cherenkov photons' paths and interactions and visualizes/animates them on the computer screen using Monte Carlo methods/techniques which employ…

  16. Continuing health education in the Family Health Program in Montes Claros: intentions, realities and possibilities

    Directory of Open Access Journals (Sweden)

    Alcione Gonçalves Ribeiro Vieira

    2010-11-01

    Full Text Available O estudo versa sobre Educação Permanente em Saúde e consistiu em analisar a experiência de formação no trabalho de duas equipes do Programa Saúde da Família no município de Montes Claros, tendo como referência as diretrizes da Política Nacional de Educação Permanente em Saúde. A metodologia adotada foi a investigação qualitativa e os sujeitos da pesquisa foram os trabalhadores de saúde. O eixo norteador foram os processos de educação e de trabalho e nesses os processos participativos, os saberes e experiências relativas às práticas de saúde no cotidiano das unidades. O estudo apontou a existência da formação em serviço, integrando o processo educativo às práticas de saúde, mas sendo desenvolvida de forma incipiente e não sistematizada. Constatou, ainda, o desenvolvimento de treinamentos e capacitações para atualização dos trabalhadores, que se caracterizavam por utilizar metodologias tradicionais e concentrar-se, em termos do planejamento, formulação dos temas e conteúdos, no nível central da Secretaria Municipal de Saúde, desvinculados das práticas concretas das Unidades de Saúde.Oprocesso educativo direcionado à população caracterizava –se pela prescrição de comportamentos, considerados tecnicamente mais corretos. Os resultados obtidos mostraram que a proposta de Educação Permanente não estava implantada em sua plenitude nas equipes em tela, embora estivessem presentes ações educativas que refletem suas diretrizes. Um dos desafios apontados foi a ampliação da participação de sujeitos sociais na viabilização e construção destas unidades como espaço de relações de trabalho e de formação.

  17. A fast Monte Carlo program for pulsed-neutron capture-gamma tools

    Energy Technology Data Exchange (ETDEWEB)

    Hovgaard, J.

    1992-02-01

    A fast model for the pulsed-neutron capture-gamma tool has been developed. It is believed that the program produce valid results even though some approximation have been introduced. A correct {gamma} photon transport simulation, which is under preparation, has for instance not yet been included. Simulations performed so far has shown that the model, with respect to computing time and accuracy, fully lives up to expectations with respect to computing time and accuracy. (au).

  18. A fast Monte Carlo program for pulsed-neutron capture-gamma tools

    Energy Technology Data Exchange (ETDEWEB)

    Hovgaard, J.

    1992-02-01

    A fast model for the pulsed-neutron capture-gamma tool has been developed. It is believed that the program produce valid results even though some approximation have been introduced. A correct [gamma] photon transport simulation, which is under preparation, has for instance not yet been included. Simulations performed so far has shown that the model, with respect to computing time and accuracy, fully lives up to expectations with respect to computing time and accuracy. (au).

  19. Adaptation of a Fortran-Based Monte-Carlo Microscopic Black Hole Simulation Program to C++ Based Root

    Science.gov (United States)

    Jenkins, C. M.; Godang, R.; Cavaglia, M.; Cremaldi, L.; Summers, D.

    2008-10-01

    The 14 TeV center of mass proton-proton collisions at the LHC opens the possibility for new Physics, including the possible formation of microscopic black holes. A Fortran-based Monte Carlo event generator program called CATFISH (Collider grAviTational FIeld Simulator for black Holes) has been developed at the University of Mississippi to study signatures of microscopic black hole production (http://www.phy.olemiss.edu/GR/catfish). This black hole event generator includes many of the currently accepted theoretical results for microscopic black hole formation. High energy physics data analysis is shifting from Fortran to C++ as the CERN data analysis packages HBOOK and PAW are no longer supported. The C++ based root is replacing these packages. Work done at the University of South Alabama has resulted in a successful inclusion of CATFISH into root. The methods used to interface the Fortran-based CATFISH into the C++ based root will be presented. Benchmark histograms will be presented demonstrating the conversion. Preliminary results will be presented for selecting black hole candidate events in 14 TeV/ center of mass proton-proton collisions.

  20. Monte Carlo-Based Dose Calculation in Postprostatectomy Image-Guided Intensity Modulated Radiotherapy: A Pilot Study

    Directory of Open Access Journals (Sweden)

    Ashley Rankine

    2015-01-01

    Full Text Available Step-and-shoot (S&S intensity-modulated radiotherapy (IMRT using the XiO treatment planning system (TPS has been routinely used for patients receiving postprostatectomy radiotherapy (PPRT. After installing the Monaco, a pilot study was undertaken with five patients to compare XiO with Monaco (V2.03 TPS for PPRT with respect to plan quality for S&S as well as volumetric-modulated arc therapy (VMAT. Monaco S&S showed higher mean clinical target volume (CTV coverage (99.85% than both XiO S&S (97.98%, P = 0.04 and Monaco VMAT (99.44, P = 0.02. Rectal V60Gy volumes were lower for Monaco S&S compared to XiO (46.36% versus 58.06%, P = 0.001 and Monaco VMAT (46.36% versus 54.66%, P = 0.02. Rectal V60Gy volume was lowest for Monaco S&S and superior to XiO (mean 19.89% versus 31.25%, P = 0.02. Rectal V60Gy volumes were lower for Monaco VMAT compared to XiO (21.09% versus 31.25%, P = 0.02. Other organ-at-risk (OAR parameters were comparable between TPSs. Compared to XiO S&S, Monaco S&S plans had fewer segments (78.6 versus 116.8 segments, P = 0.02, lower total monitor units (MU (677.6 MU versus 770.7 MU, P = 0.01, and shorter beam-on times (5.7 min versus 7.6 min, P = 0.03. This pilot study suggests that Monaco S&S improves CTV coverage, OAR doses, and planning and treatment times for PPRT.

  1. Measurement of Light Collection of CMS PbWO4 Crystals Comparison with the Cristal Monte Carlo Simulation Program and Further Evaluation

    CERN Document Server

    Drobychev, Gleb; Peigneux, Jean-Pierre; Rivoalan, P

    1998-01-01

    The measurement of the signal for light collection limited by the effective area of the present APD detectors has been evaluated using a photomultiplier and co60 source. Comparison of the methods and experimental results for simple geometrical situations has been made with the results of the CRISTAL Monte Carlo program which is also used to evaluate more complicated geometries including evaluation of a wavelength shifter used with an APD as a photodetector.

  2. Spike Inference from Calcium Imaging using Sequential Monte Carlo Methods

    OpenAIRE

    NeuroData; Paninski, L

    2015-01-01

    Vogelstein JT, Paninski L. Spike Inference from Calcium Imaging using Sequential Monte Carlo Methods. Statistical and Applied Mathematical Sciences Institute (SAMSI) Program on Sequential Monte Carlo Methods, 2008

  3. Application of adjoint Monte Carlo to accelerate simulations of mono-directional beams in treatment planning for Boron Neutron Capture Therapy

    NARCIS (Netherlands)

    Nievaart, V.A.; Legrady, D.; Moss, R.L.; Kloosterman, J.L.; Van der Hagen, T.H.; Van Dam, H.

    2007-01-01

    This paper deals with the application of the adjoint transport theory in order to optimize Monte Carlo based radiotherapy treatment planning. The technique is applied to Boron Neutron Capture Therapy where most often mixed beams of neutrons and gammas are involved. In normal forward Monte Carlo simu

  4. Monte Carlo integration on GPU

    OpenAIRE

    Kanzaki, J.

    2010-01-01

    We use a graphics processing unit (GPU) for fast computations of Monte Carlo integrations. Two widely used Monte Carlo integration programs, VEGAS and BASES, are parallelized on GPU. By using $W^{+}$ plus multi-gluon production processes at LHC, we test integrated cross sections and execution time for programs in FORTRAN and C on CPU and those on GPU. Integrated results agree with each other within statistical errors. Execution time of programs on GPU run about 50 times faster than those in C...

  5. The Virtual Monte Carlo

    CERN Document Server

    Hrivnacova, I; Berejnov, V V; Brun, R; Carminati, F; Fassò, A; Futo, E; Gheata, A; Caballero, I G; Morsch, Andreas

    2003-01-01

    The concept of Virtual Monte Carlo (VMC) has been developed by the ALICE Software Project to allow different Monte Carlo simulation programs to run without changing the user code, such as the geometry definition, the detector response simulation or input and output formats. Recently, the VMC classes have been integrated into the ROOT framework, and the other relevant packages have been separated from the AliRoot framework and can be used individually by any other HEP project. The general concept of the VMC and its set of base classes provided in ROOT will be presented. Existing implementations for Geant3, Geant4 and FLUKA and simple examples of usage will be described.

  6. EGS code system: computer programs for the Monte Carlo simulation of electromagnetic cascade showers. Version 3. [EGS, PEGS, TESTSR, in MORTRAN

    Energy Technology Data Exchange (ETDEWEB)

    Ford, R.L.; Nelson, W.R.

    1978-06-01

    A code to simulate almost any electron--photon transport problem conceivable is described. The report begins with a lengthy historical introduction and a description of the shower generation process. Then the detailed physics of the shower processes and the methods used to simulate them are presented. Ideas of sampling theory, transport techniques, particle interactions in general, and programing details are discussed. Next, EGS calculations and various experiments and other Monte Carlo results are compared. The remainder of the report consists of user manuals for EGS, PEGS, and TESTSR codes; options, input specifications, and typical output are included. 38 figures, 12 tables. (RWR)

  7. Monte Carlo Treatment Planning for Molecular Targeted Radiotherapy within the MINERVA System

    Energy Technology Data Exchange (ETDEWEB)

    Lehmann, J; Siantar, C H; Wessol, D E; Wemple, C A; Nigg, D; Cogliati, J; Daly, T; Descalle, M; Flickinger, T; Pletcher, D; DeNardo, G

    2004-09-22

    can only be properly accounted for by image-based, patient-specific treatment planning as has been common in external beam radiation therapy for many years. MINERVA offers 3D Monte Carlo based MTR treatment planning as its first integrated operational capability. The new MINERVA system will ultimately incorporate capabilities for a comprehensive list of radiation therapies. In progress are modules for external beam photon-electron therapy and Boron Neutron Capture Therapy (BNCT). Brachytherapy and Protontherapy are planned. Through the open Application Programming Interface (API) other groups can add their own modules and share them with the community.

  8. Monte Carlo treatment planning for molecular targeted radiotherapy within the MINERVA system

    Energy Technology Data Exchange (ETDEWEB)

    Lehmann, Joerg [University of California, Lawrence Livermore National Laboratory, 7000 East Ave, Livermore, CA 94550 (United States); Siantar, Christine Hartmann [University of California, Lawrence Livermore National Laboratory, 7000 East Ave, Livermore, CA 94550 (United States); Wessol, Daniel E [Idaho National Engineering and Environmental Laboratory, PO Box 1625, Idaho Falls, ID 83415-3885 (United States); Wemple, Charles A [Idaho National Engineering and Environmental Laboratory, PO Box 1625, Idaho Falls, ID 83415-3885 (United States); Nigg, David [Idaho National Engineering and Environmental Laboratory, PO Box 1625, Idaho Falls, ID 83415-3885 (United States); Cogliati, Josh [Department of Computer Science, Montana State University, Bozeman, MT 59717 (United States); Daly, Tom [University of California, Lawrence Livermore National Laboratory, 7000 East Ave, Livermore, CA 94550 (United States); Descalle, Marie-Anne [University of California, Lawrence Livermore National Laboratory, 7000 East Ave, Livermore, CA 94550 (United States); Flickinger, Terry [University of California, Lawrence Livermore National Laboratory, 7000 East Ave, Livermore, CA 94550 (United States); Pletcher, David [University of California, Lawrence Livermore National Laboratory, 7000 East Ave, Livermore, CA 94550 (United States); DeNardo, Gerald [University of California Davis, School of Medicine, Sacramento, CA 95817 (United States)

    2005-03-07

    drug pharmacokinetics in MTR can only be properly accounted for by image-based, patient-specific treatment planning, as has been common in external beam radiation therapy for many years. MINERVA offers 3D Monte Carlo-based MTR treatment planning as its first integrated operational capability. The new MINERVA system will ultimately incorporate capabilities for a comprehensive list of radiation therapies. In progress are modules for external beam photon-electron therapy and boron neutron capture therapy (BNCT). Brachytherapy and proton therapy are planned. Through the open application programming interface (API), other groups can add their own modules and share them with the community.

  9. The Monte Carlo Program KoralW version 1.51 and The Concurrent Monte Carlo KoralW\\&YFSWW3 with All Background Graphs and First Order Corrections to W-Pair Production

    CERN Document Server

    Jadach, Stanislaw; Skrzypek, Maciej; Ward, B F L; Was, Zbigniew

    2001-01-01

    The version 1.51 of the Monte Carlo (MC) program KoralW for all $e^+e^-\\to f_1\\bar f_2 f_3\\bar f_4$ processes is presented. The most important change since the previous version 1.42 is the facility for writing MC events on the mass storage device and re-processing them later on. In the re-processing one may modify parameters of the Standard Model in order to fit them to experimental data. Another important new feature is a possibility of including complete ${\\cal O}(\\alpha)$ corrections to double-resonant W-pair component-processes in addition to all background (non-WW) graphs. The inclusion is done with the help of the YFSWW3 MC event generator for fully exclusive differential distributions (event-per-event). Technically, it is done in such a way that YFSWW3 runs concurrently with KoralW as a separate slave process, reading momenta of the MC event generated by KoralW and returning the correction weight to KoralW. KoralW introduces the ${\\cal O}(\\alpha)$ correction using this weight, and finishes processing t...

  10. RCPO1 - A Monte Carlo program for solving neutron and photon transport problems in three dimensional geometry with detailed energy description and depletion capability

    Energy Technology Data Exchange (ETDEWEB)

    Ondis, L.A., II; Tyburski, L.J.; Moskowitz, B.S.

    2000-03-01

    The RCP01 Monte Carlo program is used to analyze many geometries of interest in nuclear design and analysis of light water moderated reactors such as the core in its pressure vessel with complex piping arrangement, fuel storage arrays, shipping and container arrangements, and neutron detector configurations. Written in FORTRAN and in use on a variety of computers, it is capable of estimating steady state neutron or photon reaction rates and neutron multiplication factors. The energy range covered in neutron calculations is that relevant to the fission process and subsequent slowing-down and thermalization, i.e., 20 MeV to 0 eV. The same energy range is covered for photon calculations.

  11. Monts Jura Jazz Festival

    CERN Multimedia

    2012-01-01

    The 5th edition of the "Monts Jura Jazz Festival" will take place at the Esplanade du Lac in Divonne-les-Bains, France on September 21 and 22. This festival organized by the CERN Jazz Club and supported by the CERN Staff Association is becoming a major musical event in the Geneva region. International Jazz artists like Didier Lockwood and David Reinhardt are part of this year outstanding program. Full program and e-tickets are available on the festival website. Don't miss this great festival!

  12. Parallel implementation of inverse adding-doubling and Monte Carlo multi-layered programs for high performance computing systems with shared and distributed memory

    Science.gov (United States)

    Chugunov, Svyatoslav; Li, Changying

    2015-09-01

    Parallel implementation of two numerical tools popular in optical studies of biological materials-Inverse Adding-Doubling (IAD) program and Monte Carlo Multi-Layered (MCML) program-was developed and tested in this study. The implementation was based on Message Passing Interface (MPI) and standard C-language. Parallel versions of IAD and MCML programs were compared to their sequential counterparts in validation and performance tests. Additionally, the portability of the programs was tested using a local high performance computing (HPC) cluster, Penguin-On-Demand HPC cluster, and Amazon EC2 cluster. Parallel IAD was tested with up to 150 parallel cores using 1223 input datasets. It demonstrated linear scalability and the speedup was proportional to the number of parallel cores (up to 150x). Parallel MCML was tested with up to 1001 parallel cores using problem sizes of 104-109 photon packets. It demonstrated classical performance curves featuring communication overhead and performance saturation point. Optimal performance curve was derived for parallel MCML as a function of problem size. Typical speedup achieved for parallel MCML (up to 326x) demonstrated linear increase with problem size. Precision of MCML results was estimated in a series of tests - problem size of 106 photon packets was found optimal for calculations of total optical response and 108 photon packets for spatially-resolved results. The presented parallel versions of MCML and IAD programs are portable on multiple computing platforms. The parallel programs could significantly speed up the simulation for scientists and be utilized to their full potential in computing systems that are readily available without additional costs.

  13. Evaluation of the material assignment method used by a Monte Carlo treatment planning system.

    Science.gov (United States)

    Isambert, A; Brualla, L; Lefkopoulos, D

    2009-12-01

    An evaluation of the conversion process from Hounsfield units (HU) to material composition in computerised tomography (CT) images, employed by the Monte Carlo based treatment planning system ISOgray (DOSIsoft), is presented. A boundary in the HU for the material conversion between "air" and "lung" materials was determined based on a study using 22 patients. The dosimetric consequence of the new boundary was quantitatively evaluated for a lung patient plan.

  14. Monts Jura Jazz Festival

    CERN Multimedia

    Jazz Club

    2012-01-01

    The 5th edition of the "Monts Jura Jazz Festival" that will take place on September 21st and 22nd 2012 at the Esplanade du Lac in Divonne-les-Bains. This festival is organized by the "CERN Jazz Club" with the support of the "CERN Staff Association". This festival is a major musical event in the French/Swiss area and proposes a world class program with jazz artists such as D.Lockwood and D.Reinhardt. More information on http://www.jurajazz.com.

  15. Simulation of a Quality Control Jaszczak Phantom with SIMIND Monte Carlo and Adding the Phantom as an Accessory to the Program

    Directory of Open Access Journals (Sweden)

    Jalil Pirayesh Islamian

    2012-03-01

    Full Text Available Introduction Quality control is an important phenomenon in nuclear medicine imaging. A Jaszczak SPECT Phantom provides consistent performance information for any SPECT or PET system. This article describes the simulation of a Jaszczak phantom and creating an executable phantom file for comparing assessment of SPECT cameras using SIMIND Monte Carlo simulation program which is well-established for SPECT. Materials and Methods The simulation was based on a Deluxe model of Jaszczak Phantom with defined geometry. Quality control tests were provided together with initial imaging example and suggested use for the assessment of parameters such as spatial resolution, limits of lesion detection, and contrast comparing with a Siemens E.Cam SPECT system. Results The phantom simulation was verified by matching tomographic spatial resolution, image contrast, and also uniformity compared with the experiment SPECT of the phantom from filtered backprojection reconstructed images of the spheres and rods. The calculated contrasts of the rods were 0.774, 0.627, 0.575, 0.372, 0.191, and 0.132 for an experiment with the rods diameters of 31.8, 25.4, 19.1, 15.9, 12.7, and 9.5 mm, respectively. The calculated contrasts of simulated rods were 0.661, 0.527, 0.487, 0.400, 0.23, and 0.2 for cold rods and also 0.92, 0.91, 0.88, 0.81, 0.76, and 0.56 for hot rods. Reconstructed spatial tomographic resolution of both experiment and simulated SPECTs of the phantom obtained about 9.5 mm. An executable phantom file and an input phantom file were created for the SIMIND Monte Carlo program. Conclusion This phantom may be used for simulated SPECT systems and would be ideal for verification of the simulated systems with real ones by comparing the results of quality control and image evaluation. It is also envisaged that this phantom could be used with a range of radionuclide doses in simulation situations such as cold, hot, and background uptakes for the assessment of detection

  16. The Specific Bias in Dynamic Monte Carlo Simulations of Nuclear Reactor

    Science.gov (United States)

    Yamamoto, Toshihisa; Endo, Hiroshi; Ishizu, Tomoko; Tatewaki, Isao

    2014-06-01

    During the development of Monte-Carlo-based dynamic code system, we have encountered two major Monte-Carlo-specific problems. One is the break down due to "false super-criticality" which is caused by an accidentally large eigenvalue due to statistical error in spite of the fact that the reactor is actually not. The other problem, which is the main topic in this paper, is that the statistical error in power level using the reactivity calculated with Monte Carlo code is not symmetric about its mean but always positively biased. This signifies that the bias is accumulated as the calculation proceeds and consequently results in over-estimation of the final power level. It should be noted that the bias will not eliminated by refining time step as long as the variance is not zero. A preliminary investigation on this matter using the one-group-precursor point kinetic equations was made and it was concluded that the bias in power level is approximately proportional to the product of variance in Monte Carlo calculation and elapsed time. This conclusion was verified with some numerical experiments. This outcome is important in quantifying the required precision of the Monte-Carlo-based reactivity calculations.

  17. RSW-MCFP: A Resource-Oriented Solid Waste Management System for a Mixed Rural-Urban Area through Monte Carlo Simulation-Based Fuzzy Programming

    Directory of Open Access Journals (Sweden)

    P. Li

    2013-01-01

    Full Text Available The growth of global population and economy continually increases the waste volumes and consequently creates challenges to handle and dispose solid wastes. It becomes more challenging in mixed rural-urban areas (i.e., areas of mixed land use for rural and urban purposes where both agricultural waste (e.g., manure and municipal solid waste are generated. The efficiency and confidence of decisions in current management practices significantly rely on the accurate information and subjective judgments, which are usually compromised by uncertainties. This study proposed a resource-oriented solid waste management system for mixed rural-urban areas. The system is featured by a novel Monte Carlo simulation-based fuzzy programming approach. The developed system was tested by a real-world case with consideration of various resource-oriented treatment technologies and the associated uncertainties. The modeling results indicated that the community-based bio-coal and household-based CH4 facilities were necessary and would become predominant in the waste management system. The 95% confidence intervals of waste loadings to the CH4 and bio-coal facilities were 387, 450 and 178, 215 tonne/day (mixed flow, respectively. In general, the developed system has high capability in supporting solid waste management for mixed rural-urban areas in a cost-efficient and sustainable manner under uncertainty.

  18. Monte Carlo uncertainty analyses for integral beryllium experiments

    CERN Document Server

    Fischer, U; Tsige-Tamirat, H

    2000-01-01

    The novel Monte Carlo technique for calculating point detector sensitivities has been applied to two representative beryllium transmission experiments with the objective to investigate the sensitivity of important responses such as the neutron multiplication and to assess the related uncertainties due to the underlying cross-section data uncertainties. As an important result, it has been revealed that the neutron multiplication power of beryllium can be predicted with good accuracy using state-of-the-art nuclear data evaluations. Severe discrepancies do exist for the spectral neutron flux distribution that would transmit into significant uncertainties of the calculated neutron spectra and of the nuclear blanket performance in blanket design calculations. With regard to this, it is suggested to re-analyse the secondary energy and angle distribution data of beryllium by means of Monte Carlo based sensitivity and uncertainty calculations. Related code development work is underway.

  19. Visibility assessment : Monte Carlo characterization of temporal variability.

    Energy Technology Data Exchange (ETDEWEB)

    Laulainen, N.; Shannon, J.; Trexler, E. C., Jr.

    1997-12-12

    Current techniques for assessing the benefits of certain anthropogenic emission reductions are largely influenced by limitations in emissions data and atmospheric modeling capability and by the highly variant nature of meteorology. These data and modeling limitations are likely to continue for the foreseeable future, during which time important strategic decisions need to be made. Statistical atmospheric quality data and apportionment techniques are used in Monte-Carlo models to offset serious shortfalls in emissions, entrainment, topography, statistical meteorology data and atmospheric modeling. This paper describes the evolution of Department of Energy (DOE) Monte-Carlo based assessment models and the development of statistical inputs. A companion paper describes techniques which are used to develop the apportionment factors used in the assessment models.

  20. RCP01: a Monte Carlo program for solving neutron and photon transport problems in three-dimensional geometry with detailed energy description (LWBR development program). [For CDC-6600 and -7600, in FORTRAN

    Energy Technology Data Exchange (ETDEWEB)

    Candelore, N R; Gast, R C; Ondis, II, L A

    1978-08-01

    The RCP01 Monte Carlo program for the CDC-7600 and CDC-6600 performs fixed source or eigenfunction neutron reaction rate calculations, or photon reaction rate calculations, for complex geometries. The photon calculations may be linked to the neutron reaction rate calculations. For neutron calculations, the full energy range is treated as required for neutron birth by the fission process and the subsequent neutron slowing down and thermalization, i.e., 10 MeV to 0 eV; for photon calculations the same energy range is treated. The detailed cross sections required for the neutron or photon collision processes are provided by RCPL1. This report provides details of the various types of neutron and photon starts and collisions, the common geometry tracking, and the input required. 37 figures, 1 table.

  1. Application de la methode des sous-groupes au calcul Monte-Carlo multigroupe

    Science.gov (United States)

    Martin, Nicolas

    effects of the scattering reaction consistent with the subgroup method. In this study, we generalize the Discrete Angle Technique, already proposed for homogeneous, multigroup cross sections, to isotopic cross sections on the form of probability tables. In this technique, the angular density is discretized into probability tables. Similarly to the cross-section case, a moment approach is used to compute the probability tables for the scattering cosine. (4) The introduction of a leakage model based on the B1 fundamental mode approximation. Unlike deterministic lattice packages, most Monte Carlo-based lattice physics codes do not include leakage models. However the generation of homogenized and condensed group constants (cross sections, diffusion coefficients) require the critical flux. This project has involved the development of a program into the DRAGON framework, written in Fortran 2003 and wrapped with a driver in C, the GANLIB 5. Choosing Fortran 2003 has permitted the use of some modern features, such as the definition of objects and methods, data encapsulation and polymorphism. The validation of the proposed code has been performed by comparison with other numerical methods: (1) The continuous-energy Monte Carlo method of the SERPENT code. (2) The Collision Probability (CP) method and the discrete ordinates (SN) method of the DRAGON lattice code. (3) The multigroup Monte Carlo code MORET, coupled with the DRAGON code. Benchmarks used in this work are representative of some industrial configurations encountered in reactor and criticality-safety calculations: (1)Pressurized Water Reactors (PWR) cells and assemblies. (2) Canada-Deuterium Uranium Reactors (CANDU-6) clusters. (3) Critical experiments from the ICSBEP handbook (International Criticality Safety Benchmark Evaluation Program).

  2. Proton Upset Monte Carlo Simulation

    Science.gov (United States)

    O'Neill, Patrick M.; Kouba, Coy K.; Foster, Charles C.

    2009-01-01

    The Proton Upset Monte Carlo Simulation (PROPSET) program calculates the frequency of on-orbit upsets in computer chips (for given orbits such as Low Earth Orbit, Lunar Orbit, and the like) from proton bombardment based on the results of heavy ion testing alone. The software simulates the bombardment of modern microelectronic components (computer chips) with high-energy (.200 MeV) protons. The nuclear interaction of the proton with the silicon of the chip is modeled and nuclear fragments from this interaction are tracked using Monte Carlo techniques to produce statistically accurate predictions.

  3. Improved Monte Carlo Renormalization Group Method

    Science.gov (United States)

    Gupta, R.; Wilson, K. G.; Umrigar, C.

    1985-01-01

    An extensive program to analyze critical systems using an Improved Monte Carlo Renormalization Group Method (IMCRG) being undertaken at LANL and Cornell is described. Here we first briefly review the method and then list some of the topics being investigated.

  4. Monte Vista NWR Water Use Report- 1964

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — This report summarizes water use at Monte Vista NWR for 1964. The document includes summaries of 1964 water use, 1965 water program recommendations, and proposed...

  5. Extended Ensemble Monte Carlo

    OpenAIRE

    Iba, Yukito

    2000-01-01

    ``Extended Ensemble Monte Carlo''is a generic term that indicates a set of algorithms which are now popular in a variety of fields in physics and statistical information processing. Exchange Monte Carlo (Metropolis-Coupled Chain, Parallel Tempering), Simulated Tempering (Expanded Ensemble Monte Carlo), and Multicanonical Monte Carlo (Adaptive Umbrella Sampling) are typical members of this family. Here we give a cross-disciplinary survey of these algorithms with special emphasis on the great f...

  6. Monte Carlo-based Spencer-Attix and Bragg-Gray tissue-to-air stopping power ratios for ISO beta sources.

    Science.gov (United States)

    Selvam, T Palani; Vandana, S; Bakshi, A K; Babu, D A R

    2016-02-01

    Spencer-Attix (SA) and Bragg-Gray (BG) mass-collision-stopping-power ratios of tissue-to-air are calculated using a modified version of EGSnrc-based SPRRZnrc user-code for the International Organization for Standardization (ISO) beta sources such as (147)Pm, (85)Kr, (90)Sr/(90)Y and (106)Ru/(106)Rh. The ratios are calculated at 5 and 70 µm depths along the central axis of the unit density ICRU-4-element tissue phantom as a function of air-cavity lengths of the extrapolation chamber l = 0.025-0.25 cm. The study shows that the BG values are independent of l and agree well with the ISO-reported values for the above sources. The overall variation in the SA values is ∼0.3% for all the investigated sources, when l is varied from 0.025 to 0.25 cm. As energy of the beta increases the SA stopping-power ratio for a given cavity length decreases. For example, SA values of (147)Pm are higher by ∼2% when compared with the corresponding values of (106)Ru/(106)Rh source. SA stopping-power ratios are higher than the BG stopping-power ratios and the degree of variation depends on type of source and the value of l. For example, the difference is up to 0.7 % at l = 0.025 cm for the (90)Sr/(90)Y source. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  7. Reduction of radiation risks in patients undergoing some X-ray examinations by using optimal projections: A Monte Carlo program-based mathematical calculation

    Directory of Open Access Journals (Sweden)

    A Chaparian

    2014-01-01

    Full Text Available The objectives of this paper were calculation and comparison of the effective doses, the risks of exposure-induced cancer, and dose reduction in the gonads for male and female patients in different projections of some X-ray examinations. Radiographies of lumbar spine [in the eight projections of anteroposterior (AP, posteroanterior (PA, right lateral (RLAT, left lateral (LLAT, right anterior-posterior oblique (RAO, left anterior-posterior oblique (LAO, right posterior-anterior oblique (RPO, and left posterior-anterior oblique (LPO], abdomen (in the two projections of AP and PA, and pelvis (in the two projections of AP and PA were investigated. A solid-state dosimeter was used for the measuring of the entrance skin exposure. A Monte Carlo program was used for calculation of effective doses, the risks of radiation-induced cancer, and doses to the gonads related to the different projections. Results of this study showed that PA projection of abdomen, lumbar spine, and pelvis radiographies caused 50%-57% lower effective doses than AP projection and 50%-60% reduction in radiation risks. Also use of LAO projection of lumbar spine X-ray examination caused 53% lower effective dose than RPO projection and 56% and 63% reduction in radiation risk for male and female, respectively, and RAO projection caused 28% lower effective dose than LPO projection and 52% and 39% reduction in radiation risk for males and females, respectively. About dose reduction in the gonads, using of the PA position rather than AP in the radiographies of the abdomen, lumbar spine, and pelvis can result in reduction of the ovaries doses in women, 38%, 31%, and 25%, respectively and reduction of the testicles doses in males, 76%, 86%, and 94%, respectively. Also for oblique projections of lumbar spine X-ray examination, with employment of LAO rather than RPO and also RAO rather than LPO, demonstrated 22% and 13% reductions to the ovaries doses and 66% and 54% reductions in the

  8. Fast quantum Monte Carlo on a GPU

    CERN Document Server

    Lutsyshyn, Y

    2013-01-01

    We present a scheme for the parallelization of quantum Monte Carlo on graphical processing units, focusing on bosonic systems and variational Monte Carlo. We use asynchronous execution schemes with shared memory persistence, and obtain an excellent acceleration. Comparing with single core execution, GPU-accelerated code runs over x100 faster. The CUDA code is provided along with the package that is necessary to execute variational Monte Carlo for a system representing liquid helium-4. The program was benchmarked on several models of Nvidia GPU, including Fermi GTX560 and M2090, and the latest Kepler architecture K20 GPU. Kepler-specific optimization is discussed.

  9. Dose optimization based on linear programming implemented in a system for treatment planning in Monte Carlo; Optimizacion de dosis basada en programacion lineal implemenetada en un un sistema para la planificacion de tratamiento en Monte Carlo

    Energy Technology Data Exchange (ETDEWEB)

    Ureba, A.; Palma, B. A.; Leal, A.

    2011-07-01

    Develop a more efficient method of optimization in relation to time, based on linear programming designed to implement a multi objective penalty function which also permits a simultaneous solution integrated boost situations considering two white volumes simultaneously.

  10. Study on the response of thermoluminescent dosemeters to synchrotron radiation: experimental method and Monte Carlo calculations.

    Science.gov (United States)

    Bakshi, A K; Chatterjee, S; Palani Selvam, T; Dhabekar, B S

    2010-07-01

    In the present study, the energy dependence of response of some popular thermoluminescent dosemeters (TLDs) have been investigated such as LiF:Mg,Ti, LiF:Mg,Cu,P and CaSO(4):Dy to synchrotron radiation in the energy range of 10-34 keV. The study utilised experimental, Monte Carlo and analytical methods. The Monte Carlo calculations were based on the EGSnrc and FLUKA codes. The calculated energy response of all the TLDs using the EGSnrc and FLUKA codes shows excellent agreement with each other. The analytically calculated response shows good agreement with the Monte Carlo calculated response in the low-energy region. In the case of CaSO(4):Dy, the Monte Carlo-calculated energy response is smaller by a factor of 3 at all energies in comparison with the experimental response when polytetrafluoroethylene (PTFE) (75 % by wt) is included in the Monte Carlo calculations. When PTFE is ignored in the Monte Carlo calculations, the difference between the calculated and experimental response decreases (both responses are comparable >25 keV). For the LiF-based TLDs, the Monte Carlo-based response shows reasonable agreement with the experimental response.

  11. Monte Carlo fundamentals

    Energy Technology Data Exchange (ETDEWEB)

    Brown, F.B.; Sutton, T.M.

    1996-02-01

    This report is composed of the lecture notes from the first half of a 32-hour graduate-level course on Monte Carlo methods offered at KAPL. These notes, prepared by two of the principle developers of KAPL`s RACER Monte Carlo code, cover the fundamental theory, concepts, and practices for Monte Carlo analysis. In particular, a thorough grounding in the basic fundamentals of Monte Carlo methods is presented, including random number generation, random sampling, the Monte Carlo approach to solving transport problems, computational geometry, collision physics, tallies, and eigenvalue calculations. Furthermore, modern computational algorithms for vector and parallel approaches to Monte Carlo calculations are covered in detail, including fundamental parallel and vector concepts, the event-based algorithm, master/slave schemes, parallel scaling laws, and portability issues.

  12. Monte Carlo methods

    OpenAIRE

    Bardenet, R.

    2012-01-01

    ISBN:978-2-7598-1032-1; International audience; Bayesian inference often requires integrating some function with respect to a posterior distribution. Monte Carlo methods are sampling algorithms that allow to compute these integrals numerically when they are not analytically tractable. We review here the basic principles and the most common Monte Carlo algorithms, among which rejection sampling, importance sampling and Monte Carlo Markov chain (MCMC) methods. We give intuition on the theoretic...

  13. Exploring Monte Carlo methods

    CERN Document Server

    Dunn, William L

    2012-01-01

    Exploring Monte Carlo Methods is a basic text that describes the numerical methods that have come to be known as "Monte Carlo." The book treats the subject generically through the first eight chapters and, thus, should be of use to anyone who wants to learn to use Monte Carlo. The next two chapters focus on applications in nuclear engineering, which are illustrative of uses in other fields. Five appendices are included, which provide useful information on probability distributions, general-purpose Monte Carlo codes for radiation transport, and other matters. The famous "Buffon's needle proble

  14. Monte Carlo Simulation of Counting Experiments.

    Science.gov (United States)

    Ogden, Philip M.

    A computer program to perform a Monte Carlo simulation of counting experiments was written. The program was based on a mathematical derivation which started with counts in a time interval. The time interval was subdivided to form a binomial distribution with no two counts in the same subinterval. Then the number of subintervals was extended to…

  15. Improved version of the PHOBOS Glauber Monte Carlo

    CERN Document Server

    Loizides, C; Steinberg, P

    2014-01-01

    Glauber models are used to calculate geometric quantities in the initial state of heavy ion collisions, such as impact parameter, number of participating nucleons and initial eccentricity. Experimental heavy-ion collaboration, in particular at RHIC and LHC, use Glauber Model calculations for various geometric observables. In this document, we describe the assumptions inherent to the approach, and provide an updated implementation (v2) of the Monte Carlo based Glauber Model calculation, which originally was used by the PHOBOS collaboration. The main improvement w.r.t. the earlier version (arXiv:0805.4411) are the inclusion of tritium, Helium-3, and Uranium, as well as the treatment of deformed nuclei and Glauber-Gribov fluctuations of the proton in p+A collisions. A users' guide (updated to reflect changes in v2) is provided for running various calculations.

  16. Coupled carrier-phonon nonequilibrium dynamics in terahertz quantum cascade lasers: a Monte Carlo analysis

    Science.gov (United States)

    Iotti, Rita C.; Rossi, Fausto

    2013-07-01

    The operation of state-of-the-art optoelectronic quantum devices may be significantly affected by the presence of a nonequilibrium quasiparticle population to which the carrier subsystem is unavoidably coupled. This situation is particularly evident in new-generation semiconductor-heterostructure-based quantum emitters, operating both in the mid-infrared as well as in the terahertz (THz) region of the electromagnetic spectrum. In this paper, we present a Monte Carlo-based global kinetic approach, suitable for the investigation of a combined carrier-phonon nonequilibrium dynamics in realistic devices, and discuss its application with a prototypical resonant-phonon THz emitting quantum cascade laser design.

  17. MCOR - Monte Carlo depletion code for reference LWR calculations

    Energy Technology Data Exchange (ETDEWEB)

    Puente Espel, Federico, E-mail: fup104@psu.edu [Department of Mechanical and Nuclear Engineering, Pennsylvania State University (United States); Tippayakul, Chanatip, E-mail: cut110@psu.edu [Department of Mechanical and Nuclear Engineering, Pennsylvania State University (United States); Ivanov, Kostadin, E-mail: kni1@psu.edu [Department of Mechanical and Nuclear Engineering, Pennsylvania State University (United States); Misu, Stefan, E-mail: Stefan.Misu@areva.com [AREVA, AREVA NP GmbH, Erlangen (Germany)

    2011-04-15

    Research highlights: > Introduction of a reference Monte Carlo based depletion code with extended capabilities. > Verification and validation results for MCOR. > Utilization of MCOR for benchmarking deterministic lattice physics (spectral) codes. - Abstract: The MCOR (MCnp-kORigen) code system is a Monte Carlo based depletion system for reference fuel assembly and core calculations. The MCOR code is designed as an interfacing code that provides depletion capability to the LANL Monte Carlo code by coupling two codes: MCNP5 with the AREVA NP depletion code, KORIGEN. The physical quality of both codes is unchanged. The MCOR code system has been maintained and continuously enhanced since it was initially developed and validated. The verification of the coupling was made by evaluating the MCOR code against similar sophisticated code systems like MONTEBURNS, OCTOPUS and TRIPOLI-PEPIN. After its validation, the MCOR code has been further improved with important features. The MCOR code presents several valuable capabilities such as: (a) a predictor-corrector depletion algorithm, (b) utilization of KORIGEN as the depletion module, (c) individual depletion calculation of each burnup zone (no burnup zone grouping is required, which is particularly important for the modeling of gadolinium rings), and (d) on-line burnup cross-section generation by the Monte Carlo calculation for 88 isotopes and usage of the KORIGEN libraries for PWR and BWR typical spectra for the remaining isotopes. Besides the just mentioned capabilities, the MCOR code newest enhancements focus on the possibility of executing the MCNP5 calculation in sequential or parallel mode, a user-friendly automatic re-start capability, a modification of the burnup step size evaluation, and a post-processor and test-matrix, just to name the most important. The article describes the capabilities of the MCOR code system; from its design and development to its latest improvements and further ameliorations. Additionally

  18. MORSE Monte Carlo code

    Energy Technology Data Exchange (ETDEWEB)

    Cramer, S.N.

    1984-01-01

    The MORSE code is a large general-use multigroup Monte Carlo code system. Although no claims can be made regarding its superiority in either theoretical details or Monte Carlo techniques, MORSE has been, since its inception at ORNL in the late 1960s, the most widely used Monte Carlo radiation transport code. The principal reason for this popularity is that MORSE is relatively easy to use, independent of any installation or distribution center, and it can be easily customized to fit almost any specific need. Features of the MORSE code are described.

  19. Quantum Monte Carlo simulation

    OpenAIRE

    Wang, Yazhen

    2011-01-01

    Contemporary scientific studies often rely on the understanding of complex quantum systems via computer simulation. This paper initiates the statistical study of quantum simulation and proposes a Monte Carlo method for estimating analytically intractable quantities. We derive the bias and variance for the proposed Monte Carlo quantum simulation estimator and establish the asymptotic theory for the estimator. The theory is used to design a computational scheme for minimizing the mean square er...

  20. Monte Carlo transition probabilities

    OpenAIRE

    Lucy, L. B.

    2001-01-01

    Transition probabilities governing the interaction of energy packets and matter are derived that allow Monte Carlo NLTE transfer codes to be constructed without simplifying the treatment of line formation. These probabilities are such that the Monte Carlo calculation asymptotically recovers the local emissivity of a gas in statistical equilibrium. Numerical experiments with one-point statistical equilibrium problems for Fe II and Hydrogen confirm this asymptotic behaviour. In addition, the re...

  1. Intrinsic fluorescence of protein in turbid media using empirical relation based on Monte Carlo lookup table

    Science.gov (United States)

    Einstein, Gnanatheepam; Udayakumar, Kanniyappan; Aruna, Prakasarao; Ganesan, Singaravelu

    2017-03-01

    Fluorescence of Protein has been widely used in diagnostic oncology for characterizing cellular metabolism. However, the intensity of fluorescence emission is affected due to the absorbers and scatterers in tissue, which may lead to error in estimating exact protein content in tissue. Extraction of intrinsic fluorescence from measured fluorescence has been achieved by different methods. Among them, Monte Carlo based method yields the highest accuracy for extracting intrinsic fluorescence. In this work, we have attempted to generate a lookup table for Monte Carlo simulation of fluorescence emission by protein. Furthermore, we fitted the generated lookup table using an empirical relation. The empirical relation between measured and intrinsic fluorescence is validated using tissue phantom experiments. The proposed relation can be used for estimating intrinsic fluorescence of protein for real-time diagnostic applications and thereby improving the clinical interpretation of fluorescence spectroscopic data.

  2. A parallel systematic-Monte Carlo algorithm for exploring conformational space.

    Science.gov (United States)

    Perez-Riverol, Yasset; Vera, Roberto; Mazola, Yuliet; Musacchio, Alexis

    2012-01-01

    Computational algorithms to explore the conformational space of small molecules are complex and computer demand field in chemoinformatics. In this paper a hybrid algorithm to explore the conformational space of organic molecules is presented. This hybrid algorithm is based in a systematic search approach combined with a Monte Carlo based method in order to obtain an ensemble of low-energy conformations simulating the flexibility of small chemical compounds. The Monte Carlo method uses the Metropolis criterion to accept or reject a conformation through an in-house implementation of the MMFF94s force field to calculate the conformational energy. The parallel design of this algorithm, based on the message passing interface (MPI) paradigm, was implemented. The results showed a performance increase in the terms of speed and efficiency.

  3. Variance Reduction Techniques in Monte Carlo Methods

    NARCIS (Netherlands)

    Kleijnen, Jack P.C.; Ridder, A.A.N.; Rubinstein, R.Y.

    2010-01-01

    Monte Carlo methods are simulation algorithms to estimate a numerical quantity in a statistical model of a real system. These algorithms are executed by computer programs. Variance reduction techniques (VRT) are needed, even though computer speed has been increasing dramatically, ever since the intr

  4. Variance Reduction Techniques in Monte Carlo Methods

    NARCIS (Netherlands)

    Kleijnen, Jack P.C.; Ridder, A.A.N.; Rubinstein, R.Y.

    2010-01-01

    Monte Carlo methods are simulation algorithms to estimate a numerical quantity in a statistical model of a real system. These algorithms are executed by computer programs. Variance reduction techniques (VRT) are needed, even though computer speed has been increasing dramatically, ever since the intr

  5. Accuracy of Monte Carlo simulations compared to in-vivo MDCT dosimetry

    Energy Technology Data Exchange (ETDEWEB)

    Bostani, Maryam, E-mail: mbostani@mednet.ucla.edu; McMillan, Kyle; Cagnon, Chris H.; McNitt-Gray, Michael F. [Departments of Biomedical Physics and Radiology, David Geffen School of Medicine, University of California, Los Angeles, Los Angeles, California 90024 (United States); Mueller, Jonathon W. [United States Air Force, Keesler Air Force Base, Biloxi, Mississippi 39534 (United States); Cody, Dianna D. [University of Texas M.D. Anderson Cancer Center, Houston, Texas 77030 (United States); DeMarco, John J. [Departments of Biomedical Physics and Radiation Oncology, David Geffen School of Medicine, University of California, Los Angeles, Los Angeles, California 90024 (United States)

    2015-02-15

    Purpose: The purpose of this study was to assess the accuracy of a Monte Carlo simulation-based method for estimating radiation dose from multidetector computed tomography (MDCT) by comparing simulated doses in ten patients to in-vivo dose measurements. Methods: MD Anderson Cancer Center Institutional Review Board approved the acquisition of in-vivo rectal dose measurements in a pilot study of ten patients undergoing virtual colonoscopy. The dose measurements were obtained by affixing TLD capsules to the inner lumen of rectal catheters. Voxelized patient models were generated from the MDCT images of the ten patients, and the dose to the TLD for all exposures was estimated using Monte Carlo based simulations. The Monte Carlo simulation results were compared to the in-vivo dose measurements to determine accuracy. Results: The calculated mean percent difference between TLD measurements and Monte Carlo simulations was −4.9% with standard deviation of 8.7% and a range of −22.7% to 5.7%. Conclusions: The results of this study demonstrate very good agreement between simulated and measured doses in-vivo. Taken together with previous validation efforts, this work demonstrates that the Monte Carlo simulation methods can provide accurate estimates of radiation dose in patients undergoing CT examinations.

  6. Monte Carlo simulation of neutron scattering instruments

    Energy Technology Data Exchange (ETDEWEB)

    Seeger, P.A.

    1995-12-31

    A library of Monte Carlo subroutines has been developed for the purpose of design of neutron scattering instruments. Using small-angle scattering as an example, the philosophy and structure of the library are described and the programs are used to compare instruments at continuous wave (CW) and long-pulse spallation source (LPSS) neutron facilities. The Monte Carlo results give a count-rate gain of a factor between 2 and 4 using time-of-flight analysis. This is comparable to scaling arguments based on the ratio of wavelength bandwidth to resolution width.

  7. Monte Vista and Alamosa NWR Water Use Report- 1980

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — This report summarizes water use at Monte Vista and Alamosa NWR for 1980. The document includes summaries of 1980 water use, 1981 water program recommendations, and...

  8. Monte Vista and Alamosa NWR Water Use Report- 1983

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — This report summarizes water use at Monte Vista and Alamosa NWR for 1983. The document includes summaries of 1983 water use, 1984 water program recommendations, and...

  9. Monte Vista and Alamosa NWR Water Use Report- 1986

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — This report summarizes water use at Monte Vista and Alamosa NWR for 1986. The document includes summaries of 1986 water use, 1987 water program recommendations, and...

  10. Monte Vista and Alamosa NWR Water Use Report- 1987

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — This report summarizes water use at Monte Vista and Alamosa NWR for 1987. The document includes summaries of 1987 water use, 1988 water program recommendations, and...

  11. Monte Vista and Alamosa NWR Water Use Report- 1984

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — This report summarizes water use at Monte Vista and Alamosa NWR for 1984. The document includes summaries of 1984 water use, 1985 water program recommendations, and...

  12. CosmoPMC: Cosmology Population Monte Carlo

    CERN Document Server

    Kilbinger, Martin; Cappe, Olivier; Cardoso, Jean-Francois; Fort, Gersende; Prunet, Simon; Robert, Christian P; Wraith, Darren

    2011-01-01

    We present the public release of the Bayesian sampling algorithm for cosmology, CosmoPMC (Cosmology Population Monte Carlo). CosmoPMC explores the parameter space of various cosmological probes, and also provides a robust estimate of the Bayesian evidence. CosmoPMC is based on an adaptive importance sampling method called Population Monte Carlo (PMC). Various cosmology likelihood modules are implemented, and new modules can be added easily. The importance-sampling algorithm is written in C, and fully parallelised using the Message Passing Interface (MPI). Due to very little overhead, the wall-clock time required for sampling scales approximately with the number of CPUs. The CosmoPMC package contains post-processing and plotting programs, and in addition a Monte-Carlo Markov chain (MCMC) algorithm. The sampling engine is implemented in the library pmclib, and can be used independently. The software is available for download at http://www.cosmopmc.info.

  13. Study of the Transition Flow Regime using Monte Carlo Methods

    Science.gov (United States)

    Hassan, H. A.

    1999-01-01

    This NASA Cooperative Agreement presents a study of the Transition Flow Regime Using Monte Carlo Methods. The topics included in this final report are: 1) New Direct Simulation Monte Carlo (DSMC) procedures; 2) The DS3W and DS2A Programs; 3) Papers presented; 4) Miscellaneous Applications and Program Modifications; 5) Solution of Transitional Wake Flows at Mach 10; and 6) Turbulence Modeling of Shock-Dominated Fows with a k-Enstrophy Formulation.

  14. Cloud-based Monte Carlo modelling of BSSRDF for the rendering of human skin appearance (Conference Presentation)

    Science.gov (United States)

    Doronin, Alexander; Rushmeier, Holly E.; Meglinski, Igor; Bykov, Alexander V.

    2016-03-01

    We present a new Monte Carlo based approach for the modelling of Bidirectional Scattering-Surface Reflectance Distribution Function (BSSRDF) for accurate rendering of human skin appearance. The variations of both skin tissues structure and the major chromophores are taken into account correspondingly to the different ethnic and age groups. The computational solution utilizes HTML5, accelerated by the graphics processing units (GPUs), and therefore is convenient for the practical use at the most of modern computer-based devices and operating systems. The results of imitation of human skin reflectance spectra, corresponding skin colours and examples of 3D faces rendering are presented and compared with the results of phantom studies.

  15. Monte Carlo Option Princing

    Directory of Open Access Journals (Sweden)

    Cecilia Maya

    2004-12-01

    Full Text Available El método Monte Carlo se aplica a varios casos de valoración de opciones financieras. El método genera una buena aproximación al comparar su precisión con la de otros métodos numéricos. La estimación que produce la versión Cruda de Monte Carlo puede ser aún más exacta si se recurre a metodologías de reducción de la varianza entre las cuales se sugieren la variable antitética y de la variable de control. Sin embargo, dichas metodologías requieren un esfuerzo computacional mayor por lo cual las mismas deben ser evaluadas en términos no sólo de su precisión sino también de su eficiencia.

  16. Monte Carlo and nonlinearities

    CERN Document Server

    Dauchet, Jérémi; Blanco, Stéphane; Caliot, Cyril; Charon, Julien; Coustet, Christophe; Hafi, Mouna El; Eymet, Vincent; Farges, Olivier; Forest, Vincent; Fournier, Richard; Galtier, Mathieu; Gautrais, Jacques; Khuong, Anaïs; Pelissier, Lionel; Piaud, Benjamin; Roger, Maxime; Terrée, Guillaume; Weitz, Sebastian

    2016-01-01

    The Monte Carlo method is widely used to numerically predict systems behaviour. However, its powerful incremental design assumes a strong premise which has severely limited application so far: the estimation process must combine linearly over dimensions. Here we show that this premise can be alleviated by projecting nonlinearities on a polynomial basis and increasing the configuration-space dimension. Considering phytoplankton growth in light-limited environments, radiative transfer in planetary atmospheres, electromagnetic scattering by particles and concentrated-solar-power-plant productions, we prove the real world usability of this advance on four test-cases that were so far regarded as impracticable by Monte Carlo approaches. We also illustrate an outstanding feature of our method when applied to sharp problems with interacting particles: handling rare events is now straightforward. Overall, our extension preserves the features that made the method popular: addressing nonlinearities does not compromise o...

  17. Fundamentals of Monte Carlo

    Energy Technology Data Exchange (ETDEWEB)

    Wollaber, Allan Benton [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-06-16

    This is a powerpoint presentation which serves as lecture material for the Parallel Computing summer school. It goes over the fundamentals of the Monte Carlo calculation method. The material is presented according to the following outline: Introduction (background, a simple example: estimating π), Why does this even work? (The Law of Large Numbers, The Central Limit Theorem), How to sample (inverse transform sampling, rejection), and An example from particle transport.

  18. 超级蒙特卡罗核计算仿真软件系统SuperMC%Super Monte Carlo Simulation Program for Nuclear and Radiation Process:SuperMC

    Institute of Scientific and Technical Information of China (English)

    吴宜灿; 孙光耀; 吴斌; 杨琪; 陈朝斌; 党同强; 方菱; 裴曦; 王芳; 汪进; 蒋洁琼; 宋婧; 汪建业; 赵柱民; FDS团队; 胡丽琴; 龙鹏程; 何桃; 程梦云; 郑华庆; 郝丽娟; 俞盛朋

    2016-01-01

    Monte Carlo method has distinct advantages in simulating complicated nuclear systems. However,great challenges to current MC methods and codes prevent its application in engineering proj ects, such as difficulties in the accurate modeling of complex geometries and material distribution,slow convergence of calculation,prompt and effective analysis of massive data. Super Monte Carlo Simulation Program for Nuclear and Radiation Process (SuperMC)is designed to perform the comprehensive neutronics calculation,taking the radiation transport as the core and including the depletion,radiation source term/dose/biohazard,material activation and transmutation,etc. It supports the multi-physics coupling calculation including thermo-hydraulics,structural mechanics,chemistry,biology,etc. Key techniques including automatic and accurate modeling,high efficient calculation,4D visualization were developed and more than 2000 international benchmark models and experiments were used to verify and validate SuperMC. SuperMC has been widely used in reactor engineering proj ects and etc. In this paper,the overview of SuperMC development was introduced.%蒙特卡罗方法对于复杂核系统的模拟具有明显优势,然而在实际工程应用中存在巨大的挑战,如复杂结构与材料分布精准建模难度大、计算收敛速度慢、海量数据难以及时有效分析等。超级蒙特卡罗核计算仿真软件系统 SuperMC设计为支持以辐射输运为核心,包含燃耗、辐射源项/剂量/生物危害、材料活化与嬗变等的综合中子学计算,支持热工水力学、结构力学、化学、生物学等多物理耦合模拟。 SuperMC目前已发展了精准建模、高效计算、四维可视化等关键技术,通过2000余个国际基准模型及实验的验证与确认,在反应堆工程等方面获得广泛应用,本文对其发展概况进行介绍。

  19. LMC: Logarithmantic Monte Carlo

    Science.gov (United States)

    Mantz, Adam B.

    2017-06-01

    LMC is a Markov Chain Monte Carlo engine in Python that implements adaptive Metropolis-Hastings and slice sampling, as well as the affine-invariant method of Goodman & Weare, in a flexible framework. It can be used for simple problems, but the main use case is problems where expensive likelihood evaluations are provided by less flexible third-party software, which benefit from parallelization across many nodes at the sampling level. The parallel/adaptive methods use communication through MPI, or alternatively by writing/reading files, and mostly follow the approaches pioneered by CosmoMC (ascl:1106.025).

  20. Time management for Monte-Carlo tree search in Go

    NARCIS (Netherlands)

    Baier, Hendrik; Winands, Mark H M

    2012-01-01

    The dominant approach for programs playing the game of Go is nowadays Monte-Carlo Tree Search (MCTS). While MCTS allows for fine-grained time control, little has been published on time management for MCTS programs under tournament conditions. This paper investigates the effects that various time-man

  1. Improving PWR core simulations by Monte Carlo uncertainty analysis and Bayesian inference

    CERN Document Server

    Castro, Emilio; Buss, Oliver; Garcia-Herranz, Nuria; Hoefer, Axel; Porsch, Dieter

    2016-01-01

    A Monte Carlo-based Bayesian inference model is applied to the prediction of reactor operation parameters of a PWR nuclear power plant. In this non-perturbative framework, high-dimensional covariance information describing the uncertainty of microscopic nuclear data is combined with measured reactor operation data in order to provide statistically sound, well founded uncertainty estimates of integral parameters, such as the boron letdown curve and the burnup-dependent reactor power distribution. The performance of this methodology is assessed in a blind test approach, where we use measurements of a given reactor cycle to improve the prediction of the subsequent cycle. As it turns out, the resulting improvement of the prediction quality is impressive. In particular, the prediction uncertainty of the boron letdown curve, which is of utmost importance for the planning of the reactor cycle length, can be reduced by one order of magnitude by including the boron concentration measurement information of the previous...

  2. Neutrino oscillation parameter sampling with MonteCUBES

    Science.gov (United States)

    Blennow, Mattias; Fernandez-Martinez, Enrique

    2010-01-01

    We present MonteCUBES ("Monte Carlo Utility Based Experiment Simulator"), a software package designed to sample the neutrino oscillation parameter space through Markov Chain Monte Carlo algorithms. MonteCUBES makes use of the GLoBES software so that the existing experiment definitions for GLoBES, describing long baseline and reactor experiments, can be used with MonteCUBES. MonteCUBES consists of two main parts: The first is a C library, written as a plug-in for GLoBES, implementing the Markov Chain Monte Carlo algorithm to sample the parameter space. The second part is a user-friendly graphical Matlab interface to easily read, analyze, plot and export the results of the parameter space sampling. Program summaryProgram title: MonteCUBES (Monte Carlo Utility Based Experiment Simulator) Catalogue identifier: AEFJ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEFJ_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public Licence No. of lines in distributed program, including test data, etc.: 69 634 No. of bytes in distributed program, including test data, etc.: 3 980 776 Distribution format: tar.gz Programming language: C Computer: MonteCUBES builds and installs on 32 bit and 64 bit Linux systems where GLoBES is installed Operating system: 32 bit and 64 bit Linux RAM: Typically a few MBs Classification: 11.1 External routines: GLoBES [1,2] and routines/libraries used by GLoBES Subprograms used:Cat Id ADZI_v1_0, Title GLoBES, Reference CPC 177 (2007) 439 Nature of problem: Since neutrino masses do not appear in the standard model of particle physics, many models of neutrino masses also induce other types of new physics, which could affect the outcome of neutrino oscillation experiments. In general, these new physics imply high-dimensional parameter spaces that are difficult to explore using classical methods such as multi-dimensional projections and minimizations, such as those

  3. MCMini: Monte Carlo on GPGPU

    Energy Technology Data Exchange (ETDEWEB)

    Marcus, Ryan C. [Los Alamos National Laboratory

    2012-07-25

    MCMini is a proof of concept that demonstrates the possibility for Monte Carlo neutron transport using OpenCL with a focus on performance. This implementation, written in C, shows that tracing particles and calculating reactions on a 3D mesh can be done in a highly scalable fashion. These results demonstrate a potential path forward for MCNP or other Monte Carlo codes.

  4. Monte Carlo methods for electromagnetics

    CERN Document Server

    Sadiku, Matthew NO

    2009-01-01

    Until now, novices had to painstakingly dig through the literature to discover how to use Monte Carlo techniques for solving electromagnetic problems. Written by one of the foremost researchers in the field, Monte Carlo Methods for Electromagnetics provides a solid understanding of these methods and their applications in electromagnetic computation. Including much of his own work, the author brings together essential information from several different publications.Using a simple, clear writing style, the author begins with a historical background and review of electromagnetic theory. After addressing probability and statistics, he introduces the finite difference method as well as the fixed and floating random walk Monte Carlo methods. The text then applies the Exodus method to Laplace's and Poisson's equations and presents Monte Carlo techniques for handing Neumann problems. It also deals with whole field computation using the Markov chain, applies Monte Carlo methods to time-varying diffusion problems, and ...

  5. Monte-Carlo simulation-based statistical modeling

    CERN Document Server

    Chen, John

    2017-01-01

    This book brings together expert researchers engaged in Monte-Carlo simulation-based statistical modeling, offering them a forum to present and discuss recent issues in methodological development as well as public health applications. It is divided into three parts, with the first providing an overview of Monte-Carlo techniques, the second focusing on missing data Monte-Carlo methods, and the third addressing Bayesian and general statistical modeling using Monte-Carlo simulations. The data and computer programs used here will also be made publicly available, allowing readers to replicate the model development and data analysis presented in each chapter, and to readily apply them in their own research. Featuring highly topical content, the book has the potential to impact model development and data analyses across a wide spectrum of fields, and to spark further research in this direction.

  6. Synchrotron stereotactic radiotherapy: dosimetry by Fricke gel and Monte Carlo simulations.

    Science.gov (United States)

    Boudou, Caroline; Biston, Marie-Claude; Corde, Stéphanie; Adam, Jean-François; Ferrero, Claudio; Estève, François; Elleaume, Hélène

    2004-11-21

    Synchrotron stereotactic radiotherapy (SSR) consists in loading the tumour with a high atomic number element (Z), and exposing it to monochromatic x-rays from a synchrotron source (50-100 keV), in stereotactic conditions. The dose distribution results from both the stereotactic monochromatic x-ray irradiation and the presence of the high Z element. The purpose of this preliminary study was to evaluate the two-dimensional dose distribution resulting solely from the irradiation geometry, using Monte Carlo simulations and a Fricke gel dosimeter. The verification of a Monte Carlo-based dosimetry was first assessed by depth dose measurements in a water tank. We thereafter used a Fricke dosimeter to compare Monte Carlo simulations with dose measurements. The Fricke dosimeter is a solution containing ferrous ions which are oxidized to ferric ions under ionizing radiation, proportionally to the absorbed dose. A cylindrical phantom filled with Fricke gel was irradiated in stereotactic conditions over several slices with a continuous beam (beam section = 0.1 x 1 cm2). The phantom and calibration vessels were then imaged by nuclear magnetic resonance. The measured doses were fairly consistent with those predicted by Monte Carlo simulations. However, the measured maximum absolute dose was 10% underestimated regarding calculation. The loss of information in the higher region of dose is explained by the diffusion of ferric ions. Monte Carlo simulation is the most accurate tool for dosimetry including complex geometries made of heterogeneous materials. Although the technique requires improvements, gel dosimetry remains an essential tool for the experimental verification of dose distribution in SSR with millimetre precision.

  7. Metropolis Methods for Quantum Monte Carlo Simulations

    OpenAIRE

    Ceperley, D. M.

    2003-01-01

    Since its first description fifty years ago, the Metropolis Monte Carlo method has been used in a variety of different ways for the simulation of continuum quantum many-body systems. This paper will consider some of the generalizations of the Metropolis algorithm employed in quantum Monte Carlo: Variational Monte Carlo, dynamical methods for projector monte carlo ({\\it i.e.} diffusion Monte Carlo with rejection), multilevel sampling in path integral Monte Carlo, the sampling of permutations, ...

  8. Lectures on Monte Carlo methods

    CERN Document Server

    Madras, Neal

    2001-01-01

    Monte Carlo methods form an experimental branch of mathematics that employs simulations driven by random number generators. These methods are often used when others fail, since they are much less sensitive to the "curse of dimensionality", which plagues deterministic methods in problems with a large number of variables. Monte Carlo methods are used in many fields: mathematics, statistics, physics, chemistry, finance, computer science, and biology, for instance. This book is an introduction to Monte Carlo methods for anyone who would like to use these methods to study various kinds of mathemati

  9. High-Throughput Computation and the Applicability of Monte Carlo Integration in Fatigue Load Estimation of Floating Offshore Wind Turbines

    Energy Technology Data Exchange (ETDEWEB)

    Graf, Peter A.; Stewart, Gordon; Lackner, Matthew; Dykes, Katherine; Veers, Paul

    2016-05-01

    Long-term fatigue loads for floating offshore wind turbines are hard to estimate because they require the evaluation of the integral of a highly nonlinear function over a wide variety of wind and wave conditions. Current design standards involve scanning over a uniform rectangular grid of metocean inputs (e.g., wind speed and direction and wave height and period), which becomes intractable in high dimensions as the number of required evaluations grows exponentially with dimension. Monte Carlo integration offers a potentially efficient alternative because it has theoretical convergence proportional to the inverse of the square root of the number of samples, which is independent of dimension. In this paper, we first report on the integration of the aeroelastic code FAST into NREL's systems engineering tool, WISDEM, and the development of a high-throughput pipeline capable of sampling from arbitrary distributions, running FAST on a large scale, and postprocessing the results into estimates of fatigue loads. Second, we use this tool to run a variety of studies aimed at comparing grid-based and Monte Carlo-based approaches with calculating long-term fatigue loads. We observe that for more than a few dimensions, the Monte Carlo approach can represent a large improvement in computational efficiency, but that as nonlinearity increases, the effectiveness of Monte Carlo is correspondingly reduced. The present work sets the stage for future research focusing on using advanced statistical methods for analysis of wind turbine fatigue as well as extreme loads.

  10. Quantum Monte Carlo for vibrating molecules

    Energy Technology Data Exchange (ETDEWEB)

    Brown, W.R. [Univ. of California, Berkeley, CA (United States). Chemistry Dept.]|[Lawrence Berkeley National Lab., CA (United States). Chemical Sciences Div.

    1996-08-01

    Quantum Monte Carlo (QMC) has successfully computed the total electronic energies of atoms and molecules. The main goal of this work is to use correlation function quantum Monte Carlo (CFQMC) to compute the vibrational state energies of molecules given a potential energy surface (PES). In CFQMC, an ensemble of random walkers simulate the diffusion and branching processes of the imaginary-time time dependent Schroedinger equation in order to evaluate the matrix elements. The program QMCVIB was written to perform multi-state VMC and CFQMC calculations and employed for several calculations of the H{sub 2}O and C{sub 3} vibrational states, using 7 PES`s, 3 trial wavefunction forms, two methods of non-linear basis function parameter optimization, and on both serial and parallel computers. In order to construct accurate trial wavefunctions different wavefunctions forms were required for H{sub 2}O and C{sub 3}. In order to construct accurate trial wavefunctions for C{sub 3}, the non-linear parameters were optimized with respect to the sum of the energies of several low-lying vibrational states. In order to stabilize the statistical error estimates for C{sub 3} the Monte Carlo data was collected into blocks. Accurate vibrational state energies were computed using both serial and parallel QMCVIB programs. Comparison of vibrational state energies computed from the three C{sub 3} PES`s suggested that a non-linear equilibrium geometry PES is the most accurate and that discrete potential representations may be used to conveniently determine vibrational state energies.

  11. Monte Carlo simulations of nanoscale Ne+ ion beam sputtering: investigating the influence of surface effects, interstitial formation, and the nanostructural evolution

    Science.gov (United States)

    Mahady, Kyle; Tan, Shida; Greenzweig, Yuval; Livengood, Richard; Raveh, Amir; Rack, Philip

    2017-01-01

    We present an updated version of our Monte-Carlo based code for the simulation of ion beam sputtering. This code simulates the interaction of energetic ions with a target, and tracks the cumulative damage, enabling it to simulate the dynamic evolution of nanostructures as material is removed. The updated code described in this paper is significantly faster, permitting the inclusion of new features, namely routines to handle interstitial atoms, and to reduce the surface energy as the structure would otherwise develop energetically unfavorable surface porosity. We validate our code against the popular Monte-Carlo code SRIM-TRIM, and study the development of nanostructures from Ne+ ion beam milling in a copper target.

  12. Plasma physics code contribution to the Mont-Blanc project

    OpenAIRE

    Sáez, Xavier; Soba, Alejandro; Mantsinen, Mervi

    2015-01-01

    This work develops strategies for adapting a particle-in-cell code to heterogeneous computer architectures and, in particular, to an ARM-based prototype of the Mont-Blanc project using OmpSs programming model and the OpenMP and OpenCL languages.

  13. Multilevel sequential Monte Carlo samplers

    KAUST Repository

    Beskos, Alexandros

    2016-08-29

    In this article we consider the approximation of expectations w.r.t. probability distributions associated to the solution of partial differential equations (PDEs); this scenario appears routinely in Bayesian inverse problems. In practice, one often has to solve the associated PDE numerically, using, for instance finite element methods which depend on the step-size level . hL. In addition, the expectation cannot be computed analytically and one often resorts to Monte Carlo methods. In the context of this problem, it is known that the introduction of the multilevel Monte Carlo (MLMC) method can reduce the amount of computational effort to estimate expectations, for a given level of error. This is achieved via a telescoping identity associated to a Monte Carlo approximation of a sequence of probability distributions with discretization levels . ∞>h0>h1⋯>hL. In many practical problems of interest, one cannot achieve an i.i.d. sampling of the associated sequence and a sequential Monte Carlo (SMC) version of the MLMC method is introduced to deal with this problem. It is shown that under appropriate assumptions, the attractive property of a reduction of the amount of computational effort to estimate expectations, for a given level of error, can be maintained within the SMC context. That is, relative to exact sampling and Monte Carlo for the distribution at the finest level . hL. The approach is numerically illustrated on a Bayesian inverse problem. © 2016 Elsevier B.V.

  14. PEPSI — a Monte Carlo generator for polarized leptoproduction

    Science.gov (United States)

    Mankiewicz, L.; Schäfer, A.; Veltri, M.

    1992-09-01

    We describe PEPSI (Polarized Electron Proton Scattering Interactions), a Monte Carlo program for polarized deep inelastic leptoproduction mediated by electromagnetic interaction, and explain how to use it. The code is a modification of the LEPTO 4.3 Lund Monte Carlo for unpolarized scattering. The hard virtual gamma-parton scattering is generated according to the polarization-dependent QCD cross-section of the first order in α S. PEPSI requires the standard polarization-independent JETSET routines to simulate the fragmentation into final hadrons.

  15. Monte Carlo simulation as a tool to predict blasting fragmentation based on the Kuz Ram model

    Science.gov (United States)

    Morin, Mario A.; Ficarazzo, Francesco

    2006-04-01

    Rock fragmentation is considered the most important aspect of production blasting because of its direct effects on the costs of drilling and blasting and on the economics of the subsequent operations of loading, hauling and crushing. Over the past three decades, significant progress has been made in the development of new technologies for blasting applications. These technologies include increasingly sophisticated computer models for blast design and blast performance prediction. Rock fragmentation depends on many variables such as rock mass properties, site geology, in situ fracturing and blasting parameters and as such has no complete theoretical solution for its prediction. However, empirical models for the estimation of size distribution of rock fragments have been developed. In this study, a blast fragmentation Monte Carlo-based simulator, based on the Kuz-Ram fragmentation model, has been developed to predict the entire fragmentation size distribution, taking into account intact and joints rock properties, the type and properties of explosives and the drilling pattern. Results produced by this simulator were quite favorable when compared with real fragmentation data obtained from a blast quarry. It is anticipated that the use of Monte Carlo simulation will increase our understanding of the effects of rock mass and explosive properties on the rock fragmentation by blasting, as well as increase our confidence in these empirical models. This understanding will translate into improvements in blasting operations, its corresponding costs and the overall economics of open pit mines and rock quarries.

  16. A Monte Carlo model for 3D grain evolution during welding

    Science.gov (United States)

    Rodgers, Theron M.; Mitchell, John A.; Tikare, Veena

    2017-09-01

    Welding is one of the most wide-spread processes used in metal joining. However, there are currently no open-source software implementations for the simulation of microstructural evolution during a weld pass. Here we describe a Potts Monte Carlo based model implemented in the SPPARKS kinetic Monte Carlo computational framework. The model simulates melting, solidification and solid-state microstructural evolution of material in the fusion and heat-affected zones of a weld. The model does not simulate thermal behavior, but rather utilizes user input parameters to specify weld pool and heat-affect zone properties. Weld pool shapes are specified by Bézier curves, which allow for the specification of a wide range of pool shapes. Pool shapes can range from narrow and deep to wide and shallow representing different fluid flow conditions within the pool. Surrounding temperature gradients are calculated with the aide of a closest point projection algorithm. The model also allows simulation of pulsed power welding through time-dependent variation of the weld pool size. Example simulation results and comparisons with laboratory weld observations demonstrate microstructural variation with weld speed, pool shape, and pulsed-power.

  17. Equilibrium Statistics: Monte Carlo Methods

    Science.gov (United States)

    Kröger, Martin

    Monte Carlo methods use random numbers, or ‘random’ sequences, to sample from a known shape of a distribution, or to extract distribution by other means. and, in the context of this book, to (i) generate representative equilibrated samples prior being subjected to external fields, or (ii) evaluate high-dimensional integrals. Recipes for both topics, and some more general methods, are summarized in this chapter. It is important to realize, that Monte Carlo should be as artificial as possible to be efficient and elegant. Advanced Monte Carlo ‘moves’, required to optimize the speed of algorithms for a particular problem at hand, are outside the scope of this brief introduction. One particular modern example is the wavelet-accelerated MC sampling of polymer chains [406].

  18. Monte Carlo systems used for treatment planning and dose verification

    Energy Technology Data Exchange (ETDEWEB)

    Brualla, Lorenzo [Universitaetsklinikum Essen, NCTeam, Strahlenklinik, Essen (Germany); Rodriguez, Miguel [Centro Medico Paitilla, Balboa (Panama); Lallena, Antonio M. [Universidad de Granada, Departamento de Fisica Atomica, Molecular y Nuclear, Granada (Spain)

    2017-04-15

    General-purpose radiation transport Monte Carlo codes have been used for estimation of the absorbed dose distribution in external photon and electron beam radiotherapy patients since several decades. Results obtained with these codes are usually more accurate than those provided by treatment planning systems based on non-stochastic methods. Traditionally, absorbed dose computations based on general-purpose Monte Carlo codes have been used only for research, owing to the difficulties associated with setting up a simulation and the long computation time required. To take advantage of radiation transport Monte Carlo codes applied to routine clinical practice, researchers and private companies have developed treatment planning and dose verification systems that are partly or fully based on fast Monte Carlo algorithms. This review presents a comprehensive list of the currently existing Monte Carlo systems that can be used to calculate or verify an external photon and electron beam radiotherapy treatment plan. Particular attention is given to those systems that are distributed, either freely or commercially, and that do not require programming tasks from the end user. These systems are compared in terms of features and the simulation time required to compute a set of benchmark calculations. (orig.) [German] Seit mehreren Jahrzehnten werden allgemein anwendbare Monte-Carlo-Codes zur Simulation des Strahlungstransports benutzt, um die Verteilung der absorbierten Dosis in der perkutanen Strahlentherapie mit Photonen und Elektronen zu evaluieren. Die damit erzielten Ergebnisse sind meist akkurater als solche, die mit nichtstochastischen Methoden herkoemmlicher Bestrahlungsplanungssysteme erzielt werden koennen. Wegen des damit verbundenen Arbeitsaufwands und der langen Dauer der Berechnungen wurden Monte-Carlo-Simulationen von Dosisverteilungen in der konventionellen Strahlentherapie in der Vergangenheit im Wesentlichen in der Forschung eingesetzt. Im Bemuehen, Monte

  19. Monte Carlo tests of the ELIPGRID-PC algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Davidson, J.R.

    1995-04-01

    The standard tool for calculating the probability of detecting pockets of contamination called hot spots has been the ELIPGRID computer code of Singer and Wickman. The ELIPGRID-PC program has recently made this algorithm available for an IBM{reg_sign} PC. However, no known independent validation of the ELIPGRID algorithm exists. This document describes a Monte Carlo simulation-based validation of a modified version of the ELIPGRID-PC code. The modified ELIPGRID-PC code is shown to match Monte Carlo-calculated hot-spot detection probabilities to within {plus_minus}0.5% for 319 out of 320 test cases. The one exception, a very thin elliptical hot spot located within a rectangular sampling grid, differed from the Monte Carlo-calculated probability by about 1%. These results provide confidence in the ability of the modified ELIPGRID-PC code to accurately predict hot-spot detection probabilities within an acceptable range of error.

  20. Monte Carlo simulation of laser attenuation characteristics in fog

    Science.gov (United States)

    Wang, Hong-Xia; Sun, Chao; Zhu, You-zhang; Sun, Hong-hui; Li, Pan-shi

    2011-06-01

    Based on the Mie scattering theory and the gamma size distribution model, the scattering extinction parameter of spherical fog-drop is calculated. For the transmission attenuation of the laser in the fog, a Monte Carlo simulation model is established, and the impact of attenuation ratio on visibility and field angle is computed and analysed using the program developed by MATLAB language. The results of the Monte Carlo method in this paper are compared with the results of single scattering method. The results show that the influence of multiple scattering need to be considered when the visibility is low, and single scattering calculations have larger errors. The phenomenon of multiple scattering can be interpreted more better when the Monte Carlo is used to calculate the attenuation ratio of the laser transmitting in the fog.

  1. Monte Carlo Simulation in Statistical Physics An Introduction

    CERN Document Server

    Binder, Kurt

    2010-01-01

    Monte Carlo Simulation in Statistical Physics deals with the computer simulation of many-body systems in condensed-matter physics and related fields of physics, chemistry and beyond, to traffic flows, stock market fluctuations, etc.). Using random numbers generated by a computer, probability distributions are calculated, allowing the estimation of the thermodynamic properties of various systems. This book describes the theoretical background to several variants of these Monte Carlo methods and gives a systematic presentation from which newcomers can learn to perform such simulations and to analyze their results. The fifth edition covers Classical as well as Quantum Monte Carlo methods. Furthermore a new chapter on the sampling of free-energy landscapes has been added. To help students in their work a special web server has been installed to host programs and discussion groups (http://wwwcp.tphys.uni-heidelberg.de). Prof. Binder was awarded the Berni J. Alder CECAM Award for Computational Physics 2001 as well ...

  2. In Silico Generation of Peptides by Replica Exchange Monte Carlo: Docking-Based Optimization of Maltose-Binding-Protein Ligands.

    Directory of Open Access Journals (Sweden)

    Anna Russo

    Full Text Available Short peptides can be designed in silico and synthesized through automated techniques, making them advantageous and versatile protein binders. A number of docking-based algorithms allow for a computational screening of peptides as binders. Here we developed ex-novo peptides targeting the maltose site of the Maltose Binding Protein, the prototypical system for the study of protein ligand recognition. We used a Monte Carlo based protocol, to computationally evolve a set of octapeptides starting from a polialanine sequence. We screened in silico the candidate peptides and characterized their binding abilities by surface plasmon resonance, fluorescence and electrospray ionization mass spectrometry assays. These experiments showed the designed binders to recognize their target with micromolar affinity. We finally discuss the obtained results in the light of further improvement in the ex-novo optimization of peptide based binders.

  3. Physics study of microbeam radiation therapy with PSI-version of Monte Carlo code GEANT as a new computational tool

    CERN Document Server

    Stepanek, J; Laissue, J A; Lyubimova, N; Di Michiel, F; Slatkin, D N

    2000-01-01

    Microbeam radiation therapy (MRT) is a currently experimental method of radiotherapy which is mediated by an array of parallel microbeams of synchrotron-wiggler-generated X-rays. Suitably selected, nominally supralethal doses of X-rays delivered to parallel microslices of tumor-bearing tissues in rats can be either palliative or curative while causing little or no serious damage to contiguous normal tissues. Although the pathogenesis of MRT-mediated tumor regression is not understood, as in all radiotherapy such understanding will be based ultimately on our understanding of the relationships among the following three factors: (1) microdosimetry, (2) damage to normal tissues, and (3) therapeutic efficacy. Although physical microdosimetry is feasible, published information on MRT microdosimetry to date is computational. This report describes Monte Carlo-based computational MRT microdosimetry using photon and/or electron scattering and photoionization cross-section data in the 1 e V through 100 GeV range distrib...

  4. Monte Carlo Hamiltonian: Linear Potentials

    Institute of Scientific and Technical Information of China (English)

    LUO Xiang-Qian; LIU Jin-Jiang; HUANG Chun-Qing; JIANG Jun-Qin; Helmut KROGER

    2002-01-01

    We further study the validity of the Monte Carlo Hamiltonian method. The advantage of the method,in comparison with the standard Monte Carlo Lagrangian approach, is its capability to study the excited states. Weconsider two quantum mechanical models: a symmetric one V(x) = |x|/2; and an asymmetric one V(x) = ∞, forx < 0 and V(x) = x, for x ≥ 0. The results for the spectrum, wave functions and thermodynamical observables are inagreement with the analytical or Runge-Kutta calculations.

  5. Status of Monte-Carlo Event Generators

    Energy Technology Data Exchange (ETDEWEB)

    Hoeche, Stefan; /SLAC

    2011-08-11

    Recent progress on general-purpose Monte-Carlo event generators is reviewed with emphasis on the simulation of hard QCD processes and subsequent parton cascades. Describing full final states of high-energy particle collisions in contemporary experiments is an intricate task. Hundreds of particles are typically produced, and the reactions involve both large and small momentum transfer. The high-dimensional phase space makes an exact solution of the problem impossible. Instead, one typically resorts to regarding events as factorized into different steps, ordered descending in the mass scales or invariant momentum transfers which are involved. In this picture, a hard interaction, described through fixed-order perturbation theory, is followed by multiple Bremsstrahlung emissions off initial- and final-state and, finally, by the hadronization process, which binds QCD partons into color-neutral hadrons. Each of these steps can be treated independently, which is the basic concept inherent to general-purpose event generators. Their development is nowadays often focused on an improved description of radiative corrections to hard processes through perturbative QCD. In this context, the concept of jets is introduced, which allows to relate sprays of hadronic particles in detectors to the partons in perturbation theory. In this talk, we briefly review recent progress on perturbative QCD in event generation. The main focus lies on the general-purpose Monte-Carlo programs HERWIG, PYTHIA and SHERPA, which will be the workhorses for LHC phenomenology. A detailed description of the physics models included in these generators can be found in [8]. We also discuss matrix-element generators, which provide the parton-level input for general-purpose Monte Carlo.

  6. Experimental and Monte Carlo evaluation of Eclipse treatment planning system for effects on dose distribution of the hip prostheses

    Energy Technology Data Exchange (ETDEWEB)

    Çatlı, Serap, E-mail: serapcatli@hotmail.com [Gazi University, Faculty of Sciences, 06500 Teknikokullar, Ankara (Turkey); Tanır, Güneş [Gazi University, Faculty of Sciences, 06500 Teknikokullar, Ankara (Turkey)

    2013-10-01

    The present study aimed to investigate the effects of titanium, titanium alloy, and stainless steel hip prostheses on dose distribution based on the Monte Carlo simulation method, as well as the accuracy of the Eclipse treatment planning system (TPS) at 6 and 18 MV photon energies. In the present study the pencil beam convolution (PBC) method implemented in the Eclipse TPS was compared to the Monte Carlo method and ionization chamber measurements. The present findings show that if high-Z material is used in prosthesis, large dose changes can occur due to scattering. The variance in dose observed in the present study was dependent on material type, density, and atomic number, as well as photon energy; as photon energy increased back scattering decreased. The dose perturbation effect of hip prostheses was significant and could not be predicted accurately by the PBC method for hip prostheses. The findings show that for accurate dose calculation the Monte Carlo-based TPS should be used in patients with hip prostheses.

  7. Experimental and Monte Carlo evaluation of Eclipse treatment planning system for effects on dose distribution of the hip prostheses.

    Science.gov (United States)

    Catlı, Serap; Tanır, Güneş

    2013-01-01

    The present study aimed to investigate the effects of titanium, titanium alloy, and stainless steel hip prostheses on dose distribution based on the Monte Carlo simulation method, as well as the accuracy of the Eclipse treatment planning system (TPS) at 6 and 18MV photon energies. In the present study the pencil beam convolution (PBC) method implemented in the Eclipse TPS was compared to the Monte Carlo method and ionization chamber measurements. The present findings show that if high-Z material is used in prosthesis, large dose changes can occur due to scattering. The variance in dose observed in the present study was dependent on material type, density, and atomic number, as well as photon energy; as photon energy increased back scattering decreased. The dose perturbation effect of hip prostheses was significant and could not be predicted accurately by the PBC method for hip prostheses. The findings show that for accurate dose calculation the Monte Carlo-based TPS should be used in patients with hip prostheses.

  8. Monte-Carlo scatter correction for cone-beam computed tomography with limited scan field-of-view

    Science.gov (United States)

    Bertram, Matthias; Sattel, Timo; Hohmann, Steffen; Wiegert, Jens

    2008-03-01

    In flat detector cone-beam computed tomography (CBCT), scattered radiation is a major source of image degradation, making accurate a posteriori scatter correction inevitable. A potential solution to this problem is provided by computerized scatter correction based on Monte-Carlo simulations. Using this technique, the detected distributions of X-ray scatter are estimated for various viewing directions using Monte-Carlo simulations of an intermediate reconstruction. However, as a major drawback, for standard CBCT geometries and with standard size flat detectors such as mounted on interventional C-arms, the scan field of view is too small to accommodate the human body without lateral truncations, and thus this technique cannot be readily applied. In this work, we present a novel method for constructing a model of the object in a laterally and possibly also axially extended field of view, which enables meaningful application of Monte-Carlo based scatter correction even in case of heavy truncations. Evaluation is based on simulations of a clinical CT data set of a human abdomen, which strongly exceeds the field of view of the simulated C-arm based CBCT imaging geometry. By using the proposed methodology, almost complete removal of scatter-caused inhomogeneities is demonstrated in reconstructed images.

  9. Monte Carlo Particle Lists: MCPL

    CERN Document Server

    Kittelmann, Thomas; Knudsen, Erik B; Willendrup, Peter; Cai, Xiao Xiao; Kanaki, Kalliopi

    2016-01-01

    A binary format with lists of particle state information, for interchanging particles between various Monte Carlo simulation applications, is presented. Portable C code for file manipulation is made available to the scientific community, along with converters and plugins for several popular simulation packages.

  10. Validation of the Monte Carlo criticality program KENO IV and the Hansen-Roach sixteen-energy-group-cross sections for high-assay uranium systems. [KENO IV criticality code

    Energy Technology Data Exchange (ETDEWEB)

    Handley, G. R.; Masters, L. C.; Stachowiak, R. V.

    1981-04-10

    Validation of the Monte Carlo criticality code, KENO IV, and the Hansen-Roach sixteen-energy-group cross sections was accomplished by calculating the effective neutron multiplication constant, k/sub eff/, of 29 experimentally critical assemblies which had uranium enrichments of 92.6% or higher in the uranium-235 isotope. The experiments were chosen so that a large variety of geometries and of neutron energy spectra were covered. Problems, calculating the k/sub eff/ of systems with high-uranium-concentration uranyl nitrate solution that were minimally reflected or unreflected, resulted in the separate examination of five cases.

  11. Applications of Monte Carlo Methods in Calculus.

    Science.gov (United States)

    Gordon, Sheldon P.; Gordon, Florence S.

    1990-01-01

    Discusses the application of probabilistic ideas, especially Monte Carlo simulation, to calculus. Describes some applications using the Monte Carlo method: Riemann sums; maximizing and minimizing a function; mean value theorems; and testing conjectures. (YP)

  12. Haplotype association analyses in resources of mixed structure using Monte Carlo testing

    Directory of Open Access Journals (Sweden)

    Thomas Alun

    2010-12-01

    Full Text Available Abstract Background Genomewide association studies have resulted in a great many genomic regions that are likely to harbor disease genes. Thorough interrogation of these specific regions is the logical next step, including regional haplotype studies to identify risk haplotypes upon which the underlying critical variants lie. Pedigrees ascertained for disease can be powerful for genetic analysis due to the cases being enriched for genetic disease. Here we present a Monte Carlo based method to perform haplotype association analysis. Our method, hapMC, allows for the analysis of full-length and sub-haplotypes, including imputation of missing data, in resources of nuclear families, general pedigrees, case-control data or mixtures thereof. Both traditional association statistics and transmission/disequilibrium statistics can be performed. The method includes a phasing algorithm that can be used in large pedigrees and optional use of pseudocontrols. Results Our new phasing algorithm substantially outperformed the standard expectation-maximization algorithm that is ignorant of pedigree structure, and hence is preferable for resources that include pedigree structure. Through simulation we show that our Monte Carlo procedure maintains the correct type 1 error rates for all resource types. Power comparisons suggest that transmission-disequilibrium statistics are superior for performing association in resources of only nuclear families. For mixed structure resources, however, the newly implemented pseudocontrol approach appears to be the best choice. Results also indicated the value of large high-risk pedigrees for association analysis, which, in the simulations considered, were comparable in power to case-control resources of the same sample size. Conclusions We propose hapMC as a valuable new tool to perform haplotype association analyses, particularly for resources of mixed structure. The availability of meta-association and haplotype-mining modules in

  13. Determination of the detective quantum efficiency of gamma camera systems: a Monte Carlo study.

    Science.gov (United States)

    Eriksson, Ida; Starck, Sven-Ake; Båth, Magnus

    2010-01-01

    The purpose of the present work was to investigate the validity of using the Monte Carlo technique for determining the detective quantum efficiency (DQE) of a gamma camera system and to use this technique in investigating the DQE behaviour of a gamma camera system and its dependency on a number of relevant parameters. The Monte Carlo-based software SIMIND, simulating a complete gamma camera system, was used in the present study. The modulation transfer function (MTF) of the system was determined from simulated images of a point source of (99m)Tc, positioned at different depths in a water phantom. Simulations were performed using different collimators and energy windows. The MTF of the system was combined with the photon yield and the sensitivity, obtained from the simulations, to form the frequency-dependent DQE of the system. As figure-of-merit (FOM), the integral of the 2D DQE was used. The simulated DQE curves agreed well with published data. As expected, there was a strong dependency of the shape and magnitude of the DQE curve on the collimator, energy window and imaging position. The highest FOM was obtained for a lower energy threshold of 127 keV for objects close to the detector and 131 keV for objects deeper in the phantom, supporting an asymmetric window setting to reduce scatter. The Monte Carlo software SIMIND can be used to determine the DQE of a gamma camera system from a simulated point source alone. The optimal DQE results in the present study were obtained for parameter settings close to the clinically used settings.

  14. (U) Introduction to Monte Carlo Methods

    Energy Technology Data Exchange (ETDEWEB)

    Hungerford, Aimee L. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-03-20

    Monte Carlo methods are very valuable for representing solutions to particle transport problems. Here we describe a “cook book” approach to handling the terms in a transport equation using Monte Carlo methods. Focus is on the mechanics of a numerical Monte Carlo code, rather than the mathematical foundations of the method.

  15. Theoretical study of transition state structure and reaction enthalpy of the F + H2-->HF + H reaction by a diffusion quantum Monte Carlo approach.

    Science.gov (United States)

    Lu, Shih-I

    2005-05-15

    Ab initio calculations of transition state structure and reaction enthalpy of the F + H2-->HF + H reaction has been carried out by the fixed-node diffusion quantum Monte Carlo method in this study. The Monte Carlo sampling is based on the Ornstein-Uhlenbeck random walks guided by a trial wave function constructed from the floating spherical Gaussian orbitals and spherical Gaussian geminals. The Monte Carlo calculated barrier height of 1.09(16) kcal/mol is consistent with the experimental values, 0.86(10)/1.18(10) kcal/mol, and the calculated value from the multireference-type coupled-cluster (MRCC) calculation with the aug-cc-pVQZ(F)/cc-pVQZ(H) basis set, 1.11 kcal/mol. The Monte Carlo-based calculation also gives a similar value of the reaction enthalpy, -32.00(4) kcal/mol, compared with the experimental value, -32.06(17) kcal/mol, and the calculated value from a MRCC/aug-cc-pVQZ(F)/cc-pVQZ(H) calculation, -31.94 kcal/mol. This study clearly indicates a further application of the random-walk-based approach in the field of quantum chemical calculation.

  16. Density matrix quantum Monte Carlo

    CERN Document Server

    Blunt, N S; Spencer, J S; Foulkes, W M C

    2013-01-01

    This paper describes a quantum Monte Carlo method capable of sampling the full density matrix of a many-particle system, thus granting access to arbitrary reduced density matrices and allowing expectation values of complicated non-local operators to be evaluated easily. The direct sampling of the density matrix also raises the possibility of calculating previously inaccessible entanglement measures. The algorithm closely resembles the recently introduced full configuration interaction quantum Monte Carlo method, but works all the way from infinite to zero temperature. We explain the theory underlying the method, describe the algorithm, and introduce an importance-sampling procedure to improve the stochastic efficiency. To demonstrate the potential of our approach, the energy and staggered magnetization of the isotropic antiferromagnetic Heisenberg model on small lattices and the concurrence of one-dimensional spin rings are compared to exact or well-established results. Finally, the nature of the sign problem...

  17. Efficient kinetic Monte Carlo simulation

    Science.gov (United States)

    Schulze, Tim P.

    2008-02-01

    This paper concerns kinetic Monte Carlo (KMC) algorithms that have a single-event execution time independent of the system size. Two methods are presented—one that combines the use of inverted-list data structures with rejection Monte Carlo and a second that combines inverted lists with the Marsaglia-Norman-Cannon algorithm. The resulting algorithms apply to models with rates that are determined by the local environment but are otherwise arbitrary, time-dependent and spatially heterogeneous. While especially useful for crystal growth simulation, the algorithms are presented from the point of view that KMC is the numerical task of simulating a single realization of a Markov process, allowing application to a broad range of areas where heterogeneous random walks are the dominate simulation cost.

  18. Adaptive Multilevel Monte Carlo Simulation

    KAUST Repository

    Hoel, H

    2011-08-23

    This work generalizes a multilevel forward Euler Monte Carlo method introduced in Michael B. Giles. (Michael Giles. Oper. Res. 56(3):607–617, 2008.) for the approximation of expected values depending on the solution to an Itô stochastic differential equation. The work (Michael Giles. Oper. Res. 56(3):607– 617, 2008.) proposed and analyzed a forward Euler multilevelMonte Carlo method based on a hierarchy of uniform time discretizations and control variates to reduce the computational effort required by a standard, single level, Forward Euler Monte Carlo method. This work introduces an adaptive hierarchy of non uniform time discretizations, generated by an adaptive algorithmintroduced in (AnnaDzougoutov et al. Raùl Tempone. Adaptive Monte Carlo algorithms for stopped diffusion. In Multiscale methods in science and engineering, volume 44 of Lect. Notes Comput. Sci. Eng., pages 59–88. Springer, Berlin, 2005; Kyoung-Sook Moon et al. Stoch. Anal. Appl. 23(3):511–558, 2005; Kyoung-Sook Moon et al. An adaptive algorithm for ordinary, stochastic and partial differential equations. In Recent advances in adaptive computation, volume 383 of Contemp. Math., pages 325–343. Amer. Math. Soc., Providence, RI, 2005.). This form of the adaptive algorithm generates stochastic, path dependent, time steps and is based on a posteriori error expansions first developed in (Anders Szepessy et al. Comm. Pure Appl. Math. 54(10):1169– 1214, 2001). Our numerical results for a stopped diffusion problem, exhibit savings in the computational cost to achieve an accuracy of ϑ(TOL),from(TOL−3), from using a single level version of the adaptive algorithm to ϑ(((TOL−1)log(TOL))2).

  19. The CCFM Monte Carlo generator CASCADE 2.2.0

    CERN Document Server

    Jung, H; Deak, M; Grebenyuk, A; Hautmann, F; Hentschinski, M; Knutsson, A; Kraemer, M; Kutak, K; Lipatov, A; Zotov, N

    2010-01-01

    CASCADE is a full hadron level Monte Carlo event generator for ep, \\gamma p and p\\bar{p} and pp processes, which uses the CCFM evolution equation for the initial state cascade in a backward evolution approach supplemented with off - shell matrix elements for the hard scattering. A detailed program description is given, with emphasis on parameters the user wants to change and variables which completely specify the generated events.

  20. Monte Carlo simulations to replace film dosimetry in IMRT verification

    OpenAIRE

    Goetzfried, Thomas; Rickhey, Mark; Treutwein, Marius; Koelbl, Oliver; Bogner, Ludwig

    2011-01-01

    Patient-specific verification of intensity-modulated radiation therapy (IMRT) plans can be done by dosimetric measurements or by independent dose or monitor unit calculations. The aim of this study was the clinical evaluation of IMRT verification based on a fast Monte Carlo (MC) program with regard to possible benefits compared to commonly used film dosimetry. 25 head-and-neck IMRT plans were recalculated by a pencil beam based treatment planning system (TPS) using an appropriate quality assu...

  1. A separable shadow Hamiltonian hybrid Monte Carlo method.

    Science.gov (United States)

    Sweet, Christopher R; Hampton, Scott S; Skeel, Robert D; Izaguirre, Jesús A

    2009-11-07

    Hybrid Monte Carlo (HMC) is a rigorous sampling method that uses molecular dynamics (MD) as a global Monte Carlo move. The acceptance rate of HMC decays exponentially with system size. The shadow hybrid Monte Carlo (SHMC) was previously introduced to reduce this performance degradation by sampling instead from the shadow Hamiltonian defined for MD when using a symplectic integrator. SHMC's performance is limited by the need to generate momenta for the MD step from a nonseparable shadow Hamiltonian. We introduce the separable shadow Hamiltonian hybrid Monte Carlo (S2HMC) method based on a formulation of the leapfrog/Verlet integrator that corresponds to a separable shadow Hamiltonian, which allows efficient generation of momenta. S2HMC gives the acceptance rate of a fourth order integrator at the cost of a second-order integrator. Through numerical experiments we show that S2HMC consistently gives a speedup greater than two over HMC for systems with more than 4000 atoms for the same variance. By comparison, SHMC gave a maximum speedup of only 1.6 over HMC. S2HMC has the additional advantage of not requiring any user parameters beyond those of HMC. S2HMC is available in the program PROTOMOL 2.1. A Python version, adequate for didactic purposes, is also in MDL (http://mdlab.sourceforge.net/s2hmc).

  2. A Monte Carlo Method for Multi-Objective Correlated Geometric Optimization

    Science.gov (United States)

    2014-05-01

    performs a Monte Carlo optimization to provide geospatial intelligence on entity placement using OpenCL framework. The solutions for optimal...Geometric optimization,Monte Carlo method, parallel computing, OpenCL 22 Song J. Park 410-278-5444Unclassified Unclassified Unclassified UU ii...given threat and target positions, and • AMonte Carlo method development in the OpenCL programming model for vendor-agnostic architecture support and

  3. Radiation shielding design for neutron diffractometers assisted by Monte Carlo methods

    Science.gov (United States)

    Osborn, John C.; Ersez, Tunay; Braoudakis, George

    2006-11-01

    Monte Carlo simulations may be used to model radiation shielding for neutron diffractometers. The use of the MCNP computer program to assess shielding for a diffractometer is discussed. A comparison is made of shielding requirements for radiation generated by several materials commonly used in neutron optical elements and beam stops, including lithium-6 based absorbers where the Monte Carlo method can model the effects of fast neutrons generated by this material.

  4. Monte Carlo approach to turbulence

    Energy Technology Data Exchange (ETDEWEB)

    Dueben, P.; Homeier, D.; Muenster, G. [Muenster Univ. (Germany). Inst. fuer Theoretische Physik; Jansen, K. [DESY, Zeuthen (Germany). John von Neumann-Inst. fuer Computing NIC; Mesterhazy, D. [Humboldt Univ., Berlin (Germany). Inst. fuer Physik

    2009-11-15

    The behavior of the one-dimensional random-force-driven Burgers equation is investigated in the path integral formalism on a discrete space-time lattice. We show that by means of Monte Carlo methods one may evaluate observables, such as structure functions, as ensemble averages over different field realizations. The regularization of shock solutions to the zero-viscosity limit (Hopf-equation) eventually leads to constraints on lattice parameters required for the stability of the simulations. Insight into the formation of localized structures (shocks) and their dynamics is obtained. (orig.)

  5. Approaching Chemical Accuracy with Quantum Monte Carlo

    OpenAIRE

    Petruzielo, Frank R.; Toulouse, Julien; Umrigar, C. J.

    2012-01-01

    International audience; A quantum Monte Carlo study of the atomization energies for the G2 set of molecules is presented. Basis size dependence of diffusion Monte Carlo atomization energies is studied with a single determinant Slater-Jastrow trial wavefunction formed from Hartree-Fock orbitals. With the largest basis set, the mean absolute deviation from experimental atomization energies for the G2 set is 3.0 kcal/mol. Optimizing the orbitals within variational Monte Carlo improves the agreem...

  6. Mean field simulation for Monte Carlo integration

    CERN Document Server

    Del Moral, Pierre

    2013-01-01

    In the last three decades, there has been a dramatic increase in the use of interacting particle methods as a powerful tool in real-world applications of Monte Carlo simulation in computational physics, population biology, computer sciences, and statistical machine learning. Ideally suited to parallel and distributed computation, these advanced particle algorithms include nonlinear interacting jump diffusions; quantum, diffusion, and resampled Monte Carlo methods; Feynman-Kac particle models; genetic and evolutionary algorithms; sequential Monte Carlo methods; adaptive and interacting Marko

  7. Optical Monte Carlo modeling of a true portwine stain anatomy

    Science.gov (United States)

    Barton, Jennifer K.; Pfefer, T. Joshua; Welch, Ashley J.; Smithies, Derek J.; Nelson, Jerry; van Gemert, Martin J.

    1998-04-01

    A unique Monte Carlo program capable of accommodating an arbitrarily complex geometry was used to determine the energy deposition in a true port wine stain anatomy. Serial histologic sections taken from a biopsy of a dark red, laser therapy resistant stain were digitized and used to create the program input for simulation at wavelengths of 532 and 585 nm. At both wavelengths, the greatest energy deposition occurred in the superficial blood vessels, and subsequently decreased with depth as the laser beam was attenuated. However, more energy was deposited in the epidermis and superficial blood vessels at 532 nm than at 585 nm.

  8. Monte Carlo Treatment Planning for Advanced Radiotherapy

    DEFF Research Database (Denmark)

    Cronholm, Rickard

    and validation of a Monte Carlo model of a medical linear accelerator (i), converting a CT scan of a patient to a Monte Carlo compliant phantom (ii) and translating the treatment plan parameters (including beam energy, angles of incidence, collimator settings etc) to a Monte Carlo input file (iii). A protocol...... previous algorithms since it uses delineations of structures in order to include and/or exclude certain media in various anatomical regions. This method has the potential to reduce anatomically irrelevant media assignment. In house MATLAB scripts translating the treatment plan parameters to Monte Carlo...

  9. 1-D EQUILIBRIUM DISCRETE DIFFUSION MONTE CARLO

    Energy Technology Data Exchange (ETDEWEB)

    T. EVANS; ET AL

    2000-08-01

    We present a new hybrid Monte Carlo method for 1-D equilibrium diffusion problems in which the radiation field coexists with matter in local thermodynamic equilibrium. This method, the Equilibrium Discrete Diffusion Monte Carlo (EqDDMC) method, combines Monte Carlo particles with spatially discrete diffusion solutions. We verify the EqDDMC method with computational results from three slab problems. The EqDDMC method represents an incremental step toward applying this hybrid methodology to non-equilibrium diffusion, where it could be simultaneously coupled to Monte Carlo transport.

  10. Error in Monte Carlo, quasi-error in Quasi-Monte Carlo

    OpenAIRE

    Kleiss, R. H. P.; Lazopoulos, A.

    2006-01-01

    While the Quasi-Monte Carlo method of numerical integration achieves smaller integration error than standard Monte Carlo, its use in particle physics phenomenology has been hindered by the abscence of a reliable way to estimate that error. The standard Monte Carlo error estimator relies on the assumption that the points are generated independently of each other and, therefore, fails to account for the error improvement advertised by the Quasi-Monte Carlo method. We advocate the construction o...

  11. Uncertainties in s-process nucleosynthesis in massive stars determined by Monte Carlo variations

    Science.gov (United States)

    Nishimura (西村信哉), N.; Hirschi, R.; Rauscher, T.; Murphy, A. St. J.; Cescutti, G.

    2017-08-01

    The s-process in massive stars produces the weak component of the s-process (nuclei up to A ∼ 90), in amounts that match solar abundances. For heavier isotopes, such as barium, production through neutron capture is significantly enhanced in very metal-poor stars with fast rotation. However, detailed theoretical predictions for the resulting final s-process abundances have important uncertainties caused both by the underlying uncertainties in the nuclear physics (principally neutron-capture reaction and β-decay rates) as well as by the stellar evolution modelling. In this work, we investigated the impact of nuclear-physics uncertainties relevant to the s-process in massive stars. Using a Monte Carlo based approach, we performed extensive nuclear reaction network calculations that include newly evaluated upper and lower limits for the individual temperature-dependent reaction rates. We found that most of the uncertainty in the final abundances is caused by uncertainties in the neutron-capture rates, while β-decay rate uncertainties affect only a few nuclei near s-process branchings. The s-process in rotating metal-poor stars shows quantitatively different uncertainties and key reactions, although the qualitative characteristics are similar. We confirmed that our results do not significantly change at different metallicities for fast rotating massive stars in the very low metallicity regime. We highlight which of the identified key reactions are realistic candidates for improved measurement by future experiments.

  12. Calculation of photon pulse height distribution using deterministic and Monte Carlo methods

    Science.gov (United States)

    Akhavan, Azadeh; Vosoughi, Naser

    2015-12-01

    Radiation transport techniques which are used in radiation detection systems comprise one of two categories namely probabilistic and deterministic. However, probabilistic methods are typically used in pulse height distribution simulation by recreating the behavior of each individual particle, the deterministic approach, which approximates the macroscopic behavior of particles by solution of Boltzmann transport equation, is being developed because of its potential advantages in computational efficiency for complex radiation detection problems. In current work linear transport equation is solved using two methods including collided components of the scalar flux algorithm which is applied by iterating on the scattering source and ANISN deterministic computer code. This approach is presented in one dimension with anisotropic scattering orders up to P8 and angular quadrature orders up to S16. Also, multi-group gamma cross-section library required for this numerical transport simulation is generated in a discrete appropriate form. Finally, photon pulse height distributions are indirectly calculated by deterministic methods that approvingly compare with those from Monte Carlo based codes namely MCNPX and FLUKA.

  13. Ultrasound modulated light blood flow measurement using intensity autocorrelation function: a Monte-Carlo simulation

    Science.gov (United States)

    Tsalach, A.; Metzger, Y.; Breskin, I.; Zeitak, R.; Shechter, R.

    2014-03-01

    Development of techniques for continuous measurement of regional blood flow, and in particular cerebral blood flow (CBF), is essential for monitoring critical care patients. Recently, a novel technique, based on ultrasound modulation of light was developed for non-invasive, continuous CBF monitoring (termed ultrasound-tagged light (UTL or UT-NIRS)), and shown to correlate with readings of 133 Xe SPECT1 and laser Doppler2. Coherent light is introduced into the tissue concurrently with an Ultrasound (US) field. Displacement of scattering centers within the sampled volume induced by Brownian motion, blood flow and the US field affects the photons' temporal correlation. Hence, the temporal fluctuations of the obtained speckle pattern provide dynamic information about the blood flow. We developed a comprehensive simulation, combining the effects of Brownian motion, US and flow on the obtained speckle pattern. Photons trajectories within the tissue are generated using a Monte-Carlo based model. Then, the temporal changes in the optical path due to displacement of scattering centers are determined, and the corresponding interference pattern over time is derived. Finally, the light intensity autocorrelation function of a single speckle is calculated, from which the tissue decorrelation time is determined. The simulation's results are compared with in-vitro experiments, using a digital correlator, demonstrating decorrelation time prediction within the 95% confidence interval. This model may assist in the development of optical based methods for blood flow measurements and particularly, in methods using the acousto-optic effect.

  14. Development of Monte Carlo Methods for Investigating Migration of Radionuclides in Contaminated Environments

    Energy Technology Data Exchange (ETDEWEB)

    Avrorin, E. N.; Tsvetokhin, A. G.; Xenofontov, A. I.; Kourbatova, E. I.; Regens, J. L.

    2002-02-26

    This paper presents the results of an ongoing research and development project conducted by Russian institutions in Moscow and Snezhinsk, supported by the International Science and Technology Center (ISTC), in collaboration with the University of Oklahoma. The joint study focuses on developing and applying analytical tools to effectively characterize contaminant transport and assess risks associated with migration of radionuclides and heavy metals in the water column and sediments of large reservoirs or lakes. The analysis focuses on the development and evaluation of theoretical-computational models that describe the distribution of radioactive wastewater within a reservoir and characterize the associated radiation field as well as estimate doses received from radiation exposure. The analysis focuses on the development and evaluation of Monte Carlo-based, theoretical-computational methods that are applied to increase the precision of results and to reduce computing time for estimating the characteristics the radiation field emitted from the contaminated wastewater layer. The calculated migration of radionuclides is used to estimate distributions of radiation doses that could be received by an exposed population based on exposure to radionuclides from specified volumes of discrete aqueous sources. The calculated dose distributions can be used to support near-term and long-term decisions about priorities for environmental remediation and stewardship.

  15. Monte Carlo simulation of the kinetic effects on GaAs/GaAs(001) MBE growth

    Science.gov (United States)

    Ageev, Oleg A.; Solodovnik, Maxim S.; Balakirev, Sergey V.; Mikhaylin, Ilya A.; Eremenko, Mikhail M.

    2017-01-01

    The molecular beam epitaxial growth of GaAs on the GaAs(001)-(2×4) surface is investigated using a kinetic Monte Carlo-based method. The developed algorithm permits to focus on the kinetic effects in a wide range of growth conditions and enables considerable computational speedup. The simulation results show that the growth rate has a dramatic influence upon both the island morphology and Ga surface diffusion length. The average island size reduces with increasing growth rate while the island density increases with increasing growth rate as well as As4/Ga beam equivalent pressure ratio. As the growth rate increases, the island density becomes weaker dependent upon the As4/Ga pressure ratio and approaches to a saturation value. We also discuss three characteristics of Ga surface diffusion, namely a diffusion length of a Ga adatom deposited first, an average diffusion length, and an island spacing as an average distance between islands. The calculations show that the As4/Ga pressure ratio dependences of these characteristics obey the same law, but with different coefficients. An increase of the As4/Ga pressure ratio leads to a decrease in both the diffusion length and island spacing. However, its influence becomes stronger with increasing growth rate for the first Ga adatom diffusion length and weaker for the average diffusion length and for the island spacing.

  16. Monte Carlo simulation of a compact microbeam radiotherapy system based on carbon nanotube field emission technology.

    Science.gov (United States)

    Schreiber, Eric C; Chang, Sha X

    2012-08-01

    Microbeam radiation therapy (MRT) is an experimental radiotherapy technique that has shown potent antitumor effects with minimal damage to normal tissue in animal studies. This unique form of radiation is currently only produced in a few large synchrotron accelerator research facilities in the world. To promote widespread translational research on this promising treatment technology we have proposed and are in the initial development stages of a compact MRT system that is based on carbon nanotube field emission x-ray technology. We report on a Monte Carlo based feasibility study of the compact MRT system design. Monte Carlo calculations were performed using EGSnrc-based codes. The proposed small animal research MRT device design includes carbon nanotube cathodes shaped to match the corresponding MRT collimator apertures, a common reflection anode with filter, and a MRT collimator. Each collimator aperture is sized to deliver a beam width ranging from 30 to 200 μm at 18.6 cm source-to-axis distance. Design parameters studied with Monte Carlo include electron energy, cathode design, anode angle, filtration, and collimator design. Calculations were performed for single and multibeam configurations. Increasing the energy from 100 kVp to 160 kVp increased the photon fluence through the collimator by a factor of 1.7. Both energies produced a largely uniform fluence along the long dimension of the microbeam, with 5% decreases in intensity near the edges. The isocentric dose rate for 160 kVp was calculated to be 700 Gy∕min∕A in the center of a 3 cm diameter target. Scatter contributions resulting from collimator size were found to produce only small (<7%) changes in the dose rate for field widths greater than 50 μm. Dose vs depth was weakly dependent on filtration material. The peak-to-valley ratio varied from 10 to 100 as the separation between adjacent microbeams varies from 150 to 1000 μm. Monte Carlo simulations demonstrate that the proposed compact MRT system

  17. Modelling of scintillator based flat-panel detectors with Monte-Carlo simulations

    Science.gov (United States)

    Reims, N.; Sukowski, F.; Uhlmann, N.

    2011-01-01

    Scintillator based flat panel detectors are state of the art in the field of industrial X-ray imaging applications. Choosing the proper system and setup parameters for the vast range of different applications can be a time consuming task, especially when developing new detector systems. Since the system behaviour cannot always be foreseen easily, Monte-Carlo (MC) simulations are keys to gain further knowledge of system components and their behaviour for different imaging conditions. In this work we used two Monte-Carlo based models to examine an indirect converting flat panel detector, specifically the Hamamatsu C9312SK. We focused on the signal generation in the scintillation layer and its influence on the spatial resolution of the whole system. The models differ significantly in their level of complexity. The first model gives a global description of the detector based on different parameters characterizing the spatial resolution. With relatively small effort a simulation model can be developed which equates the real detector regarding signal transfer. The second model allows a more detailed insight of the system. It is based on the well established cascade theory, i.e. describing the detector as a cascade of elemental gain and scattering stages, which represent the built in components and their signal transfer behaviour. In comparison to the first model the influence of single components especially the important light spread behaviour in the scintillator can be analysed in a more differentiated way. Although the implementation of the second model is more time consuming both models have in common that a relatively small amount of system manufacturer parameters are needed. The results of both models were in good agreement with the measured parameters of the real system.

  18. A Monte Carlo template-based analysis for very high definition imaging atmospheric Cherenkov telescopes as applied to the VERITAS telescope array

    CERN Document Server

    ,

    2015-01-01

    We present a sophisticated likelihood reconstruction algorithm for shower-image analysis of imaging Cherenkov telescopes. The reconstruction algorithm is based on the comparison of the camera pixel amplitudes with the predictions from a Monte Carlo based model. Shower parameters are determined by a maximisation of a likelihood function. Maximisation of the likelihood as a function of shower fit parameters is performed using a numerical non-linear optimisation technique. A related reconstruction technique has already been developed by the CAT and the H.E.S.S. experiments, and provides a more precise direction and energy reconstruction of the photon induced shower compared to the second moment of the camera image analysis. Examples are shown of the performance of the analysis on simulated gamma-ray data from the VERITAS array.

  19. A Monte Carlo simulation method for the assessment of undiscovered, conventional oil and gas: Chapter 26 in Petroleum systems and geologic assessment of oil and gas in the San Joaquin Basin Province, California

    Science.gov (United States)

    Charpentier, Ronald R.; Klett, T.R.

    2007-01-01

    The U.S. Geological Survey has developed two Monte Carlo programs for assessment of undiscovered conventional oil and gas resources. EMCEE (for Energy Monte Carlo) and Emc2 (for Energy Monte Carlo program 2) are programs that calculate probabilistic estimates of undiscovered resources based on input distributions for numbers and sizes of undiscovered fields. Emc2 uses specific types of distributions for the input, whereas EMCEE allows greater flexibility of the input distribution types.

  20. Langevin Monte Carlo filtering for target tracking

    NARCIS (Netherlands)

    Iglesias Garcia, Fernando; Bocquel, Melanie; Driessen, Hans

    2015-01-01

    This paper introduces the Langevin Monte Carlo Filter (LMCF), a particle filter with a Markov chain Monte Carlo algorithm which draws proposals by simulating Hamiltonian dynamics. This approach is well suited to non-linear filtering problems in high dimensional state spaces where the bootstrap filte

  1. An introduction to Monte Carlo methods

    NARCIS (Netherlands)

    Walter, J. -C.; Barkema, G. T.

    2015-01-01

    Monte Carlo simulations are methods for simulating statistical systems. The aim is to generate a representative ensemble of configurations to access thermodynamical quantities without the need to solve the system analytically or to perform an exact enumeration. The main principles of Monte Carlo sim

  2. An introduction to Monte Carlo methods

    NARCIS (Netherlands)

    Walter, J. -C.; Barkema, G. T.

    2015-01-01

    Monte Carlo simulations are methods for simulating statistical systems. The aim is to generate a representative ensemble of configurations to access thermodynamical quantities without the need to solve the system analytically or to perform an exact enumeration. The main principles of Monte Carlo sim

  3. Challenges of Monte Carlo Transport

    Energy Technology Data Exchange (ETDEWEB)

    Long, Alex Roberts [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-06-10

    These are slides from a presentation for Parallel Summer School at Los Alamos National Laboratory. Solving discretized partial differential equations (PDEs) of interest can require a large number of computations. We can identify concurrency to allow parallel solution of discrete PDEs. Simulated particles histories can be used to solve the Boltzmann transport equation. Particle histories are independent in neutral particle transport, making them amenable to parallel computation. Physical parameters and method type determine the data dependencies of particle histories. Data requirements shape parallel algorithms for Monte Carlo. Then, Parallel Computational Physics and Parallel Monte Carlo are discussed and, finally, the results are given. The mesh passing method greatly simplifies the IMC implementation and allows simple load-balancing. Using MPI windows and passive, one-sided RMA further simplifies the implementation by removing target synchronization. The author is very interested in implementations of PGAS that may allow further optimization for one-sided, read-only memory access (e.g. Open SHMEM). The MPICH_RMA_OVER_DMAPP option and library is required to make one-sided messaging scale on Trinitite - Moonlight scales poorly. Interconnect specific libraries or functions are likely necessary to ensure performance. BRANSON has been used to directly compare the current standard method to a proposed method on idealized problems. The mesh passing algorithm performs well on problems that are designed to show the scalability of the particle passing method. BRANSON can now run load-imbalanced, dynamic problems. Potential avenues of improvement in the mesh passing algorithm will be implemented and explored. A suite of test problems that stress DD methods will elucidate a possible path forward for production codes.

  4. The MC21 Monte Carlo Transport Code

    Energy Technology Data Exchange (ETDEWEB)

    Sutton TM, Donovan TJ, Trumbull TH, Dobreff PS, Caro E, Griesheimer DP, Tyburski LJ, Carpenter DC, Joo H

    2007-01-09

    MC21 is a new Monte Carlo neutron and photon transport code currently under joint development at the Knolls Atomic Power Laboratory and the Bettis Atomic Power Laboratory. MC21 is the Monte Carlo transport kernel of the broader Common Monte Carlo Design Tool (CMCDT), which is also currently under development. The vision for CMCDT is to provide an automated, computer-aided modeling and post-processing environment integrated with a Monte Carlo solver that is optimized for reactor analysis. CMCDT represents a strategy to push the Monte Carlo method beyond its traditional role as a benchmarking tool or ''tool of last resort'' and into a dominant design role. This paper describes various aspects of the code, including the neutron physics and nuclear data treatments, the geometry representation, and the tally and depletion capabilities.

  5. Deterministic sensitivity analysis for first-order Monte Carlo simulations: a technical note.

    Science.gov (United States)

    Geisler, Benjamin P; Siebert, Uwe; Gazelle, G Scott; Cohen, David J; Göhler, Alexander

    2009-01-01

    Monte Carlo microsimulations have gained increasing popularity in decision-analytic modeling because they can incorporate discrete events. Although deterministic sensitivity analyses are essential for interpretation of results, it remains difficult to combine these alongside Monte Carlo simulations in standard modeling packages without enormous time investment. Our purpose was to facilitate one-way deterministic sensitivity analysis of TreeAge Markov state-transition models requiring first-order Monte Carlo simulations. Using TreeAge Pro Suite 2007 and Microsoft Visual Basic for EXCEL, we constructed a generic script that enables one to perform automated deterministic one-way sensitivity analyses in EXCEL employing microsimulation models. In addition, we constructed a generic EXCEL-worksheet that allows for use of the script with little programming knowledge. Linking TreeAge Pro Suite 2007 and Visual Basic enables the performance of deterministic sensitivity analyses of first-order Monte Carlo simulations. There are other potentially interesting applications for automated analysis.

  6. Monte Carlo Simulation as a Research Management Tool

    Energy Technology Data Exchange (ETDEWEB)

    Douglas, L. J.

    1986-06-01

    Monte Carlo simulation provides a research manager with a performance monitoring tool to supplement the standard schedule- and resource-based tools such as the Program Evaluation and Review Technique (PERT) and Critical Path Method (CPM). The value of the Monte Carlo simulation in a research environment is that it 1) provides a method for ranking competing processes, 2) couples technical improvements to the process economics, and 3) provides a mechanism to determine the value of research dollars. In this paper the Monte Carlo simulation approach is developed and applied to the evaluation of three competing processes for converting lignocellulosic biomass to ethanol. The technique is shown to be useful for ranking the processes and illustrating the importance of the timeframe of the analysis on the decision process. The results show that acid hydrolysis processes have higher potential for near-term application (2-5 years), while the enzymatic hydrolysis approach has an equal chance to be competitive in the long term (beyond 10 years).

  7. Monte Carlo simulation: tool for the calibration in analytical determination of radionuclides; Simulacion Monte Carlo: herramienta para la calibracion en determinaciones analiticas de radionucleidos

    Energy Technology Data Exchange (ETDEWEB)

    Gonzalez, Jorge A. Carrazana; Ferrera, Eduardo A. Capote; Gomez, Isis M. Fernandez; Castro, Gloria V. Rodriguez; Ricardo, Niury Martinez, E-mail: cphr@cphr.edu.cu [Centro de Proteccion e Higiene de las Radiaciones (CPHR), La Habana (Cuba)

    2013-07-01

    This work shows how is established the traceability of the analytical determinations using this calibration method. Highlights the advantages offered by Monte Carlo simulation for the application of corrections by differences in chemical composition, density and height of the samples analyzed. Likewise, the results obtained by the LVRA in two exercises organized by the International Agency for Atomic Energy (IAEA) are presented. In these exercises (an intercomparison and a proficiency test) all reported analytical results were obtained based on calibrations in efficiency by Monte Carlo simulation using the DETEFF program.

  8. Monte Carlo approaches to light nuclei

    Energy Technology Data Exchange (ETDEWEB)

    Carlson, J.

    1990-01-01

    Significant progress has been made recently in the application of Monte Carlo methods to the study of light nuclei. We review new Green's function Monte Carlo results for the alpha particle, Variational Monte Carlo studies of {sup 16}O, and methods for low-energy scattering and transitions. Through these calculations, a coherent picture of the structure and electromagnetic properties of light nuclei has arisen. In particular, we examine the effect of the three-nucleon interaction and the importance of exchange currents in a variety of experimentally measured properties, including form factors and capture cross sections. 29 refs., 7 figs.

  9. Monte carlo simulation for soot dynamics

    KAUST Repository

    Zhou, Kun

    2012-01-01

    A new Monte Carlo method termed Comb-like frame Monte Carlo is developed to simulate the soot dynamics. Detailed stochastic error analysis is provided. Comb-like frame Monte Carlo is coupled with the gas phase solver Chemkin II to simulate soot formation in a 1-D premixed burner stabilized flame. The simulated soot number density, volume fraction, and particle size distribution all agree well with the measurement available in literature. The origin of the bimodal distribution of particle size distribution is revealed with quantitative proof.

  10. Lattice gauge theories and Monte Carlo simulations

    CERN Document Server

    Rebbi, Claudio

    1983-01-01

    This volume is the most up-to-date review on Lattice Gauge Theories and Monte Carlo Simulations. It consists of two parts. Part one is an introductory lecture on the lattice gauge theories in general, Monte Carlo techniques and on the results to date. Part two consists of important original papers in this field. These selected reprints involve the following: Lattice Gauge Theories, General Formalism and Expansion Techniques, Monte Carlo Simulations. Phase Structures, Observables in Pure Gauge Theories, Systems with Bosonic Matter Fields, Simulation of Systems with Fermions.

  11. Quantum Monte Carlo for minimum energy structures

    CERN Document Server

    Wagner, Lucas K

    2010-01-01

    We present an efficient method to find minimum energy structures using energy estimates from accurate quantum Monte Carlo calculations. This method involves a stochastic process formed from the stochastic energy estimates from Monte Carlo that can be averaged to find precise structural minima while using inexpensive calculations with moderate statistical uncertainty. We demonstrate the applicability of the algorithm by minimizing the energy of the H2O-OH- complex and showing that the structural minima from quantum Monte Carlo calculations affect the qualitative behavior of the potential energy surface substantially.

  12. Proton therapy Monte Carlo SRNA-VOX code

    Directory of Open Access Journals (Sweden)

    Ilić Radovan D.

    2012-01-01

    Full Text Available The most powerful feature of the Monte Carlo method is the possibility of simulating all individual particle interactions in three dimensions and performing numerical experiments with a preset error. These facts were the motivation behind the development of a general-purpose Monte Carlo SRNA program for proton transport simulation in technical systems described by standard geometrical forms (plane, sphere, cone, cylinder, cube. Some of the possible applications of the SRNA program are: (a a general code for proton transport modeling, (b design of accelerator-driven systems, (c simulation of proton scattering and degrading shapes and composition, (d research on proton detectors; and (e radiation protection at accelerator installations. This wide range of possible applications of the program demands the development of various versions of SRNA-VOX codes for proton transport modeling in voxelized geometries and has, finally, resulted in the ISTAR package for the calculation of deposited energy distribution in patients on the basis of CT data in radiotherapy. All of the said codes are capable of using 3-D proton sources with an arbitrary energy spectrum in an interval of 100 keV to 250 MeV.

  13. Monte Carlo Frameworks Building Customisable High-performance C++ Applications

    CERN Document Server

    Duffy, Daniel J

    2011-01-01

    This is one of the first books that describe all the steps that are needed in order to analyze, design and implement Monte Carlo applications. It discusses the financial theory as well as the mathematical and numerical background that is needed to write flexible and efficient C++ code using state-of-the art design and system patterns, object-oriented and generic programming models in combination with standard libraries and tools.   Includes a CD containing the source code for all examples. It is strongly advised that you experiment with the code by compiling it and extending it to suit your ne

  14. 11th International Conference on Monte Carlo and Quasi-Monte Carlo Methods in Scientific Computing

    CERN Document Server

    Nuyens, Dirk

    2016-01-01

    This book presents the refereed proceedings of the Eleventh International Conference on Monte Carlo and Quasi-Monte Carlo Methods in Scientific Computing that was held at the University of Leuven (Belgium) in April 2014. These biennial conferences are major events for Monte Carlo and quasi-Monte Carlo researchers. The proceedings include articles based on invited lectures as well as carefully selected contributed papers on all theoretical aspects and applications of Monte Carlo and quasi-Monte Carlo methods. Offering information on the latest developments in these very active areas, this book is an excellent reference resource for theoreticians and practitioners interested in solving high-dimensional computational problems, arising, in particular, in finance, statistics and computer graphics.

  15. PyMercury: Interactive Python for the Mercury Monte Carlo Particle Transport Code

    Energy Technology Data Exchange (ETDEWEB)

    Iandola, F N; O' Brien, M J; Procassini, R J

    2010-11-29

    Monte Carlo particle transport applications are often written in low-level languages (C/C++) for optimal performance on clusters and supercomputers. However, this development approach often sacrifices straightforward usability and testing in the interest of fast application performance. To improve usability, some high-performance computing applications employ mixed-language programming with high-level and low-level languages. In this study, we consider the benefits of incorporating an interactive Python interface into a Monte Carlo application. With PyMercury, a new Python extension to the Mercury general-purpose Monte Carlo particle transport code, we improve application usability without diminishing performance. In two case studies, we illustrate how PyMercury improves usability and simplifies testing and validation in a Monte Carlo application. In short, PyMercury demonstrates the value of interactive Python for Monte Carlo particle transport applications. In the future, we expect interactive Python to play an increasingly significant role in Monte Carlo usage and testing.

  16. Aasta film - joonisfilm "Mont Blanc" / Verni Leivak

    Index Scriptorium Estoniae

    Leivak, Verni, 1966-

    2002-01-01

    Eesti Filmiajakirjanike Ühing andis aasta 2001 parima filmi tiitli Priit Tenderi joonisfilmile "Mont Blanc" : Eesti Joonisfilm 2001.Ka filmikriitikute eelistused kinodes ja televisioonis 2001. aastal näidatud filmide osas

  17. Simulation and the Monte Carlo method

    CERN Document Server

    Rubinstein, Reuven Y

    2016-01-01

    Simulation and the Monte Carlo Method, Third Edition reflects the latest developments in the field and presents a fully updated and comprehensive account of the major topics that have emerged in Monte Carlo simulation since the publication of the classic First Edition over more than a quarter of a century ago. While maintaining its accessible and intuitive approach, this revised edition features a wealth of up-to-date information that facilitates a deeper understanding of problem solving across a wide array of subject areas, such as engineering, statistics, computer science, mathematics, and the physical and life sciences. The book begins with a modernized introduction that addresses the basic concepts of probability, Markov processes, and convex optimization. Subsequent chapters discuss the dramatic changes that have occurred in the field of the Monte Carlo method, with coverage of many modern topics including: Markov Chain Monte Carlo, variance reduction techniques such as the transform likelihood ratio...

  18. Avariide kiuste Monte Carlosse / Aare Arula

    Index Scriptorium Estoniae

    Arula, Aare

    2007-01-01

    Vt. ka Tehnika dlja Vsehh nr. 3, lk. 26-27. 26. jaanuaril 1937 Tallinnast Monte Carlo tähesõidule startinud Karl Siitanit ja tema meeskonda ootasid ees seiklused, mis oleksid neile peaaegu elu maksnud

  19. Avariide kiuste Monte Carlosse / Aare Arula

    Index Scriptorium Estoniae

    Arula, Aare

    2007-01-01

    Vt. ka Tehnika dlja Vsehh nr. 3, lk. 26-27. 26. jaanuaril 1937 Tallinnast Monte Carlo tähesõidule startinud Karl Siitanit ja tema meeskonda ootasid ees seiklused, mis oleksid neile peaaegu elu maksnud

  20. Monte Carlo simulations for plasma physics

    Energy Technology Data Exchange (ETDEWEB)

    Okamoto, M.; Murakami, S.; Nakajima, N.; Wang, W.X. [National Inst. for Fusion Science, Toki, Gifu (Japan)

    2000-07-01

    Plasma behaviours are very complicated and the analyses are generally difficult. However, when the collisional processes play an important role in the plasma behaviour, the Monte Carlo method is often employed as a useful tool. For examples, in neutral particle injection heating (NBI heating), electron or ion cyclotron heating, and alpha heating, Coulomb collisions slow down high energetic particles and pitch angle scatter them. These processes are often studied by the Monte Carlo technique and good agreements can be obtained with the experimental results. Recently, Monte Carlo Method has been developed to study fast particle transports associated with heating and generating the radial electric field. Further it is applied to investigating the neoclassical transport in the plasma with steep gradients of density and temperatures which is beyong the conventional neoclassical theory. In this report, we briefly summarize the researches done by the present authors utilizing the Monte Carlo method. (author)

  1. Predator trapping on Monte Vista NWR

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — This letter is summarizing the status of predator trapping on Monte Vista National Wildlife refuge in light of the referendum passes in the State of Colorado banning...

  2. Quantum Monte Carlo Calculations of Light Nuclei

    CERN Document Server

    Pieper, Steven C

    2007-01-01

    During the last 15 years, there has been much progress in defining the nuclear Hamiltonian and applying quantum Monte Carlo methods to the calculation of light nuclei. I describe both aspects of this work and some recent results.

  3. Monte Carlo methods for particle transport

    CERN Document Server

    Haghighat, Alireza

    2015-01-01

    The Monte Carlo method has become the de facto standard in radiation transport. Although powerful, if not understood and used appropriately, the method can give misleading results. Monte Carlo Methods for Particle Transport teaches appropriate use of the Monte Carlo method, explaining the method's fundamental concepts as well as its limitations. Concise yet comprehensive, this well-organized text: * Introduces the particle importance equation and its use for variance reduction * Describes general and particle-transport-specific variance reduction techniques * Presents particle transport eigenvalue issues and methodologies to address these issues * Explores advanced formulations based on the author's research activities * Discusses parallel processing concepts and factors affecting parallel performance Featuring illustrative examples, mathematical derivations, computer algorithms, and homework problems, Monte Carlo Methods for Particle Transport provides nuclear engineers and scientists with a practical guide ...

  4. Smart detectors for Monte Carlo radiative transfer

    CERN Document Server

    Baes, Maarten

    2008-01-01

    Many optimization techniques have been invented to reduce the noise that is inherent in Monte Carlo radiative transfer simulations. As the typical detectors used in Monte Carlo simulations do not take into account all the information contained in the impacting photon packages, there is still room to optimize this detection process and the corresponding estimate of the surface brightness distributions. We want to investigate how all the information contained in the distribution of impacting photon packages can be optimally used to decrease the noise in the surface brightness distributions and hence to increase the efficiency of Monte Carlo radiative transfer simulations. We demonstrate that the estimate of the surface brightness distribution in a Monte Carlo radiative transfer simulation is similar to the estimate of the density distribution in an SPH simulation. Based on this similarity, a recipe is constructed for smart detectors that take full advantage of the exact location of the impact of the photon pack...

  5. Quantum Monte Carlo approaches for correlated systems

    CERN Document Server

    Becca, Federico

    2017-01-01

    Over the past several decades, computational approaches to studying strongly-interacting systems have become increasingly varied and sophisticated. This book provides a comprehensive introduction to state-of-the-art quantum Monte Carlo techniques relevant for applications in correlated systems. Providing a clear overview of variational wave functions, and featuring a detailed presentation of stochastic samplings including Markov chains and Langevin dynamics, which are developed into a discussion of Monte Carlo methods. The variational technique is described, from foundations to a detailed description of its algorithms. Further topics discussed include optimisation techniques, real-time dynamics and projection methods, including Green's function, reptation and auxiliary-field Monte Carlo, from basic definitions to advanced algorithms for efficient codes, and the book concludes with recent developments on the continuum space. Quantum Monte Carlo Approaches for Correlated Systems provides an extensive reference ...

  6. Pheasant hunting on the Monte Vista NWR

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — This letter to the Alamosa/Monte Vista NWR Refuge Manager discusses the need to alter management of pheasants in the area to halt the continued decline in population...

  7. Aasta film - joonisfilm "Mont Blanc" / Verni Leivak

    Index Scriptorium Estoniae

    Leivak, Verni, 1966-

    2002-01-01

    Eesti Filmiajakirjanike Ühing andis aasta 2001 parima filmi tiitli Priit Tenderi joonisfilmile "Mont Blanc" : Eesti Joonisfilm 2001.Ka filmikriitikute eelistused kinodes ja televisioonis 2001. aastal näidatud filmide osas

  8. LCG Monte-Carlo Data Base

    CERN Document Server

    Bartalini, P.; Kryukov, A.; Selyuzhenkov, Ilya V.; Sherstnev, A.; Vologdin, A.

    2004-01-01

    We present the Monte-Carlo events Data Base (MCDB) project and its development plans. MCDB facilitates communication between authors of Monte-Carlo generators and experimental users. It also provides a convenient book-keeping and an easy access to generator level samples. The first release of MCDB is now operational for the CMS collaboration. In this paper we review the main ideas behind MCDB and discuss future plans to develop this Data Base further within the CERN LCG framework.

  9. Monte Carlo Algorithms for Linear Problems

    OpenAIRE

    DIMOV, Ivan

    2000-01-01

    MSC Subject Classification: 65C05, 65U05. Monte Carlo methods are a powerful tool in many fields of mathematics, physics and engineering. It is known, that these methods give statistical estimates for the functional of the solution by performing random sampling of a certain chance variable whose mathematical expectation is the desired functional. Monte Carlo methods are methods for solving problems using random variables. In the book [16] edited by Yu. A. Shreider one can find the followin...

  10. The Feynman Path Goes Monte Carlo

    OpenAIRE

    Sauer, Tilman

    2001-01-01

    Path integral Monte Carlo (PIMC) simulations have become an important tool for the investigation of the statistical mechanics of quantum systems. I discuss some of the history of applying the Monte Carlo method to non-relativistic quantum systems in path-integral representation. The principle feasibility of the method was well established by the early eighties, a number of algorithmic improvements have been introduced in the last two decades.

  11. Monte Carlo Hamiltonian:Inverse Potential

    Institute of Scientific and Technical Information of China (English)

    LUO Xiang-Qian; CHENG Xiao-Ni; Helmut KR(O)GER

    2004-01-01

    The Monte Carlo Hamiltonian method developed recently allows to investigate the ground state and low-lying excited states of a quantum system,using Monte Carlo(MC)algorithm with importance sampling.However,conventional MC algorithm has some difficulties when applied to inverse potentials.We propose to use effective potential and extrapolation method to solve the problem.We present examples from the hydrogen system.

  12. Self-consistent kinetic lattice Monte Carlo

    Energy Technology Data Exchange (ETDEWEB)

    Horsfield, A.; Dunham, S.; Fujitani, Hideaki

    1999-07-01

    The authors present a brief description of a formalism for modeling point defect diffusion in crystalline systems using a Monte Carlo technique. The main approximations required to construct a practical scheme are briefly discussed, with special emphasis on the proper treatment of charged dopants and defects. This is followed by tight binding calculations of the diffusion barrier heights for charged vacancies. Finally, an application of the kinetic lattice Monte Carlo method to vacancy diffusion is presented.

  13. Error in Monte Carlo, quasi-error in Quasi-Monte Carlo

    CERN Document Server

    Kleiss, R H

    2006-01-01

    While the Quasi-Monte Carlo method of numerical integration achieves smaller integration error than standard Monte Carlo, its use in particle physics phenomenology has been hindered by the abscence of a reliable way to estimate that error. The standard Monte Carlo error estimator relies on the assumption that the points are generated independently of each other and, therefore, fails to account for the error improvement advertised by the Quasi-Monte Carlo method. We advocate the construction of an estimator of stochastic nature, based on the ensemble of pointsets with a particular discrepancy value. We investigate the consequences of this choice and give some first empirical results on the suggested estimators.

  14. kmos: A lattice kinetic Monte Carlo framework

    Science.gov (United States)

    Hoffmann, Max J.; Matera, Sebastian; Reuter, Karsten

    2014-07-01

    Kinetic Monte Carlo (kMC) simulations have emerged as a key tool for microkinetic modeling in heterogeneous catalysis and other materials applications. Systems, where site-specificity of all elementary reactions allows a mapping onto a lattice of discrete active sites, can be addressed within the particularly efficient lattice kMC approach. To this end we describe the versatile kmos software package, which offers a most user-friendly implementation, execution, and evaluation of lattice kMC models of arbitrary complexity in one- to three-dimensional lattice systems, involving multiple active sites in periodic or aperiodic arrangements, as well as site-resolved pairwise and higher-order lateral interactions. Conceptually, kmos achieves a maximum runtime performance which is essentially independent of lattice size by generating code for the efficiency-determining local update of available events that is optimized for a defined kMC model. For this model definition and the control of all runtime and evaluation aspects kmos offers a high-level application programming interface. Usage proceeds interactively, via scripts, or a graphical user interface, which visualizes the model geometry, the lattice occupations and rates of selected elementary reactions, while allowing on-the-fly changes of simulation parameters. We demonstrate the performance and scaling of kmos with the application to kMC models for surface catalytic processes, where for given operation conditions (temperature and partial pressures of all reactants) central simulation outcomes are catalytic activity and selectivities, surface composition, and mechanistic insight into the occurrence of individual elementary processes in the reaction network.

  15. Monte Carlo EM加速算法%Acceleration of Monte Carlo EM Algorithm

    Institute of Scientific and Technical Information of China (English)

    罗季

    2008-01-01

    EM算法是近年来常用的求后验众数的估计的一种数据增广算法,但由于求出其E步中积分的显示表达式有时很困难,甚至不可能,限制了其应用的广泛性.而Monte Carlo EM算法很好地解决了这个问题,将EM算法中E步的积分用Monte Carlo模拟来有效实现,使其适用性大大增强.但无论是EM算法,还是Monte Carlo EM算法,其收敛速度都是线性的,被缺损信息的倒数所控制,当缺损数据的比例很高时,收敛速度就非常缓慢.而Newton-Raphson算法在后验众数的附近具有二次收敛速率.本文提出Monte Carlo EM加速算法,将Monte Carlo EM算法与Newton-Raphson算法结合,既使得EM算法中的E步用Monte Carlo模拟得以实现,又证明了该算法在后验众数附近具有二次收敛速度.从而使其保留了Monte Carlo EM算法的优点,并改进了Monte Carlo EM算法的收敛速度.本文通过数值例子,将Monte Carlo EM加速算法的结果与EM算法、Monte Carlo EM算法的结果进行比较,进一步说明了Monte Carlo EM加速算法的优良性.

  16. Teaching Markov Chain Monte Carlo: Revealing the Basic Ideas behind the Algorithm

    Science.gov (United States)

    Stewart, Wayne; Stewart, Sepideh

    2014-01-01

    For many scientists, researchers and students Markov chain Monte Carlo (MCMC) simulation is an important and necessary tool to perform Bayesian analyses. The simulation is often presented as a mathematical algorithm and then translated into an appropriate computer program. However, this can result in overlooking the fundamental and deeper…

  17. McStas 1.1: A tool for building neutron Monte Carlo simulations

    DEFF Research Database (Denmark)

    Lefmann, K.; Nielsen, K.; Tennant, D.A.

    2000-01-01

    McStas is a project to develop general tools for the creation of simulations of neutron scattering experiments. In this paper, we briefly introduce McStas and describe a particular application of the program: the Monte Carlo calculation of the resolution function of a standard triple-axis neutron...

  18. McStas 1.1: a tool for building neutron Monte Carlo simulations

    Science.gov (United States)

    Lefmann, K.; Nielsen, K.; Tennant, A.; Lake, B.

    2000-03-01

    McStas is a project to develop general tools for the creation of simulations of neutron scattering experiments. In this paper, we briefly introduce McStas and describe a particular application of the program: the Monte Carlo calculation of the resolution function of a standard triple-axis neutron scattering instrument. The method compares well with the analytical calculations of Popovici.

  19. Markov Chain Monte Carlo Estimation of Item Parameters for the Generalized Graded Unfolding Model

    Science.gov (United States)

    de la Torre, Jimmy; Stark, Stephen; Chernyshenko, Oleksandr S.

    2006-01-01

    The authors present a Markov Chain Monte Carlo (MCMC) parameter estimation procedure for the generalized graded unfolding model (GGUM) and compare it to the marginal maximum likelihood (MML) approach implemented in the GGUM2000 computer program, using simulated and real personality data. In the simulation study, test length, number of response…

  20. Libraries and Development Environments for Monte Carlo Simulations of Lattice Gauge Theories on Parallel Computers

    Science.gov (United States)

    Decker, K. M.; Jayewardena, C.; Rehmann, R.

    We describe the library lgtlib, and lgttool, the corresponding development environment for Monte Carlo simulations of lattice gauge theory on multiprocessor vector computers with shared memory. We explain why distributed memory parallel processor (DMPP) architectures are particularly appealing for compute-intensive scientific applications, and introduce the design of a general application and program development environment system for scientific applications on DMPP architectures.

  1. Monte Carlo Evaluation of Tritium Beta Spectrum Energy Deposition in Gallium Nitride (GaN) Direct Energy Conversion Devices

    Science.gov (United States)

    2014-09-01

    Monte Carlo Evaluation of Tritium Beta Spectrum Energy Deposition in Gallium Nitride (GaN) Direct Energy Conversion Devices by Marc Litz...MD 20783-1138 ARL-TR-7082 September 2014 Monte Carlo Evaluation of Tritium Beta Spectrum Energy Deposition in Gallium Nitride (GaN... Tritium Beta Spectrum Energy Deposition in Gallium Nitride (GaN) Direct Energy Conversion Devices 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM

  2. A Positive-Weight Next-to-Leading-Order Monte Carlo for e+e- Annihilation to Hadrons

    CERN Document Server

    Latunde-Dada, O; Webber, Bryan R; Gieseke, Stefan; Latunde-Dada, Oluseyi; Webber, Bryan

    2007-01-01

    We apply the positive-weight Monte Carlo method of Nason for simulating QCD processes accurate to Next-To-Leading Order to the case of e+e- annihilation to hadrons. The method entails the generation of the hardest gluon emission first and then subsequently adding a `truncated' shower before the emission. We have interfaced our result to the Herwig++ shower Monte Carlo program and obtained better results than those obtained with Herwig++ at leading order with a matrix element correction.

  3. Streamlining resummed QCD calculations using Monte Carlo integration

    CERN Document Server

    Farhi, David; Freytsis, Marat; Schwartz, Matthew D

    2015-01-01

    Some of the most arduous and error-prone aspects of precision resummed calculations are related to the partonic hard process, having nothing to do with the resummation. In particular, interfacing to parton-distribution functions, combining various channels, and performing the phase space integration can be limiting factors in completing calculations. Conveniently, however, most of these tasks are already automated in many Monte Carlo programs, such as MadGraph, Alpgen or Sherpa. In this paper, we show how such programs can be used to produce distributions of partonic kinematics with associated color structures representing the hard factor in a resummed distribution. These distributions can then be used to weight convolutions of jet, soft and beam functions producing a complete resummed calculation. In fact, only around 1000 unweighted events are necessary to produce precise distributions. A number of examples and checks are provided, including $e^+e^-$ two- and four-jet event shapes, $n$-jettiness and jet-mas...

  4. Spatial distribution sampling and Monte Carlo simulation of radioactive isotopes

    CERN Document Server

    Krainer, Alexander Michael

    2015-01-01

    This work focuses on the implementation of a program for random sampling of uniformly spatially distributed isotopes for Monte Carlo particle simulations and in specific FLUKA. With FLUKA it is possible to calculate the radio nuclide production in high energy fields. The decay of these nuclide, and therefore the resulting radiation field, however can only be simulated in the same geometry. This works gives the tool to simulate the decay of the produced nuclide in other geometries. With that the radiation field from an irradiated object can be simulated in arbitrary environments. The sampling of isotope mixtures was tested by simulating a 50/50 mixture of $Cs^{137}$ and $Co^{60}$. These isotopes are both well known and provide therefore a first reliable benchmark in that respect. The sampling of uniformly distributed coordinates was tested using the histogram test for various spatial distributions. The advantages and disadvantages of the program compared to standard methods are demonstrated in the real life ca...

  5. Approaching Chemical Accuracy with Quantum Monte Carlo

    CERN Document Server

    Petruzielo, F R; Umrigar, C J

    2012-01-01

    A quantum Monte Carlo study of the atomization energies for the G2 set of molecules is presented. Basis size dependence of diffusion Monte Carlo atomization energies is studied with a single determinant Slater-Jastrow trial wavefunction formed from Hartree-Fock orbitals. With the largest basis set, the mean absolute deviation from experimental atomization energies for the G2 set is 3.0 kcal/mol. Optimizing the orbitals within variational Monte Carlo improves the agreement between diffusion Monte Carlo and experiment, reducing the mean absolute deviation to 2.1 kcal/mol. Moving beyond a single determinant Slater-Jastrow trial wavefunction, diffusion Monte Carlo with a small complete active space Slater-Jastrow trial wavefunction results in near chemical accuracy. In this case, the mean absolute deviation from experimental atomization energies is 1.2 kcal/mol. It is shown from calculations on systems containing phosphorus that the accuracy can be further improved by employing a larger active space.

  6. New simpler method of matching NLO corrections with parton shower Monte Carlo

    CERN Document Server

    Jadach, S; Sapeta, S; Siodmok, A; Skrzypek, M

    2016-01-01

    Next steps in development of the KrkNLO method of implementing NLO QCD corrections to hard processes in parton shower Monte Carlo programs are presented. This new method is a simpler alternative to other well-known approaches, such as MC@NLO and POWHEG. The KrkNLO method owns its simplicity to the use of parton distribution functions (PDFs) in a new, so-called Monte Carlo (MC), factorization scheme which was recently fully defined for the first time. Preliminary numerical results for the Higgs-boson production process are also presented.

  7. Monte Carlo studies of positron implantation in elemental metallic and multilayer systems

    Energy Technology Data Exchange (ETDEWEB)

    Ghosh, V.J.; Welch, D.O.; Lynn, K.G.

    1992-01-01

    We have used a Monte Carlo computer code developed at Brookhaven [sup 1,2] to study the implantation profiles of 1-10 keV positrons incident on a wide range of semi-infinite metals and multilayer systems. Our Monte Carlo program accounts for elastic scattering as well as inelastic scattering from core and valence electrons, and includes the excitation of plasmons. The implantation profiles of positrons in many metals as well as Pd/Al, and Al/Co/Si multilayers are presented. Scaling relations and closed-form expressions representing he implantation profiles are also discussed.

  8. Monte Carlo studies of positron implantation in elemental metallic and multilayer systems

    Energy Technology Data Exchange (ETDEWEB)

    Ghosh, V.J.; Welch, D.O.; Lynn, K.G.

    1992-12-01

    We have used a Monte Carlo computer code developed at Brookhaven {sup 1,2} to study the implantation profiles of 1-10 keV positrons incident on a wide range of semi-infinite metals and multilayer systems. Our Monte Carlo program accounts for elastic scattering as well as inelastic scattering from core and valence electrons, and includes the excitation of plasmons. The implantation profiles of positrons in many metals as well as Pd/Al, and Al/Co/Si multilayers are presented. Scaling relations and closed-form expressions representing he implantation profiles are also discussed.

  9. Multiplatform application for calculating a combined standard uncertainty using a Monte Carlo method

    Science.gov (United States)

    Niewinski, Marek; Gurnecki, Pawel

    2016-12-01

    The paper presents a new computer program for calculating a combined standard uncertainty. It implements the algorithm described in JCGM 101:20081 which is concerned with the use of a Monte Carlo method as an implementation of the propagation of distributions for uncertainty evaluation. The accuracy of the calculation has been obtained by using the high quality random number generators. The paper describes the main principles of the program and compares the obtained result with example problems presented in JCGM Supplement 1.

  10. CT-Based Brachytherapy Treatment Planning using Monte Carlo Simulation Aided by an Interface Software

    Directory of Open Access Journals (Sweden)

    Vahid Moslemi

    2011-03-01

    Full Text Available Introduction: In brachytherapy, radioactive sources are placed close to the tumor, therefore, small changes in their positions can cause large changes in the dose distribution. This emphasizes the need for computerized treatment planning. The usual method for treatment planning of cervix brachytherapy uses conventional radiographs in the Manchester system. Nowadays, because of their advantages in locating the source positions and the surrounding tissues, CT and MRI images are replacing conventional radiographs. In this study, we used CT images in Monte Carlo based dose calculation for brachytherapy treatment planning, using an interface software to create the geometry file required in the MCNP code. The aim of using the interface software is to facilitate and speed up the geometry set-up for simulations based on the patient’s anatomy. This paper examines the feasibility of this method in cervix brachytherapy and assesses its accuracy and speed. Material and Methods: For dosimetric measurements regarding the treatment plan, a pelvic phantom was made from polyethylene in which the treatment applicators could be placed. For simulations using CT images, the phantom was scanned at 120 kVp. Using an interface software written in MATLAB, the CT images were converted into MCNP input file and the simulation was then performed. Results: Using the interface software, preparation time for the simulations of the applicator and surrounding structures was approximately 3 minutes; the corresponding time needed in the conventional MCNP geometry entry being approximately 1 hour. The discrepancy in the simulated and measured doses to point A was 1.7% of the prescribed dose.  The corresponding dose differences between the two methods in rectum and bladder were 3.0% and 3.7% of the prescribed dose, respectively. Comparing the results of simulation using the interface software with those of simulation using the standard MCNP geometry entry showed a less than 1

  11. A Monte Carlo Method for Making the SDSS u-Band Magnitude More Accurate

    Science.gov (United States)

    Gu, Jiayin; Du, Cuihua; Zuo, Wenbo; Jing, Yingjie; Wu, Zhenyu; Ma, Jun; Zhou, Xu

    2016-10-01

    We develop a new Monte Carlo-based method to convert the Sloan Digital Sky Survey (SDSS) u-band magnitude to the south Galactic Cap of the u-band Sky Survey (SCUSS) u-band magnitude. Due to the increased accuracy of SCUSS u-band measurements, the converted u-band magnitude becomes more accurate compared with the original SDSS u-band magnitude, in particular at the faint end. The average u-magnitude error (for both SDSS and SCUSS) of numerous main-sequence stars with 0.2\\lt g-r\\lt 0.8 increases as the g-band magnitude becomes fainter. When g = 19.5, the average magnitude error of the SDSS u is 0.11. When g = 20.5, the average SDSS u error rises to 0.22. However, at this magnitude, the average magnitude error of the SCUSS u is just half as much as that of the SDSS u. The SDSS u-band magnitudes of main-sequence stars with 0.2\\lt g-r\\lt 0.8 and 18.5\\lt g\\lt 20.5 are converted, therefore the maximum average error of the converted u-band magnitudes is 0.11. The potential application of this conversion is to derive a more accurate photometric metallicity calibration from SDSS observations, especially for the more distant stars. Thus, we can explore stellar metallicity distributions either in the Galactic halo or some stream stars.

  12. Random Numbers and Monte Carlo Methods

    Science.gov (United States)

    Scherer, Philipp O. J.

    Many-body problems often involve the calculation of integrals of very high dimension which cannot be treated by standard methods. For the calculation of thermodynamic averages Monte Carlo methods are very useful which sample the integration volume at randomly chosen points. After summarizing some basic statistics, we discuss algorithms for the generation of pseudo-random numbers with given probability distribution which are essential for all Monte Carlo methods. We show how the efficiency of Monte Carlo integration can be improved by sampling preferentially the important configurations. Finally the famous Metropolis algorithm is applied to classical many-particle systems. Computer experiments visualize the central limit theorem and apply the Metropolis method to the traveling salesman problem.

  13. SMCTC: Sequential Monte Carlo in C++

    Directory of Open Access Journals (Sweden)

    Adam M. Johansen

    2009-04-01

    Full Text Available Sequential Monte Carlo methods are a very general class of Monte Carlo methodsfor sampling from sequences of distributions. Simple examples of these algorithms areused very widely in the tracking and signal processing literature. Recent developmentsillustrate that these techniques have much more general applicability, and can be appliedvery eectively to statistical inference problems. Unfortunately, these methods are oftenperceived as being computationally expensive and dicult to implement. This articleseeks to address both of these problems.A C++ template class library for the ecient and convenient implementation of verygeneral Sequential Monte Carlo algorithms is presented. Two example applications areprovided: a simple particle lter for illustrative purposes and a state-of-the-art algorithmfor rare event estimation.

  14. Shell model the Monte Carlo way

    Energy Technology Data Exchange (ETDEWEB)

    Ormand, W.E.

    1995-03-01

    The formalism for the auxiliary-field Monte Carlo approach to the nuclear shell model is presented. The method is based on a linearization of the two-body part of the Hamiltonian in an imaginary-time propagator using the Hubbard-Stratonovich transformation. The foundation of the method, as applied to the nuclear many-body problem, is discussed. Topics presented in detail include: (1) the density-density formulation of the method, (2) computation of the overlaps, (3) the sign of the Monte Carlo weight function, (4) techniques for performing Monte Carlo sampling, and (5) the reconstruction of response functions from an imaginary-time auto-correlation function using MaxEnt techniques. Results obtained using schematic interactions, which have no sign problem, are presented to demonstrate the feasibility of the method, while an extrapolation method for realistic Hamiltonians is presented. In addition, applications at finite temperature are outlined.

  15. Quantum Monte Carlo with variable spins.

    Science.gov (United States)

    Melton, Cody A; Bennett, M Chandler; Mitas, Lubos

    2016-06-28

    We investigate the inclusion of variable spins in electronic structure quantum Monte Carlo, with a focus on diffusion Monte Carlo with Hamiltonians that include spin-orbit interactions. Following our previous introduction of fixed-phase spin-orbit diffusion Monte Carlo, we thoroughly discuss the details of the method and elaborate upon its technicalities. We present a proof for an upper-bound property for complex nonlocal operators, which allows for the implementation of T-moves to ensure the variational property. We discuss the time step biases associated with our particular choice of spin representation. Applications of the method are also presented for atomic and molecular systems. We calculate the binding energies and geometry of the PbH and Sn2 molecules, as well as the electron affinities of the 6p row elements in close agreement with experiments.

  16. A brief introduction to Monte Carlo simulation.

    Science.gov (United States)

    Bonate, P L

    2001-01-01

    Simulation affects our life every day through our interactions with the automobile, airline and entertainment industries, just to name a few. The use of simulation in drug development is relatively new, but its use is increasing in relation to the speed at which modern computers run. One well known example of simulation in drug development is molecular modelling. Another use of simulation that is being seen recently in drug development is Monte Carlo simulation of clinical trials. Monte Carlo simulation differs from traditional simulation in that the model parameters are treated as stochastic or random variables, rather than as fixed values. The purpose of this paper is to provide a brief introduction to Monte Carlo simulation methods.

  17. Quantum Monte Carlo with Variable Spins

    CERN Document Server

    Melton, Cody A; Mitas, Lubos

    2016-01-01

    We investigate the inclusion of variable spins in electronic structure quantum Monte Carlo, with a focus on diffusion Monte Carlo with Hamiltonians that include spin-orbit interactions. Following our previous introduction of fixed-phase spin-orbit diffusion Monte Carlo (FPSODMC), we thoroughly discuss the details of the method and elaborate upon its technicalities. We present a proof for an upper-bound property for complex nonlocal operators, which allows for the implementation of T-moves to ensure the variational property. We discuss the time step biases associated with our particular choice of spin representation. Applications of the method are also presented for atomic and molecular systems. We calculate the binding energies and geometry of the PbH and Sn$_2$ molecules, as well as the electron affinities of the 6$p$ row elements in close agreement with experiments.

  18. Quantum speedup of Monte Carlo methods.

    Science.gov (United States)

    Montanaro, Ashley

    2015-09-08

    Monte Carlo methods use random sampling to estimate numerical quantities which are hard to compute deterministically. One important example is the use in statistical physics of rapidly mixing Markov chains to approximately compute partition functions. In this work, we describe a quantum algorithm which can accelerate Monte Carlo methods in a very general setting. The algorithm estimates the expected output value of an arbitrary randomized or quantum subroutine with bounded variance, achieving a near-quadratic speedup over the best possible classical algorithm. Combining the algorithm with the use of quantum walks gives a quantum speedup of the fastest known classical algorithms with rigorous performance bounds for computing partition functions, which use multiple-stage Markov chain Monte Carlo techniques. The quantum algorithm can also be used to estimate the total variation distance between probability distributions efficiently.

  19. Adiabatic optimization versus diffusion Monte Carlo methods

    Science.gov (United States)

    Jarret, Michael; Jordan, Stephen P.; Lackey, Brad

    2016-10-01

    Most experimental and theoretical studies of adiabatic optimization use stoquastic Hamiltonians, whose ground states are expressible using only real nonnegative amplitudes. This raises a question as to whether classical Monte Carlo methods can simulate stoquastic adiabatic algorithms with polynomial overhead. Here we analyze diffusion Monte Carlo algorithms. We argue that, based on differences between L1 and L2 normalized states, these algorithms suffer from certain obstructions preventing them from efficiently simulating stoquastic adiabatic evolution in generality. In practice however, we obtain good performance by introducing a method that we call Substochastic Monte Carlo. In fact, our simulations are good classical optimization algorithms in their own right, competitive with the best previously known heuristic solvers for MAX-k -SAT at k =2 ,3 ,4 .

  20. Self-learning Monte Carlo method

    Science.gov (United States)

    Liu, Junwei; Qi, Yang; Meng, Zi Yang; Fu, Liang

    2017-01-01

    Monte Carlo simulation is an unbiased numerical tool for studying classical and quantum many-body systems. One of its bottlenecks is the lack of a general and efficient update algorithm for large size systems close to the phase transition, for which local updates perform badly. In this Rapid Communication, we propose a general-purpose Monte Carlo method, dubbed self-learning Monte Carlo (SLMC), in which an efficient update algorithm is first learned from the training data generated in trial simulations and then used to speed up the actual simulation. We demonstrate the efficiency of SLMC in a spin model at the phase transition point, achieving a 10-20 times speedup.

  1. Monte Carlo strategies in scientific computing

    CERN Document Server

    Liu, Jun S

    2008-01-01

    This paperback edition is a reprint of the 2001 Springer edition This book provides a self-contained and up-to-date treatment of the Monte Carlo method and develops a common framework under which various Monte Carlo techniques can be "standardized" and compared Given the interdisciplinary nature of the topics and a moderate prerequisite for the reader, this book should be of interest to a broad audience of quantitative researchers such as computational biologists, computer scientists, econometricians, engineers, probabilists, and statisticians It can also be used as the textbook for a graduate-level course on Monte Carlo methods Many problems discussed in the alter chapters can be potential thesis topics for masters’ or PhD students in statistics or computer science departments Jun Liu is Professor of Statistics at Harvard University, with a courtesy Professor appointment at Harvard Biostatistics Department Professor Liu was the recipient of the 2002 COPSS Presidents' Award, the most prestigious one for sta...

  2. Evaluation of the interindividual human variation in bioactivation of methyleugenol using physiologically based kinetic modeling and Monte Carlo simulations

    Energy Technology Data Exchange (ETDEWEB)

    Al-Subeihi, Ala' A.A., E-mail: subeihi@yahoo.com [Division of Toxicology, Wageningen University, Tuinlaan 5, 6703 HE Wageningen (Netherlands); BEN-HAYYAN-Aqaba International Laboratories, Aqaba Special Economic Zone Authority (ASEZA), P. O. Box 2565, Aqaba 77110 (Jordan); Alhusainy, Wasma; Kiwamoto, Reiko; Spenkelink, Bert [Division of Toxicology, Wageningen University, Tuinlaan 5, 6703 HE Wageningen (Netherlands); Bladeren, Peter J. van [Division of Toxicology, Wageningen University, Tuinlaan 5, 6703 HE Wageningen (Netherlands); Nestec S.A., Avenue Nestlé 55, 1800 Vevey (Switzerland); Rietjens, Ivonne M.C.M.; Punt, Ans [Division of Toxicology, Wageningen University, Tuinlaan 5, 6703 HE Wageningen (Netherlands)

    2015-03-01

    The present study aims at predicting the level of formation of the ultimate carcinogenic metabolite of methyleugenol, 1′-sulfooxymethyleugenol, in the human population by taking variability in key bioactivation and detoxification reactions into account using Monte Carlo simulations. Depending on the metabolic route, variation was simulated based on kinetic constants obtained from incubations with a range of individual human liver fractions or by combining kinetic constants obtained for specific isoenzymes with literature reported human variation in the activity of these enzymes. The results of the study indicate that formation of 1′-sulfooxymethyleugenol is predominantly affected by variation in i) P450 1A2-catalyzed bioactivation of methyleugenol to 1′-hydroxymethyleugenol, ii) P450 2B6-catalyzed epoxidation of methyleugenol, iii) the apparent kinetic constants for oxidation of 1′-hydroxymethyleugenol, and iv) the apparent kinetic constants for sulfation of 1′-hydroxymethyleugenol. Based on the Monte Carlo simulations a so-called chemical-specific adjustment factor (CSAF) for intraspecies variation could be derived by dividing different percentiles by the 50th percentile of the predicted population distribution for 1′-sulfooxymethyleugenol formation. The obtained CSAF value at the 90th percentile was 3.2, indicating that the default uncertainty factor of 3.16 for human variability in kinetics may adequately cover the variation within 90% of the population. Covering 99% of the population requires a larger uncertainty factor of 6.4. In conclusion, the results showed that adequate predictions on interindividual human variation can be made with Monte Carlo-based PBK modeling. For methyleugenol this variation was observed to be in line with the default variation generally assumed in risk assessment. - Highlights: • Interindividual human differences in methyleugenol bioactivation were simulated. • This was done using in vitro incubations, PBK modeling

  3. Sampling uncertainty evaluation for data acquisition board based on Monte Carlo method

    Science.gov (United States)

    Ge, Leyi; Wang, Zhongyu

    2008-10-01

    Evaluating the data acquisition board sampling uncertainty is a difficult problem in the field of signal sampling. This paper analyzes the sources of dada acquisition board sampling uncertainty in the first, then introduces a simulation theory of dada acquisition board sampling uncertainty evaluation based on Monte Carlo method and puts forward a relation model of sampling uncertainty results, sampling numbers and simulation times. In the case of different sample numbers and different signal scopes, the author establishes a random sampling uncertainty evaluation program of a PCI-6024E data acquisition board to execute the simulation. The results of the proposed Monte Carlo simulation method are in a good agreement with the GUM ones, and the validities of Monte Carlo method are represented.

  4. MCNP-REN a Monte Carlo tool for neutron detector design

    CERN Document Server

    Abhold, M E

    2002-01-01

    The development of neutron detectors makes extensive use of the predictions of detector response through the use of Monte Carlo techniques in conjunction with the point reactor model. Unfortunately, the point reactor model fails to accurately predict detector response in common applications. For this reason, the general Monte Carlo code developed at Los Alamos National Laboratory, Monte Carlo N-Particle (MCNP), was modified to simulate the pulse streams that would be generated by a neutron detector and normally analyzed by a shift register. This modified code, MCNP-Random Exponentially Distributed Neutron Source (MCNP-REN), along with the Time Analysis Program, predicts neutron detector response without using the point reactor model, making it unnecessary for the user to decide whether or not the assumptions of the point model are met for their application. MCNP-REN is capable of simulating standard neutron coincidence counting as well as neutron multiplicity counting. Measurements of mixed oxide fresh fuel w...

  5. Parallel Markov chain Monte Carlo simulations.

    Science.gov (United States)

    Ren, Ruichao; Orkoulas, G

    2007-06-07

    With strict detailed balance, parallel Monte Carlo simulation through domain decomposition cannot be validated with conventional Markov chain theory, which describes an intrinsically serial stochastic process. In this work, the parallel version of Markov chain theory and its role in accelerating Monte Carlo simulations via cluster computing is explored. It is shown that sequential updating is the key to improving efficiency in parallel simulations through domain decomposition. A parallel scheme is proposed to reduce interprocessor communication or synchronization, which slows down parallel simulation with increasing number of processors. Parallel simulation results for the two-dimensional lattice gas model show substantial reduction of simulation time for systems of moderate and large size.

  6. Monte Carlo Hamiltonian:Linear Potentials

    Institute of Scientific and Technical Information of China (English)

    LUOXiang-Qian; HelmutKROEGER; 等

    2002-01-01

    We further study the validity of the Monte Carlo Hamiltonian method .The advantage of the method,in comparison with the standard Monte Carlo Lagrangian approach,is its capability to study the excited states.We consider two quantum mechanical models:a symmetric one V(x)=/x/2;and an asymmetric one V(x)==∞,for x<0 and V(x)=2,for x≥0.The results for the spectrum,wave functions and thermodynamical observables are in agreement with the analytical or Runge-Kutta calculations.

  7. Monte Carlo dose distributions for radiosurgery

    Energy Technology Data Exchange (ETDEWEB)

    Perucha, M.; Leal, A.; Rincon, M.; Carrasco, E. [Sevilla Univ. (Spain). Dept. Fisiologia Medica y Biofisica; Sanchez-Doblado, F. [Sevilla Univ. (Spain). Dept. Fisiologia Medica y Biofisica]|[Hospital Univ. Virgen Macarena, Sevilla (Spain). Servicio de Oncologia Radioterapica; Nunez, L. [Clinica Puerta de Hierro, Madrid (Spain). Servicio de Radiofisica; Arrans, R.; Sanchez-Calzado, J.A.; Errazquin, L. [Hospital Univ. Virgen Macarena, Sevilla (Spain). Servicio de Oncologia Radioterapica; Sanchez-Nieto, B. [Royal Marsden NHS Trust (United Kingdom). Joint Dept. of Physics]|[Inst. of Cancer Research, Sutton, Surrey (United Kingdom)

    2001-07-01

    The precision of Radiosurgery Treatment planning systems is limited by the approximations of their algorithms and by their dosimetrical input data. This fact is especially important in small fields. However, the Monte Carlo methods is an accurate alternative as it considers every aspect of particle transport. In this work an acoustic neurinoma is studied by comparing the dose distribution of both a planning system and Monte Carlo. Relative shifts have been measured and furthermore, Dose-Volume Histograms have been calculated for target and adjacent organs at risk. (orig.)

  8. Monte carlo simulations of organic photovoltaics.

    Science.gov (United States)

    Groves, Chris; Greenham, Neil C

    2014-01-01

    Monte Carlo simulations are a valuable tool to model the generation, separation, and collection of charges in organic photovoltaics where charges move by hopping in a complex nanostructure and Coulomb interactions between charge carriers are important. We review the Monte Carlo techniques that have been applied to this problem, and describe the results of simulations of the various recombination processes that limit device performance. We show how these processes are influenced by the local physical and energetic structure of the material, providing information that is useful for design of efficient photovoltaic systems.

  9. The Rational Hybrid Monte Carlo Algorithm

    CERN Document Server

    Clark, M A

    2006-01-01

    The past few years have seen considerable progress in algorithmic development for the generation of gauge fields including the effects of dynamical fermions. The Rational Hybrid Monte Carlo (RHMC) algorithm, where Hybrid Monte Carlo is performed using a rational approximation in place the usual inverse quark matrix kernel is one of these developments. This algorithm has been found to be extremely beneficial in many areas of lattice QCD (chiral fermions, finite temperature, Wilson fermions etc.). We review the algorithm and some of these benefits, and we compare against other recent algorithm developements. We conclude with an update of the Berlin wall plot comparing costs of all popular fermion formulations.

  10. The Rational Hybrid Monte Carlo algorithm

    Science.gov (United States)

    Clark, Michael

    2006-12-01

    The past few years have seen considerable progress in algorithmic development for the generation of gauge fields including the effects of dynamical fermions. The Rational Hybrid Monte Carlo (RHMC) algorithm, where Hybrid Monte Carlo is performed using a rational approximation in place the usual inverse quark matrix kernel is one of these developments. This algorithm has been found to be extremely beneficial in many areas of lattice QCD (chiral fermions, finite temperature, Wilson fermions etc.). We review the algorithm and some of these benefits, and we compare against other recent algorithm developements. We conclude with an update of the Berlin wall plot comparing costs of all popular fermion formulations.

  11. Fast sequential Monte Carlo methods for counting and optimization

    CERN Document Server

    Rubinstein, Reuven Y; Vaisman, Radislav

    2013-01-01

    A comprehensive account of the theory and application of Monte Carlo methods Based on years of research in efficient Monte Carlo methods for estimation of rare-event probabilities, counting problems, and combinatorial optimization, Fast Sequential Monte Carlo Methods for Counting and Optimization is a complete illustration of fast sequential Monte Carlo techniques. The book provides an accessible overview of current work in the field of Monte Carlo methods, specifically sequential Monte Carlo techniques, for solving abstract counting and optimization problems. Written by authorities in the

  12. Monte Carlo reference data sets for imaging research: Executive summary of the report of AAPM Research Committee Task Group 195.

    Science.gov (United States)

    Sechopoulos, Ioannis; Ali, Elsayed S M; Badal, Andreu; Badano, Aldo; Boone, John M; Kyprianou, Iacovos S; Mainegra-Hing, Ernesto; McMillan, Kyle L; McNitt-Gray, Michael F; Rogers, D W O; Samei, Ehsan; Turner, Adam C

    2015-10-01

    The use of Monte Carlo simulations in diagnostic medical imaging research is widespread due to its flexibility and ability to estimate quantities that are challenging to measure empirically. However, any new Monte Carlo simulation code needs to be validated before it can be used reliably. The type and degree of validation required depends on the goals of the research project, but, typically, such validation involves either comparison of simulation results to physical measurements or to previously published results obtained with established Monte Carlo codes. The former is complicated due to nuances of experimental conditions and uncertainty, while the latter is challenging due to typical graphical presentation and lack of simulation details in previous publications. In addition, entering the field of Monte Carlo simulations in general involves a steep learning curve. It is not a simple task to learn how to program and interpret a Monte Carlo simulation, even when using one of the publicly available code packages. This Task Group report provides a common reference for benchmarking Monte Carlo simulations across a range of Monte Carlo codes and simulation scenarios. In the report, all simulation conditions are provided for six different Monte Carlo simulation cases that involve common x-ray based imaging research areas. The results obtained for the six cases using four publicly available Monte Carlo software packages are included in tabular form. In addition to a full description of all simulation conditions and results, a discussion and comparison of results among the Monte Carlo packages and the lessons learned during the compilation of these results are included. This abridged version of the report includes only an introductory description of the six cases and a brief example of the results of one of the cases. This work provides an investigator the necessary information to benchmark his/her Monte Carlo simulation software against the reference cases included here

  13. Monte Carlo methods in AB initio quantum chemistry quantum Monte Carlo for molecules

    CERN Document Server

    Lester, William A; Reynolds, PJ

    1994-01-01

    This book presents the basic theory and application of the Monte Carlo method to the electronic structure of atoms and molecules. It assumes no previous knowledge of the subject, only a knowledge of molecular quantum mechanics at the first-year graduate level. A working knowledge of traditional ab initio quantum chemistry is helpful, but not essential.Some distinguishing features of this book are: Clear exposition of the basic theory at a level to facilitate independent study. Discussion of the various versions of the theory: diffusion Monte Carlo, Green's function Monte Carlo, and release n

  14. Use of Monte Carlo Methods in brachytherapy; Uso del metodo de Monte Carlo en braquiterapia

    Energy Technology Data Exchange (ETDEWEB)

    Granero Cabanero, D.

    2015-07-01

    The Monte Carlo method has become a fundamental tool for brachytherapy dosimetry mainly because no difficulties associated with experimental dosimetry. In brachytherapy the main handicap of experimental dosimetry is the high dose gradient near the present sources making small uncertainties in the positioning of the detectors lead to large uncertainties in the dose. This presentation will review mainly the procedure for calculating dose distributions around a fountain using the Monte Carlo method showing the difficulties inherent in these calculations. In addition we will briefly review other applications of the method of Monte Carlo in brachytherapy dosimetry, as its use in advanced calculation algorithms, calculating barriers or obtaining dose applicators around. (Author)

  15. On the use of stochastic approximation Monte Carlo for Monte Carlo integration

    KAUST Repository

    Liang, Faming

    2009-03-01

    The stochastic approximation Monte Carlo (SAMC) algorithm has recently been proposed as a dynamic optimization algorithm in the literature. In this paper, we show in theory that the samples generated by SAMC can be used for Monte Carlo integration via a dynamically weighted estimator by calling some results from the literature of nonhomogeneous Markov chains. Our numerical results indicate that SAMC can yield significant savings over conventional Monte Carlo algorithms, such as the Metropolis-Hastings algorithm, for the problems for which the energy landscape is rugged. © 2008 Elsevier B.V. All rights reserved.

  16. Monte Carlo simulation. The water regime in the gas diffusion layer of a PEM fuel cell; Monte-Carlo-Simulation. Wasserhaushalt in der GDL einer PEM-Brennstoffzelle

    Energy Technology Data Exchange (ETDEWEB)

    Seidenberger, Katrin; Wilhelm, Florian; Scholta, Joachim [Zentrum fuer Sonnenenergie- und Wasserstoff-Forschung Baden-Wuerttemberg (ZSW), Ulm (Germany)

    2011-04-15

    The life of a fuel cell is determined by the life of its components. A Monte Carlo model developed by Zentrum fuer Sonnenenergie- und Wasserstoff-Forschung Baden-Wuerttemberg (ZWS) focuses on the gas diffusion layer (GDL). The simulation program assumes a medium-scale water distribution, thus enabling the detection of water accumulation in the GDL. The results can be compared with experimental data, e.g. from synchrotron tomography measurements, and verified.

  17. Monte Carlo simulation code modernization

    CERN Document Server

    CERN. Geneva

    2015-01-01

    The continual development of sophisticated transport simulation algorithms allows increasingly accurate description of the effect of the passage of particles through matter. This modelling capability finds applications in a large spectrum of fields from medicine to astrophysics, and of course HEP. These new capabilities however come at the cost of a greater computational intensity of the new models, which has the effect of increasing the demands of computing resources. This is particularly true for HEP, where the demand for more simulation are driven by the need of both more accuracy and more precision, i.e. better models and more events. Usually HEP has relied on the "Moore's law" evolution, but since almost ten years the increase in clock speed has withered and computing capacity comes in the form of hardware architectures of many-core or accelerated processors. To harness these opportunities we need to adapt our code to concurrent programming models taking advantages of both SIMD and SIMT architectures. Th...

  18. Virtual detector characterisation with Monte-Carlo simulations

    Science.gov (United States)

    Sukowski, F.; Yaneu Yaneu, J. F.; Salamon, M.; Ebert, S.; Uhlmann, N.

    2009-08-01

    In the field of X-ray imaging flat-panel detectors which convert X-rays into electrical signals, are widely used. For different applications, detectors differ in several specific parameters that can be used for characterizing the detector. At the Development Center X-ray Technology EZRT we studied the question how well these characteristics can be determined by only knowing the layer composition of a detector. In order to determine the required parameters, the Monte-Carlo (MC) simulation program ROSI [J. Giersch et al., Nucl. Instr. and Meth. A 509 (2003) 151] was used while taking into account all primary and secondary particle interactions as well as the focal spot size of the X-ray tube. For the study, the Hamamatsu C9311DK [Technical Datasheet Hamamatsu C9311DK flat panel sensor, Hamamatsu Photonics, ( www.hamamatsu.com)], a scintillator-based detector, and the Ajat DIC 100TL [Technical description of Ajat DIC 100TL, Ajat Oy Ltd., ( www.ajat.fi)], a direct converting semiconductor detector, were used. The layer compositions of the two detectors were implemented into the MC simulation program. The following characteristics were measured [N. Uhlmann et al., Nucl. Instr. and Meth. A 591 (2008) 46] and compared to simulation results: The basic spatial resolution (BSR), the modulation transfer function (MTF), the contrast sensitivity (CS) and the specific material thickness range (SMTR). To take scattering of optical photons into account DETECT2000 [C. Moisan et al., DETECT2000—A Program for Modeling Optical Properties of Scintillators, Department of Electrical and Computer Engineering, Laval University, Quebec City, 2000], another Monte-Carlo simulation was used.

  19. Monte Carlo Simulation of X-rays Multiple Refractive Scattering from Fine Structure Objects imaged with the DEI Technique

    CERN Document Server

    Khromova, A N; Arfelli, F; Menk, R H; Besch, H J; Plothow-Besch, H; 10.1109/NSSMIC.2004.1466758

    2010-01-01

    In this work we present a novel 3D Monte Carlo photon transport program for simulation of multiple refractive scattering based on the refractive properties of X-rays in highly scattering media, like lung tissue. Multiple scattering reduces not only the quality of the image, but contains also information on the internal structure of the object. This information can be exploited utilizing image modalities such as Diffraction Enhanced Imaging (DEI). To study the effect of multiple scattering a Monte Carlo program was developed that simulates multiple refractive scattering of X-ray photons on monodisperse PMMA (poly-methyl-methacrylate) microspheres representing alveoli in lung tissue. Eventually, the results of the Monte Carlo program were compared to the measurements taken at the SYRMEP beamline at Elettra (Trieste, Italy) on special phantoms showing a good agreement between both data.

  20. A comparison of Monte Carlo generators

    CERN Document Server

    Golan, Tomasz

    2014-01-01

    A comparison of GENIE, NEUT, NUANCE, and NuWro Monte Carlo neutrino event generators is presented using a set of four observables: protons multiplicity, total visible energy, most energetic proton momentum, and $\\pi^+$ two-dimensional energy vs cosine distribution.

  1. Monte Carlo Tools for Jet Quenching

    OpenAIRE

    Zapp, Korinna

    2011-01-01

    A thorough understanding of jet quenching on the basis of multi-particle final states and jet observables requires new theoretical tools. This talk summarises the status and propects of the theoretical description of jet quenching in terms of Monte Carlo generators.

  2. An Introduction to Monte Carlo Methods

    Science.gov (United States)

    Raeside, D. E.

    1974-01-01

    Reviews the principles of Monte Carlo calculation and random number generation in an attempt to introduce the direct and the rejection method of sampling techniques as well as the variance-reduction procedures. Indicates that the increasing availability of computers makes it possible for a wider audience to learn about these powerful methods. (CC)

  3. Scalable Domain Decomposed Monte Carlo Particle Transport

    Energy Technology Data Exchange (ETDEWEB)

    O' Brien, Matthew Joseph [Univ. of California, Davis, CA (United States)

    2013-12-05

    In this dissertation, we present the parallel algorithms necessary to run domain decomposed Monte Carlo particle transport on large numbers of processors (millions of processors). Previous algorithms were not scalable, and the parallel overhead became more computationally costly than the numerical simulation.

  4. Monte Carlo methods beyond detailed balance

    NARCIS (Netherlands)

    Schram, Raoul D.; Barkema, Gerard T.

    2015-01-01

    Monte Carlo algorithms are nearly always based on the concept of detailed balance and ergodicity. In this paper we focus on algorithms that do not satisfy detailed balance. We introduce a general method for designing non-detailed balance algorithms, starting from a conventional algorithm satisfying

  5. An analysis of Monte Carlo tree search

    CSIR Research Space (South Africa)

    James, S

    2017-02-01

    Full Text Available Monte Carlo Tree Search (MCTS) is a family of directed search algorithms that has gained widespread attention in recent years. Despite the vast amount of research into MCTS, the effect of modifications on the algorithm, as well as the manner...

  6. Monte Carlo simulation experiments on box-type radon dosimeter

    Science.gov (United States)

    Jamil, Khalid; Kamran, Muhammad; Illahi, Ahsan; Manzoor, Shahid

    2014-11-01

    Epidemiological studies show that inhalation of radon gas (222Rn) may be carcinogenic especially to mine workers, people living in closed indoor energy conserved environments and underground dwellers. It is, therefore, of paramount importance to measure the 222Rn concentrations (Bq/m3) in indoors environments. For this purpose, box-type passive radon dosimeters employing ion track detector like CR-39 are widely used. Fraction of the number of radon alphas emitted in the volume of the box type dosimeter resulting in latent track formation on CR-39 is the latent track registration efficiency. Latent track registration efficiency is ultimately required to evaluate the radon concentration which consequently determines the effective dose and the radiological hazards. In this research, Monte Carlo simulation experiments were carried out to study the alpha latent track registration efficiency for box type radon dosimeter as a function of dosimeter's dimensions and range of alpha particles in air. Two different self developed Monte Carlo simulation techniques were employed namely: (a) Surface ratio (SURA) method and (b) Ray hitting (RAHI) method. Monte Carlo simulation experiments revealed that there are two types of efficiencies i.e. intrinsic efficiency (ηint) and alpha hit efficiency (ηhit). The ηint depends upon only on the dimensions of the dosimeter and ηhit depends both upon dimensions of the dosimeter and range of the alpha particles. The total latent track registration efficiency is the product of both intrinsic and hit efficiencies. It has been concluded that if diagonal length of box type dosimeter is kept smaller than the range of alpha particle then hit efficiency is achieved as 100%. Nevertheless the intrinsic efficiency keeps playing its role. The Monte Carlo simulation experimental results have been found helpful to understand the intricate track registration mechanisms in the box type dosimeter. This paper explains that how radon concentration from the

  7. PREFACE: First European Workshop on Monte Carlo Treatment Planning

    Science.gov (United States)

    Reynaert, Nick

    2007-07-01

    The "First European Workshop on Monte Carlo treatment planning", was an initiative of the European working group on Monte Carlo treatment planning (EWG-MCTP). It was organised at Ghent University (Belgium) on 22-25October 2006. The meeting was very successful and was attended by 150 participants. The impressive list of invited speakers and the scientific contributions (posters and oral presentations) have led to a very interesting program, that was well appreciated by all attendants. In addition, the presence of seven vendors of commercial MCTP software systems provided serious added value to the workshop. For each vendor, a representative has given a presentation in a dedicated session, explaining the current status of their system. It is clear that, for "traditional" radiotherapy applications (using photon or electron beams), Monte Carlo dose calculations have become the state of the art, and are being introduced into almost all commercial treatment planning systems. Invited lectures illustrated that scientific challenges are currently associated with 4D applications (e.g. respiratory motion) and the introduction of MC dose calculations in inverse planning. But it was striking that the Monte Carlo technique is also becoming very important in more novel treatment modalities such as BNCT, hadron therapy, stereotactic radiosurgery, Tomotherapy, etc. This emphasizes the continuous growing interest in MCTP. The people who attended the dosimetry session will certainly remember the high level discussion on the determination of correction factors for different ion chambers, used in small fields. The following proceedings will certainly confirm the high scientific level of the meeting. I would like to thank the members of the local organizing committee for all the hard work done before, during and after this meeting. The organisation of such an event is not a trivial task and it would not have been possible without the help of all my colleagues. I would also like to thank

  8. Monte Carlo simulation experiments on box-type radon dosimeter

    Energy Technology Data Exchange (ETDEWEB)

    Jamil, Khalid, E-mail: kjamil@comsats.edu.pk; Kamran, Muhammad; Illahi, Ahsan; Manzoor, Shahid

    2014-11-11

    Epidemiological studies show that inhalation of radon gas ({sup 222}Rn) may be carcinogenic especially to mine workers, people living in closed indoor energy conserved environments and underground dwellers. It is, therefore, of paramount importance to measure the {sup 222}Rn concentrations (Bq/m{sup 3}) in indoors environments. For this purpose, box-type passive radon dosimeters employing ion track detector like CR-39 are widely used. Fraction of the number of radon alphas emitted in the volume of the box type dosimeter resulting in latent track formation on CR-39 is the latent track registration efficiency. Latent track registration efficiency is ultimately required to evaluate the radon concentration which consequently determines the effective dose and the radiological hazards. In this research, Monte Carlo simulation experiments were carried out to study the alpha latent track registration efficiency for box type radon dosimeter as a function of dosimeter’s dimensions and range of alpha particles in air. Two different self developed Monte Carlo simulation techniques were employed namely: (a) Surface ratio (SURA) method and (b) Ray hitting (RAHI) method. Monte Carlo simulation experiments revealed that there are two types of efficiencies i.e. intrinsic efficiency (η{sub int}) and alpha hit efficiency (η{sub hit}). The η{sub int} depends upon only on the dimensions of the dosimeter and η{sub hit} depends both upon dimensions of the dosimeter and range of the alpha particles. The total latent track registration efficiency is the product of both intrinsic and hit efficiencies. It has been concluded that if diagonal length of box type dosimeter is kept smaller than the range of alpha particle then hit efficiency is achieved as 100%. Nevertheless the intrinsic efficiency keeps playing its role. The Monte Carlo simulation experimental results have been found helpful to understand the intricate track registration mechanisms in the box type dosimeter. This paper

  9. Monte Carlo radiation transport in external beam radiotherapy

    OpenAIRE

    Çeçen, Yiğit

    2013-01-01

    The use of Monte Carlo in radiation transport is an effective way to predict absorbed dose distributions. Monte Carlo modeling has contributed to a better understanding of photon and electron transport by radiotherapy physicists. The aim of this review is to introduce Monte Carlo as a powerful radiation transport tool. In this review, photon and electron transport algorithms for Monte Carlo techniques are investigated and a clinical linear accelerator model is studied for external beam radiot...

  10. MONTE CARLO SIMULATION OF MULTIFOCAL STOCHASTIC SCANNING SYSTEM

    Directory of Open Access Journals (Sweden)

    LIXIN LIU

    2014-01-01

    Full Text Available Multifocal multiphoton microscopy (MMM has greatly improved the utilization of excitation light and imaging speed due to parallel multiphoton excitation of the samples and simultaneous detection of the signals, which allows it to perform three-dimensional fast fluorescence imaging. Stochastic scanning can provide continuous, uniform and high-speed excitation of the sample, which makes it a suitable scanning scheme for MMM. In this paper, the graphical programming language — LabVIEW is used to achieve stochastic scanning of the two-dimensional galvo scanners by using white noise signals to control the x and y mirrors independently. Moreover, the stochastic scanning process is simulated by using Monte Carlo method. Our results show that MMM can avoid oversampling or subsampling in the scanning area and meet the requirements of uniform sampling by stochastically scanning the individual units of the N × N foci array. Therefore, continuous and uniform scanning in the whole field of view is implemented.

  11. Kinetic Monte Carlo modelling of neutron irradiation damage in iron

    Energy Technology Data Exchange (ETDEWEB)

    Gamez, L. [Instituto de Fusion Nuclear, UPM, Madrid (Spain); Departamento de Fisica Aplicada, ETSII, UPM, Madrid (Spain)], E-mail: linarejos.gamez@upm.es; Martinez, E. [Instituto de Fusion Nuclear, UPM, Madrid (Spain); Lawrence Livermore National Laboratory, LLNL, CA 94550 (United States); Perlado, J.M.; Cepas, P. [Instituto de Fusion Nuclear, UPM, Madrid (Spain); Caturla, M.J. [Departamento de Fisica Aplicada, Universidad de Alicante, Alicante (Spain); Victoria, M. [Instituto de Fusion Nuclear, UPM, Madrid (Spain); Marian, J. [Lawrence Livermore National Laboratory, LLNL, CA 94550 (United States); Arevalo, C. [Instituto de Fusion Nuclear, UPM, Madrid (Spain); Hernandez, M.; Gomez, D. [CIEMAT, Madrid (Spain)

    2007-10-15

    Ferritic steels (FeCr based alloys) are key materials needed to fulfill the requirements expected in future nuclear fusion facilities, both for magnetic and inertial confinement, and advanced fission reactors (GIV) and transmutation systems. Research in such field is actually a critical aspect in the European research program and abroad. Experimental and multiscale simulation methodologies are going hand by hand in increasing the knowledge of materials performance. At DENIM, it is progressing in some specific part of the well-linked simulation methodology both for defects energetics and diffusion, and for dislocation dynamics. In this study, results obtained from kinetic Monte Carlo simulations of neutron irradiated Fe under different conditions are presented, using modified ad hoc parameters. A significant agreement with experimental measurements has been found for some of the parameterization and mechanisms considered. The results of these simulations are discussed and compared with previous calculations.

  12. Reliability Assessment of Active Distribution System Using Monte Carlo Simulation Method

    Directory of Open Access Journals (Sweden)

    Shaoyun Ge

    2014-01-01

    Full Text Available In this paper we have treated the reliability assessment problem of low and high DG penetration level of active distribution system using the Monte Carlo simulation method. The problem is formulated as a two-case program, the program of low penetration simulation and the program of high penetration simulation. The load shedding strategy and the simulation process were introduced in detail during each FMEA process. Results indicate that the integration of DG can improve the reliability of the system if the system was operated actively.

  13. Hybrid Monte Carlo with Chaotic Mixing

    CERN Document Server

    Kadakia, Nirag

    2016-01-01

    We propose a hybrid Monte Carlo (HMC) technique applicable to high-dimensional multivariate normal distributions that effectively samples along chaotic trajectories. The method is predicated on the freedom of choice of the HMC momentum distribution, and due to its mixing properties, exhibits sample-to-sample autocorrelations that decay far faster than those in the traditional hybrid Monte Carlo algorithm. We test the methods on distributions of varying correlation structure, finding that the proposed technique produces superior covariance estimates, is less reliant on step-size tuning, and can even function with sparse or no momentum re-sampling. The method presented here is promising for more general distributions, such as those that arise in Bayesian learning of artificial neural networks and in the state and parameter estimation of dynamical systems.

  14. Monte Carlo study of real time dynamics

    CERN Document Server

    Alexandru, Andrei; Bedaque, Paulo F; Vartak, Sohan; Warrington, Neill C

    2016-01-01

    Monte Carlo studies involving real time dynamics are severely restricted by the sign problem that emerges from highly oscillatory phase of the path integral. In this letter, we present a new method to compute real time quantities on the lattice using the Schwinger-Keldysh formalism via Monte Carlo simulations. The key idea is to deform the path integration domain to a complex manifold where the phase oscillations are mild and the sign problem is manageable. We use the previously introduced "contraction algorithm" to create a Markov chain on this alternative manifold. We substantiate our approach by analyzing the quantum mechanical anharmonic oscillator. Our results are in agreement with the exact ones obtained by diagonalization of the Hamiltonian. The method we introduce is generic and in principle applicable to quantum field theory albeit very slow. We discuss some possible improvements that should speed up the algorithm.

  15. Multilevel sequential Monte-Carlo samplers

    KAUST Repository

    Jasra, Ajay

    2016-01-05

    Multilevel Monte-Carlo methods provide a powerful computational technique for reducing the computational cost of estimating expectations for a given computational effort. They are particularly relevant for computational problems when approximate distributions are determined via a resolution parameter h, with h=0 giving the theoretical exact distribution (e.g. SDEs or inverse problems with PDEs). The method provides a benefit by coupling samples from successive resolutions, and estimating differences of successive expectations. We develop a methodology that brings Sequential Monte-Carlo (SMC) algorithms within the framework of the Multilevel idea, as SMC provides a natural set-up for coupling samples over different resolutions. We prove that the new algorithm indeed preserves the benefits of the multilevel principle, even if samples at all resolutions are now correlated.

  16. Monte Carlo Simulation for Particle Detectors

    CERN Document Server

    Pia, Maria Grazia

    2012-01-01

    Monte Carlo simulation is an essential component of experimental particle physics in all the phases of its life-cycle: the investigation of the physics reach of detector concepts, the design of facilities and detectors, the development and optimization of data reconstruction software, the data analysis for the production of physics results. This note briefly outlines some research topics related to Monte Carlo simulation, that are relevant to future experimental perspectives in particle physics. The focus is on physics aspects: conceptual progress beyond current particle transport schemes, the incorporation of materials science knowledge relevant to novel detection technologies, functionality to model radiation damage, the capability for multi-scale simulation, quantitative validation and uncertainty quantification to determine the predictive power of simulation. The R&D on simulation for future detectors would profit from cooperation within various components of the particle physics community, and synerg...

  17. An enhanced Monte Carlo outlier detection method.

    Science.gov (United States)

    Zhang, Liangxiao; Li, Peiwu; Mao, Jin; Ma, Fei; Ding, Xiaoxia; Zhang, Qi

    2015-09-30

    Outlier detection is crucial in building a highly predictive model. In this study, we proposed an enhanced Monte Carlo outlier detection method by establishing cross-prediction models based on determinate normal samples and analyzing the distribution of prediction errors individually for dubious samples. One simulated and three real datasets were used to illustrate and validate the performance of our method, and the results indicated that this method outperformed Monte Carlo outlier detection in outlier diagnosis. After these outliers were removed, the value of validation by Kovats retention indices and the root mean square error of prediction decreased from 3.195 to 1.655, and the average cross-validation prediction error decreased from 2.0341 to 1.2780. This method helps establish a good model by eliminating outliers. © 2015 Wiley Periodicals, Inc.

  18. Composite biasing in Monte Carlo radiative transfer

    CERN Document Server

    Baes, Maarten; Lunttila, Tuomas; Bianchi, Simone; Camps, Peter; Juvela, Mika; Kuiper, Rolf

    2016-01-01

    Biasing or importance sampling is a powerful technique in Monte Carlo radiative transfer, and can be applied in different forms to increase the accuracy and efficiency of simulations. One of the drawbacks of the use of biasing is the potential introduction of large weight factors. We discuss a general strategy, composite biasing, to suppress the appearance of large weight factors. We use this composite biasing approach for two different problems faced by current state-of-the-art Monte Carlo radiative transfer codes: the generation of photon packages from multiple components, and the penetration of radiation through high optical depth barriers. In both cases, the implementation of the relevant algorithms is trivial and does not interfere with any other optimisation techniques. Through simple test models, we demonstrate the general applicability, accuracy and efficiency of the composite biasing approach. In particular, for the penetration of high optical depths, the gain in efficiency is spectacular for the spe...

  19. Multilevel Monte Carlo Approaches for Numerical Homogenization

    KAUST Repository

    Efendiev, Yalchin R.

    2015-10-01

    In this article, we study the application of multilevel Monte Carlo (MLMC) approaches to numerical random homogenization. Our objective is to compute the expectation of some functionals of the homogenized coefficients, or of the homogenized solutions. This is accomplished within MLMC by considering different sizes of representative volumes (RVEs). Many inexpensive computations with the smallest RVE size are combined with fewer expensive computations performed on larger RVEs. Likewise, when it comes to homogenized solutions, different levels of coarse-grid meshes are used to solve the homogenized equation. We show that, by carefully selecting the number of realizations at each level, we can achieve a speed-up in the computations in comparison to a standard Monte Carlo method. Numerical results are presented for both one-dimensional and two-dimensional test-cases that illustrate the efficiency of the approach.

  20. Monte Carlo simulations on SIMD computer architectures

    Energy Technology Data Exchange (ETDEWEB)

    Burmester, C.P.; Gronsky, R. [Lawrence Berkeley Lab., CA (United States); Wille, L.T. [Florida Atlantic Univ., Boca Raton, FL (United States). Dept. of Physics

    1992-03-01

    Algorithmic considerations regarding the implementation of various materials science applications of the Monte Carlo technique to single instruction multiple data (SMM) computer architectures are presented. In particular, implementation of the Ising model with nearest, next nearest, and long range screened Coulomb interactions on the SIMD architecture MasPar MP-1 (DEC mpp-12000) series of massively parallel computers is demonstrated. Methods of code development which optimize processor array use and minimize inter-processor communication are presented including lattice partitioning and the use of processor array spanning tree structures for data reduction. Both geometric and algorithmic parallel approaches are utilized. Benchmarks in terms of Monte Carlo updates per second for the MasPar architecture are presented and compared to values reported in the literature from comparable studies on other architectures.

  1. Inhomogeneous Monte Carlo simulations of dermoscopic spectroscopy

    Science.gov (United States)

    Gareau, Daniel S.; Li, Ting; Jacques, Steven; Krueger, James

    2012-03-01

    Clinical skin-lesion diagnosis uses dermoscopy: 10X epiluminescence microscopy. Skin appearance ranges from black to white with shades of blue, red, gray and orange. Color is an important diagnostic criteria for diseases including melanoma. Melanin and blood content and distribution impact the diffuse spectral remittance (300-1000nm). Skin layers: immersion medium, stratum corneum, spinous epidermis, basal epidermis and dermis as well as laterally asymmetric features (eg. melanocytic invasion) were modeled in an inhomogeneous Monte Carlo model.

  2. Handbook of Markov chain Monte Carlo

    CERN Document Server

    Brooks, Steve

    2011-01-01

    ""Handbook of Markov Chain Monte Carlo"" brings together the major advances that have occurred in recent years while incorporating enough introductory material for new users of MCMC. Along with thorough coverage of the theoretical foundations and algorithmic and computational methodology, this comprehensive handbook includes substantial realistic case studies from a variety of disciplines. These case studies demonstrate the application of MCMC methods and serve as a series of templates for the construction, implementation, and choice of MCMC methodology.

  3. Accelerated Monte Carlo by Embedded Cluster Dynamics

    Science.gov (United States)

    Brower, R. C.; Gross, N. A.; Moriarty, K. J. M.

    1991-07-01

    We present an overview of the new methods for embedding Ising spins in continuous fields to achieve accelerated cluster Monte Carlo algorithms. The methods of Brower and Tamayo and Wolff are summarized and variations are suggested for the O( N) models based on multiple embedded Z2 spin components and/or correlated projections. Topological features are discussed for the XY model and numerical simulations presented for d=2, d=3 and mean field theory lattices.

  4. An introduction to Monte Carlo methods

    Science.gov (United States)

    Walter, J.-C.; Barkema, G. T.

    2015-01-01

    Monte Carlo simulations are methods for simulating statistical systems. The aim is to generate a representative ensemble of configurations to access thermodynamical quantities without the need to solve the system analytically or to perform an exact enumeration. The main principles of Monte Carlo simulations are ergodicity and detailed balance. The Ising model is a lattice spin system with nearest neighbor interactions that is appropriate to illustrate different examples of Monte Carlo simulations. It displays a second order phase transition between disordered (high temperature) and ordered (low temperature) phases, leading to different strategies of simulations. The Metropolis algorithm and the Glauber dynamics are efficient at high temperature. Close to the critical temperature, where the spins display long range correlations, cluster algorithms are more efficient. We introduce the rejection free (or continuous time) algorithm and describe in details an interesting alternative representation of the Ising model using graphs instead of spins with the so-called Worm algorithm. We conclude with an important discussion of the dynamical effects such as thermalization and correlation time.

  5. Monte Carlo modeling of spatially complex wrist tissue for the optimization of optical pulse oximeters

    Science.gov (United States)

    Robinson, Mitchell; Butcher, Ryan; Coté, Gerard L.

    2017-02-01

    Monte Carlo modeling of photon propagation has been used in the examination of particular areas of the body to further enhance the understanding of light propagation through tissue. This work seeks to improve upon the established simulation methods through more accurate representations of the simulated tissues in the wrist as well as the characteristics of the light source. The Monte Carlo simulation program was developed using Matlab. Generation of different tissue domains, such as muscle, vasculature, and bone, was performed in Solidworks, where each domain was saved as a separate .stl file that was read into the program. The light source was altered to give considerations to both viewing angle of the simulated LED as well as the nominal diameter of the source. It is believed that the use of these more accurate models generates results that more closely match those seen in-vivo, and can be used to better guide the design of optical wrist-worn measurement devices.

  6. ISAJET: a Monte Carlo event generator for pp and anti pp interactions. Version 3

    Energy Technology Data Exchange (ETDEWEB)

    Paige, F.E.; Protopopescu, S.D.

    1982-09-01

    ISAJET is a Monte Carlo computer program which simulates pp and anti pp reactions at high energy. It can generate minimum bias events representative of the total inelastic cross section, high PT hadronic events, and Drell-Yan events with a virtual ..gamma.., W/sup + -/, or Z/sup 0/. It is based on perturbative QCD and phenomeno-logical models for jet fragmentation.

  7. The CCFM Monte Carlo generator CASCADE Version 2.2.03

    Energy Technology Data Exchange (ETDEWEB)

    Jung, H. [DESY, Hamburg (Germany); University of Antwerp, Antwerp (Belgium); Baranov, S. [Lebedev Physics Institute, Moscow (Russian Federation); Deak, M. [University of Madrid, Instituto de Fisica Teorica UAM/CSIC, Madrid (Spain); Grebenyuk, A.; Hentschinski, M.; Knutsson, A.; Kraemer, M. [DESY, Hamburg (Germany); Hautmann, F. [University of Oxford, Oxford (United Kingdom); Kutak, K. [University of Antwerp, Antwerp (Belgium); Lipatov, A.; Zotov, N. [Moscow State University, SINP, Moscow (Russian Federation)

    2010-12-15

    Cascade is a full hadron level Monte Carlo event generator for ep, {gamma}p and p anti p and pp processes, which uses the CCFM evolution equation for the initial state cascade in a backward evolution approach supplemented with off-shell matrix elements for the hard scattering. A detailed program description is given, with emphasis on parameters the user wants to change and common block variables which completely specify the generated events. (orig.)

  8. Quasi-Monte Carlo methods for lattice systems: A first look

    Science.gov (United States)

    Jansen, K.; Leovey, H.; Ammon, A.; Griewank, A.; Müller-Preussker, M.

    2014-03-01

    We investigate the applicability of quasi-Monte Carlo methods to Euclidean lattice systems for quantum mechanics in order to improve the asymptotic error behavior of observables for such theories. In most cases the error of an observable calculated by averaging over random observations generated from an ordinary Markov chain Monte Carlo simulation behaves like N, where N is the number of observations. By means of quasi-Monte Carlo methods it is possible to improve this behavior for certain problems to N-1, or even further if the problems are regular enough. We adapted and applied this approach to simple systems like the quantum harmonic and anharmonic oscillator and verified an improved error scaling. Catalogue identifier: AERJ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AERJ_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: GNU General Public Licence version 3 No. of lines in distributed program, including test data, etc.: 67759 No. of bytes in distributed program, including test data, etc.: 2165365 Distribution format: tar.gz Programming language: C and C++. Computer: PC. Operating system: Tested on GNU/Linux, should be portable to other operating systems with minimal efforts. Has the code been vectorized or parallelized?: No RAM: The memory usage directly scales with the number of samples and dimensions: Bytes used = “number of samples” × “number of dimensions” × 8 Bytes (double precision). Classification: 4.13, 11.5, 23. External routines: FFTW 3 library (http://www.fftw.org) Nature of problem: Certain physical models formulated as a quantum field theory through the Feynman path integral, such as quantum chromodynamics, require a non-perturbative treatment of the path integral. The only known approach that achieves this is the lattice regularization. In this formulation the path integral is discretized to a finite, but very high dimensional integral. So far only Monte

  9. Guideline of Monte Carlo calculation. Neutron/gamma ray transport simulation by Monte Carlo method

    CERN Document Server

    2002-01-01

    This report condenses basic theories and advanced applications of neutron/gamma ray transport calculations in many fields of nuclear energy research. Chapters 1 through 5 treat historical progress of Monte Carlo methods, general issues of variance reduction technique, cross section libraries used in continuous energy Monte Carlo codes. In chapter 6, the following issues are discussed: fusion benchmark experiments, design of ITER, experiment analyses of fast critical assembly, core analyses of JMTR, simulation of pulsed neutron experiment, core analyses of HTTR, duct streaming calculations, bulk shielding calculations, neutron/gamma ray transport calculations of the Hiroshima atomic bomb. Chapters 8 and 9 treat function enhancements of MCNP and MVP codes, and a parallel processing of Monte Carlo calculation, respectively. An important references are attached at the end of this report.

  10. Belo Monte hydropower project: actual studies; AHE Belo Monte: os estudos atuais

    Energy Technology Data Exchange (ETDEWEB)

    Figueira Netto, Carlos Alberto de Moya [CNEC Engenharia S.A., Sao Paulo, SP (Brazil); Rezende, Paulo Fernando Vieira Souto [Centrais Eletricas Brasileiras S.A. (ELETROBRAS), Rio de Janeiro, RJ (Brazil)

    2008-07-01

    This article presents the evolution of the studies of Belo Monte Hydro Power Project (HPP) since the initial inventory studies of the Xingu River in 1979 until the current studies for conclusion of the Technical, Economic and Environmental Feasibility Studies the Belo Monte Hydro Power Project, as authorized by Brazilian National Congress. The current studies characterize the Belo Monte HPP with an installed capacity of 11,181.3 MW (20 units of 550 MW in the main power house and 7 units of 25.9 MW in the additional power house), connected to the Brazilian Interconnected Power Grid, allowing to generate 4,796 mean MW of firm energy, without depending on any flow rate regularization of the upstream Xingu river flooding only 441 k m2, of which approximately 200 k m2, correspond to the normal annual wet season flooding of the Xingu River. (author)

  11. Thermodynamic properties of van der Waals fluids from Monte Carlo simulations and perturbative Monte Carlo theory.

    Science.gov (United States)

    Díez, A; Largo, J; Solana, J R

    2006-08-21

    Computer simulations have been performed for fluids with van der Waals potential, that is, hard spheres with attractive inverse power tails, to determine the equation of state and the excess energy. On the other hand, the first- and second-order perturbative contributions to the energy and the zero- and first-order perturbative contributions to the compressibility factor have been determined too from Monte Carlo simulations performed on the reference hard-sphere system. The aim was to test the reliability of this "exact" perturbation theory. It has been found that the results obtained from the Monte Carlo perturbation theory for these two thermodynamic properties agree well with the direct Monte Carlo simulations. Moreover, it has been found that results from the Barker-Henderson [J. Chem. Phys. 47, 2856 (1967)] perturbation theory are in good agreement with those from the exact perturbation theory.

  12. Monte Carlo simulation of photon migration in a cloud computing environment with MapReduce.

    Science.gov (United States)

    Pratx, Guillem; Xing, Lei

    2011-12-01

    Monte Carlo simulation is considered the most reliable method for modeling photon migration in heterogeneous media. However, its widespread use is hindered by the high computational cost. The purpose of this work is to report on our implementation of a simple MapReduce method for performing fault-tolerant Monte Carlo computations in a massively-parallel cloud computing environment. We ported the MC321 Monte Carlo package to Hadoop, an open-source MapReduce framework. In this implementation, Map tasks compute photon histories in parallel while a Reduce task scores photon absorption. The distributed implementation was evaluated on a commercial compute cloud. The simulation time was found to be linearly dependent on the number of photons and inversely proportional to the number of nodes. For a cluster size of 240 nodes, the simulation of 100 billion photon histories took 22 min, a 1258 × speed-up compared to the single-threaded Monte Carlo program. The overall computational throughput was 85,178 photon histories per node per second, with a latency of 100 s. The distributed simulation produced the same output as the original implementation and was resilient to hardware failure: the correctness of the simulation was unaffected by the shutdown of 50% of the nodes.

  13. Direct Monte Carlo Method Simulation of the Synthesis of Carbon Particle Through Coagulation in the Detonation of Explosives

    Institute of Scientific and Technical Information of China (English)

    马峰; 恽寿榕; 黄风雷

    2003-01-01

    A model is constructed and used in computing the coagulation probability of free carbon during the detonation of explosives. A direct simulation Monte Carlo (DSMC) program is constructed to simulate the coagulation of free carbon particles. The evaluation of the distribution spectrum of particles in the system is obtained. The simulation result is consistent with the experimental curve.

  14. SIMIND Monte Carlo simulation of a single photon emission CT

    Directory of Open Access Journals (Sweden)

    Bahreyni Toossi M

    2010-01-01

    Full Text Available In this study, we simulated a Siemens E.CAM SPECT system using SIMIND Monte Carlo program to acquire its experimental characterization in terms of energy resolution, sensitivity, spatial resolution and imaging of phantoms using 99m Tc. The experimental and simulation data for SPECT imaging was acquired from a point source and Jaszczak phantom . Verification of the simulation was done by comparing two sets of images and related data obtained from the actual and simulated systems. Image quality was assessed by comparing image contrast and resolution. Simulated and measured energy spectra (with or without a collimator and spatial resolution from point sources in air were compared. The resulted energy spectra present similar peaks for the gamma energy of 99m Tc at 140 KeV. FWHM for the simulation calculated to14.01 KeV and 13.80 KeV for experimental data, corresponding to energy resolution of 10.01and 9.86% compared to defined 9.9% for both systems, respectively. Sensitivities of the real and virtual gamma cameras were calculated to 85.11 and 85.39 cps/MBq, respectively. The energy spectra of both simulated and real gamma cameras were matched. Images obtained from Jaszczak phantom, experimentally and by simulation, showed similarity in contrast and resolution. SIMIND Monte Carlo could successfully simulate the Siemens E.CAM gamma camera. The results validate the use of the simulated system for further investigation, including modification, planning, and developing a SPECT system to improve the quality of images.

  15. Venus - Maxwell Montes and Cleopatra Crater

    Science.gov (United States)

    1991-01-01

    This Magellan full-resolution image shows Maxwell Montes, and is centered at 65 degrees north latitude and 6 degrees east longitude. Maxwell is the highest mountain on Venus, rising almost 11 kilometers (6.8 miles) above mean planetary radius. The western slopes (on the left) are very steep, whereas the eastern slopes descend gradually into Fortuna Tessera. The broad ridges and valleys making up Maxwell and Fortuna suggest that the topography resulted from compression. Most of Maxwell Montes has a very bright radar return; such bright returns are common on Venus at high altitudes. This phenomenon is thought to result from the presence of a radar reflective mineral such as pyrite. Interestingly, the highest area on Maxwell is less bright than the surrounding slopes, suggesting that the phenomenon is limited to a particular elevation range. The pressure, temperature, and chemistry of the atmosphere vary with altitude; the material responsible for the bright return probably is only stable in a particular range of atmospheric conditions and therefore a particular elevation range. The prominent circular feature in eastern Maxwell is Cleopatra. Cleopatra is a double-ring impact basin about 100 kilometers (62 miles) in diameter and 2.5 kilometers (1.5 miles) deep. A steep-walled, winding channel a few kilometers wide breaks through the rough terrain surrounding the crater rim. A large amount of lava originating in Cleopatra flowed through this channel and filled valleys in Fortuna Tessera. Cleopatra is superimposed on the structures of Maxwell Montes and appears to be undeformed, indicating that Cleopatra is relatively young.

  16. A Monte Carlo algorithm for degenerate plasmas

    Energy Technology Data Exchange (ETDEWEB)

    Turrell, A.E., E-mail: a.turrell09@imperial.ac.uk; Sherlock, M.; Rose, S.J.

    2013-09-15

    A procedure for performing Monte Carlo calculations of plasmas with an arbitrary level of degeneracy is outlined. It has possible applications in inertial confinement fusion and astrophysics. Degenerate particles are initialised according to the Fermi–Dirac distribution function, and scattering is via a Pauli blocked binary collision approximation. The algorithm is tested against degenerate electron–ion equilibration, and the degenerate resistivity transport coefficient from unmagnetised first order transport theory. The code is applied to the cold fuel shell and alpha particle equilibration problem of inertial confinement fusion.

  17. A note on simultaneous Monte Carlo tests

    DEFF Research Database (Denmark)

    Hahn, Ute

    In this short note, Monte Carlo tests of goodness of fit for data of the form X(t), t ∈ I are considered, that reject the null hypothesis if X(t) leaves an acceptance region bounded by an upper and lower curve for some t in I. A construction of the acceptance region is proposed that complies to a...... to a given target level of rejection, and yields exact p-values. The construction is based on pointwise quantiles, estimated from simulated realizations of X(t) under the null hypothesis....

  18. Archimedes, the Free Monte Carlo simulator

    CERN Document Server

    Sellier, Jean Michel D

    2012-01-01

    Archimedes is the GNU package for Monte Carlo simulations of electron transport in semiconductor devices. The first release appeared in 2004 and since then it has been improved with many new features like quantum corrections, magnetic fields, new materials, GUI, etc. This document represents the first attempt to have a complete manual. Many of the Physics models implemented are described and a detailed description is presented to make the user able to write his/her own input deck. Please, feel free to contact the author if you want to contribute to the project.

  19. Cluster hybrid Monte Carlo simulation algorithms

    Science.gov (United States)

    Plascak, J. A.; Ferrenberg, Alan M.; Landau, D. P.

    2002-06-01

    We show that addition of Metropolis single spin flips to the Wolff cluster-flipping Monte Carlo procedure leads to a dramatic increase in performance for the spin-1/2 Ising model. We also show that adding Wolff cluster flipping to the Metropolis or heat bath algorithms in systems where just cluster flipping is not immediately obvious (such as the spin-3/2 Ising model) can substantially reduce the statistical errors of the simulations. A further advantage of these methods is that systematic errors introduced by the use of imperfect random-number generation may be largely healed by hybridizing single spin flips with cluster flipping.

  20. Introduction to Cluster Monte Carlo Algorithms

    Science.gov (United States)

    Luijten, E.

    This chapter provides an introduction to cluster Monte Carlo algorithms for classical statistical-mechanical systems. A brief review of the conventional Metropolis algorithm is given, followed by a detailed discussion of the lattice cluster algorithm developed by Swendsen and Wang and the single-cluster variant introduced by Wolff. For continuum systems, the geometric cluster algorithm of Dress and Krauth is described. It is shown how their geometric approach can be generalized to incorporate particle interactions beyond hardcore repulsions, thus forging a connection between the lattice and continuum approaches. Several illustrative examples are discussed.

  1. Monte Carlo simulation for the transport beamline

    Energy Technology Data Exchange (ETDEWEB)

    Romano, F.; Cuttone, G.; Jia, S. B.; Varisano, A. [INFN, Laboratori Nazionali del Sud, Via Santa Sofia 62, Catania (Italy); Attili, A.; Marchetto, F.; Russo, G. [INFN, Sezione di Torino, Via P.Giuria, 1 10125 Torino (Italy); Cirrone, G. A. P.; Schillaci, F.; Scuderi, V. [INFN, Laboratori Nazionali del Sud, Via Santa Sofia 62, Catania, Italy and Institute of Physics Czech Academy of Science, ELI-Beamlines project, Na Slovance 2, Prague (Czech Republic); Carpinelli, M. [INFN Sezione di Cagliari, c/o Dipartimento di Fisica, Università di Cagliari, Cagliari (Italy); Tramontana, A. [INFN, Laboratori Nazionali del Sud, Via Santa Sofia 62, Catania, Italy and Università di Catania, Dipartimento di Fisica e Astronomia, Via S. Sofia 64, Catania (Italy)

    2013-07-26

    In the framework of the ELIMED project, Monte Carlo (MC) simulations are widely used to study the physical transport of charged particles generated by laser-target interactions and to preliminarily evaluate fluence and dose distributions. An energy selection system and the experimental setup for the TARANIS laser facility in Belfast (UK) have been already simulated with the GEANT4 (GEometry ANd Tracking) MC toolkit. Preliminary results are reported here. Future developments are planned to implement a MC based 3D treatment planning in order to optimize shots number and dose delivery.

  2. Mosaic crystal algorithm for Monte Carlo simulations

    CERN Document Server

    Seeger, P A

    2002-01-01

    An algorithm is presented for calculating reflectivity, absorption, and scattering of mosaic crystals in Monte Carlo simulations of neutron instruments. The algorithm uses multi-step transport through the crystal with an exact solution of the Darwin equations at each step. It relies on the kinematical model for Bragg reflection (with parameters adjusted to reproduce experimental data). For computation of thermal effects (the Debye-Waller factor and coherent inelastic scattering), an expansion of the Debye integral as a rapidly converging series of exponential terms is also presented. Any crystal geometry and plane orientation may be treated. The algorithm has been incorporated into the neutron instrument simulation package NISP. (orig.)

  3. Diffusion quantum Monte Carlo for molecules

    Energy Technology Data Exchange (ETDEWEB)

    Lester, W.A. Jr.

    1986-07-01

    A quantum mechanical Monte Carlo method has been used for the treatment of molecular problems. The imaginary-time Schroedinger equation written with a shift in zero energy (E/sub T/ - V(R)) can be interpreted as a generalized diffusion equation with a position-dependent rate or branching term. Since diffusion is the continuum limit of a random walk, one may simulate the Schroedinger equation with a function psi (note, not psi/sup 2/) as a density of ''walks.'' The walks undergo an exponential birth and death as given by the rate term. 16 refs., 2 tabs.

  4. Exascale Monte Carlo R&D

    Energy Technology Data Exchange (ETDEWEB)

    Marcus, Ryan C. [Los Alamos National Laboratory

    2012-07-24

    Overview of this presentation is (1) Exascale computing - different technologies, getting there; (2) high-performance proof-of-concept MCMini - features and results; and (3) OpenCL toolkit - Oatmeal (OpenCL Automatic Memory Allocation Library) - purpose and features. Despite driver issues, OpenCL seems like a good, hardware agnostic tool. MCMini demonstrates the possibility for GPGPU-based Monte Carlo methods - it shows great scaling for HPC application and algorithmic equivalence. Oatmeal provides a flexible framework to aid in the development of scientific OpenCL codes.

  5. Household water use and conservation models using Monte Carlo techniques

    Science.gov (United States)

    Cahill, R.; Lund, J. R.; DeOreo, B.; Medellín-Azuara, J.

    2013-10-01

    The increased availability of end use measurement studies allows for mechanistic and detailed approaches to estimating household water demand and conservation potential. This study simulates water use in a single-family residential neighborhood using end-water-use parameter probability distributions generated from Monte Carlo sampling. This model represents existing water use conditions in 2010 and is calibrated to 2006-2011 metered data. A two-stage mixed integer optimization model is then developed to estimate the least-cost combination of long- and short-term conservation actions for each household. This least-cost conservation model provides an estimate of the upper bound of reasonable conservation potential for varying pricing and rebate conditions. The models were adapted from previous work in Jordan and are applied to a neighborhood in San Ramon, California in the eastern San Francisco Bay Area. The existing conditions model produces seasonal use results very close to the metered data. The least-cost conservation model suggests clothes washer rebates are among most cost-effective rebate programs for indoor uses. Retrofit of faucets and toilets is also cost-effective and holds the highest potential for water savings from indoor uses. This mechanistic modeling approach can improve understanding of water demand and estimate cost-effectiveness of water conservation programs.

  6. State-of-the-art Monte Carlo 1988

    Energy Technology Data Exchange (ETDEWEB)

    Soran, P.D.

    1988-06-28

    Particle transport calculations in highly dimensional and physically complex geometries, such as detector calibration, radiation shielding, space reactors, and oil-well logging, generally require Monte Carlo transport techniques. Monte Carlo particle transport can be performed on a variety of computers ranging from APOLLOs to VAXs. Some of the hardware and software developments, which now permit Monte Carlo methods to be routinely used, are reviewed in this paper. The development of inexpensive, large, fast computer memory, coupled with fast central processing units, permits Monte Carlo calculations to be performed on workstations, minicomputers, and supercomputers. The Monte Carlo renaissance is further aided by innovations in computer architecture and software development. Advances in vectorization and parallelization architecture have resulted in the development of new algorithms which have greatly reduced processing times. Finally, the renewed interest in Monte Carlo has spawned new variance reduction techniques which are being implemented in large computer codes. 45 refs.

  7. Monte Carlo Simulations: Number of Iterations and Accuracy

    Science.gov (United States)

    2015-07-01

    Jessica Schultheis for her editorial review. vi INTENTIONALLY LEFT BLANK. 1 1. Introduction Monte Carlo (MC) methods1 are often used...ARL-TN-0684 ● JULY 2015 US Army Research Laboratory Monte Carlo Simulations: Number of Iterations and Accuracy by William...needed. Do not return it to the originator. ARL-TN-0684 ● JULY 2015 US Army Research Laboratory Monte Carlo Simulations: Number

  8. Discrete diffusion Monte Carlo for frequency-dependent radiative transfer

    Energy Technology Data Exchange (ETDEWEB)

    Densmore, Jeffrey D [Los Alamos National Laboratory; Kelly, Thompson G [Los Alamos National Laboratory; Urbatish, Todd J [Los Alamos National Laboratory

    2010-11-17

    Discrete Diffusion Monte Carlo (DDMC) is a technique for increasing the efficiency of Implicit Monte Carlo radiative-transfer simulations. In this paper, we develop an extension of DDMC for frequency-dependent radiative transfer. We base our new DDMC method on a frequency-integrated diffusion equation for frequencies below a specified threshold. Above this threshold we employ standard Monte Carlo. With a frequency-dependent test problem, we confirm the increased efficiency of our new DDMC technique.

  9. Alternative Monte Carlo Approach for General Global Illumination

    Institute of Scientific and Technical Information of China (English)

    徐庆; 李朋; 徐源; 孙济洲

    2004-01-01

    An alternative Monte Carlo strategy for the computation of global illumination problem was presented.The proposed approach provided a new and optimal way for solving Monte Carlo global illumination based on the zero variance importance sampling procedure. A new importance driven Monte Carlo global illumination algorithm in the framework of the new computing scheme was developed and implemented. Results, which were obtained by rendering test scenes, show that this new framework and the newly derived algorithm are effective and promising.

  10. Validation of Compton Scattering Monte Carlo Simulation Models

    CERN Document Server

    Weidenspointner, Georg; Hauf, Steffen; Hoff, Gabriela; Kuster, Markus; Pia, Maria Grazia; Saracco, Paolo

    2014-01-01

    Several models for the Monte Carlo simulation of Compton scattering on electrons are quantitatively evaluated with respect to a large collection of experimental data retrieved from the literature. Some of these models are currently implemented in general purpose Monte Carlo systems; some have been implemented and evaluated for possible use in Monte Carlo particle transport for the first time in this study. Here we present first and preliminary results concerning total and differential Compton scattering cross sections.

  11. Applications of the Monte Carlo simulation in dosimetry and medical physics problems; Aplicaciones de la simulacion Monte Carlo en dosimetria y problemas de fisica medica

    Energy Technology Data Exchange (ETDEWEB)

    Rojas C, E. L., E-mail: leticia.rojas@inin.gob.m [ININ, Gerencia de Ciencias Ambientales, Carretera Mexico-Toluca s/n, 52750 Ocoyoacac, Estado de Mexico (Mexico)

    2010-07-01

    At the present time the computers use to solve important problems extends to all the areas. These areas can be of social, economic, of engineering, of basic and applied science, etc. With and appropriate handling of computation programs and information can be carried out calculations and simulations of real models, to study them and to solve theoretical or application problems. The processes that contain random variables are susceptible of being approached with the Monte Carlo method. This is a numeric method that, thanks to the improvements in the processors of the computers, it can apply in many tasks more than what was made in the principles of their practical application (at the beginning of the decade of 1950). In this work the application of the Monte Carlo method will be approached in the simulation of the radiation interaction with the matter, to investigate dosimetric aspects of some problems that exist in the medical physics area. Also, contain an introduction about some historical data and some general concepts related with the Monte Carlo simulation are revised. (Author)

  12. A kinetic Monte Carlo study of desorption of H2 from graphite (0001)

    CERN Document Server

    Gavardi, E; Hornekaer, L; 10.1016/j.cplett.2009.07.003

    2009-01-01

    The formation of H2 in the interstellar medium proceeds on the surfaces of silicate or carbonaceous particles. To get a deeper insight of its formation on the latter substrate, this letter focuses on H2 desorption from graphite (0001) in Temperature-Programmed-Desorption Monte-Carlo simulations. The results are compared to experimental results which show two main peaks and an intermediate shoulder for high initial coverage. The simulation program includes barriers obtained by ab-initio methods and is further optimised to match two independent experimental observations. The simulations reproduce the two experimental observed desorption peaks. Additionally, a possible origin of the intermediate peak is given.

  13. Three dimensional Monte-Carlo modeling of laser-tissue interaction

    Energy Technology Data Exchange (ETDEWEB)

    Gentile, N A; Kim, B M; London, R A; Trauner, K B

    1999-03-12

    A full three dimensional Monte-Carlo program was developed for analysis of the laser-tissue interactions. This project was performed as a part of the LATIS3D (3-D Laser-Tissue interaction) project. The accuracy was verified against results from a public domain two dimensional axisymmetric program. The code was used for simulation of light transport in simplified human knee geometry. Using the real human knee meshes which will be extracted from MRI images in the near future, a full analysis of dosimetry and surgical strategies for photodynamic therapy of rheumatoid arthritis will be followed.

  14. Monte Carlo simulation for the micellar behavior of amphiphilic comb-like copolymers

    Institute of Scientific and Technical Information of China (English)

    冯莺; 隋家贤; 赵季若; 陈欣方

    2000-01-01

    Micellar behaviors in 2D and 3D lattice models for amphiphilic comb-like copolymers in water phase and in water/oil mixtures were simulated. A dynamical algorithm together with chain reptation movements was used in the simulation. Three-dimension displaying program was pro-grammed and free energy was estimated by Monte Carlo technigue. The results demonstrate that reduced interaction energy influences morphological structures of micelle and emulsion ??stems greatly; 3D simulation showing can display more direct images of morphological structures; the amphiphilic comb-like polymers with a hydrophobic main chain and hydrophilic side chains have lower energy in water than in oil.

  15. Application of the direct simulation Monte Carlo method to the full shuttle geometry

    Science.gov (United States)

    Bird, G. A.

    1990-01-01

    A new set of programs has been developed for the application of the direct simulation Monte Carlo (or DSMC) method to rarefied gas flows with complex three-dimensional boundaries. The programs are efficient in terms of the computational load and also in terms of the effort required to set up particular cases. This efficiency is illustrated through computations of the flow about the Shuttle Orbiter. The general flow features are illustrated for altitudes from 170 to 100 km. Also, the computed lift-drag ratio during re-entry is compared with flight measurements.

  16. Development of advanced geometric models and acceleration techniques for Monte Carlo simulation in Medical Physics

    OpenAIRE

    Badal Soler, Andreu

    2008-01-01

    Els programes de simulació Monte Carlo de caràcter general s'utilitzen actualment en una gran varietat d'aplicacions.Tot i això, els models geomètrics implementats en la majoria de programes imposen certes limitacions a la forma dels objectes que es poden definir. Aquests models no són adequats per descriure les superfícies arbitràries que es troben en estructures anatòmiques o en certs aparells mèdics i, conseqüentment, algunes aplicacions que requereixen l'ús de models geomètrics molt detal...

  17. [From the asylums to the community: the reform process of National Colony "Dr. Manuel A. Montes de Oca"].

    Science.gov (United States)

    Rossetto, Jorge

    2009-01-01

    Since 2004, a profound transformation of the asylum care model, characterized by overcrowding, lack of discharge and absence of rehabilitation programs, and social reinsertion, has been developed at National Colony "Dr. Manuel A. Montes de Oca". During this period, a plan that contemplates several programs and projects aimed at restoring the rights of institutionalized people with mental disabilities and promoting opportunities for social inclusion has been implemented.

  18. Development of CT scanner models for patient organ dose calculations using Monte Carlo methods

    Science.gov (United States)

    Gu, Jianwei

    There is a serious and growing concern about the CT dose delivered by diagnostic CT examinations or image-guided radiation therapy imaging procedures. To better understand and to accurately quantify radiation dose due to CT imaging, Monte Carlo based CT scanner models are needed. This dissertation describes the development, validation, and application of detailed CT scanner models including a GE LightSpeed 16 MDCT scanner and two image guided radiation therapy (IGRT) cone beam CT (CBCT) scanners, kV CBCT and MV CBCT. The modeling process considered the energy spectrum, beam geometry and movement, and bowtie filter (BTF). The methodology of validating the scanner models using reported CTDI values was also developed and implemented. Finally, the organ doses to different patients undergoing CT scan were obtained by integrating the CT scanner models with anatomically-realistic patient phantoms. The tube current modulation (TCM) technique was also investigated for dose reduction. It was found that for RPI-AM, thyroid, kidneys and thymus received largest dose of 13.05, 11.41 and 11.56 mGy/100 mAs from chest scan, abdomen-pelvis scan and CAP scan, respectively using 120 kVp protocols. For RPI-AF, thymus, small intestine and kidneys received largest dose of 10.28, 12.08 and 11.35 mGy/100 mAs from chest scan, abdomen-pelvis scan and CAP scan, respectively using 120 kVp protocols. The dose to the fetus of the 3 month pregnant patient phantom was 0.13 mGy/100 mAs and 0.57 mGy/100 mAs from the chest and kidney scan, respectively. For the chest scan of the 6 month patient phantom and the 9 month patient phantom, the fetal doses were 0.21 mGy/100 mAs and 0.26 mGy/100 mAs, respectively. For MDCT with TCM schemas, the fetal dose can be reduced with 14%-25%. To demonstrate the applicability of the method proposed in this dissertation for modeling the CT scanner, additional MDCT scanner was modeled and validated by using the measured CTDI values. These results demonstrated that the

  19. Multiple Monte Carlo Testing with Applications in Spatial Point Processes

    DEFF Research Database (Denmark)

    Mrkvička, Tomáš; Myllymäki, Mari; Hahn, Ute

    with a function as the test statistic, 3) several Monte Carlo tests with functions as test statistics. The rank test has correct (global) type I error in each case and it is accompanied with a p-value and with a graphical interpretation which shows which subtest or which distances of the used test function......The rank envelope test (Myllym\\"aki et al., Global envelope tests for spatial processes, arXiv:1307.0239 [stat.ME]) is proposed as a solution to multiple testing problem for Monte Carlo tests. Three different situations are recognized: 1) a few univariate Monte Carlo tests, 2) a Monte Carlo test...

  20. THE MCNPX MONTE CARLO RADIATION TRANSPORT CODE

    Energy Technology Data Exchange (ETDEWEB)

    WATERS, LAURIE S. [Los Alamos National Laboratory; MCKINNEY, GREGG W. [Los Alamos National Laboratory; DURKEE, JOE W. [Los Alamos National Laboratory; FENSIN, MICHAEL L. [Los Alamos National Laboratory; JAMES, MICHAEL R. [Los Alamos National Laboratory; JOHNS, RUSSELL C. [Los Alamos National Laboratory; PELOWITZ, DENISE B. [Los Alamos National Laboratory

    2007-01-10

    MCNPX (Monte Carlo N-Particle eXtended) is a general-purpose Monte Carlo radiation transport code with three-dimensional geometry and continuous-energy transport of 34 particles and light ions. It contains flexible source and tally options, interactive graphics, and support for both sequential and multi-processing computer platforms. MCNPX is based on MCNP4B, and has been upgraded to most MCNP5 capabilities. MCNP is a highly stable code tracking neutrons, photons and electrons, and using evaluated nuclear data libraries for low-energy interaction probabilities. MCNPX has extended this base to a comprehensive set of particles and light ions, with heavy ion transport in development. Models have been included to calculate interaction probabilities when libraries are not available. Recent additions focus on the time evolution of residual nuclei decay, allowing calculation of transmutation and delayed particle emission. MCNPX is now a code of great dynamic range, and the excellent neutronics capabilities allow new opportunities to simulate devices of interest to experimental particle physics; particularly calorimetry. This paper describes the capabilities of the current MCNPX version 2.6.C, and also discusses ongoing code development.

  1. Multi-Index Monte Carlo (MIMC)

    KAUST Repository

    Haji Ali, Abdul Lateef

    2015-01-07

    We propose and analyze a novel Multi-Index Monte Carlo (MIMC) method for weak approximation of stochastic models that are described in terms of differential equations either driven by random measures or with random coefficients. The MIMC method is both a stochastic version of the combination technique introduced by Zenger, Griebel and collaborators and an extension of the Multilevel Monte Carlo (MLMC) method first described by Heinrich and Giles. Inspired by Giles’s seminal work, instead of using first-order differences as in MLMC, we use in MIMC high-order mixed differences to reduce the variance of the hierarchical differences dramatically. Under standard assumptions on the convergence rates of the weak error, variance and work per sample, the optimal index set turns out to be of Total Degree (TD) type. When using such sets, MIMC yields new and improved complexity results, which are natural generalizations of Giles’s MLMC analysis, and which increase the domain of problem parameters for which we achieve the optimal convergence.

  2. Chemical application of diffusion quantum Monte Carlo

    Science.gov (United States)

    Reynolds, P. J.; Lester, W. A., Jr.

    1983-10-01

    The diffusion quantum Monte Carlo (QMC) method gives a stochastic solution to the Schroedinger equation. As an example the singlet-triplet splitting of the energy of the methylene molecule CH2 is given. The QMC algorithm was implemented on the CYBER 205, first as a direct transcription of the algorithm running on our VAX 11/780, and second by explicitly writing vector code for all loops longer than a crossover length C. The speed of the codes relative to one another as a function of C, and relative to the VAX is discussed. Since CH2 has only eight electrons, most of the loops in this application are fairly short. The longest inner loops run over the set of atomic basis functions. The CPU time dependence obtained versus the number of basis functions is discussed and compared with that obtained from traditional quantum chemistry codes and that obtained from traditional computer architectures. Finally, preliminary work on restructuring the algorithm to compute the separate Monte Carlo realizations in parallel is discussed.

  3. Multi-Index Monte Carlo (MIMC)

    KAUST Repository

    Haji Ali, Abdul Lateef

    2016-01-06

    We propose and analyze a novel Multi-Index Monte Carlo (MIMC) method for weak approximation of stochastic models that are described in terms of differential equations either driven by random measures or with random coefficients. The MIMC method is both a stochastic version of the combination technique introduced by Zenger, Griebel and collaborators and an extension of the Multilevel Monte Carlo (MLMC) method first described by Heinrich and Giles. Inspired by Giles s seminal work, instead of using first-order differences as in MLMC, we use in MIMC high-order mixed differences to reduce the variance of the hierarchical differences dramatically. Under standard assumptions on the convergence rates of the weak error, variance and work per sample, the optimal index set turns out to be of Total Degree (TD) type. When using such sets, MIMC yields new and improved complexity results, which are natural generalizations of Giles s MLMC analysis, and which increase the domain of problem parameters for which we achieve the optimal convergence, O(TOL-2).

  4. Discrete range clustering using Monte Carlo methods

    Science.gov (United States)

    Chatterji, G. B.; Sridhar, B.

    1993-01-01

    For automatic obstacle avoidance guidance during rotorcraft low altitude flight, a reliable model of the nearby environment is needed. Such a model may be constructed by applying surface fitting techniques to the dense range map obtained by active sensing using radars. However, for covertness, passive sensing techniques using electro-optic sensors are desirable. As opposed to the dense range map obtained via active sensing, passive sensing algorithms produce reliable range at sparse locations, and therefore, surface fitting techniques to fill the gaps in the range measurement are not directly applicable. Both for automatic guidance and as a display for aiding the pilot, these discrete ranges need to be grouped into sets which correspond to objects in the nearby environment. The focus of this paper is on using Monte Carlo methods for clustering range points into meaningful groups. One of the aims of the paper is to explore whether simulated annealing methods offer significant advantage over the basic Monte Carlo method for this class of problems. We compare three different approaches and present application results of these algorithms to a laboratory image sequence and a helicopter flight sequence.

  5. Quantum Monte Carlo Calculations of Neutron Matter

    CERN Document Server

    Carlson, J; Ravenhall, D G

    2003-01-01

    Uniform neutron matter is approximated by a cubic box containing a finite number of neutrons, with periodic boundary conditions. We report variational and Green's function Monte Carlo calculations of the ground state of fourteen neutrons in a periodic box using the Argonne $\\vep $ two-nucleon interaction at densities up to one and half times the nuclear matter density. The effects of the finite box size are estimated using variational wave functions together with cluster expansion and chain summation techniques. They are small at subnuclear densities. We discuss the expansion of the energy of low-density neutron gas in powers of its Fermi momentum. This expansion is strongly modified by the large nn scattering length, and does not begin with the Fermi-gas kinetic energy as assumed in both Skyrme and relativistic mean field theories. The leading term of neutron gas energy is ~ half the Fermi-gas kinetic energy. The quantum Monte Carlo results are also used to calibrate the accuracy of variational calculations ...

  6. Information Geometry and Sequential Monte Carlo

    CERN Document Server

    Sim, Aaron; Stumpf, Michael P H

    2012-01-01

    This paper explores the application of methods from information geometry to the sequential Monte Carlo (SMC) sampler. In particular the Riemannian manifold Metropolis-adjusted Langevin algorithm (mMALA) is adapted for the transition kernels in SMC. Similar to its function in Markov chain Monte Carlo methods, the mMALA is a fully adaptable kernel which allows for efficient sampling of high-dimensional and highly correlated parameter spaces. We set up the theoretical framework for its use in SMC with a focus on the application to the problem of sequential Bayesian inference for dynamical systems as modelled by sets of ordinary differential equations. In addition, we argue that defining the sequence of distributions on geodesics optimises the effective sample sizes in the SMC run. We illustrate the application of the methodology by inferring the parameters of simulated Lotka-Volterra and Fitzhugh-Nagumo models. In particular we demonstrate that compared to employing a standard adaptive random walk kernel, the SM...

  7. STUDY ON THE SEQUENCE STRUCTURE OF SBR BY 13C-NMR METHOD Ⅵ MONTE CARLO SIMULATION OF THE SEQUENCE OF SBR

    Institute of Scientific and Technical Information of China (English)

    YU Dingsheng; WU Mingguang; JIAO Shuke

    1993-01-01

    A program of Monte Carlo simulation of binary copolymerization for E-SBR (emulsion polymn.SB rubber) was made according to the terminal model.The simulation results obtained by this program were in good agreement with those experimental ones.A detail microstructure information of E-SBR molecular chain has been provided.

  8. Use of Monte Carlo Methods for determination of isodose curves in brachytherapy; Uso de tecnicas Monte Carlo para determinacao de curvas de isodose em braquiterapia

    Energy Technology Data Exchange (ETDEWEB)

    Vieira, Jose Wilson

    2001-08-01

    Brachytherapy is a special form of cancer treatment in which the radioactive source is very close to or inside the tumor with the objective of causing the necrosis of the cancerous tissue. The intensity of cell response to the radiation varies according to the tissue type and degree of differentiation. Since the malign cells are less differentiated than the normal ones, they are more sensitive to the radiation. This is the basis for radiotherapy techniques. Institutes that work with the application of high dose rates use sophisticated computer programs to calculate the necessary dose to achieve the necrosis of the tumor and the same time, minimizing the irradiation of tissues and organs of the neighborhood. With knowledge the characteristics of the source and the tumor, it is possible to trace isodose curves with the necessary information for planning the brachytherapy in patients. The objective of this work is, using Monte Carlo techniques, to develop a computer program - the ISODOSE - which allows to determine isodose curves in turn of linear radioactive sources used in brachytherapy. The development of ISODOSE is important because the available commercial programs, in general, are very expensive and practically inaccessible to small clinics. The use of Monte Carlo techniques is viable because they avoid problems inherent to analytic solutions as, for instance , the integration of functions with singularities in its domain. The results of ISODOSE were compared with similar data found in the literature and also with those obtained at the institutes of radiotherapy of the 'Hospital do Cancer do Recife' and of the 'Hospital Portugues do Recife'. ISODOSE presented good performance, mainly, due to the Monte Carlo techniques, that allowed a quite detailed drawing of the isodose curves in turn of linear sources. (author)

  9. Epistasis Test in Meta-Analysis: A Multi-Parameter Markov Chain Monte Carlo Model for Consistency of Evidence.

    Directory of Open Access Journals (Sweden)

    Chin Lin

    Full Text Available Conventional genome-wide association studies (GWAS have been proven to be a successful strategy for identifying genetic variants associated with complex human traits. However, there is still a large heritability gap between GWAS and transitional family studies. The "missing heritability" has been suggested to be due to lack of studies focused on epistasis, also called gene-gene interactions, because individual trials have often had insufficient sample size. Meta-analysis is a common method for increasing statistical power. However, sufficient detailed information is difficult to obtain. A previous study employed a meta-regression-based method to detect epistasis, but it faced the challenge of inconsistent estimates. Here, we describe a Markov chain Monte Carlo-based method, called "Epistasis Test in Meta-Analysis" (ETMA, which uses genotype summary data to obtain consistent estimates of epistasis effects in meta-analysis. We defined a series of conditions to generate simulation data and tested the power and type I error rates in ETMA, individual data analysis and conventional meta-regression-based method. ETMA not only successfully facilitated consistency of evidence but also yielded acceptable type I error and higher power than conventional meta-regression. We applied ETMA to three real meta-analysis data sets. We found significant gene-gene interactions in the renin-angiotensin system and the polycyclic aromatic hydrocarbon metabolism pathway, with strong supporting evidence. In addition, glutathione S-transferase (GST mu 1 and theta 1 were confirmed to exert independent effects on cancer. We concluded that the application of ETMA to real meta-analysis data was successful. Finally, we developed an R package, etma, for the detection of epistasis in meta-analysis [etma is available via the Comprehensive R Archive Network (CRAN at https://cran.r-project.org/web/packages/etma/index.html].

  10. Epistasis Test in Meta-Analysis: A Multi-Parameter Markov Chain Monte Carlo Model for Consistency of Evidence.

    Science.gov (United States)

    Lin, Chin; Chu, Chi-Ming; Su, Sui-Lung

    2016-01-01

    Conventional genome-wide association studies (GWAS) have been proven to be a successful strategy for identifying genetic variants associated with complex human traits. However, there is still a large heritability gap between GWAS and transitional family studies. The "missing heritability" has been suggested to be due to lack of studies focused on epistasis, also called gene-gene interactions, because individual trials have often had insufficient sample size. Meta-analysis is a common method for increasing statistical power. However, sufficient detailed information is difficult to obtain. A previous study employed a meta-regression-based method to detect epistasis, but it faced the challenge of inconsistent estimates. Here, we describe a Markov chain Monte Carlo-based method, called "Epistasis Test in Meta-Analysis" (ETMA), which uses genotype summary data to obtain consistent estimates of epistasis effects in meta-analysis. We defined a series of conditions to generate simulation data and tested the power and type I error rates in ETMA, individual data analysis and conventional meta-regression-based method. ETMA not only successfully facilitated consistency of evidence but also yielded acceptable type I error and higher power than conventional meta-regression. We applied ETMA to three real meta-analysis data sets. We found significant gene-gene interactions in the renin-angiotensin system and the polycyclic aromatic hydrocarbon metabolism pathway, with strong supporting evidence. In addition, glutathione S-transferase (GST) mu 1 and theta 1 were confirmed to exert independent effects on cancer. We concluded that the application of ETMA to real meta-analysis data was successful. Finally, we developed an R package, etma, for the detection of epistasis in meta-analysis [etma is available via the Comprehensive R Archive Network (CRAN) at https://cran.r-project.org/web/packages/etma/index.html].

  11. Investigation of correction factors for non-reference conditions in ion chamber photon dosimetry with Monte-Carlo simulations

    Energy Technology Data Exchange (ETDEWEB)

    Wulff, Joerg [Klinik fuer Strahlendiagnostik, Medizinisches Zentrum fuer Radiologie, Philipps Univ. Marburg (Germany); Inst. fuer Medizinische Physik und Strahlenschutz (IMPS), Fachhochschule Giessen-Friedberg (Germany); Heverhagen, Johannes T. [Klinik fuer Strahlendiagnostik, Medizinisches Zentrum fuer Radiologie, Philipps Univ. Marburg (Germany); Karle, Heiko [Klinik und Poliklinik fuer Radioonkologie sowie Strahlentherapie, Universitaetsmedizin der Johannes Gutenberg-Univ., Mainz (Germany); Zink, Klemens [Inst. fuer Medizinische Physik und Strahlenschutz (IMPS), Fachhochschule Giessen-Friedberg (Germany)

    2010-07-01

    Current dosimetry protocols require geometrical reference conditions for the determination of absorbed dose in external radiotherapy. Whenever these geometrical conditions cannot be maintained the application of additional corrections becomes necessary, in principle. The current DIN6800-2 protocol includes a corresponding factor k{sub NR}, but numerical values are lacking and no definite information about the magnitude of this correction is available yet. This study presents Monte-Carlo based calculations within the 6 MV-X photon field of a linear accelerator for a common used ion chamber (PTW31010) employing the EGSnrc code system. The linear accelerator model was matched to measurements, showing good agreement and is used as a realistic source. The individual perturbation correction factors as well as the resulting correction factor k{sub NR} were calculated as a function of depth for three field sizes, as a function of central axis distance for the largest field and within the build-up region. The behaviour of the ion chamber was further investigated for an idealized hypothetical field boundary. Within the field of the linear accelerator where charged particle equilibrium is achieved the factor k{sub NR} was generally below {proportional_to}0.5%. In the build-up region a depth dependent correction of up to 2% was calculated when positioning the chamber according to DIN6800-2. Minimizing the depth dependence of the corrections in the build-up region lead to a slightly different positioning of the ion chamber as currently recommended. In regions of the hypothetical field boundary with missing charged particle equilibrium and high dose gradients, the ion chamber response changed by up to {proportional_to}40%, caused by the comparatively large volume (0.125 cm{sup 3}) of the investigated chamber. (orig.)

  12. Quantum Monte Carlo Endstation for Petascale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Lubos Mitas

    2011-01-26

    NCSU research group has been focused on accomplising the key goals of this initiative: establishing new generation of quantum Monte Carlo (QMC) computational tools as a part of Endstation petaflop initiative for use at the DOE ORNL computational facilities and for use by computational electronic structure community at large; carrying out high accuracy quantum Monte Carlo demonstration projects in application of these tools to the forefront electronic structure problems in molecular and solid systems; expanding the impact of QMC methods and approaches; explaining and enhancing the impact of these advanced computational approaches. In particular, we have developed quantum Monte Carlo code (QWalk, www.qwalk.org) which was significantly expanded and optimized using funds from this support and at present became an actively used tool in the petascale regime by ORNL researchers and beyond. These developments have been built upon efforts undertaken by the PI's group and collaborators over the period of the last decade. The code was optimized and tested extensively on a number of parallel architectures including petaflop ORNL Jaguar machine. We have developed and redesigned a number of code modules such as evaluation of wave functions and orbitals, calculations of pfaffians and introduction of backflow coordinates together with overall organization of the code and random walker distribution over multicore architectures. We have addressed several bottlenecks such as load balancing and verified efficiency and accuracy of the calculations with the other groups of the Endstation team. The QWalk package contains about 50,000 lines of high quality object-oriented C++ and includes also interfaces to data files from other conventional electronic structure codes such as Gamess, Gaussian, Crystal and others. This grant supported PI for one month during summers, a full-time postdoc and partially three graduate students over the period of the grant duration, it has resulted in 13

  13. Monte Carlo analysis of radiative transport in oceanographic lidar measurements

    Energy Technology Data Exchange (ETDEWEB)

    Cupini, E.; Ferro, G. [ENEA, Divisione Fisica Applicata, Centro Ricerche Ezio Clementel, Bologna (Italy); Ferrari, N. [Bologna Univ., Bologna (Italy). Dipt. Ingegneria Energetica, Nucleare e del Controllo Ambientale

    2001-07-01

    The analysis of oceanographic lidar systems measurements is often carried out with semi-empirical methods, since there is only a rough understanding of the effects of many environmental variables. The development of techniques for interpreting the accuracy of lidar measurements is needed to evaluate the effects of various environmental situations, as well as of different experimental geometric configurations and boundary conditions. A Monte Carlo simulation model represents a tool that is particularly well suited for answering these important questions. The PREMAR-2F Monte Carlo code has been developed taking into account the main molecular and non-molecular components of the marine environment. The laser radiation interaction processes of diffusion, re-emission, refraction and absorption are treated. In particular are considered: the Rayleigh elastic scattering, produced by atoms and molecules with small dimensions with respect to the laser emission wavelength (i.e. water molecules), the Mie elastic scattering, arising from atoms or molecules with dimensions comparable to the laser wavelength (hydrosols), the Raman inelastic scattering, typical of water, the absorption of water, inorganic (sediments) and organic (phytoplankton and CDOM) hydrosols, the fluorescence re-emission of chlorophyll and yellow substances. PREMAR-2F is an extension of a code for the simulation of the radiative transport in atmospheric environments (PREMAR-2). The approach followed in PREMAR-2 was to combine conventional Monte Carlo techniques with analytical estimates of the probability of the receiver to have a contribution from photons coming back after an interaction in the field of view of the lidar fluorosensor collecting apparatus. This offers an effective mean for modelling a lidar system with realistic geometric constraints. The retrieved semianalytic Monte Carlo radiative transfer model has been developed in the frame of the Italian Research Program for Antarctica (PNRA) and it is

  14. Monte Carlo模拟在聚合反应速率常数估算中的运用%Estimation of rate constants for polymerization based on Monte Carlo simulation

    Institute of Scientific and Technical Information of China (English)

    罗正鸿; 詹晓力; 阳永荣

    2006-01-01

    The application of Monte Carlo method in estimating rate constants for polymerization was described.A general program for Monte Carlo simulation was determined first according to the elementary reactions, after which the rate constants could be automatically adjusted and optimized through comparing of experimental and simulated data with an error expression that meeted a given minimum criterion.Such a process made the rate constants to be estimated without kinetic model in advance.The technique was applied to estimate the rate constants of the bulk polymerization of styrene catalyzed by the rare earth catalyst.The estimated results showed the Monte Carlo method was feasible and effective for estimating rate constants in polymerization engineering.

  15. Validation of PET-SORTEO Monte Carlo simulations for the geometries of the MicroPET R4 and Focus 220 PET scanners

    Energy Technology Data Exchange (ETDEWEB)

    Lartizien, C [CREATIS Laboratory, UMR CNRS 5220, Inserm U630, INSA-Lyon, Universite Lyon 1, F69621 Villeurbanne (France); Kuntner, C [Department of Radiopharmaceuticals, Austrian Research Center GmbH-ARC, Seibersdorf (Austria); Goertzen, A L [McConnell Brain Imaging Centre, Montreal (Canada); Evans, A C [McConnell Brain Imaging Centre, Montreal (Canada); Reilhac, A [CERMEP, Bron (France)

    2007-08-21

    PET-SORTEO is a Monte Carlo-based simulator that enables the fast generation of realistic PET data for the geometry of the ECAT EXACT HR+ scanner. In order to address the increasing need for simulation models of animal PET imaging systems, our aim is to adapt and configure this simulation tool for small animal PET scanners, especially for the widely distributed microPET R4 and Focus 220 systems manufactured by Siemens Preclinical Solutions. We propose a simulation model that can produce realistic rodent images in order to evaluate and optimize acquisition and reconstruction protocols. The first part of this study presents the validation of SORTEO against the geometries of the R4 and the Focus 220 systems. This validation is carried out against actual measurements performed on the R4 scanner at the Montreal Neurological Institute in Canada and on the Focus 220 system of Department of radiopharmaceuticals of the Austrian Research Center in Seibersdorf. The comparison of simulated and experimental performance measurements includes spatial resolution, energy spectra, scatter fraction and count rates. In the second part of the study, we demonstrate the ability to rapidly generate realistic whole-body radioactive distributions using the MOBY phantom and give comparative example case studies of the same rodent model simulated with PET-SORTEO for the R4 and Focus 220 systems.

  16. Analysis of different Monte Carlo simulation codes for its use in radiotherapy; Analisis de diferentes codigos de simulacion Monte Carlo parea su uso en radioterapia

    Energy Technology Data Exchange (ETDEWEB)

    Azorin V, C.G.; Rivera M, T. [CICATA-IPN, Legaria, Mexico D.F. (Mexico)]. e-mail: claudiaazorin@yahoo.com.mx

    2007-07-01

    Full text: At the present time many computer programs that simulate the radiation interaction with the matter using the Monte Carlo method. Presently work is carried out the comparative analysis of four of these codes (MCNPX, EGS4, GEANT, PENELOPE) for their later one use in the development of a simple algorithm that simulates the energy deposit when passing through the matter in patients subjected to radiotherapy. The results of the analysis show that the studied simulators model the interaction of almost all type of particles with the matter, although they have their variations among those the energy intervals that manage, the programming language in which are programmed, as well as the platform under which they are executed can be mentioned. (Author)

  17. Monte Carlo模拟薄膜生长的研究%Study of Thin Film Growth by Monte Carlo Stimulation

    Institute of Scientific and Technical Information of China (English)

    彭冬生; 冯玉春; 牛憨笨

    2006-01-01

    阐述了Monte Carlo方法在薄膜生长中的应用和最新进展;简要论述了Monte Carlo算法的类型及各自的特点;结合MonteCarlo方法的特点,提出了模拟薄膜生长的模型以及处理方法.同时,归纳出MonteCarlo模拟薄膜生长需要解决的主要问题.

  18. MONTE CARLO ANALYSES OF THE YALINA THERMAL FACILITY WITH SERPENT STEREOLITHOGRAPHY GEOMETRY MODEL

    Energy Technology Data Exchange (ETDEWEB)

    Talamo, A.; Gohar, Y.

    2015-01-01

    This paper analyzes the YALINA Thermal subcritical assembly of Belarus using two different Monte Carlo transport programs, SERPENT and MCNP. The MCNP model is based on combinatorial geometry and universes hierarchy, while the SERPENT model is based on Stereolithography geometry. The latter consists of unstructured triangulated surfaces defined by the normal and vertices. This geometry format is used by 3D printers and it has been created by: the CUBIT software, MATLAB scripts, and C coding. All the Monte Carlo simulations have been performed using the ENDF/B-VII.0 nuclear data library. Both MCNP and SERPENT share the same geometry specifications, which describe the facility details without using any material homogenization. Three different configurations have been studied with different number of fuel rods. The three fuel configurations use 216, 245, or 280 fuel rods, respectively. The numerical simulations show that the agreement between SERPENT and MCNP results is within few tens of pcms.

  19. A new Monte Carlo code for absorption simulation of laser-skin tissue interaction

    Institute of Scientific and Technical Information of China (English)

    Afshan Shirkavand; Saeed Sarkar; Marjaneh Hejazi; Leila Ataie-Fashtami; Mohammad Reza Alinaghizadeh

    2007-01-01

    In laser clinical applications, the process of photon absorption and thermal energy diffusion in the target tissue and its surrounding tissue during laser irradiation are crucial. Such information allows the selection of proper operating parameters such as laser power, and exposure time for optimal therapeutic. The Monte Carlo method is a useful tool for studying laser-tissue interaction and simulation of energy absorption in tissue during laser irradiation. We use the principles of this technique and write a new code with MATLAB 6.5, and then validate it against Monte Carlo multi layer (MCML) code. The new code is proved to be with good accuracy. It can be used to calculate the total power bsorbed in the region of interest. This can be combined for heat modelling with other computerized programs.

  20. MCViNE -- An object oriented Monte Carlo neutron ray tracing simulation package

    CERN Document Server

    Lin, Jiao Y Y; Granroth, Garrett E; Abernathy, Douglas L; Lumsden, Mark D; Winn, Barry; Aczel, Adam A; Aivazis, Michael; Fultz, Brent

    2015-01-01

    MCViNE (Monte-Carlo VIrtual Neutron Experiment) is a versatile Monte Carlo (MC) neutron ray-tracing program that provides researchers with tools for performing computer modeling and simulations that mirror real neutron scattering experiments. By adopting modern software engineering practices such as using composite and visitor design patterns for representing and accessing neutron scatterers, and using recursive algorithms for multiple scattering, MCViNE is flexible enough to handle sophisticated neutron scattering problems including, for example, neutron detection by complex detector systems, and single and multiple scattering events in a variety of samples and sample environments. In addition, MCViNE can take advantage of simulation components in linear-chain-based MC ray tracing packages widely used in instrument design and optimization, as well as NumPy-based components that make prototypes useful and easy to develop. These developments have enabled us to carry out detailed simulations of neutron scatteri...

  1. Milagro Version 2 An Implicit Monte Carlo Code for Thermal Radiative Transfer: Capabilities, Development, and Usage

    Energy Technology Data Exchange (ETDEWEB)

    T.J. Urbatsch; T.M. Evans

    2006-02-15

    We have released Version 2 of Milagro, an object-oriented, C++ code that performs radiative transfer using Fleck and Cummings' Implicit Monte Carlo method. Milagro, a part of the Jayenne program, is a stand-alone driver code used as a methods research vehicle and to verify its underlying classes. These underlying classes are used to construct Implicit Monte Carlo packages for external customers. Milagro-2 represents a design overhaul that allows better parallelism and extensibility. New features in Milagro-2 include verified momentum deposition, restart capability, graphics capability, exact energy conservation, and improved load balancing and parallel efficiency. A users' guide also describes how to configure, make, and run Milagro2.

  2. Coarse-grained stochastic processes and Monte Carlo simulations in lattice systems

    CERN Document Server

    Katsoulakis, M A; Vlachos, D G

    2003-01-01

    In this paper we present a new class of coarse-grained stochastic processes and Monte Carlo simulations, derived directly from microscopic lattice systems and describing mesoscopic length scales. As our primary example, we mainly focus on a microscopic spin-flip model for the adsorption and desorption of molecules between a surface adjacent to a gas phase, although a similar analysis carries over to other processes. The new model can capture large scale structures, while retaining microscopic information on intermolecular forces and particle fluctuations. The requirement of detailed balance is utilized as a systematic design principle to guarantee correct noise fluctuations for the coarse-grained model. We carry out a rigorous asymptotic analysis of the new system using techniques from large deviations and present detailed numerical comparisons of coarse-grained and microscopic Monte Carlo simulations. The coarse-grained stochastic algorithms provide large computational savings without increasing programming ...

  3. Monte Carlo simulations to replace film dosimetry in IMRT verification.

    Science.gov (United States)

    Goetzfried, Thomas; Rickhey, Mark; Treutwein, Marius; Koelbl, Oliver; Bogner, Ludwig

    2011-01-01

    Patient-specific verification of intensity-modulated radiation therapy (IMRT) plans can be done by dosimetric measurements or by independent dose or monitor unit calculations. The aim of this study was the clinical evaluation of IMRT verification based on a fast Monte Carlo (MC) program with regard to possible benefits compared to commonly used film dosimetry. 25 head-and-neck IMRT plans were recalculated by a pencil beam based treatment planning system (TPS) using an appropriate quality assurance (QA) phantom. All plans were verified both by film and diode dosimetry and compared to MC simulations. The irradiated films, the results of diode measurements and the computed dose distributions were evaluated, and the data were compared on the basis of gamma maps and dose-difference histograms. Average deviations in the high-dose region between diode measurements and point dose calculations performed with the TPS and MC program were 0.7 ± 2.7% and 1.2 ± 3.1%, respectively. For film measurements, the mean gamma values with 3% dose difference and 3mm distance-to-agreement were 0.74 ± 0.28 (TPS as reference) with dose deviations up to 10%. Corresponding values were significantly reduced to 0.34 ± 0.09 for MC dose calculation. The total time needed for both verification procedures is comparable, however, by far less labor intensive in the case of MC simulations. The presented study showed that independent dose calculation verification of IMRT plans with a fast MC program has the potential to eclipse film dosimetry more and more in the near future. Thus, the linac-specific QA part will necessarily become more important. In combination with MC simulations and due to the simple set-up, point-dose measurements for dosimetric plausibility checks are recommended at least in the IMRT introduction phase. Copyright © 2010. Published by Elsevier GmbH.

  4. Recent Developments in Quantum Monte Carlo: Methods and Applications

    Science.gov (United States)

    Aspuru-Guzik, Alan; Austin, Brian; Domin, Dominik; Galek, Peter T. A.; Handy, Nicholas; Prasad, Rajendra; Salomon-Ferrer, Romelia; Umezawa, Naoto; Lester, William A.

    2007-12-01

    The quantum Monte Carlo method in the diffusion Monte Carlo form has become recognized for its capability of describing the electronic structure of atomic, molecular and condensed matter systems to high accuracy. This talk will briefly outline the method with emphasis on recent developments connected with trial function construction, linear scaling, and applications to selected systems.

  5. QUANTUM MONTE-CARLO SIMULATIONS - ALGORITHMS, LIMITATIONS AND APPLICATIONS

    NARCIS (Netherlands)

    DERAEDT, H

    1992-01-01

    A survey is given of Quantum Monte Carlo methods currently used to simulate quantum lattice models. The formalisms employed to construct the simulation algorithms are sketched. The origin of fundamental (minus sign) problems which limit the applicability of the Quantum Monte Carlo approach is shown

  6. Quantum Monte Carlo Simulations : Algorithms, Limitations and Applications

    NARCIS (Netherlands)

    Raedt, H. De

    1992-01-01

    A survey is given of Quantum Monte Carlo methods currently used to simulate quantum lattice models. The formalisms employed to construct the simulation algorithms are sketched. The origin of fundamental (minus sign) problems which limit the applicability of the Quantum Monte Carlo approach is shown

  7. Reporting Monte Carlo Studies in Structural Equation Modeling

    NARCIS (Netherlands)

    Boomsma, Anne

    2013-01-01

    In structural equation modeling, Monte Carlo simulations have been used increasingly over the last two decades, as an inventory from the journal Structural Equation Modeling illustrates. Reaching out to a broad audience, this article provides guidelines for reporting Monte Carlo studies in that fiel

  8. Practical schemes for accurate forces in quantum Monte Carlo

    NARCIS (Netherlands)

    Moroni, S.; Saccani, S.; Filippi, Claudia

    2014-01-01

    While the computation of interatomic forces has become a well-established practice within variational Monte Carlo (VMC), the use of the more accurate Fixed-Node Diffusion Monte Carlo (DMC) method is still largely limited to the computation of total energies on structures obtained at a lower level of

  9. Efficiency and accuracy of Monte Carlo (importance) sampling

    NARCIS (Netherlands)

    Waarts, P.H.

    2003-01-01

    Monte Carlo Analysis is often regarded as the most simple and accurate reliability method. Be-sides it is the most transparent method. The only problem is the accuracy in correlation with the efficiency. Monte Carlo gets less efficient or less accurate when very low probabilities are to be computed

  10. The Monte Carlo Method. Popular Lectures in Mathematics.

    Science.gov (United States)

    Sobol', I. M.

    The Monte Carlo Method is a method of approximately solving mathematical and physical problems by the simulation of random quantities. The principal goal of this booklet is to suggest to specialists in all areas that they will encounter problems which can be solved by the Monte Carlo Method. Part I of the booklet discusses the simulation of random…

  11. Forest canopy BRDF simulation using Monte Carlo method

    NARCIS (Netherlands)

    Huang, J.; Wu, B.; Zeng, Y.; Tian, Y.

    2006-01-01

    Monte Carlo method is a random statistic method, which has been widely used to simulate the Bidirectional Reflectance Distribution Function (BRDF) of vegetation canopy in the field of visible remote sensing. The random process between photons and forest canopy was designed using Monte Carlo method.

  12. Sensitivity of Monte Carlo simulations to input distributions

    Energy Technology Data Exchange (ETDEWEB)

    RamoRao, B. S.; Srikanta Mishra, S.; McNeish, J.; Andrews, R. W.

    2001-07-01

    The sensitivity of the results of a Monte Carlo simulation to the shapes and moments of the probability distributions of the input variables is studied. An economical computational scheme is presented as an alternative to the replicate Monte Carlo simulations and is explained with an illustrative example. (Author) 4 refs.

  13. Quantum Monte Carlo using a Stochastic Poisson Solver

    Energy Technology Data Exchange (ETDEWEB)

    Das, D; Martin, R M; Kalos, M H

    2005-05-06

    Quantum Monte Carlo (QMC) is an extremely powerful method to treat many-body systems. Usually quantum Monte Carlo has been applied in cases where the interaction potential has a simple analytic form, like the 1/r Coulomb potential. However, in a complicated environment as in a semiconductor heterostructure, the evaluation of the interaction itself becomes a non-trivial problem. Obtaining the potential from any grid-based finite-difference method, for every walker and every step is unfeasible. We demonstrate an alternative approach of solving the Poisson equation by a classical Monte Carlo within the overall quantum Monte Carlo scheme. We have developed a modified ''Walk On Spheres'' algorithm using Green's function techniques, which can efficiently account for the interaction energy of walker configurations, typical of quantum Monte Carlo algorithms. This stochastically obtained potential can be easily incorporated within popular quantum Monte Carlo techniques like variational Monte Carlo (VMC) or diffusion Monte Carlo (DMC). We demonstrate the validity of this method by studying a simple problem, the polarization of a helium atom in the electric field of an infinite capacitor.

  14. Further experience in Bayesian analysis using Monte Carlo Integration

    NARCIS (Netherlands)

    H.K. van Dijk (Herman); T. Kloek (Teun)

    1980-01-01

    textabstractAn earlier paper [Kloek and Van Dijk (1978)] is extended in three ways. First, Monte Carlo integration is performed in a nine-dimensional parameter space of Klein's model I [Klein (1950)]. Second, Monte Carlo is used as a tool for the elicitation of a uniform prior on a finite region by

  15. New Approaches and Applications for Monte Carlo Perturbation Theory

    Energy Technology Data Exchange (ETDEWEB)

    Aufiero, Manuele; Bidaud, Adrien; Kotlyar, Dan; Leppänen, Jaakko; Palmiotti, Giuseppe; Salvatores, Massimo; Sen, Sonat; Shwageraus, Eugene; Fratoni, Massimiliano

    2017-02-01

    This paper presents some of the recent and new advancements in the extension of Monte Carlo Perturbation Theory methodologies and application. In particular, the discussed problems involve Brunup calculation, perturbation calculation based on continuous energy functions, and Monte Carlo Perturbation Theory in loosely coupled systems.

  16. Forest canopy BRDF simulation using Monte Carlo method

    NARCIS (Netherlands)

    Huang, J.; Wu, B.; Zeng, Y.; Tian, Y.

    2006-01-01

    Monte Carlo method is a random statistic method, which has been widely used to simulate the Bidirectional Reflectance Distribution Function (BRDF) of vegetation canopy in the field of visible remote sensing. The random process between photons and forest canopy was designed using Monte Carlo method.

  17. Practical schemes for accurate forces in quantum Monte Carlo

    NARCIS (Netherlands)

    Moroni, S.; Saccani, S.; Filippi, C.

    2014-01-01

    While the computation of interatomic forces has become a well-established practice within variational Monte Carlo (VMC), the use of the more accurate Fixed-Node Diffusion Monte Carlo (DMC) method is still largely limited to the computation of total energies on structures obtained at a lower level of

  18. CERN Summer Student Report 2016 Monte Carlo Data Base Improvement

    CERN Document Server

    Caciulescu, Alexandru Razvan

    2016-01-01

    During my Summer Student project I worked on improving the Monte Carlo Data Base and MonALISA services for the ALICE Collaboration. The project included learning the infrastructure for tracking and monitoring of the Monte Carlo productions as well as developing a new RESTful API for seamless integration with the JIRA issue tracking framework.

  19. Morse Monte Carlo Radiation Transport Code System

    Energy Technology Data Exchange (ETDEWEB)

    Emmett, M.B.

    1975-02-01

    The report contains sections containing descriptions of the MORSE and PICTURE codes, input descriptions, sample problems, deviations of the physical equations and explanations of the various error messages. The MORSE code is a multipurpose neutron and gamma-ray transport Monte Carlo code. Time dependence for both shielding and criticality problems is provided. General three-dimensional geometry may be used with an albedo option available at any material surface. The PICTURE code provide aid in preparing correct input data for the combinatorial geometry package CG. It provides a printed view of arbitrary two-dimensional slices through the geometry. By inspecting these pictures one may determine if the geometry specified by the input cards is indeed the desired geometry. 23 refs. (WRF)

  20. Variational Monte Carlo study of pentaquark states

    Energy Technology Data Exchange (ETDEWEB)

    Mark W. Paris

    2005-07-01

    Accurate numerical solution of the five-body Schrodinger equation is effected via variational Monte Carlo. The spectrum is assumed to exhibit a narrow resonance with strangeness S=+1. A fully antisymmetrized and pair-correlated five-quark wave function is obtained for the assumed non-relativistic Hamiltonian which has spin, isospin, and color dependent pair interactions and many-body confining terms which are fixed by the non-exotic spectra. Gauge field dynamics are modeled via flux tube exchange factors. The energy determined for the ground states with J=1/2 and negative (positive) parity is 2.22 GeV (2.50 GeV). A lower energy negative parity state is consistent with recent lattice results. The short-range structure of the state is analyzed via its diquark content.

  1. Monte Carlo simulation of neutron scattering instruments

    Energy Technology Data Exchange (ETDEWEB)

    Seeger, P.A.; Daemen, L.L.; Hjelm, R.P. Jr.

    1998-12-01

    A code package consisting of the Monte Carlo Library MCLIB, the executing code MC{_}RUN, the web application MC{_}Web, and various ancillary codes is proposed as an open standard for simulation of neutron scattering instruments. The architecture of the package includes structures to define surfaces, regions, and optical elements contained in regions. A particle is defined by its vector position and velocity, its time of flight, its mass and charge, and a polarization vector. The MC{_}RUN code handles neutron transport and bookkeeping, while the action on the neutron within any region is computed using algorithms that may be deterministic, probabilistic, or a combination. Complete versatility is possible because the existing library may be supplemented by any procedures a user is able to code. Some examples are shown.

  2. Atomistic Monte Carlo simulation of lipid membranes

    DEFF Research Database (Denmark)

    Wüstner, Daniel; Sklenar, Heinz

    2014-01-01

    , as assessed by calculation of molecular energies and entropies. We also show transition from a crystalline-like to a fluid DPPC bilayer by the CBC local-move MC method, as indicated by the electron density profile, head group orientation, area per lipid, and whole-lipid displacements. We discuss the potential......Biological membranes are complex assemblies of many different molecules of which analysis demands a variety of experimental and computational approaches. In this article, we explain challenges and advantages of atomistic Monte Carlo (MC) simulation of lipid membranes. We provide an introduction...... into the various move sets that are implemented in current MC methods for efficient conformational sampling of lipids and other molecules. In the second part, we demonstrate for a concrete example, how an atomistic local-move set can be implemented for MC simulations of phospholipid monomers and bilayer patches...

  3. Experimental Monte Carlo Quantum Process Certification

    CERN Document Server

    Steffen, L; Fedorov, A; Baur, M; Wallraff, A

    2012-01-01

    Experimental implementations of quantum information processing have now reached a level of sophistication where quantum process tomography is impractical. The number of experimental settings as well as the computational cost of the data post-processing now translates to days of effort to characterize even experiments with as few as 8 qubits. Recently a more practical approach to determine the fidelity of an experimental quantum process has been proposed, where the experimental data is compared directly to an ideal process using Monte Carlo sampling. Here we present an experimental implementation of this scheme in a circuit quantum electrodynamics setup to determine the fidelity of two qubit gates, such as the cphase and the cnot gate, and three qubit gates, such as the Toffoli gate and two sequential cphase gates.

  4. Gas discharges modeling by Monte Carlo technique

    Directory of Open Access Journals (Sweden)

    Savić Marija

    2010-01-01

    Full Text Available The basic assumption of the Townsend theory - that ions produce secondary electrons - is valid only in a very narrow range of the reduced electric field E/N. In accordance with the revised Townsend theory that was suggested by Phelps and Petrović, secondary electrons are produced in collisions of ions, fast neutrals, metastable atoms or photons with the cathode, or in gas phase ionizations by fast neutrals. In this paper we tried to build up a Monte Carlo code that can be used to calculate secondary electron yields for different types of particles. The obtained results are in good agreement with the analytical results of Phelps and. Petrović [Plasma Sourc. Sci. Technol. 8 (1999 R1].

  5. On nonlinear Markov chain Monte Carlo

    CERN Document Server

    Andrieu, Christophe; Doucet, Arnaud; Del Moral, Pierre; 10.3150/10-BEJ307

    2011-01-01

    Let $\\mathscr{P}(E)$ be the space of probability measures on a measurable space $(E,\\mathcal{E})$. In this paper we introduce a class of nonlinear Markov chain Monte Carlo (MCMC) methods for simulating from a probability measure $\\pi\\in\\mathscr{P}(E)$. Nonlinear Markov kernels (see [Feynman--Kac Formulae: Genealogical and Interacting Particle Systems with Applications (2004) Springer]) $K:\\mathscr{P}(E)\\times E\\rightarrow\\mathscr{P}(E)$ can be constructed to, in some sense, improve over MCMC methods. However, such nonlinear kernels cannot be simulated exactly, so approximations of the nonlinear kernels are constructed using auxiliary or potentially self-interacting chains. Several nonlinear kernels are presented and it is demonstrated that, under some conditions, the associated approximations exhibit a strong law of large numbers; our proof technique is via the Poisson equation and Foster--Lyapunov conditions. We investigate the performance of our approximations with some simulations.

  6. Monte Carlo exploration of warped Higgsless models

    Energy Technology Data Exchange (ETDEWEB)

    Hewett, JoAnne L.; Lillie, Benjamin; Rizzo, Thomas Gerard [Stanford Linear Accelerator Center, 2575 Sand Hill Rd., Menlo Park, CA, 94025 (United States)]. E-mail: rizzo@slac.stanford.edu

    2004-10-01

    We have performed a detailed Monte Carlo exploration of the parameter space for a warped Higgsless model of electroweak symmetry breaking in 5 dimensions. This model is based on the SU(2){sub L} x SU(2){sub R} x U(1){sub B-L} gauge group in an AdS{sub 5} bulk with arbitrary gauge kinetic terms on both the Planck and TeV branes. Constraints arising from precision electroweak measurements and collider data are found to be relatively easy to satisfy. We show, however, that the additional requirement of perturbative unitarity up to the cut-off, {approx_equal} 10 TeV, in W{sub L}{sup +}W{sub L}{sup -} elastic scattering in the absence of dangerous tachyons eliminates all models. If successful models of this class exist, they must be highly fine-tuned. (author)

  7. Monte Carlo Exploration of Warped Higgsless Models

    CERN Document Server

    Hewett, J L; Rizzo, T G

    2004-01-01

    We have performed a detailed Monte Carlo exploration of the parameter space for a warped Higgsless model of electroweak symmetry breaking in 5 dimensions. This model is based on the $SU(2)_L\\times SU(2)_R\\times U(1)_{B-L}$ gauge group in an AdS$_5$ bulk with arbitrary gauge kinetic terms on both the Planck and TeV branes. Constraints arising from precision electroweak measurements and collider data are found to be relatively easy to satisfy. We show, however, that the additional requirement of perturbative unitarity up to the cut-off, $\\simeq 10$ TeV, in $W_L^+W_L^-$ elastic scattering in the absence of dangerous tachyons eliminates all models. If successful models of this class exist, they must be highly fine-tuned.

  8. Monte Carlo Implementation of Polarized Hadronization

    CERN Document Server

    Matevosyan, Hrayr H; Thomas, Anthony W

    2016-01-01

    We study the polarized quark hadronization in a Monte Carlo (MC) framework based on the recent extension of the quark-jet framework, where a self-consistent treatment of the quark polarization transfer in a sequential hadronization picture has been presented. Here, we first adopt this approach for MC simulations of hadronization process with finite number of produced hadrons, expressing the relevant probabilities in terms of the eight leading twist quark-to-quark transverse momentum dependent (TMD) splitting functions (SFs) for elementary $q \\to q'+h$ transition. We present explicit expressions for the unpolarized and Collins fragmentation functions (FFs) of unpolarized hadrons emitted at rank two. Further, we demonstrate that all the current spectator-type model calculations of the leading twist quark-to-quark TMD SFs violate the positivity constraints, and propose quark model based ansatz for these input functions that circumvents the problem. We validate our MC framework by explicitly proving the absence o...

  9. Commensurabilities between ETNOs: a Monte Carlo survey

    CERN Document Server

    Marcos, C de la Fuente

    2016-01-01

    Many asteroids in the main and trans-Neptunian belts are trapped in mean motion resonances with Jupiter and Neptune, respectively. As a side effect, they experience accidental commensurabilities among themselves. These commensurabilities define characteristic patterns that can be used to trace the source of the observed resonant behaviour. Here, we explore systematically the existence of commensurabilities between the known ETNOs using their heliocentric and barycentric semimajor axes, their uncertainties, and Monte Carlo techniques. We find that the commensurability patterns present in the known ETNO population resemble those found in the main and trans-Neptunian belts. Although based on small number statistics, such patterns can only be properly explained if most, if not all, of the known ETNOs are subjected to the resonant gravitational perturbations of yet undetected trans-Plutonian planets. We show explicitly that some of the statistically significant commensurabilities are compatible with the Planet Nin...

  10. Variable length trajectory compressible hybrid Monte Carlo

    CERN Document Server

    Nishimura, Akihiko

    2016-01-01

    Hybrid Monte Carlo (HMC) generates samples from a prescribed probability distribution in a configuration space by simulating Hamiltonian dynamics, followed by the Metropolis (-Hastings) acceptance/rejection step. Compressible HMC (CHMC) generalizes HMC to a situation in which the dynamics is reversible but not necessarily Hamiltonian. This article presents a framework to further extend the algorithm. Within the existing framework, each trajectory of the dynamics must be integrated for the same amount of (random) time to generate a valid Metropolis proposal. Our generalized acceptance/rejection mechanism allows a more deliberate choice of the integration time for each trajectory. The proposed algorithm in particular enables an effective application of variable step size integrators to HMC-type sampling algorithms based on reversible dynamics. The potential of our framework is further demonstrated by another extension of HMC which reduces the wasted computations due to unstable numerical approximations and corr...

  11. Lunar Regolith Albedos Using Monte Carlos

    Science.gov (United States)

    Wilson, T. L.; Andersen, V.; Pinsky, L. S.

    2003-01-01

    The analysis of planetary regoliths for their backscatter albedos produced by cosmic rays (CRs) is important for space exploration and its potential contributions to science investigations in fundamental physics and astrophysics. Albedos affect all such experiments and the personnel that operate them. Groups have analyzed the production rates of various particles and elemental species by planetary surfaces when bombarded with Galactic CR fluxes, both theoretically and by means of various transport codes, some of which have emphasized neutrons. Here we report on the preliminary results of our current Monte Carlo investigation into the production of charged particles, neutrons, and neutrinos by the lunar surface using FLUKA. In contrast to previous work, the effects of charm are now included.

  12. Nuclear reactions in Monte Carlo codes.

    Science.gov (United States)

    Ferrari, A; Sala, P R

    2002-01-01

    The physics foundations of hadronic interactions as implemented in most Monte Carlo codes are presented together with a few practical examples. The description of the relevant physics is presented schematically split into the major steps in order to stress the different approaches required for the full understanding of nuclear reactions at intermediate and high energies. Due to the complexity of the problem, only a few semi-qualitative arguments are developed in this paper. The description will be necessarily schematic and somewhat incomplete, but hopefully it will be useful for a first introduction into this topic. Examples are shown mostly for the high energy regime, where all mechanisms mentioned in the paper are at work and to which perhaps most of the readers are less accustomed. Examples for lower energies can be found in the references.

  13. Helminthiases in Montes Claros. Preliminary survey

    Directory of Open Access Journals (Sweden)

    Rina Girard Kaminsky

    1976-04-01

    Full Text Available A preliminary survey was conducted for the presence of helminths in the city of Montes Claros, M. G., Brazil. Three groups of persons were examined by the direct smear, Kato thick film and MIFC techniques; one group by direct smear and Kato only. General findings were: a high prevalence of hookworm, followed by ascariasis, S. mansoni, S. stercoralis and very light infections with T. trichiurá. E. vermicularis and H. nana were ranking parasites at an orphanage, with some hookworm and S. mansoni infections as well. At a pig slaughter house, the dominant parasites were hookworm and S. mansoni. Pig cysticercosis was an incidental finding worth mentioning for the health hazard it represents for humans as well as an economic loss. From the comparative results between the Kato and the MIF the former proved itself again as a more sensitive and reliable concentration method for helminth eggs, of low cost and easy performance.

  14. Atomistic Monte Carlo simulation of lipid membranes

    DEFF Research Database (Denmark)

    Wüstner, Daniel; Sklenar, Heinz

    2014-01-01

    Biological membranes are complex assemblies of many different molecules of which analysis demands a variety of experimental and computational approaches. In this article, we explain challenges and advantages of atomistic Monte Carlo (MC) simulation of lipid membranes. We provide an introduction......, as assessed by calculation of molecular energies and entropies. We also show transition from a crystalline-like to a fluid DPPC bilayer by the CBC local-move MC method, as indicated by the electron density profile, head group orientation, area per lipid, and whole-lipid displacements. We discuss the potential...... of local-move MC methods in combination with molecular dynamics simulations, for example, for studying multi-component lipid membranes containing cholesterol....

  15. Geometric Monte Carlo and Black Janus Geometries

    CERN Document Server

    Bak, Dongsu; Kim, Kyung Kiu; Min, Hyunsoo; Song, Jeong-Pil

    2016-01-01

    We describe an application of the Monte Carlo method to the Janus deformation of the black brane background. We present numerical results for three and five dimensional black Janus geometries with planar and spherical interfaces. In particular, we argue that the 5D geometry with a spherical interface has an application in understanding the finite temperature bag-like QCD model via the AdS/CFT correspondence. The accuracy and convergence of the algorithm are evaluated with respect to the grid spacing. The systematic errors of the method are determined using an exact solution of 3D black Janus. This numerical approach for solving linear problems is unaffected initial guess of a trial solution and can handle an arbitrary geometry under various boundary conditions in the presence of source fields.

  16. Modeling neutron guides using Monte Carlo simulations

    CERN Document Server

    Wang, D Q; Crow, M L; Wang, X L; Lee, W T; Hubbard, C R

    2002-01-01

    Four neutron guide geometries, straight, converging, diverging and curved, were characterized using Monte Carlo ray-tracing simulations. The main areas of interest are the transmission of the guides at various neutron energies and the intrinsic time-of-flight (TOF) peak broadening. Use of a delta-function time pulse from a uniform Lambert neutron source allows one to quantitatively simulate the effect of guides' geometry on the TOF peak broadening. With a converging guide, the intensity and the beam divergence increases while the TOF peak width decreases compared with that of a straight guide. By contrast, use of a diverging guide decreases the intensity and the beam divergence, and broadens the width (in TOF) of the transmitted neutron pulse.

  17. Accurate barrier heights using diffusion Monte Carlo

    CERN Document Server

    Krongchon, Kittithat; Wagner, Lucas K

    2016-01-01

    Fixed node diffusion Monte Carlo (DMC) has been performed on a test set of forward and reverse barrier heights for 19 non-hydrogen-transfer reactions, and the nodal error has been assessed. The DMC results are robust to changes in the nodal surface, as assessed by using different mean-field techniques to generate single determinant wave functions. Using these single determinant nodal surfaces, DMC results in errors of 1.5(5) kcal/mol on barrier heights. Using the large data set of DMC energies, we attempted to find good descriptors of the fixed node error. It does not correlate with a number of descriptors including change in density, but does correlate with the gap between the highest occupied and lowest unoccupied orbital energies in the mean-field calculation.

  18. Accelerated GPU based SPECT Monte Carlo simulations

    Science.gov (United States)

    Garcia, Marie-Paule; Bert, Julien; Benoit, Didier; Bardiès, Manuel; Visvikis, Dimitris

    2016-06-01

    Monte Carlo (MC) modelling is widely used in the field of single photon emission computed tomography (SPECT) as it is a reliable technique to simulate very high quality scans. This technique provides very accurate modelling of the radiation transport and particle interactions in a heterogeneous medium. Various MC codes exist for nuclear medicine imaging simulations. Recently, new strategies exploiting the computing capabilities of graphical processing units (GPU) have been proposed. This work aims at evaluating the accuracy of such GPU implementation strategies in comparison to standard MC codes in the context of SPECT imaging. GATE was considered the reference MC toolkit and used to evaluate the performance of newly developed GPU Geant4-based Monte Carlo simulation (GGEMS) modules for SPECT imaging. Radioisotopes with different photon energies were used with these various CPU and GPU Geant4-based MC codes in order to assess the best strategy for each configuration. Three different isotopes were considered: 99m Tc, 111In and 131I, using a low energy high resolution (LEHR) collimator, a medium energy general purpose (MEGP) collimator and a high energy general purpose (HEGP) collimator respectively. Point source, uniform source, cylindrical phantom and anthropomorphic phantom acquisitions were simulated using a model of the GE infinia II 3/8" gamma camera. Both simulation platforms yielded a similar system sensitivity and image statistical quality for the various combinations. The overall acceleration factor between GATE and GGEMS platform derived from the same cylindrical phantom acquisition was between 18 and 27 for the different radioisotopes. Besides, a full MC simulation using an anthropomorphic phantom showed the full potential of the GGEMS platform, with a resulting acceleration factor up to 71. The good agreement with reference codes and the acceleration factors obtained support the use of GPU implementation strategies for improving computational efficiency

  19. Monte Carlo modelling of TRIGA research reactor

    Science.gov (United States)

    El Bakkari, B.; Nacir, B.; El Bardouni, T.; El Younoussi, C.; Merroun, O.; Htet, A.; Boulaich, Y.; Zoubair, M.; Boukhal, H.; Chakir, M.

    2010-10-01

    The Moroccan 2 MW TRIGA MARK II research reactor at Centre des Etudes Nucléaires de la Maâmora (CENM) achieved initial criticality on May 2, 2007. The reactor is designed to effectively implement the various fields of basic nuclear research, manpower training, and production of radioisotopes for their use in agriculture, industry, and medicine. This study deals with the neutronic analysis of the 2-MW TRIGA MARK II research reactor at CENM and validation of the results by comparisons with the experimental, operational, and available final safety analysis report (FSAR) values. The study was prepared in collaboration between the Laboratory of Radiation and Nuclear Systems (ERSN-LMR) from Faculty of Sciences of Tetuan (Morocco) and CENM. The 3-D continuous energy Monte Carlo code MCNP (version 5) was used to develop a versatile and accurate full model of the TRIGA core. The model represents in detailed all components of the core with literally no physical approximation. Continuous energy cross-section data from the more recent nuclear data evaluations (ENDF/B-VI.8, ENDF/B-VII.0, JEFF-3.1, and JENDL-3.3) as well as S( α, β) thermal neutron scattering functions distributed with the MCNP code were used. The cross-section libraries were generated by using the NJOY99 system updated to its more recent patch file "up259". The consistency and accuracy of both the Monte Carlo simulation and neutron transport physics were established by benchmarking the TRIGA experiments. Core excess reactivity, total and integral control rods worth as well as power peaking factors were used in the validation process. Results of calculations are analysed and discussed.

  20. Accelerated GPU based SPECT Monte Carlo simulations.

    Science.gov (United States)

    Garcia, Marie-Paule; Bert, Julien; Benoit, Didier; Bardiès, Manuel; Visvikis, Dimitris

    2016-06-07

    Monte Carlo (MC) modelling is widely used in the field of single photon emission computed tomography (SPECT) as it is a reliable technique to simulate very high quality scans. This technique provides very accurate modelling of the radiation transport and particle interactions in a heterogeneous medium. Various MC codes exist for nuclear medicine imaging simulations. Recently, new strategies exploiting the computing capabilities of graphical processing units (GPU) have been proposed. This work aims at evaluating the accuracy of such GPU implementation strategies in comparison to standard MC codes in the context of SPECT imaging. GATE was considered the reference MC toolkit and used to evaluate the performance of newly developed GPU Geant4-based Monte Carlo simulation (GGEMS) modules for SPECT imaging. Radioisotopes with different photon energies were used with these various CPU and GPU Geant4-based MC codes in order to assess the best strategy for each configuration. Three different isotopes were considered: (99m) Tc, (111)In and (131)I, using a low energy high resolution (LEHR) collimator, a medium energy general purpose (MEGP) collimator and a high energy general purpose (HEGP) collimator respectively. Point source, uniform source, cylindrical phantom and anthropomorphic phantom acquisitions were simulated using a model of the GE infinia II 3/8" gamma camera. Both simulation platforms yielded a similar system sensitivity and image statistical quality for the various combinations. The overall acceleration factor between GATE and GGEMS platform derived from the same cylindrical phantom acquisition was between 18 and 27 for the different radioisotopes. Besides, a full MC simulation using an anthropomorphic phantom showed the full potential of the GGEMS platform, with a resulting acceleration factor up to 71. The good agreement with reference codes and the acceleration factors obtained support the use of GPU implementation strategies for improving computational

  1. Kontrola tačnosti rezultata u simulacijama Monte Karlo / Accuracy control in Monte Carlo simulations

    Directory of Open Access Journals (Sweden)

    Nebojša V. Nikolić

    2010-04-01

    Full Text Available U radu je demonstrirana primena metode automatizovanog ponavljanja nezavisnih simulacionih eksperimenata sa prikupljanjem statistike slučajnih procesa, u dostizanju i kontroli tačnosti simulacionih rezultata u simulaciji sistema masovnog opsluživanja Monte Karlo. Metoda se zasniva na primeni osnovnih stavova i teorema matematičke statistike i teorije verovatnoće. Tačnost simulacionih rezultata dovedena je u direktnu vezu sa brojem ponavljanja simulacionih eksperimenata. / The paper presents an application of the Automated Independent Replication with Gathering Statistics of the Stochastic Processes Method in achieving and controlling the accuracy of simulation results in the Monte Carlo queuing simulations. The method is based on the application of the basic theorems of the theory of probability and mathematical statistics. The accuracy of the simulation results is linked with a number of independent replications of simulation experiments.

  2. Fission Matrix Capability for MCNP Monte Carlo

    Energy Technology Data Exchange (ETDEWEB)

    Carney, Sean E. [Los Alamos National Laboratory; Brown, Forrest B. [Los Alamos National Laboratory; Kiedrowski, Brian C. [Los Alamos National Laboratory; Martin, William R. [Los Alamos National Laboratory

    2012-09-05

    In a Monte Carlo criticality calculation, before the tallying of quantities can begin, a converged fission source (the fundamental eigenvector of the fission kernel) is required. Tallies of interest may include powers, absorption rates, leakage rates, or the multiplication factor (the fundamental eigenvalue of the fission kernel, k{sub eff}). Just as in the power iteration method of linear algebra, if the dominance ratio (the ratio of the first and zeroth eigenvalues) is high, many iterations of neutron history simulations are required to isolate the fundamental mode of the problem. Optically large systems have large dominance ratios, and systems containing poor neutron communication between regions are also slow to converge. The fission matrix method, implemented into MCNP[1], addresses these problems. When Monte Carlo random walk from a source is executed, the fission kernel is stochastically applied to the source. Random numbers are used for: distances to collision, reaction types, scattering physics, fission reactions, etc. This method is used because the fission kernel is a complex, 7-dimensional operator that is not explicitly known. Deterministic methods use approximations/discretization in energy, space, and direction to the kernel. Consequently, they are faster. Monte Carlo directly simulates the physics, which necessitates the use of random sampling. Because of this statistical noise, common convergence acceleration methods used in deterministic methods do not work. In the fission matrix method, we are using the random walk information not only to build the next-iteration fission source, but also a spatially-averaged fission kernel. Just like in deterministic methods, this involves approximation and discretization. The approximation is the tallying of the spatially-discretized fission kernel with an incorrect fission source. We address this by making the spatial mesh fine enough that this error is negligible. As a consequence of discretization we get a

  3. Vectorized Monte Carlo methods for reactor lattice analysis

    Science.gov (United States)

    Brown, F. B.

    1984-01-01

    Some of the new computational methods and equivalent mathematical representations of physics models used in the MCV code, a vectorized continuous-enery Monte Carlo code for use on the CYBER-205 computer are discussed. While the principal application of MCV is the neutronics analysis of repeating reactor lattices, the new methods used in MCV should be generally useful for vectorizing Monte Carlo for other applications. For background, a brief overview of the vector processing features of the CYBER-205 is included, followed by a discussion of the fundamentals of Monte Carlo vectorization. The physics models used in the MCV vectorized Monte Carlo code are then summarized. The new methods used in scattering analysis are presented along with details of several key, highly specialized computational routines. Finally, speedups relative to CDC-7600 scalar Monte Carlo are discussed.

  4. Quantum Monte Carlo methods algorithms for lattice models

    CERN Document Server

    Gubernatis, James; Werner, Philipp

    2016-01-01

    Featuring detailed explanations of the major algorithms used in quantum Monte Carlo simulations, this is the first textbook of its kind to provide a pedagogical overview of the field and its applications. The book provides a comprehensive introduction to the Monte Carlo method, its use, and its foundations, and examines algorithms for the simulation of quantum many-body lattice problems at finite and zero temperature. These algorithms include continuous-time loop and cluster algorithms for quantum spins, determinant methods for simulating fermions, power methods for computing ground and excited states, and the variational Monte Carlo method. Also discussed are continuous-time algorithms for quantum impurity models and their use within dynamical mean-field theory, along with algorithms for analytically continuing imaginary-time quantum Monte Carlo data. The parallelization of Monte Carlo simulations is also addressed. This is an essential resource for graduate students, teachers, and researchers interested in ...

  5. Monte-Carlo Simulation of Ising Model%Ising 模型的Monte-Carlo模拟

    Institute of Scientific and Technical Information of China (English)

    吴国军; 胡经国

    2000-01-01

    在平面四角点阵上,以Ising模型为框架,在IBM-PC机上用Mont e-Carlo方法模拟了螺旋边界、半自由边界及自由边界条件下铁磁系统的相图,并与周期性边界条件下的结果作了比较.

  6. Monte Carlo 2000 Conference : Advanced Monte Carlo for Radiation Physics, Particle Transport Simulation and Applications

    CERN Document Server

    Baräo, Fernando; Nakagawa, Masayuki; Távora, Luis; Vaz, Pedro

    2001-01-01

    This book focusses on the state of the art of Monte Carlo methods in radiation physics and particle transport simulation and applications, the latter involving in particular, the use and development of electron--gamma, neutron--gamma and hadronic codes. Besides the basic theory and the methods employed, special attention is paid to algorithm development for modeling, and the analysis of experiments and measurements in a variety of fields ranging from particle to medical physics.

  7. Srna - Monte Carlo codes for proton transport simulation in combined and voxelized geometries

    Directory of Open Access Journals (Sweden)

    Ilić Radovan D.

    2002-01-01

    Full Text Available This paper describes new Monte Carlo codes for proton transport simulations in complex geometrical forms and in materials of different composition. The SRNA codes were developed for three dimensional (3D dose distribution calculation in proton therapy and dosimetry. The model of these codes is based on the theory of proton multiple scattering and a simple model of compound nucleus decay. The developed package consists of two codes: SRNA-2KG and SRNA-VOX. The first code simulates proton transport in combined geometry that can be described by planes and second order surfaces. The second one uses the voxelized geometry of material zones and is specifically adopted for the application of patient computer tomography data. Transition probabilities for both codes are given by the SRNADAT program. In this paper, we will present the models and algorithms of our programs, as well as the results of the numerical experiments we have carried out applying them, along with the results of proton transport simulation obtained through the PETRA and GEANT programs. The simulation of the proton beam characterization by means of the Multi-Layer Faraday Cup and spatial distribution of positron emitters obtained by our program indicate the imminent application of Monte Carlo techniques in clinical practice.

  8. Monte Carlo solution of the volume-integral equation of electromagnetic scattering

    Science.gov (United States)

    Peltoniemi, J.; Muinonen, K.

    2014-07-01

    regular quadratures. Because of the oscillating singularity of the Green's function, the quadrature must match exactly the canceling patterns of the integrand, and any improper quadrature leads to large errors. Monte Carlo based integration appears thus a very bad choice, but we take the challenge, and formulate the integration applying a three-finger rule to catch the singularity. Our other selections are the least-squares technique and plane-wave basis, though both can be freely and easily changed. The singularity is treated fully numerically, and the radius ρ is assumed so small that the correction terms do not contribute. Any other choice only worsens the accuracy, without a significant gain in speed. As with any other technique, we can solve small spheres of size xrefractive indices. In speed, this technique does not compete with faster techniques such as ADDA, but in some random cases the accuracy can be even better (probably due to sub-optimal singularity formula in ADDA -- applying numerical integration also there could probably make ADDA winner in all the cases). We continue towards more complicated cases and multiple scattering to see, if some further improvements can be made.

  9. Parallelization of a Monte Carlo particle transport simulation code

    Science.gov (United States)

    Hadjidoukas, P.; Bousis, C.; Emfietzoglou, D.

    2010-05-01

    We have developed a high performance version of the Monte Carlo particle transport simulation code MC4. The original application code, developed in Visual Basic for Applications (VBA) for Microsoft Excel, was first rewritten in the C programming language for improving code portability. Several pseudo-random number generators have been also integrated and studied. The new MC4 version was then parallelized for shared and distributed-memory multiprocessor systems using the Message Passing Interface. Two parallel pseudo-random number generator libraries (SPRNG and DCMT) have been seamlessly integrated. The performance speedup of parallel MC4 has been studied on a variety of parallel computing architectures including an Intel Xeon server with 4 dual-core processors, a Sun cluster consisting of 16 nodes of 2 dual-core AMD Opteron processors and a 200 dual-processor HP cluster. For large problem size, which is limited only by the physical memory of the multiprocessor server, the speedup results are almost linear on all systems. We have validated the parallel implementation against the serial VBA and C implementations using the same random number generator. Our experimental results on the transport and energy loss of electrons in a water medium show that the serial and parallel codes are equivalent in accuracy. The present improvements allow for studying of higher particle energies with the use of more accurate physical models, and improve statistics as more particles tracks can be simulated in low response time.

  10. Application of advanced Monte Carlo Methods in numerical dosimetry.

    Science.gov (United States)

    Reichelt, U; Henniger, J; Lange, C

    2006-01-01

    Many tasks in different sectors of dosimetry are very complex and highly sensitive to changes in the radiation field. Often, only the simulation of radiation transport is capable of describing the radiation field completely. Down to sub-cellular dimensions the energy deposition by cascades of secondary electrons is the main pathway for damage induction in matter. A large number of interactions take place until such electrons are slowed down to thermal energies. Also for some problems of photon transport a large number of photon histories need to be processed. Thus the efficient non-analogue Monte Carlo program, AMOS, has been developed for photon and electron transport. Various applications and benchmarks are presented showing its ability. For radiotherapy purposes the radiation field of a brachytherapy source is calculated according to the American Association of Physicists in Medicine Task Group Report 43 (AAPM/TG43). As additional examples, results for the detector efficiency of a high-purity germanium (HPGe) detector and a dose estimation for an X-ray shielding for radiation protection are shown.

  11. Searching for efficient Markov chain Monte Carlo proposal kernels.

    Science.gov (United States)

    Yang, Ziheng; Rodríguez, Carlos E

    2013-11-26

    Markov chain Monte Carlo (MCMC) or the Metropolis-Hastings algorithm is a simulation algorithm that has made modern Bayesian statistical inference possible. Nevertheless, the efficiency of different Metropolis-Hastings proposal kernels has rarely been studied except for the Gaussian proposal. Here we propose a unique class of Bactrian kernels, which avoid proposing values that are very close to the current value, and compare their efficiency with a number of proposals for simulating different target distributions, with efficiency measured by the asymptotic variance of a parameter estimate. The uniform kernel is found to be more efficient than the Gaussian kernel, whereas the Bactrian kernel is even better. When optimal scales are used for both, the Bactrian kernel is at least 50% more efficient than the Gaussian. Implementation in a Bayesian program for molecular clock dating confirms the general applicability of our results to generic MCMC algorithms. Our results refute a previous claim that all proposals had nearly identical performance and will prompt further research into efficient MCMC proposals.

  12. Foam A General Purpose Cellular Monte Carlo Event Generator

    CERN Document Server

    Jadach, Stanislaw

    2003-01-01

    A general purpose, self-adapting, Monte Carlo (MC) event generator (simulator) is described. The high efficiency of the MC, that is small maximum weight or variance of the MC weight is achieved by means of dividing the integration domain into small cells. The cells can be $n$-dimensional simplices, hyperrectangles or Cartesian product of them. The grid of cells, called ``foam'', is produced in the process of the binary split of the cells. The choice of the next cell to be divided and the position/direction of the division hyper-plane is driven by the algorithm which optimizes the ratio of the maximum weight to the average weight or (optionally) the total variance. The algorithm is able to deal, in principle, with an arbitrary pattern of the singularities in the distribution. As any MC generator, it can also be used for the MC integration. With the typical personal computer CPU, the program is able to perform adaptive integration/simulation at relatively small number of dimensions ($\\leq 16$). With the continu...

  13. Monte Carlo simulations of the SANS instrument in Petten

    Energy Technology Data Exchange (ETDEWEB)

    Uca, O. [European Commission, Joint Research Centre, Institute for Energy, Westerduinweg 3, 1755 LE, Petten (Netherlands)], E-mail: oktay.uca@jrc.nl; Ohms, C. [European Commission, Joint Research Centre, Institute for Energy, Westerduinweg 3, 1755 LE, Petten (Netherlands)], E-mail: carsten.ohms@jrc.nl

    2008-11-30

    The small-angle neutron-scattering facility at the 45 MW high-flux reactor in Petten, The Netherlands, was constructed in the late 1980s. It has a q-range of 5x10{sup -3} to 0.4 A{sup -1}, operating at a fixed wavelength of 4.75 A, which is realized by six pairs of double pyrolytic graphite monochromators. In this paper, we study the flux gain for the instrument installed at a neutron guide by Monte Carlo simulations using the program packages McStas [L. Lefmann, K. Nielsen, Neutron News 10 (1999) 320; P. Willendrup, E. Farhi and K. Lefmann, Physica B 350 (2004) 735] and Vitess [G. Zsigmond et al., Nucl. Instrum. Methods A 529 (2004) 218; (http://www.hmi.de/projects/ess/vitess/)]. In doing so, the instrument is relocated from its current position to the HB10 radial beam tube, the double monochromator is replaced by a velocity selector and neutron guides are used for transporting the neutrons.

  14. Monte Carlo simulations of the SANS instrument in Petten

    Science.gov (United States)

    Uca, O.; Ohms, C.

    2008-11-01

    The small-angle neutron-scattering facility at the 45 MW high-flux reactor in Petten, The Netherlands, was constructed in the late 1980s. It has a q-range of 5×10 -3 to 0.4 Å -1, operating at a fixed wavelength of 4.75 Å, which is realized by six pairs of double pyrolytic graphite monochromators. In this paper, we study the flux gain for the instrument installed at a neutron guide by Monte Carlo simulations using the program packages McStas [L. Lefmann, K. Nielsen, Neutron News 10 (1999) 320; P. Willendrup, E. Farhi and K. Lefmann, Physica B 350 (2004) 735] and Vitess [G. Zsigmond et al., Nucl. Instrum. Methods A 529 (2004) 218; http://www.hmi.de/projects/ess/vitess/]. In doing so, the instrument is relocated from its current position to the HB10 radial beam tube, the double monochromator is replaced by a velocity selector and neutron guides are used for transporting the neutrons.

  15. Monte Carlo simulation of gamma ray tomography for image reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Guedes, Karlos A.N.; Moura, Alex; Dantas, Carlos; Melo, Silvio; Lima, Emerson, E-mail: karlosguedes@hotmail.com [Universidade Federal de Pernambuco (UFPE), Recife, PE (Brazil); Meric, Ilker [University of Bergen (Norway)

    2015-07-01

    The Monte Carlo simulations of known density and shape object was validate with Gamma Ray Tomography in static experiments. An aluminum half-moon piece placed inside a steel pipe was the MC simulation test object that was also measured by means of gamma ray transmission. Wall effect of the steel pipe due to irradiation geometry in a single pair source-detector tomography was evaluated by comparison with theoretical data. MCNPX code requires a defined geometry to each photon trajectory which practically prevents this usage for tomography reconstruction simulation. The solution was found by writing a program in Delphi language to create input files automation code. Simulations of tomography data by automated MNCPX code were carried out and validated by experimental data. Working in this sequence the produced data needed a databank to be stored. Experimental setup used a Cesium-137 isotopic radioactive source (7.4 × 109 Bq), and NaI(Tl) scintillation detector of (51 × 51) × 10−3 m crystal size coupled to a multichannel analyzer. A stainless steel tubes of 0,154 m internal diameter, 0.014 m thickness wall. The results show that the MCNPX simulation code adapted to automated input file is useful for generating a matrix data M(θ,t), of a computerized gamma ray tomography for any known density and regular shape object. Experimental validation used RMSE from gamma ray paths and from attenuation coefficient data. (author)

  16. 欧式期权定价的Monte-Carlo方法%Monte-Carlo methods for Pricing European-style options

    Institute of Scientific and Technical Information of China (English)

    张丽虹

    2015-01-01

    We discuss Monte-Carlo methods for pricing European options.Based on the famous Black-Scholes model,we first discuss the Monte-Carlo simulation method to pricing standard European options according to Risk neutral theory.Methods to improve the Monte-Carlo simulation performance including introducing control variates and antithetic variates are also discussed.Finally we apply the proposed Monte-Carlo methods to price the European binary options,European lookback options and European Asian options.%讨论各种欧式期权价格的Monte-Carlo方法。根据Black-Scholes期权定价模型以及风险中性理论,首先详细地讨论如何利用Monte-Carlo方法来计算标准欧式期权价格;然后讨论如何引入控制变量以及对称变量来提高Monte-Carlo方法的精确性;最后用Monte-Carlo方法来计算标准欧式期权、欧式—两值期权、欧式—回望期权以及欧式—亚式期权的价格,并讨论相关方法的优缺点。

  17. Monte Carlo Methods in Materials Science Based on FLUKA and ROOT

    Science.gov (United States)

    Pinsky, Lawrence; Wilson, Thomas; Empl, Anton; Andersen, Victor

    2003-01-01

    A comprehensive understanding of mitigation measures for space radiation protection necessarily involves the relevant fields of nuclear physics and particle transport modeling. One method of modeling the interaction of radiation traversing matter is Monte Carlo analysis, a subject that has been evolving since the very advent of nuclear reactors and particle accelerators in experimental physics. Countermeasures for radiation protection from neutrons near nuclear reactors, for example, were an early application and Monte Carlo methods were quickly adapted to this general field of investigation. The project discussed here is concerned with taking the latest tools and technology in Monte Carlo analysis and adapting them to space applications such as radiation shielding design for spacecraft, as well as investigating how next-generation Monte Carlos can complement the existing analytical methods currently used by NASA. We have chosen to employ the Monte Carlo program known as FLUKA (A legacy acronym based on the German for FLUctuating KAscade) used to simulate all of the particle transport, and the CERN developed graphical-interface object-oriented analysis software called ROOT. One aspect of space radiation analysis for which the Monte Carlo s are particularly suited is the study of secondary radiation produced as albedoes in the vicinity of the structural geometry involved. This broad goal of simulating space radiation transport through the relevant materials employing the FLUKA code necessarily requires the addition of the capability to simulate all heavy-ion interactions from 10 MeV/A up to the highest conceivable energies. For all energies above 3 GeV/A the Dual Parton Model (DPM) is currently used, although the possible improvement of the DPMJET event generator for energies 3-30 GeV/A is being considered. One of the major tasks still facing us is the provision for heavy ion interactions below 3 GeV/A. The ROOT interface is being developed in conjunction with the

  18. Iterative acceleration methods for Monte Carlo and deterministic criticality calculations

    Energy Technology Data Exchange (ETDEWEB)

    Urbatsch, T.J.

    1995-11-01

    If you have ever given up on a nuclear criticality calculation and terminated it because it took so long to converge, you might find this thesis of interest. The author develops three methods for improving the fission source convergence in nuclear criticality calculations for physical systems with high dominance ratios for which convergence is slow. The Fission Matrix Acceleration Method and the Fission Diffusion Synthetic Acceleration (FDSA) Method are acceleration methods that speed fission source convergence for both Monte Carlo and deterministic methods. The third method is a hybrid Monte Carlo method that also converges for difficult problems where the unaccelerated Monte Carlo method fails. The author tested the feasibility of all three methods in a test bed consisting of idealized problems. He has successfully accelerated fission source convergence in both deterministic and Monte Carlo criticality calculations. By filtering statistical noise, he has incorporated deterministic attributes into the Monte Carlo calculations in order to speed their source convergence. He has used both the fission matrix and a diffusion approximation to perform unbiased accelerations. The Fission Matrix Acceleration method has been implemented in the production code MCNP and successfully applied to a real problem. When the unaccelerated calculations are unable to converge to the correct solution, they cannot be accelerated in an unbiased fashion. A Hybrid Monte Carlo method weds Monte Carlo and a modified diffusion calculation to overcome these deficiencies. The Hybrid method additionally possesses reduced statistical errors.

  19. A Monte Carlo approach for estimating measurement uncertainty using standard spreadsheet software.

    Science.gov (United States)

    Chew, Gina; Walczyk, Thomas

    2012-03-01

    Despite the importance of stating the measurement uncertainty in chemical analysis, concepts are still not widely applied by the broader scientific community. The Guide to the expression of uncertainty in measurement approves the use of both the partial derivative approach and the Monte Carlo approach. There are two limitations to the partial derivative approach. Firstly, it involves the computation of first-order derivatives of each component of the output quantity. This requires some mathematical skills and can be tedious if the mathematical model is complex. Secondly, it is not able to predict the probability distribution of the output quantity accurately if the input quantities are not normally distributed. Knowledge of the probability distribution is essential to determine the coverage interval. The Monte Carlo approach performs random sampling from probability distributions of the input quantities; hence, there is no need to compute first-order derivatives. In addition, it gives the probability density function of the output quantity as the end result, from which the coverage interval can be determined. Here we demonstrate how the Monte Carlo approach can be easily implemented to estimate measurement uncertainty using a standard spreadsheet software program such as Microsoft Excel. It is our aim to provide the analytical community with a tool to estimate measurement uncertainty using software that is already widely available and that is so simple to apply that it can even be used by students with basic computer skills and minimal mathematical knowledge.

  20. CPMC-Lab: A MATLAB package for Constrained Path Monte Carlo calculations

    Science.gov (United States)

    Nguyen, Huy; Shi, Hao; Xu, Jie; Zhang, Shiwei

    2014-12-01

    We describe CPMC-Lab, a MATLAB program for the constrained-path and phaseless auxiliary-field Monte Carlo methods. These methods have allowed applications ranging from the study of strongly correlated models, such as the Hubbard model, to ab initio calculations in molecules and solids. The present package implements the full ground-state constrained-path Monte Carlo (CPMC) method in MATLAB with a graphical interface, using the Hubbard model as an example. The package can perform calculations in finite supercells in any dimensions, under periodic or twist boundary conditions. Importance sampling and all other algorithmic details of a total energy calculation are included and illustrated. This open-source tool allows users to experiment with various model and run parameters and visualize the results. It provides a direct and interactive environment to learn the method and study the code with minimal overhead for setup. Furthermore, the package can be easily generalized for auxiliary-field quantum Monte Carlo (AFQMC) calculations in many other models for correlated electron systems, and can serve as a template for developing a production code for AFQMC total energy calculations in real materials. Several illustrative studies are carried out in one- and two-dimensional lattices on total energy, kinetic energy, potential energy, and charge- and spin-gaps.